B
B
balloon2012-08-23 17:03:19
JavaScript
balloon, 2012-08-23 17:03:19

How to deal with large amount of data (2M records) on UI?

There is an internal system for working with large amounts of data.
Users should be able to quickly view this data through a web interface.
Simplified, it looks like this:
1. there is a table in MySQL / Percona (2M records, 350M data)
2. there is a local web server that should transfer this data to the UI
3. The UI should display this data in the form of a table that can be sorted, filter, pagination, etc.

The following implementation options suggest themselves:

1. Load data for only one page via ajax when changing the page, sorting or filtering
Result: everything works slowly due to sorting and filters on non-indexed fields.

2 (current). Load all data at once and sort/filter on the client side.
Result: if the speed of sorting and filtering is more or less acceptable, then the speed of initial data loading is disappointing.

Questions:
1. How can I quickly load 300M data on the UI?
Now these are several ajax requests that return the most compact json.
The data is converted to json through PHP, which of course affects performance.
Is it possible in javascript to load csv (select in file) and iterate it?
Is it possible to upload a file with json data > 75Mb? With more volume, my Chrome crashes.

2. How to store/sort such amount of data on UI?
Now they are simply stored in an array and sorted through underscore.
Tried sqlite - much slower.

Notes:
1. Browser - Chrome. Support for the others is not needed :)
2. Server: PHP 5.3.8/Apache2/Percona 5.1/FreeBSD
3. There are many such tables

Thanks in advance for your advice.

Answer the question

In order to leave comments, you need to log in

5 answer(s)
V
Vampiro, 2012-08-23
@Vampiro

in 999 cases out of 1000 people cannot do anything looking at 2k lines. Our brain is simply not able to handle such volumes at once. A third of users apply the necessary filter, another third sorts by one column that is significant for them, and “winds” to the desired values. The rest use the search for each record.
Look at what third your users belong to. It seems to me that it is much easier to make a dialog-master with a filter than to get out of 2k records, 90% of which are not required by the user
:) then bring the database up to standard with ajax, if the database does not often update records.

1
1nd1go, 2012-08-23
@1nd1go

1. Load "barrels" - in parts. The first request gives a response with, say, the number of partitions and the address from where to download each. then using, for example , webworker, pump them out in parallel.
2. Because sqllite deprecated, I would try IndexedDB .

P
pxx, 2012-08-23
@pxx

Result: everything is slow due to sorting and filters on non-indexed fields.

So index tables, 2k records is not much for MySQL.
As Vampiro correctly said , there is absolutely no point in dumping everything on the client, the user will not see even 5% of this data, but it will kill the browser's performance to death.

E
Eugene, 2012-08-23
@Methos

To reduce the size of files (data) instead of json, you can try using an array of arrays. The first element of the array is the keys. Next is data. Download and unpack into objects.
With this method, the creation of json on the server will be faster - you can use regular arrays and implode.

F
freeek, 2012-08-23
@freeek

Have you tried any bikes like jqGrid , how do they cope with such volumes?

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question