A
A
alexoka2016-05-24 23:53:27
In contact with
alexoka, 2016-05-24 23:53:27

Handling a large amount of data when working with the VK API for a desktop application?

Straight to the point, let's say there is a group of 143k people, we want to parse all users of the group into groups in which they are members, 1 request I need returns data with a maximum weight of 180kb in xml format (or 100kb in json format, which generally does not solve the problem), in on average, we have 10 times less information about the user than the maximum - we get 18 kb per user, 18 kb X 143000 = 2574000 kb divided by 1024 to get megabytes GET - 2513 megabytes in the RAM of raw variables ....
And in the end we need to process everything data is an option to accept in json and reduce consumption by 40% ..... but this will not solve the problem at all if you process groups with a number of 1,000,000 people.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
R
ragimovich, 2016-05-25
@ragimovich

What 18kb per user? At you each user consists in 4600 groups? Information about one group takes 4 bytes (there are no groups with ID > 2 billion in VK yet). Those. a user with 100 groups will take up 400 bytes in memory. 143000 * 400 = 56 MB. I have no idea how memory usage is in C# lists / arrays, but I don’t think that the overhead will be more than 1-2 more data volumes. Those. in the worst case, you will have 150 MB of RAM for 150K people.
In binary format, the user ID database of 90M VK groups weighs 35GB, and you are talking about some 2.5GB for 140K users.

K
kpa6uu, 2016-05-25
@kpa6uu

Perform the necessary calculations on the server, after which the program requests information about a specific user.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question