A
A
Alexander20142014-05-04 05:19:09
Google Workspace
Alexander2014, 2014-05-04 05:19:09

Using Google App Engine, is it possible to organize 100,000 queries against a 20 million record database?

The volume of one record is from 15 to 400 bytes. There are 20 million of them in total.
100 thousand accesses (reading) to various (not previously known) of these records occur per day .
Is it possible to manage to meet the restrictions in free mode?
(Tricks, programming tricks)
According to quotas in GAE, a maximum of 50 thousand read operations are allowed to be performed to the datastore.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
R
rPman, 2014-05-04
@rPman

If it is possible, you did not say anything about writing to this data - then caching is more than half of the data.
The fastest and 'wrong' is caching in the RAM of the instance (static members of the class) that serves your nodes (there is only one in the free one), there are only a few tens of megabytes of RAM, and there is no direct possibility to control its restart (or even the simultaneous launch of two nodes) , but without access restrictions.
Less fast but correct - using a standard memcache (a slightly faster database with not guaranteed data storage), it has its own limits, check according to the tariff plan.
The worst thing, if possible and carried out at the client level, is to duplicate the database on several different accounts and, accordingly, domains. For example, if the data were pictures, the domain could be randomly substituted in the address.
ps two years ago, free limits in practice were several times less than stated in the tariffs - Problems with GAE limits, why are Datastore Read Operations too high?

S
stahon, 2014-07-15
@stahon

I constantly solve this problem by serializing data and combining several records into one.
A few is usually a few hundred.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question