Answer the question
In order to leave comments, you need to log in
Cloud Storage or Cloud Compute?
To solve most problems, you can use different algorithms and methods, in which the relative use of memory and computing power is very different.
I'm also interested in this price issue for the following (but in terms of permanent memory, not RAM):
For a Web application, 3D models are stored (for example, 1 MB each). In addition to these models, sometimes (or maybe more often) the client needs to transfer simplified models according to LOD - the original one is 1 MB, the 2nd level of detail is 500 kb, the 3rd level of detail is 250 kb, etc.
So far I don’t have statistics on which ones will be used more often, but the principle is as follows:
p1,p2,p3 - prices for Cloud Compute, Storage and traffic, respectively
Option 1: the model is formed on the server side and given to the client
- p1*s*q computing time on the server, where s - file size, q - number of requests
- p2*s storage memory
- p3*s*q*l traffic, where l - 0..1 depending on the selected level of detail
Option 2: models are pre-formed for all possible levels of detail
- p1*s*L computational time, where L is the number of levels of detail
- p2*s+s/2+s/4 + .. = ~ 2*s memory
- p3* s*q*l traffic
Option 3: the user always loads the original model and generates it on his side
- p1*0 computing time
- p2*s memory
- p3*s*q traffic
- s*q computing time for the user
The third option is the most economical, but not for the user, and there will be much more traffic if we assume that the original model is loaded 2 times less often than the steel ones, that is, in the previous cases it will be l = 0.5
Of course, you can take into account client-side caching, but still every new user will have to download everything the first time? What about the proxy server, on which to save the generated models from the first user, and subsequent users will spend only traffic on them (which will no longer affect the main server), what then are the services for this and how profitable are they compared to conventional storage?
I'll try to calculate how much it would go to Google Cloud with 10,000 models (10 GB), 300,000 requests per month,
Compute - $ 25 per month
Storage - 0.26 dollars
Bandwidth - 15 dollars
If I understood everything correctly, then the storage is a trifle, and the traffic is wildly expensive. It turns out that Amazon, Azure and GC are no longer needed to start (and not only) if there is Digital Ocean and Vultr
Bottom line: what is better to focus on in the long run: increasing computing power or storage capacity?
Answer the question
In order to leave comments, you need to log in
vultras lose to clouds in terms of scalability, but yes - they have cheaper resources
in Google, they have 2 months 300 dollars, measure
chickens laugh
in the context of $ 15 distribute your files from your home computer for free and don't sweat your brain
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question