Answer the question
In order to leave comments, you need to log in
Is there a public storj for storing files that are too big to send to the server?
The essence of the problem: Need to work with zip-archives ~300 mb in size, which are loaded by clients
On front-end nodes, nginx limits the maximum length of a request
client_max_body_size 20m;
I am not allowed to change the value.
There was an idea to send a request from the frontend to the api of some public repository, and only a download link to the server, and only then in the script, pull the archive with a curl.
Are there free services for my needs? Have I chosen the right approach? Thanks in advance for your advice
Answer the question
In order to leave comments, you need to log in
No, there are no free ones.
For storing large files, you should look towards Amazon Glacier.
In general, I don’t see a problem with making a chunked file upload through which thread of the JS file api wrapper.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question