Answer the question
In order to leave comments, you need to log in
The right tool for handling large binary files on an application server?
Hello. There was a question of the correct approach and choice of the tool.
There is an application server that runs api written in PHP. Its task is to process large binary files (from 100 MB to 1000 GB). It consists in reading a file, sending meta-information about it to the client in XML, receiving instructions from the client on how to process it, overwriting the file with the changes made, entering information about the file into the database. Currently, all operations with binaries are performed by a couple of C programs using 64-bit functions from unistd.h . Programs are called from php via exec(). The system, due to its heterogeneity, does not always work stably, plus it is quite inconvenient to make any changes to the code.
I would like to hear advice on how it would be right to build such a system. How can you integrate the shell call as much as possible so as not to suffer from heterogeneity. Should I use C at all here, perhaps it would make sense to process my files in php itself (I know it's perverted, but how bad will the performance be?). In general, I want practical advice from colleagues. Thank you.
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question