Answer the question
In order to leave comments, you need to log in
What is the best way to organize a file database?
Hello!
I would like to note right away that mySQL, SQLite, etc. solutions will not work. A script is being developed for cheap or even free hosting. At the moment, the database is organized by one xml file, which is parsed with each user request. The file on average will weigh 2MB.
Please immediately refrain those who are against file databases, those who will advise using mySQL with saliva at the mouth, listing its advantages. I know all its advantages, and I would gladly use, and even better, Mongo, but the task is precisely the file one.
Perhaps there are some ready-made solutions, because I think my version, with one file, is too resource-intensive. There is an idea to create a separate file for each entry, and connect and parse it in the script, or unload the xml file into memory and use it in the background.
Who will advise what?
Answer the question
In order to leave comments, you need to log in
Found a more or less intelligible solution: https://github.com/wylst/fllat
In your case, you need a very competent specialist who knows all the shortcomings of such an organization of information.
In particular, blocking files, waiting for a file to be unlocked, in general, organizing simultaneous access. If you just do an inlude and then something like "file_put_contents()" you might get into trouble. in essence, you will only need to write sqlite in PHP.
Make a quick version and launch, otherwise you will be playing for so long. Do you have one 2MB XML file? Well, it's not really such a problem. One file is better than many small ones, unless of course the disk is an SSD, which is rare on such cheap hostings. Perhaps you just need to think in the direction of the cache? Make a simple option when parsing from XML, then put the finished answer in the cache for a while or reset it according to the script when you update the data.
In general, out comrade said above, text files are no worse or even better than databases.
I wonder why habrahabr does not broadcast json like google.
The data can be stored more easily in json, but the file is processed (search for it) based on a "light" request.
For example, the user enters the site and receives his share of information, no more and no less, and not the entire file.
I wrote it incomprehensibly, but this is how Google V3 api works, you can see it there.
What about a solution like Firebase ? judging by the stingy description of your needs (if you don’t keep some kind of super duper secret information) it should be more than enough
50 Max Connections, 5 GB Data Transfer, 100 MB Data Storage.
1 GB Hosting Storage and 100 GB Hosting Transfer.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question