Answer the question
In order to leave comments, you need to log in
php. What is the best way to organize the storage of a large social game config and quick access to it?
//Update from June 17th: the solution has been formed, the description is at the end of the post.
//--
Hello!
I'm sawing a "farm" for social networks. There is a file that describes the parameters of all objects, quests, materials in the game. At the moment it is about 2000 objects, the number will increase. The client part loads the config as a json file, everything is fine here. I took it, deserialized it, I'm working.
As for the server side, I constantly bother, help me cope with paranoia and choose the right option. The server, when processing player actions, must take object data from the config and use them. Check if it is possible to sell an object, charge the correct amount of money for its sale, check if the sale of this object does not affect the completion of the quest, charge loot for the quest, etc., up to several dozen calls to the config for one user action.
The server side is written in php, uses php-fpm + nginx as input, percona for cold storage of player data and Tarantool from mail.ru as a fat cache with up-to-date data (users currently playing).
At first, I stored the config in json, which I loaded at the beginning of script processing and deserialized into an associative array. I took the necessary parameters of objects from it, in the style. $chickenModel = $globalModelsStorage['Chicken01']
Then it seemed to me that it was a long time (unserialize on each move of the player and storing a healthy array in memory) and I got confused. I tried to store the config as a php-file connected by include'om. It also seemed long (I don’t remember the specific numbers according to the test results, alas). I tried something else, peered into this article - habrahabr.ru/post/112402/.
At the moment, I'm doing, it seems to me, perversely: I store configs in the user cache of the APC area, objects are serialized in igbinary. Access by key to individual elements, fast and angry, I like it.
But here's the question:
1. Is it right to do this? Is it possible to store configs of this kind in shared memory via APC? Will there be any problems with data loss _in the process_ of work (I have not caught such yet, but in case of a server crash, there is a script that brings the config data back to APC)? How reasonable and practical is this?
2. Maybe I should use shared memory as a cache, and if the data is lost (it is impossible to select some parameter by some key), take json from disk and deserialize it, then amend the cache in shared memory?
3. Who generally acts in such a situation? I thought about looking at Memcached, but, firstly, in the event of a crash, you will also have to restore information, and secondly, since I use Tarantool, I would not want to produce a zoo of in-memory software on the server. Or under my conditions (memcached only for storing several thousand rarely changing config parameters) is there nothing wrong with this?
What other solutions are there? Maybe I'm just too worried about the speed of access to the config?
Thanks in advance, and I apologize for the confusion of the question.
//Update June 17th:
I tested pavel_salauyou and VitaZheltyakov variants. I liked Pavel's version of using redis, but often in server code I need to pull entire objects, with all their nested structures. To achieve this with redis, you would have to write a wrapper that collects different key-value pairs into a single php object, or rewrite the architecture of the server properly.
As a result, I store the config in many files that I collect from the initial json config not manually, but automatically. Each file is a config object. I include them as needed, everything is cached in APC, everything is tasty and fast.
Thanks to those who answered)
Answer the question
In order to leave comments, you need to log in
If you have an array that rarely changes, it's best to store it as a PHP array and include it. Opcacher will take this file into memory and you will be happy.
This approach is better because there are no problems with quotes, the inclusion is as fast as possible.
store the entire config in redis, redis stores data in RAM, and also periodically saves to disk, if something goes wrong, the data will not be lost.
If we add an external service, then not redis, but MongoDB . Radishes is great, but Mongo can natively store nested data - just json.
Therefore, for this task, Mongo is better suited than radish.
In addition, Mongo is also very fast, well documented, and also has aggregation capabilities - it will be easy to make queries on the amount, quantity, etc.
There are a lot of materials on Mongo (for example, about sampling and changing data , but about aggregating functions ).
Although, if you are completely satisfied with the decision Vitaly Zheltyakov, then it is better, of course, because it does not introduce new entities into the project architecture. But if you need to make selections by config, then think about it.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question