Answer the question
In order to leave comments, you need to log in
What is the best way to cache in php?
I have a lot of pages that are dynamically formed and there is a task to make fast caching. Each body of such a page is about 50 kilobytes, there are about 30,000 such pages.
The system runs on php. I consider caching to the database to be quite redundant. There was a memcache, but since it only stores everything in memory and after restarting its cache drops, it was abandoned. Now I use mongo, but for some reason it periodically crashes on some records after several days of work
. The load is high. Can all these pages be transferred to the file cache at all?
Answer the question
In order to leave comments, you need to log in
The cache in files rules for such volumes, hot pages will be in the FS cache in memory, the rest will not be lost upon restart. I once used for a site of 100-150 thousand pages the same size as yours, but that’s why I went to memcached, because I didn’t need to store the cache after the restart, and there is a lot of memory on the server, and everything is compressed in memory, it's not a pity.
you can try Redis - it has the ability to save data to disk habrahabr.ru/blogs/webdev/81917/
keep further in memcache, why overload the server at all? you take the data with nginx directly from memcache? What is the problem with regenerating the cache?
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question