Answer the question
In order to leave comments, you need to log in
How to reduce load when using foreach to read db?
I'm sorry I did not know how to formulate the request, the essence is as follows. I use foreach to read the database:
foreach (DataBase::getDb()->getRow('SELECT * FROM ' . Config::DB_TABLE_USERS) as $row) {
// Тут код
}
Answer the question
In order to leave comments, you need to log in
It all depends on the task being solved by the code. If this is the output of all users, for example in the admin panel, then it's fine. And if you check the presence of a user during authentication in this way, then it is very bad.
I agree with Rsa97. But I want to add a little. In the case of user output, the best solution will always be to limit the maximum output length to, say, 50 entries and add pagination (Order by, Limit, Offset keywords).
Pagination is an eternal hassle. Offset on large volumes will slow down hard, and if the load is also large on demand, then it’s a real disaster.
It is better not to release such paginations into the open world, I usually use it in the admin panel, or the user's personal account, in general, in places where the load is about zero.
For all other cases, I would advise you to smoke pagination based on primary keys, at first it will be painful to write all this, but then it's nice;)
It's very well written about it
10 lines or 500 - there is practically no difference. with 500 records, the analyzer will most likely be too lazy to use even indexes :-)
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question