D
D
Dmitry Demenkov2016-06-14 19:23:52
opencart
Dmitry Demenkov, 2016-06-14 19:23:52

How to speed up Opencart with a large number of products?

There is an online stationery store with 17,500 items of nomenclature. The store has been open for a very long time. Everything slows down.
The server upgrade didn't work.
Disabling the count of the number of goods, adding indexes - did not bring tangible benefits.
All pages take a long time to load.
All MySQL slows down, because Opencart has very large queries, with many attachments and ORDER BY.
Now the MyIsam base, I read somewhere that it is not advised to translate OpenCart to InnoDB.
Recently, reports on Yandex.Metrica began to come very often that the site is unavailable.
Memcache caching enabled - doesn't help.
How can it be optimized?
Or what other engine can I switch to? to quickly and cost-effectively.
Frequently Executed Slow MySQL Query

Count: 46  Time=6.91s (317s)  Lock=0.05s (2s)  Rows=1.0 (46), 
  SELECT DISTINCT *, pd.name AS name, p.image, m.name AS manufacturer, 
  (SELECT price 
      FROM oc_product_discount pd2 
      WHERE pd2.product_id = p.product_id AND pd2.customer_group_id = 'S' 
            AND pd2.quantity = 'S' AND ((pd2.date_start = 'S' OR pd2.date_start < 'S') 
            AND (pd2.date_end = 'S' OR pd2.date_end > 'S')) 
       ORDER BY pd2.priority ASC, pd2.price ASC LIMIT N) AS discount, 
    (SELECT price 
       FROM oc_product_special ps 
       WHERE ps.product_id = p.product_id AND ps.customer_group_id = 'S' 
                  AND ((ps.date_start = 'S' OR ps.date_start < 'S') AND (ps.date_end = 'S' OR ps.date_end > 'S')) 
        ORDER BY ps.priority ASC, ps.price ASC LIMIT N) AS special, 
     (SELECT points 
        FROM oc_product_reward pr 
        WHERE pr.product_id = p.product_id AND customer_group_id = 'S') AS reward, 
     (SELECT ss.name 
         FROM oc_stock_status ss 
         WHERE ss.stock_status_id = p.stock_status_id AND ss.language_id = 'S') AS stock_status, 
      (SELECT wcd.unit 
          FROM oc_weight_class_description wcd 
          WHERE p.weight_class_id = wcd.weight_class_id AND wcd.language_id = 'S') AS weight_class, 
       (SELECT lcd.unit 
           FROM oc_length_class_description lcd 
           WHERE p.length_class_id = lcd.length_class_id AND lcd.language_id = 'S') AS length_class, 
        (SELECT AVG(rating) AS total 
            FROM oc_review r1 
            WHERE r1.product_id = p.product_id AND r1.status = 'S' 
            GROUP BY r1.product_id) AS rating, 
         (SELECT COUNT(*) AS total 
             FROM oc_review r2 
             WHERE r2.product_id = p.product_id AND r2.status = 'S' 
             GROUP BY r2.product_id) AS reviews, 
        p.sort_order FROM oc_product p 
    LEFT JOIN oc_product_description pd ON (p.product_id = pd.product_id) 
    LEFT JOIN oc_product_to_store p2s ON (p.product_id = p2s.product_id) 
    LEFT JOIN oc_manufacturer m ON (p.manufacturer_id = m.manufacturer_id) 
    WHERE p.product_id = 'S' AND p.status = 'S' AND p.date_available <= 'S' AND p2s.store_id = 'S'

Answer the question

In order to leave comments, you need to log in

7 answer(s)
V
Valeriu Vodnicear, 2016-06-14
@vodnicear

The best option is a self-written engine.

P
Philipp, 2016-06-14
@zoonman

17500 is a negligible number of records.
Opencart's main problem is govnokod at the level of database calls. Lots of requests in a loop, etc.
It is usually solved by installing MySQL on a normal machine with a good amount of memory and SSD, migrating to InnoDB, analyzing queries and building the necessary indexes. In addition to indexes, you need to keep track of buffer sizes, etc. In your case, the amount of memory should be in the region of 8 GB RAM. Of the solutions, I recommend using not MySQL, but MariaDB or PerconaDB. Do not use virtual (shared) hosting.
Also look, there is a good mysqltuner utility, it suggests the optimal characteristics for the DBMS.
Enable the log of slow queries and queries that do not use indexes.
Analyze the network stack of your MySQL connection, working through a socket significantly speeds up the application. Some unfortunate masters expose the address of the domain, which the system periodically resolves and this increases the connection time costs.
With high attendance and non-critical data relevance, an approach is used to rewrite the opencart database abstraction layer using results caching in Memcached. Relevant for highly visited resources with various kinds of pre-orders or checking balances directly when adding an order.

E
Egor, 2016-06-14
@egormmm

If Mysql slows down, then you need to turn on the slow query log, find where they are created in the engine and optimize them.
The "optimization" section in the Mysql manual can also help.
If all indexes are correctly placed, then most likely a function is used in the queries that does not allow Mysql to cache the query. Different modules sin with this, for example Mega filter.
PS Hope Mysql query cache is enabled.

D
Dmitry, 2016-06-14
@thewind

All MySQL slows down, because Opencart has very large queries, with many attachments and ORDER BY.

Why not sit down and systematically correct all the necessary requests? Dividing them into several, optimizing and so on and so forth.

W
web-mechanic, 2016-06-15
@amfetamine

fix some slow db queries, set the cache where you need it, you can still jump to vds, or maybe just a good hosting will be enough. there was a site here recently, until they transferred it to vds, didn’t start flying, there was also a sufficient number of goods and descriptions for them

R
Roman Evtushenko, 2021-12-16
@devopencart2

As a rule, disabling the count of goods does not affect the speed (although everyone writes about this, I am no exception))
but there are many more manipulations that can help speed up the site.
You can try to implement some or all of the points from this manual:
https://in-it24.com/uskorenie-magazina-na-opencart/

F
fbi_agent26, 2022-01-21
@fbi_agent26

memcached for opencart is a useless thing , I turned it on, the PHP download speed didn’t really decrease significantly, it increased, from 0.17 sec on average to 0.35 sec on average, I measured each option 10 times and calculated the average value.
Indexes do not need to be done everywhere, but only where they are needed, when you open a product, you have a request for IDs in several tables, and do it in the fattest ones.
========== if opencart loads slowly and there are less than 100,000 products =================
1 - shared hosting has a shared database server with one core allocated to it server, so you won't be able to speed it up with optimizations, below are the statistics of my database.
ø per hour: 860,877
ø per minute: 14,347
ø per second: 239
Hosting allows you to put it on a virtual account, but if I put it there, then all small sites start to lag, no one calculates the load on the database.
Who wants to optimize the database, then go directly to the server, or buy a caching module, cache requests go around the database, from RAM, and the result immediately comes, there are such modules.
================Extra modules===============
I have a lighting module, it shows requests, and somehow the execution time of requests to the database grew from 0.05 seconds to 0.10-0.15, went looking, found that the seo module that I used to generate meta tags takes most of this time, turned it off naturally, if you don’t use modules, but they are, they must be turned off.
================================================= =============
I have 500,000 products, page load time is 0.2 when there is no special traffic and bots do not attack.
With a caching module (I won’t name which one, there are many of them, choose any one), page generation speed is 0.01, while my site is not an online store, it is constantly updated by node.js bots, which create a load on the database, you will have speed higher if done correctly.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question