A
A
ArmBar2019-08-12 10:47:49
elasticsearch
ArmBar, 2019-08-12 10:47:49

How to estimate the amount of resources that will be needed for elasticsearch?

There was an idea, in an online store, along with a full-text search by product names, to make a faceted search for other products through elasticsearch. The prototype was thrown and the search speed is good, but a 10-fold increase in the volume of the catalog is expected ahead (under ten million units) and it became necessary to calculate the resources (memory, processors) that may be required in this case. Google did not give an answer, although I most likely formulated the question incorrectly. Who can tell how the calculation can be done, or what are the best practices for optimizing the configuration to carry out, I will be grateful.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
M
mamontm, 2019-08-12
@mamontm

SphinxSearch
or its fork - Manticore.
Eats orders of magnitude less, searches much faster.
Why did ivi switch from Sphinx to Elasticsearch? They
switched because Sphinx can’t (couldn’t) divide the index into different servers, and the index stopped interfering with one server.
At the same time, they had to compromise because of Elastic - it searches very slowly compared to Sphinx, it eats a lot of memory resources compared to Sphinx.
Here it should be understood that all such systems - both Sphinx and Elastic and Bleve and Reindexer - all build a full-fledged index for each sneeze.
What you put in the index - only you know.
If there is no product yet - randomly generate names up to 10 million units.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question