Answer the question
In order to leave comments, you need to log in
How to speed up Django search?
An online store with 6 million positions will be in the region of 10 million, already experiencing huge problems with search performance. Direct entry still somehow works, if it can be called work (13 seconds from pressing the search button until the page loads), the search for similar matches does not work at all, it crashes with a 502 error
I tried to install elasticsearch, but I can’t index products, I have 4GB of RAM on the server, when indexing, the server kills the process due to lack of memory (command manage.py search_index --rebuild). How to be? =(
def search(request):
if request.method == 'GET':
form = SearchForm(request.GET)
if form.is_valid():
name_rus = form.cleaned_data['search']
try:
product = Product.objects.get(Q(name_rus__iexact=name_rus)|Q(sku=name_rus))
return redirect('product_item', product.slug)
except:
products_search = Product.objects.annotate(search=SearchVector('name_rus', 'name_poland')).filter(search=name_rus)
last_question = '?search=%s' % name_rus
categories = Category.objects.filter(categories_products__in=products_search).distinct()
paginator = Paginator(products_search, 40) # Show 25 contacts per page
page = request.GET.get('page')
products = paginator.get_page(page)
return render(request, 'core/search.html', {'products':products, 'categories':categories, 'page':page, 'last_question':last_question})
else: return HttpResponse('Ошибка поиска')
def typeahead(request):
q = request.GET.get('q', '')
prodlist = []
objects = Product.objects.filter(Q(name_rus__icontains=q)|Q(sku=q))[0:10]
for i in objects:
new = {'q':i.name_rus}
prodlist.append(new)
return HttpResponse(json.dumps(prodlist), content_type="aplication/json")
Answer the question
In order to leave comments, you need to log in
when indexing, the server kills the process due to lack of memory
13 seconds for a direct entry is somehow too long.
It seems to me that at you falls because of architecture of a DB.
Look at htop during indexing - most likely SQL eats up all the memory due to heavy queries, and not elastic-sphinx.
Try to transfer all products to mongodb and make a denormalized product database in this way.
fo u in a hurry
https://habrahabr.ru/post/346884/
do not look at the object, look at the measurements
Okay, but the best option is to upgrade the server. Yes, you can install Sphinx, and it will probably consume less resources, but not for long. Therefore, it is better to ask not to waste money on the operative.
Well, you can use the standard methods to increase the swap file, well, connect ZRAM.
1. In a comment on one of the answers, you mentioned that you use db_index=True, but you need a special index for a full test search. Check out this section of the postgres documentation: https://www.postgresql.org/docs/9.5/static/textsea...
2. Get rid of the name_rus__iexact=name_rus query, your SearchVector covers that.
In general, I thought that I solved the issue, but that was not the case. Sphinx does a great job with searching for 1-2 words, but if you enter a more specific query of 4-5 words, that's all ... it takes you to think for 15 seconds, the processor load is 100% and the server crashes. There are no problems with direct entry at all, everything is instantaneous.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question