A
A
Alexander2010-12-20 18:44:03
Sphinx
Alexander, 2010-12-20 18:44:03

PHP SphinxAPI limitation on finding matches

In the function SetLimits ( $offset, $limit, $max=0, $cutoff=0 ) there is a limit $max=1000 on the number of returning recordset.
$sphinx->SetLimits (0, self::LIMIT_A, 1000 );
If you increase this constant or in the sphinxapi.php file itself, then the sphinx daemon will return an error:
Query failed: searchd error: per-query max_matches=10000 out of bounds (per-server max_matches=1000).
How can I pull out the next 1000 values? The below code does not work.
$sphinx->SetLimits (1000, self::LIMIT_A, 1000 );

Answer the question

In order to leave comments, you need to log in

3 answer(s)
A
Alexander, 2010-12-21
@akalend

downvote - you don’t need a lot of mind,
but the answer, apparently, is not enough ...

I
Iskander Giniyatullin, 2010-12-21
@rednaxi

See help on the sphinx - a lot of mind is needed.
But writing a question seems to be enough :D
You need to increase not a constant and not in the sphinxapi.php file, but in your sphinx config
www.sphinxsearch.com/docs/current.html#api-func-setlimits - a help for SetLimits, it says about max_matches
www.sphinxsearch.com/docs/current.html#conf-max-matches
Just set the required number of max_matches in the config.

F
freeman08, 2017-09-09
@freeman08

Maybe someone will be useful. This is not the best solution, but:
my client had a requirement to display/save absolutely all search results.
Step using pagination for a long time. My search for 19567 results took about 6 - 7 seconds in increments of 1000.
I solved the problem like this:

// листинг кода условный, так как вырваны куски из середины кода, но идея ясна
// pre query
$sphinx->SetLimits( 0, 1, 1);
// получаем сколько всего рузльтатов найдено согласно ключу/прочим параметрам
$pre_res = $sphinx->Query($query, 'index_name');

// устанавливаем лимиты на весь диапазон найденых записей за раз
$sphinx->SetLimits( 0, $pre_res['total_found'], $pre_res['total_found']);
$res = $sphinx->Query($query, 'index_name');

This method selects 40728 records in about 2 seconds. Naturally, the time is approximate and depends on a number of factors.
This is not the best solution, but in my case it was only necessary to collect the IDs of the found records and save them to the database for further work. I did not find a beautiful way, only such an "ax" :( .. but the "ax" copes with the task not so badly;)

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question