B
B
bizf2018-08-13 14:35:23
Ruby on Rails
bizf, 2018-08-13 14:35:23

How to optimize loading for Rails Api?

How to do this if the database stores about 100 thousand records, or even more, so that there are no slowdowns. For example, if you use a filter to search for information from 20 thousand records, then when you try to enter information into the filter itself, it will slow down every time you enter a new letter, the same will happen with pagination when switching pages. That is, it is necessary that not all records are loaded from jsona, but in fragments of about 40-60 records.
Here's what's there:

def index
      @posts = params[:limit] ? Post.all.limit(params[:limit]) : Post.all

      render json: @posts
end

How to do it better. At the moment I have more than 20 thousand records and slows down well.

Answer the question

In order to leave comments, you need to log in

3 answer(s)
A
Artur Bordenyuk, 2018-08-13
@HighQuality

To prevent the pages from slowing down, use the leaflet - kaminari
To prevent the Ajax filter from slowing down, use the debouncer in Javascript - debounce

R
Roman Mirilaczvili, 2018-08-15
@2ord

Post.all.limit(params[:limit])

Take advantage
Post.limit(per_page).offset( per_page * (page_num - 1) )

S
Semyon Semyonov, 2018-08-31
@man_without_face

You may not get params[:limit], so Post.all gets it, i.e. all 20 thousand records.
You can also add so that the search is not after entering the first character, but for example after the third.
You can also add indexes to the table if there are none.
Also, judging by the code that you wrote, there is no character search at all (there is no where for a simple one)
. all columns of the posts table. Those. I think something like this will work for you:
Then the data will be sent an order of magnitude less.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question