B
B
Bogdan2017-07-25 11:43:18
Ruby on Rails
Bogdan, 2017-07-25 11:43:18

Database optimization?

Hello. How correct is the approach to move away from interacting with the model and switch to manual query generation. I have large data arrays being transferred, but Active Record inserts only one record at a time, and at the same time, with each insert, there is a request for all subordinate tables, so the place of one insert can go 5 Select to the subordinate tables and the insert itself. With more than 5000 entries, there is already a noticeable delay time. Slowly all on manual insertion directly. How true is all this, and what problems, besides referential integrity, can I remove. Thank you.

ts_data.each { | ts |
              sql_values += ",(#{ id }," +
                            "#{ children_groups[ ts[ :children_group_code ] ] }," +
                            "#{ children[ ts[ :child_code ] ] }," +
                            "#{ reasons_absences[ ts[ :reasons_absence_code ] ] }," +
                            "'#{ ts[ :date ] }'," +
                            "'#{ now }','#{ now }')"
            }

            sql = "INSERT INTO timesheet_dates ( #{ fields } ) VALUES #{ sql_values[1..-1] }"

            ActiveRecord::Base.connection.execute( sql )
end

Answer the question

In order to leave comments, you need to log in

1 answer(s)
A
Andrey Demidenko, 2017-07-25
@Dem1

There is bulk_insert and activerecord-import

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question