Batch insertion in rails 3

若如初见. 提交于 2019-11-27 18:08:14

ActiveRecord .create method supports bulk creation. The method emulates the feature if the DB doesn't support it and uses the underlying DB engine if the feature is supported.

Just pass an array of options.

# Create an Array of new objects
User.create([{ :first_name => 'Jamie' }, { :first_name => 'Jeremy' }])

Block is supported and it's the common way for shared attributes.

# Creating an Array of new objects using a block, where the block is executed for each object:
User.create([{ :first_name => 'Jamie' }, { :first_name => 'Jeremy' }]) do |u|
  u.is_admin = false
end

I finally reached a solution after the two answers of @Simone Carletti and @Sumit Munot.

Until the postgres driver supports the ActiveRecord .create method's bulk insertion, I would like to go with activerecord-import gem. It does bulk insert and that too in a single insert statement.

books = []
10.times do |i| 
    books << Book.new(:name => "book #{i}")
end
Book.import books

In POSTGRES it lead to a single insert statemnt.

Once the postgres driver supports the ActiveRecord .create method's bulk insertion in a single insert statement, then @Simone Carletti 's solution makes more sense :)

You can create a script in your rails model, write your queries to insert in that script In rails you can run the script using

rails runner MyModelName.my_method_name

Is the best way that i used in my project.

Update:

I use following in my project but it is not proper for sql injection. if you are not using user input in this query it may work for you

user_string = " ('a@ao.in','a'), ('b@ao.in','b')"
User.connection.insert("INSERT INTO users (email, name) VALUES"+user_string)

For Multiple records:

new_records = [
  {:column => 'value', :column2 => 'value'}, 
  {:column => 'value', :column2 => 'value'}
]

MyModel.create(new_records)

You can do it the fast way or the Rails way ;) The best way in my experience to import bulk data to Postgres is via CSV. What will take several minutes the Rails way will take several seconds using Postgres' native CSV import capability.

http://www.postgresql.org/docs/9.2/static/sql-copy.html

It even triggers database triggers and respects database constraints.

Edit (after your comment): Gotcha. In that case you have correctly described your two options. I have been in the same situation before, implemented it using the Rails 1000 save! strategy because it was the simplest thing that worked, and then optimized it to the 'append a huge query string' strategy because it was an order of magnitude better performing.

Of course, premature optimization is the root of all evil, so perhaps do it the simple slow Rails way, and know that building a big query string is a perfectly legit technique for optimization at the expense of maintainabilty. I feel your real question is 'is there a Railsy way that doesn't involve 1000's of queries?' - unfortunately the answer to that is no.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!