I was using active record to get my stories and then generate a CSV, the standard way as done in the rails cast. But I have a lot of rows and it takes minutes. I think if I coul
AFAIK you need to use the copy_data method on the underlying PostgreSQL database connection for this:
- (Object) copy_data(sql)
call-seq:
conn.copy_data( sql ) {|sql_result| ... } -> PG::Result
Execute a copy process for transferring [sic] data to or from the server.
This issues the SQL
COPY
command via#exec
. The response to this (if there is no error in the command) is aPG::Result
object that is passed to the block, bearing a status code of PGRES_COPY_OUT or PGRES_COPY_IN (depending on the specified copy direction). The application should then use#put_copy_data
or#get_copy_data
to receive or transmit data rows and should return from the block when finished.
And there's even an example:
conn.copy_data "COPY my_table TO STDOUT CSV" do
while row=conn.get_copy_data
p row
end
end
ActiveRecord's wrapper for the raw database connection doesn't know what copy_data
is but you can use raw_connection to unwrap it:
conn = ActiveRecord::Base.connection.raw_connection
csv = [ ]
conn.copy_data('copy stories to stdout with csv header') do
while row = conn.get_copy_data
csv.push(row)
end
end
That would leave you with an array of CSV strings in csv
(one CSV row per array entry) and you could csv.join("\r\n")
to get the final CSV data.
This answer builds up on the answer provided by @mu-is-too-short, but without a temporary object using streaming instead.
headers['X-Accel-Buffering'] = 'no'
headers["Cache-Control"] = 'no-cache'
headers["Transfer-Encoding"] = 'chunked'
headers['Content-Type'] = 'text/csv; charset=utf-8'
headers['Content-Disposition'] = 'inline; filename="data.csv"'
headers.delete('Content-Length')
sql = "SELECT * FROM stories WHERE stories.id IN (#{story_ids.join(',')})"
self.response_body = Enumerator.new do |chunk|
conn = ActiveRecord::Base.connection.raw_connection
conn.copy_data("COPY (#{sql.chomp(';')}) TO STDOUT WITH (FORMAT CSV, HEADER TRUE, RCE_QUOTE *, ESCAPE E'\\\\');") do
while row = conn.get_copy_data
chunk << "#{row.length.to_s(16)}\r\n"
chunk << row
chunk << "\r\n"
end
chunk << "0\r\n\r\n"
end
end
You can also use gz = Zlib::GzipWriter.new(Stream.new(chunk))
and gz.write row
with a class akin to
class Stream
def initialize(block)
@block = block
end
def write(row)
@block << "#{row.length.to_s(16)}\r\n"
@block << row
@block << "\r\n"
end
end
And remember headers['Content-Encoding'] = 'gzip'
. See also this gist.