It is possible to stream a large SQL Server database result set using Dapper?

僤鯓⒐⒋嵵緔 提交于 2020-07-18 03:58:31

问题


I have about 500K rows I need to return from my database (please don't ask why).

I will then need to save these results as XML (more URGH) and the ftp this file to somewhere magical.

I also need to transform the each row in the result set.

Right now, this is what I'm doing with say .. TOP 100 results:

  • using Dapper's Query<T> method, which throws the entire result set into memory
  • I then use AutoMapper to convert the database POCO to my FileResult POCO
  • Convert to XML
  • Then save this collection to the file system
  • Then FTP

This works fine for 100 rows, but I get an Out Of Memory exception with AutoMapper when trying to convert the 500K results to a new collection.

So, I was wondering if I could do this...

  • Stream data from DB using Dapper
  • For each row, automapper it
  • Convert to XML
  • Stream result to disk
  • <repeat for each row>
  • Now ftp that file to magic-land

I'm trying to stop throwing everything into RAM. My thinking is that if I can stream stuff, it's more memory efficient as I only work on a single result set of data.


回答1:


using Dapper's Query<T> method, which throws the entire result set into memory

It is a good job, then, that one of the optional parameters is a bool that lets you choose whether to buffer or not ;p

Just add , buffer: false to your existing call to Query<T>.



来源:https://stackoverflow.com/questions/34169362/it-is-possible-to-stream-a-large-sql-server-database-result-set-using-dapper

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!