问题
In C#,i have a huge dataset that i want to write it into xml file with write xml,that is my code:
using (var myConnection = new SqlConnection("Data Source=192.168.5.28;Initial Catalog=dbName;User ID=sa;Password=davood$;"))
{
var da = new SqlDataAdapter("select * from tblName", myConnection);
var ds = new DataSet();
da.Fill(ds);
var filestream = File.Create("D:\\tblName.xml");
var buffered = new BufferedStream(filestream);
ds.WriteXml(buffered);
// ds.WriteXml("D:\\tblName.xml", System.Data.XmlWriteMode.IgnoreSchema);
//
//using (var fileStream = File.Create("D:\\tblName.xml"))
//{
// using (var zipStream = new GZipStream(fileStream, CompressionMode.Compress))
// {
// ds.WriteXml(zipStream, XmlWriteMode.WriteSchema);
// }
//}
}
but after program is execute(with long time executing),file is can not be open. Tnx.
回答1:
In huge amount of data size(bulk operation), its not well advised to use C#, ORM and ADO, for performance,Memory,IO, round trip and else ... issues.
I would prefer using RDBMS core functions to make files.
By the way you may use Paging
to divide the query result and for any page you may create a separate xml file, on windows OS we have 4 gig file size limit ,so page size must be balanced due to you table row size.
First:
fetch the table row count. (calling data Reader and else is ignored)
var dataRowsCount = select count(*) from tblName
Second:
Choose a effective page size, divide the result of first call to this page size, you will have your loop count, something like this:
var pageSize = 1000;
var pageCount = (dataRowsCount / PageSize) + 1;
Third:
On a loop(based on result of second phase), call paged query to get data and create several xml files.
for(i=0;i<pageCount,i++)
{
// Call paged query and create files
// SQL Server paged Query
SELECT TOP pageSize columns
FROM Table
WHERE IDColumn NOT IN ( SELECT TOP pageSize*i IDColumn
FROM Table
ORDER BY SortColumn)
ORDER BY SortColumn;
}
Paging query sample on MSSqlServer, Oracle and MYSql could be found here.
回答2:
My suspicion is that if the data is "huge", it's probably having memory trouble loading it into the dataset, or writing it to disk, or perhaps even in whatever application you're trying to open the resulting XML file with.
Personally I'd suggest using a DataReader to read in one record at a time, then output one record of XML at a time. That way the file size should be irrelevant. It is more work, but better practice.
If that doesn't help:
- Were any exceptions thrown?
- Was an XML file created?
- If so, how big?
- What did you try to open it with?
- What happened when you tried to open it?
- Did the file have closing tags at the end?
来源:https://stackoverflow.com/questions/16432916/best-approach-to-write-huge-sql-dataset-into-xml-file