问题
I have an issue creating worksheets when adding large DataTables. I am getting out of memory errors. I turned off the event tracking as suggested to improve performance and memory consumption but I am still experiencing these errors.
Here is some sample code:
public void SaveWorkBook(string xlFileName, DataSet dataSet)
{
using (var xlWorkBook = new XLWorkbook(XLEventTracking.Disabled))
{
xlWorkBook.AddWorksheet(dataSet);
xlWorkBook.SaveAs(xlFileName);
}
}
public void SaveWorkBook(string xlFileName, params DataTable[] dataTables)
{
using (var xlWorkBook = new XLWorkbook(XLEventTracking.Disabled))
{
foreach (var dataTable in dataTables) xlWorkBook.AddWorksheet(dataTable);
xlWorkBook.SaveAs(xlFileName);
}
}
The issue of course is when the DataTables are huge. Like thousands of records or more. I tried using an alternative method to large data tables like using a temporary one with smaller records counts, then flushing them to the Worksheet, clearing the data table then refilling it with new data. Fill, flush, clear, repeat. That or the InsertData
method()
The issue is that every time I try to write the data table using the options available, only the first column is getting written. Then lower down the file I am seeing the other columns of data, but in the first cell of the worksheet instead of spanning the columns in a single row as they are suppose to. And only for 1 row. Probably for the last record that was written.
Can someone help me understand how this works?
Here is a code snippet using the InsertData
method. I create the worksheet using a dataTable as a template since it has all the columns defined. Then I get that worksheet and try to add to it.
public void SaveWorkBook(string xlFileName, DataTable dataTable, string dataFileName, char delimitor)
{
using (var xlWorkBook = new XLWorkbook(XLEventTracking.Disabled))
{
xlWorkBook.AddWorksheet(dataTable);
var workSheet = xlWorkBook.Worksheets.First(ws => ws.Name == dataTable.TableName);
using (var sr = new StreamReader(dataFileName))
{
var rowIndex = 1;
sr.ReadLine();
while (!sr.EndOfStream)
{
var line = sr.ReadLine() + string.Empty;
workSheet.Cell(++rowIndex, 1).InsertData(line.Split(delimitor));
}
}
xlWorkBook.SaveAs(xlFileName);
}
}
As you can see I am using a StreamReader
to read 1 line at a time of the CSV file I created earlier. Loading the entire file into the datable causes out of memory errors. So I read the file 1 line at a time and try to add it after creating the worksheet.
Can anyone share some insight into this?
Thanks
回答1:
This is a known issue in ClosedXML. There is currently no solution. See the issue at https://github.com/ClosedXML/ClosedXML/issues/264
来源:https://stackoverflow.com/questions/43458813/outofmemory-errors-using-large-datatables-with-closedxml