I'm using Delphi to create an XML document from data in a relational database. It tests fine with small datasets, but when I try to expand the size of the data set to production levels it eventually bombs out with an EOutOfMemory exception during node creation.
I'm using a TXMLDocument dropped on a form (MSXML as the Vendor), and my code generally looks like this:
DS := GetDS(Conn, 'SELECT Fields. . . FROM Table WHERE InsuredID = ' +IntToStr(AInsuredID));
try
while not DS.Eof do
with ANode.AddChild('LE') do
begin
AddChild('LEProvider').Text := DS.FieldByName('LEProvider').AsString;
// Need to handle "other" here
AddChild('Date').Text := DateToXMLDate(DS.FieldByName('LEDate').AsDateTime);
AddChild('Pct50').Text := DS.FieldByName('50Percent').AsString;
AddChild('Pct80').Text := DS.FieldByName('80Percent').AsString;
AddChild('Actuarial').Text := DS.FieldByName('CompleteActuarial').AsString;
AddChild('Multiplier').Text := DS.FieldByName('MortalityMultiplier').AsString;
DS.Next;
end;
finally
DS.Free;
end;
with this section, as well as numerous other similarly constructed sections applying to different database tables, executed many times. In this example ANode is an IXMLNode passed in to the function for use as a container.
I do not expect the resulting XML file on disk to be more than 10 megabytes. I assume that somehow I'm leaking memory in my creation and disposal of XMLNodes, but I'm not familiar enough with Interfaces to know how to track down my problem.
TXMDocument is a DOM style interface and keeps the whole document in memory. Memory gets used up rather quick that way. Even when the resulting file is not that big. You don't really need TXMLDocument to write out a simple XML. Why not write directly to a file in xml format?
That being said: It could also be an error due to heap fragmentation or be a real memory leak. You might want to try a tool like mentioned here: Profiler and Memory Analysis Tools for Delphi
Each of those AddChild
calls has its result stored into a temporary IXmlNode
variable declared implicitly by the compiler. They should get cleaned up automatically when the current subroutine returns (whether normally or by an exception). You can make their lifetime more explicit by declaring your own variables.
var
le, child: IXmlNode;
begin
DS := GetDS(Conn, Format(Query, [AInsuredID]));
try
while not DS.Eof do begin
le := ANode.AddChild('LE');
child := le.AddChild('LEProvider');
child.Text := DS.FieldByName('LEProvider').AsString;
// Need to handle "other" here
child := le.AddChild('Date');
child.Text := DateToXMLDate(DS.FieldByName('LEDate').AsDateTime);
child := le.AddChild('Pct50');
child.Text := DS.FieldByName('50Percent').AsString;
child := le.AddChild('Pct80');
child.Text := DS.FieldByName('80Percent').AsString;
child := le.AddChild('Actuarial');
child.Text := DS.FieldByName('CompleteActuarial').AsString;
child := le.AddChild('Multiplier');
child.Text := DS.FieldByName('MortalityMultiplier').AsString;
DS.Next;
end;
finally
DS.Free;
end;
end;
In the above code, there are no implicit interface variables. The compiler would have declared a new implicit variable for each AddNode
call, but the code above demonstrates that only two were necessary because child
can be reused for each new child node.
That code alone shouldn't cause an extreme amount of memory use, though. It seems more likely that you're keeping references to objects that you don't really need anymore, or you're creating circular references for some interface objects. The MSXML library shouldn't create any circular references of its own, but you haven't shown all the code that might be running here.
Try using a SAX parser rather than DOM. DOM keeps a representation of the whole XML file in memory.
try here
来源:https://stackoverflow.com/questions/1499717/eoutofmemory-creating-large-xml-using-delphi