Out of memory when inserting records in SQLite, FireDac, Delphi

时光怂恿深爱的人放手 提交于 2019-12-13 18:17:38

问题


I have a Delphi app that inserts some 200,000 records (about 1GB) into an SQLite database via the FireDac component TFDTable. As it inserts, I can see the application memory increasing until I get an "Out of Memory Error". I'm guessing it has something to due with cache and paging but I cant find anything that fixes it short of closing and reopening the database every 1000 records or so. Thoughts?

Edited... Sorry for the thinly worded question... The code was simple so I didn't include it but looks basically like this:

procedure DoIt;
begin
  myDB.Insert;
  myDBField1.AsString := StringOfChar('-',1000);
  myDB.Post;
end;

Now, I expect the memory might increase as the string is likely copied to the DB caches. If I look at the allocations using GetMemoryManagerState() I can actually see this. I would expect that at some point, the memory in the cache would be flushed as the data is written to disc. However, it does not seem to be. It just keeps going until I get an "Out of Memory" error.

In general most of the object properties are all set to default states except selecting sqlite in the connection and adding fields to the table.

I know there is not a lot to go on here. But I didn't think this would fail either and I was hoping someone may have had a similar issue.


回答1:


TFDTable is a thin wrapper around a query object that can build SQL commands for operating with the underlying DBMS table. It has its own storage (Table object) into which it stores data fetched to the client as well as tuples that you insert. But all that is in memory, there's no underlying file cache.

Although that internal storage can be cleared whilst you're inserting, TFDTable is not a good object for inserting data in such amount. Better use query object like TFDQuery which in combination with a batch command execution technique called Array DML can bring you real performance increase, even for local DBMS engine. And TFDQuery won't be caching inserted tuples.

FireDAC supports this technique for SQLite natively when you use indexed parameter binding, e.g. this code should insert 200 times batch of 1000 unique tuples:

const
  BatchSize = 1000;
  TotalSize = 200000;
var
  Batch: Integer;
  Index: Integer;
begin
  FDQuery.SQL.Text := 'INSERT INTO MyTable VALUES (:p1, :p2)';
  FDQuery.Params.BindMode := pbByNumber;
  FDQuery.Params.ArraySize := BatchSize;

  for Batch := 0 to TotalSize div BatchSize - 1 do
  begin
    for Index := 0 to BatchSize - 1 do
    begin
      FDQuery.Params[0].AsIntegers[Index] := (Batch * BatchSize) + Index;
      FDQuery.Params[1].AsWideStrings[Index] := 'Some Unicode string value';
    end;
    FDQuery.Execute(BatchSize, 0);
  end;
end;


来源:https://stackoverflow.com/questions/45975298/out-of-memory-when-inserting-records-in-sqlite-firedac-delphi

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!