TStringList of objects taking up tons of memory in Delphi XE

后端 未结 10 1446
遇见更好的自我
遇见更好的自我 2021-01-05 14:50

I\'m working on a simulation program.

One of the first things the program does is read in a huge file (28 mb, about 79\'000 lines,), parse each line (about 150 field

相关标签:
10条回答
  • 2021-01-05 15:18

    Just one idea which may save memory.

    You could let the data stay on the original files, then just point to them from in-memory structures.

    For instance, it's what we do for browsing big log files almost instantly: we memory-map the log file content, then we parse it quick to create indexes of useful information in memory, then we read the content dynamically. No string is created during the reading. Only pointers to each line beginning, with dynamic arrays containing the needed indexes. Calling TStringList.LoadFromFile would be definitively much slower and memory consuming.

    The code is here - see the TSynLogFile class. The trick is to read the file only once, and make all indexes on the fly.

    For instance, here is how we retrieve a line of text from the UTF-8 file content:

    function TMemoryMapText.GetString(aIndex: integer): string;
    begin
      if (self=nil) or (cardinal(aIndex)>=cardinal(fCount)) then
        result := '' else
        result := UTF8DecodeToString(fLines[aIndex],GetLineSize(fLines[aIndex],fMapEnd));
    end;
    

    We use the exact same trick to parse JSON content. Using such a mixed approach is used by the fastest XML access libraries.

    To handle your high-level data, and query them fast, you may try to use dynamic arrays of records, and our optimized TDynArray and TDynArrayHashed wrappers (in the same unit). Arrays of records will be less memory consuming, will be faster to search in because the data won't be fragemented (even faster if you use ordered indexes or hashes), and you'll be able to have high-level access to the content (you can define custom functions to retrieve the data from the memory mapped file, for instance). Dynamic arrays won't fit fast deletion of items (or you'll have to use lookup tables) - but you wrote you are not deleting much data, so it won't be a problem in your case.

    So you won't have any duplicated structure any more, only logic in RAM, and data on memory-mapped file(s) - I added a "s" here because the same logic could perfectly map to several source data files (you need some "merge" and "live refresh" AFAIK).

    0 讨论(0)
  • 2021-01-05 15:25

    Reading though the comments, it sounds like you need to lift the data out of Delphi and into a database.

    From there it is easy to match organ donors to receivers*)

    SELECT pw.* FROM patients_waiting pw
    INNER JOIN organs_available oa ON (pw.bloodtype = oa.bloodtype) 
                                  AND (pw.tissuetype = oa.tissuetype)
                                  AND (pw.organ_needed = oa.organ_offered)
    WHERE oa.id = '15484'
    

    If you want to see the patients that might match against new organ-donor 15484.

    In memory you only handle the few patients that match.

    *) simplified beyond all recognition, but still.

    0 讨论(0)
  • 2021-01-05 15:27

    Are you sure you don't suffer from a case of memory fragementation?

    Be sure to use the latest FastMM (currently 4.97), then take a look at the UsageTrackerDemo demo that contains a memory map form showing the actual usage of the Delphi memory.

    Finally take a look at VMMap that shows you how your process memory is used.

    0 讨论(0)
  • 2021-01-05 15:29

    Are there many duplicate strings in your list? Maybe trying to only store unique strings will help reducing the memory size. See my Question about a string pool for a possible (but maybe too simple) answer.

    0 讨论(0)
  • 2021-01-05 15:30

    In addition to Andreas' post:

    Before Delphi 2009, a string header occupied 8 bytes. Starting with Delphi 2009, a string header takes 12 bytes. So every unique string uses 4 bytes more than before, + the fact that each character takes twice the memory.

    Also, starting with Delphi 2010 I believe, TObject started using 8 bytes instead of 4. So for each single object created by delphi, delphi now uses 4 more bytes. Those 4 bytes were added to support the TMonitor class I believe.

    If you're in desperate need to save memory, here's a little trick that could help if you have a lot of string value that repeats themselve.

    var
      uUniqueStrings : TStringList;
    
    function ReduceStringMemory(const S : String) : string;
    var idx : Integer;
    begin
      if not uUniqueStrings.Find(S, idx) then
        idx := uUniqueStrings.Add(S);
    
      Result := uUniqueStrings[idx]
    end;
    

    Note that this will help ONLY if you have a lot of string values that repeat themselves. For exemple, this code use 150mb less on my system.

    var sl : TStringList;
      I: Integer;
    begin
      sl := TStringList.Create;
      try
        for I := 0 to 5000000 do
          sl.Add(ReduceStringMemory(StringOfChar('A',5)));every
      finally
        sl.Free;
      end;
    end;
    
    0 讨论(0)
  • 2021-01-05 15:30

    May I suggest you try using the jedi class library (JCL) class TAnsiStringList, which is like TStringList fromDelphi 2007 in that it is made up of AnsiStrings.

    Even then, as others have mentioned, XE will be using more memory than delphi 2007.

    I really don't see the value of loading the full text of a giant flat file into a stringlist. Others have suggested a bigtable approach such as Arnaud Bouchez's one, or using SqLite, or something like that, and I agree with them.

    I think you could also write a simple class that will load the entire file you have into memory, and provide a way to add line-by-line object links to a giant in-memory ansichar buffer.

    0 讨论(0)
提交回复
热议问题