Exception of type 'System.OutOfMemoryException' was thrown. Why?

前端 未结 5 1919
刺人心
刺人心 2021-02-05 11:20

I have a dynamic query that returns around 590,000 records. It runs successfully the first time, but if I run it again, I keep getting a System.OutOfMemoryException

相关标签:
5条回答
  • 2021-02-05 11:37

    You're obviously not disposing of things.

    Consider the "using" command when temporarily using objects that implement IDisposable.

    0 讨论(0)
  • 2021-02-05 11:41

    try to break your large data as much as possible because I already faced number of times this types of problem. In which I have above 10 Lakh records with 15 columns.

    0 讨论(0)
  • 2021-02-05 11:56

    It runs successfully the first time, but if I run it again, I keep getting a System.OutOfMemoryException. What are some reasons this could be happening?

    Regardless of what the others have said, the error has nothing to do with forgetting to dispose your DBCommand or DBConnection, and you will not fix your error by disposing of either of them.

    The error has everything to do with your dataset which contains nearly 600,000 rows of data. Apparently your dataset consumes more than 50% of the available memory on your machine. Clearly, you'll run out of memory when you return another dataset of the same size before the first one has been garbage collected. Simple as that.

    You can remedy this problem in a few ways:

    • Consider returning fewer records. I personally can't imagine a time when returning 600K records has ever been useful to a user. To minimize the records returned, try:

      • Limiting your query to the first 1000 records. If there are more than 1000 results returned from the query, inform the user to narrow their search results.

      • If your users really insist on seeing that much data at once, try paging the data. Remember: Google never shows you all 22 bajillion results of a search at once, it shows you 20 or so records at a time. Google probably doesn't hold all 22 bajillion results in memory at once, it probably finds its more memory efficient to requery its database to generate a new page.

    • If you just need to iterate through the data and you don't need random access, try returning a datareader instead. A datareader only loads one record into memory at a time.

    If none of those are an option, then you need to force .NET to free up the memory used by the dataset before calling your method using one of these methods:

    • Remove all references to your old dataset. Anything holding on to a refenence of your dataset will prevent it from being reclaimed by memory.

    • If you can't null all the references to your dataset, clear all of the rows from the dataset and any objects bound to those rows instead. This removes references to the datarows and allows them to be eaten by the garbage collector.

    I don't believe you'll need to call GC.Collect() to force a gen cycle. Not only is it generally a bad idea to call GC.Collect(), because sufficient memory pressure will cause .NET invoke the garbage collector on its own.

    Note: calling Dispose on your dataset does not free any memory, nor does it invoke the garbage collector, nor does it remove a reference to your dataset. Dispose is used to clean up unmanaged resources, but the DataSet does not have any unmanaged resources. It only implements IDispoable because it inherents from MarshalByValueComponent, so the Dispose method on the dataset is pretty much useless.

    0 讨论(0)
  • 2021-02-05 11:57

    Perhaps you're not disposing of the previous connection/ result classes from the previous run which means their still hanging around in memory.

    0 讨论(0)
  • 2021-02-05 11:58

    Where does it fail?

    I agree that your issue is probably that your dataset of 600,000 rows is probably just too large. I see that you are then adding it to Session. If you are using Sql session state, it will have to serialize that data as well.

    Even if you dispose of your objects properly, you will always have at least 2 copies of this dataset in memory if you run it twice, once in session, once in procedural code. This will never scale in a web application.

    Do the math, 600,000 rows, at even 1-128 bit guid per row would yield 9.6 megabytes (600k * 128 / 8) of just data, not to mention the dataset overhead.

    Trim down your results.

    0 讨论(0)
提交回复
热议问题