I\'m running into the strangest thing that I can\'t figure out. I have a SQL table with a bunch of reports stored in an ntext field. When I copied and pasted the value of on
wild guess here.
cmd.Parameters.Add(new SqlParameter("CompiledReportTimeID", CompiledReportTimeID));
you missed the @ sign. so it replaces both instances of CompiledReportTimeID with the id and you get all the results instead because of the equality?
You should try to read the data sequentially by specifying the command behavior when you execute the reader. Per the documentation, Use SequentialAccess to retrieve large values and binary data. Otherwise, an OutOfMemoryException might occur and the connection will be closed.
While sequential access is typically used on large binary data, based on the MSDN documentation you can use it to read large amounts of character data as well.
When accessing the data in the BLOB field, use the GetBytes or GetChars typed accessors of the DataReader, which fill an array with data. You can also use GetString for character data; however. to conserve system resources you might not want to load an entire BLOB value into a single string variable. You can instead specify a specific buffer size of data to be returned, and a starting location for the first byte or character to be read from the returned data. GetBytes and GetChars will return a long value, which represents the number of bytes or characters returned. If you pass a null array to GetBytes or GetChars, the long value returned will be the total number of bytes or characters in the BLOB. You can optionally specify an index in the array as a starting position for the data being read.
This MSDN example shows how to perform sequential access. I believe you can use the GetChars method to read the textual data.
Fundamentally, a System.OutOfMemoryException
doesn't just occur when you are out of memory, but when you cannot allocate a single contiguous block of memory for an object. You'll often see that error when trying to create a very large array, or load a large bitmap object, or sometimes when creating large XmlDocuments...
Array
and String
typically need to be allocated contiguously, ie can't be broken up into pieces and allocated into empty spaces in memory.
This likely isn't a SQL issue and is more an issue with the SqlReader trying to allocate a string large enough to contain the data in a row.
You mentioned that it worked properly after a reboot, so let's assume your code is fundamentally correct (possibly can still be optimised to rather expose the data as a stream instead of buffering the recordset) and that the current symptom is environmental. A freshly rebooted machine possibly doesn't have as much fragmented memory, but as you used it more, the memory fragmented and the error returned...
You may be able to prove the contiguous memory theory by closing as many other programs as possible, and adding code to force a GC.Collect(GC.MaxGeneration)
(reference) before the code with the error. This isn't a guarantee, as the memory allocated to your process may still be fragmented.
I think streaming the value might be the way to stop the error occurring, and better to avoid trying to buffer everything into a string. The downside to this is that you will keep the database connection open while the result is streamed / consumed by the rest of the program and that will bring its own overheads. I'm not sure what your code needs to do with the result, but if it needs to work with a String
instance, you may need to expand the memory available to the process (several ways to help that, but may be off-topic - leave a comment and I can add to this answer if needed)