Given the opportunity to rewrite, I would, but anyway, the code as it stands:
List foobar;
Then we add a bunch of strings to
I've posted what I exactly did here, worth giving it a go. Again steps are:
Move to the next portion
List<string> returnList;
int index = 0;
SqlCommand cmd = new SqlCommand("ExampleStoredProc", conn);
cmd.CommandType = CommandType.StoredProcedure;
while (true)
{
cmd.Parameters.Add(
new SqlParameter("@index", index));
SqlDataReader dr = cmd.ExecuteReader();
if (dr.HasRows)
{
returnList = new List<string>();
returnList.Add(dr.GetString(0).Trim());
//transfer data here
}
else
{
break;
}
index++;
}
and the stored proc should be something like this:
CREATE PROCEDURE ExampleStoredProc
@index INT
AS
BEGIN
SELECT *
FROM veryBigTable
WHERE Id >= (@index *1000) AND Id < ((@index + 1) * 1000)
END
GO
I'll definitely work no matter how many records you have, just the more data you have, longer it'll take to finish.
If it's getting even fewer than 2^24 when you manually set the correct list size then that's probably on the right track. Instead of getting to 16 million and then trying to double the size of the list, it'll be making the list really large to begin with and running out of memory earlier.
That explains why you were getting a round number - it reached the 2^24 then tried to increase in size, which caused it to use too much memory.
Sounds to me like it's some kind of 'natural' object size limit, as opposed to one in the implementation of the list.
If you're trying to use very large lists in 64 bit environments you need to enable large objects in the application configuration.
http://msdn.microsoft.com/en-us/library/hh285054.aspx
The OOM is likely due to the way Lists/ArrayLists allocate memory, which I believe is each time their boundary is reached, they attempt to double in size. The list cannot double from 2^24. You could theoretically maximize your list size by pre-specifying a size. (I.e. 2GB)