I have a table with half a million rows. I need to update every single row but the ToList() fails:
List<Contacts> allContacts = objDatabase.Contacts.ToList();
I get a System.OutOfMemoryException every time. Is there a way around this?
I already have the App.Config workaround but still no go:
<gcAllowVeryLargeObjects enabled="true" />
I'm on a 64bit machine with 8GB of RAM
Here is a solution using chunking. It will dispose of the container (and the downloaded entities) after every chunk. Memory should be released by the GC long before your system runs out of memory.
int chunkSize = 50;
int curCount = 0;
while (true)
{
using (var db = new DbEntities())
{
var chunk = db.Contacts.Skip(curCount).Take(chunkSize).ToArray();
curCount += chunkSize;
if (chunk.Length == 0) break;
foreach (var contact in chunk)
{
//do any work for the contact here
contact.Something = "SomethingNew";
}
db.SaveChanges();
}
}
Feel free to play around with the chunk size. The larger the chunk, the faster the entire process should be, but it will use up more memory.
How about
IEnumerable<Contacts> allContacts = objDatabase.Contacts.AsEnumerable();
Never convert allContacts to a list. Just use is like an enumerator and apply Foreach loop to access each contacts.
try
foreach (Contacts c in objDatabase.Contacts) c.value = newvalue;
来源:https://stackoverflow.com/questions/25900285/entity-tolist-generates-a-system-outofmemoryexception