Parallel.ForEach - System Out of Memory Exception

ぐ巨炮叔叔 提交于 2019-12-25 02:17:15

问题


I have a problem with my website crawler as I get System Out of Memory Exception after it crawls something around 700 URLs. Memory usage is raising from the start and in one moment program just stops.

It is a console application written in C#.

I think the problem is that i instantiate 6 new objects at every foreach loop. Than i pass through them, get property values with reflection and create the final object that i use to DB save.

I expect .NET to destroy those object when not using them anymore but that is not the case. What are my options? Is BAckground Worker any better?

My code is something like this....

       Parallel.ForEach(Globals.Urls, url =>
        {

            progCtrl.indexIsSet = false;

            var urlHelper = url.Split(';')[1].TrimStart('\t');
           // var urlHelper = Globals.replaceGermanUmlauts(url.Split(';')[1].TrimStart('\t'));
            HtmlDocument htm = new HtmlDocument();

            try
            {
                Company comp0 = new Company();
                Company comp1 = new Company();
                Company comp2 = new Company();
                Company comp3 = new Company();
                Company comp4 = new Company();
                Company comp5 = new Company();
                Company comp6 = new Company();


//then I do some logic, add those companies to list and go further.

How to destory them? I have tried making them IDisposable but that didn't help.

Thanks.

来源:https://stackoverflow.com/questions/22376084/parallel-foreach-system-out-of-memory-exception

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!