Recommendations for executing .NET HttpWebRequests in parallel in ASP.NET

后端 未结 4 1110
失恋的感觉
失恋的感觉 2021-01-13 04:37

I have an ASP.NET MVC web application that makes REST style web service calls to other servers. I have a scenario where I am making two HttpWebRequest calls to two separate

相关标签:
4条回答
  • 2021-01-13 04:50

    One of the answers to Multithreading WebRequests, a good and stable approach? : CSharp uses a ManualResetEvent event = new ManualResetEvent(), a reference counter equaling the number of in flight requests and Interlocked.Decrement to control the event.Set(). The main thread then waits by calling event.WaitOne().

    However, WaitHandles - Auto/ManualResetEvent and Mutex mentions that ManualResetEvent "can be significantly slower than using the various Monitor methods" like Wait, Pulse and PulseAll.

    I ended up basing my code off of this Noah Blumenthal blog post: Run generic tasks async (fluent-ly). I did make two changes: implement IDisposable and call .Close() on the ManualResetEvent and switch from using a lock() to Interlocked .Increment() and .Decrement().

    public class AsyncQueueManager : IDisposable {
        private readonly ManualResetEvent waitHandle = new ManualResetEvent(true);
        private int count = 0;
    
        public AsyncQueueManager Queue(Action<object> action) {
            Interlocked.Increment(ref count);
            waitHandle.Reset();
            Action<object> actionWrapper = CreateActionWrapper(action);
            WaitCallback waitCallback = new WaitCallback(actionWrapper);
            ThreadPool.QueueUserWorkItem(waitCallback);
            return this;
        }
    
        private Action<object> CreateActionWrapper(Action<object> action) {
            Action<object> actionWrapper = (object state) =>
            {
                try {
                    action(state);
                } catch (Exception ex) {
                    // log
                } finally {
                    if (Interlocked.Decrement(ref count) == 0) {
                        waitHandle.Set();
                    }
                }
            };
            return actionWrapper;
        }
    
        public void Wait() {
            waitHandle.WaitOne();
        }
        public void Wait(TimeSpan timeout) {
            waitHandle.WaitOne(timeout);
        }
    
        public void Dispose() {
            waitHandle.Close();
        }
    }
    
    0 讨论(0)
  • 2021-01-13 04:51

    I like to do this kind of thing a little more manually, rather than relying on asynchronous web requests or the thread pool's automatic sizing (25 threads by default). Of course, those are perfectly fine ways to solve your problem, but I think the following code is a bit more readable (in the example below, _links would contain a list of your links before processing occurs...):

    private static IList<String> _links = new List<String>();
    private const int NumberOfThreads = 2;
    
    public void SpawnWebRequests()
    {
        IList<Thread> threadList = new List<Thread>();
    
        for (int i = 0; i < NumberOfThreads; i++)
        {
            var thread = new Thread(ProcessWebRequests);
            threadList.Add(thread);
            thread.Start();
        }
    
        for (int i = 0; i < NumberOfThreads; i++)
        {
            threadList[i].Join();
        }
    }
    
    private static void ProcessWebRequests()
    {
        String link;
    
        while (true)
        {
            lock(_links)
            {
                if (_links.Count == 0)
                    break;
    
                link = _links.RemoveAt(0);
            }
    
            ProcessWebRequest(link);
        }
    }
    
    private static void ProcessWebRequest(String link)
    {
        try
        {
            var request = (HttpWebRequest)WebRequest.Create(link);
            request.Method = "HEAD"; // or "GET", since some sites (Amazon) don't allow HEAD
            request.Timeout = DefaultTimeoutSeconds * 1000;
    
            // Get the response (throws an exception if status != 200)
            using (var response = (HttpWebResponse)request.GetResponse())
            {
                if (response.StatusCode == HttpStatusCode.OK)
                    Log.Debug("Working link: {0}", request.RequestUri);
            }
        }
        catch (WebException ex)
        {
            var response = ((HttpWebResponse)ex.Response);
            var status = response != null
                             ? response.StatusCode
                             : HttpStatusCode.RequestTimeout;
    
            Log.WarnException(String.Format("Broken link ({0}): {1}", status, link), ex);
    
            // Don't rethrow, as this is an expected exception in many cases
        }
        catch (Exception ex)
        {
            Log.ErrorException(String.Format("Error processing link {0}", link), ex);
    
            // Rethrow, something went wrong
            throw;
        }
    }
    

    If you just want to manage the size of the thread pool (if you are using ThreadPool.QueueUserWorkItem()), you can use ThreadPool.SetMaxThreads = 2).

    Of course, if you want to use the Microsoft-sanctioned async approach, check out this example: http://msdn.microsoft.com/en-us/library/86wf6409.aspx. Just be sure you clean up each response (via a "using" block or by closing the response object)!

    Hope that helps, Noah

    0 讨论(0)
  • 2021-01-13 04:51

    I recommend that you create a worker class that does the HttpWebRequest and start it in its own thread for each connection. You can just Join the threads and wait until they both finish, or pass a callback method. In either case you need to account for connection failures, timeouts, and other exceptions. I prefer to use a callback that returns the connection result and the thread's ManagedThreadId, which I use to keep track of threads. The worker class should catch all exceptions so that you can handle them in the calling class.

    This article offers some insight and fixes for when you exceed the maximum number of connections: http://support.microsoft.com/kb/821268.

    0 讨论(0)
  • 2021-01-13 05:00

    Unless you are doing something fancy the size of the ThreadPool is fixed and generally large enough to accommodate your needs. In your particular case you might run into resource constraint issues using AsyncIO calls if your webservice is under heavy load and each caller has initiated 2 TP threads for its own callbacks.

    Perhaps the easiest implementation of this would be to have two distinct callback functions, one for service1 and one for service2 inside each of the CompletionEvents set some trigger variable to "true" and then in your main thread wait for both trigger variables to be set. You could do this with ResetEvents, but if your server will be under load you might want to avoid this.

    Pseudo code of the process might be:

    Dictionary<string, bool> callCompleted = new Dictionary<string, bool>
    
    string operation1Key = Guid.NewGuid().ToString();
    string operation2Key = Guid.NewGuid().ToString();
    
    callCompleted[operation1Key] = false;
    callCompleted[operation2Key] = false;
    
    //make your remote calls and pass the operation key as the data
    //....
    //....
    
    bool waiting = true;
    while (waiting) {
       bool isFinished = true;
       foreach(string operationKey in callCompleted.Keys) {
          isFinished &= callCompleted[operationKey]
          if (!isFinished) { break; }
       }
    
       waiting = !isFinished;
    }
    

    Its a little rought since I dont know the exact nature of how you are making your calls, but it should work reasonably well.

    0 讨论(0)
提交回复
热议问题