We are using HttpClient
to send requests to remote Web API in parallel:
public async Task<HttpResponseMessage> PostAsync(HttpRequestInfo httpRequestInfo)
{
using (var httpClient = new HttpClient())
{
httpClient.BaseAddress = new Uri(httpRequestInfo.BaseUrl);
if (httpRequestInfo.RequestHeaders.Any())
{
foreach (var requestHeader in httpRequestInfo.RequestHeaders)
{
httpClient.DefaultRequestHeaders.Add(requestHeader.Key, requestHeader.Value);
}
}
return await httpClient.PostAsync(httpRequestInfo.RequestUrl, httpRequestInfo.RequestBody);
}
}
This API can be called by several threads concurrently. After running about four hours we found memory leaks issue happened, from profiling tool, it seems there are two ServicePoint
objects, one of which is quite big, about 160 MB.
From my knowledge, I can see some problems above codes:
- We should share
HttpClient
instance as possible as we can. In our case, the request address and headers may vary a lot, so is this a point we can do something or it doesn't hurt too much performance? I just think of that we can prepare a dictionary to store and look upHttpClient
instances. - We didn't modify the
DefaultConnectionLimit
ofServicePoint
, so in default it can only send two requests to the same server concurrently. If we change this value to larger one, the memory leaks problem can be solved? - We also suppressed the HTTPS certificate validation:
ServicePointManager.ServerCertificateValidationCallback = delegate { return true; };
Does this have something to do with the problem?
Due to this issue is not easily reproduced(need a lot of time), I just need some thoughts so that I can optimize our code for long time running.
Explain the situation myself, just in case others also meet this issue later.
First, this is not memory leak, it's something performance problem.
We test our application in virtual machine, on which we opened the proxy. It leads to the internet is quite slow. So in our case, each HTTP request might cost 3-4 seconds. As time going, there will be a lot of connections in the
ServicePoint
queue. Therefore, it's not memory leaks, that's just because the previous connections are not finished quickly enough.
Just make sure each HTTP request is not that slow, everything becomes normal.
We also tried to reduce the HttpClient
instances, to increase the HTTP request performance:
private readonly ConcurrentDictionary<HttpRequestInfo, HttpClient> _httpClients;
private HttpClient GetHttpClient(HttpRequestInfo httpRequestInfo)
{
if (_httpClients.ContainsKey(httpRequestInfo))
{
HttpClient value;
if (!_httpClients.TryGetValue(httpRequestInfo, out value))
{
throw new InvalidOperationException("It seems there is no related http client in the dictionary.");
}
return value;
}
var httpClient = new HttpClient { BaseAddress = new Uri(httpRequestInfo.BaseUrl) };
if (httpRequestInfo.RequestHeaders.Any())
{
foreach (var requestHeader in httpRequestInfo.RequestHeaders)
{
httpClient.DefaultRequestHeaders.Add(requestHeader.Key, requestHeader.Value);
}
}
httpClient.DefaultRequestHeaders.ExpectContinue = false;
httpClient.DefaultRequestHeaders.ConnectionClose = true;
httpClient.Timeout = TimeSpan.FromMinutes(2);
if (!_httpClients.TryAdd(httpRequestInfo, httpClient))
{
throw new InvalidOperationException("Adding new http client thrown an exception.");
}
return httpClient;
}
In our case, only the request body is different for same server address. I also override the Equals
and GetHashCode
method of HttpRequestInfo
.
Meanwhile, we set ServicePointManager.DefaultConnectionLimit = int.MaxValue;
Hopes this can help you.
来源:https://stackoverflow.com/questions/29096513/big-size-of-servicepoint-object-after-several-hours-sending-http-request-in-para