Use Webclient to download several files

妖精的绣舞 提交于 2020-01-16 19:17:27

问题


I'm trying to download files from webserver via a NUnit-testcase like this:

[TestCase("url_to_test_server/456.pdf")]
[TestCase("url_to_test_server/457.pdf")]
[TestCase("url_to_test_server/458.pdf")]
public void Test(string url) 
{
    using (WebClient client = new WebClient())
    {
        client.Headers.Add("User-Agent", "Mozilla/4.0 (compatible; MSIE 8.0)");
        client.DownloadFile(new Uri(url), @"C:\Temp\" + Path.GetFileName(url));
    }
}

This code works, but when i'm trying to get the file size, it hangs.

[TestCase("url_to_test_server/456.pdf")]
[TestCase("url_to_test_server/457.pdf")]
[TestCase("url_to_test_server/458.pdf")]
public void Test(string url) 
{
    using (WebClient client = new WebClient())
    {
        client.Headers.Add("User-Agent", "Mozilla/4.0 (compatible; MSIE 8.0)");
        client.OpenRead(url);
        Int64 bytes_total = Convert.ToInt64(client.ResponseHeaders["Content-Length"]);
        client.DownloadFile(new Uri(url), @"C:\Temp\" + Path.GetFileName(url));
    }
}

How to solve this?


回答1:


You have to make sure that your server supports this header too. It doesn't seem to be a client problem.

I'd download the file in a browser and check the communications using firebug or some similar program. You have to see Content-length explicitly returned in the response. If not, you need to check the server, otherwise your problem is on the client side. I actually can't imagine a reason why the client can't read the header if it is indeed beeing returned.




回答2:


Waking up dead post but here is the answer...

This issue happens when server is not providing Content-Length in the response header. You have to fix that on the server side.

Another reason when it happens is when we have reached the connection limit to the server. So I am assuming that your issue was similar and it was hanging on second or third try in a loop.

When we call OpenRead, it opens up a stream. We just need to close this stream after getting our file size to make it work properly.

Here is the code I use to get the size:

    /// <summary>
    /// Gets file size from a url using WebClient and Stream classes
    /// </summary>
    /// <param name="address">url</param>
    /// <param name="useHeaderOnly">requests only headers instead of full file</param>
    /// <returns>File size or -1 if their is an issue.</returns>
    static Int64 GetFileSize(string address, bool useHeaderOnly = false)
    {
        Int64 retVal = 0;
        try
        {
            if(useHeaderOnly)
            {
                WebRequest request = WebRequest.Create(address);
                request.Method = "HEAD";

                // WebResponse also has to be closed otherwise we get the same issue of hanging on the connection limit. Using statement closes it automatically.
                using (WebResponse response = request.GetResponse())
                {
                    if (response != null)
                    {
                        retVal = response.ContentLength;
                        //retVal =  Convert.ToInt64(response.Headers["Content-Length"]);
                    }
                }
                request = null;
            }
            else
            {
                using (WebClient client = new WebClient())
                {
                    // Stream has to be closed otherwise we get the issue of hanging on the connection limit. Using statement closes it automatically.
                    using (Stream response = client.OpenRead(address))
                    {
                        retVal = Convert.ToInt64(client.ResponseHeaders["Content-Length"]);
                    }
                }
            }
        }
        catch (Exception ex)
        {
            Console.WriteLine(ex.Message);
            retVal = -1;
        }


        return retVal;
    }


来源:https://stackoverflow.com/questions/23950349/use-webclient-to-download-several-files

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!