I have a webpage which has nothing on it except some string(s). No images, no background color or anything, just some plain text which is not really that long in length.
If you're downloading text then I'd recommend using the WebClient and get a streamreader to the text:
WebClient web = new WebClient();
System.IO.Stream stream = web.OpenRead("http://www.yoursite.com/resource.txt");
using (System.IO.StreamReader reader = new System.IO.StreamReader(stream))
{
String text = reader.ReadToEnd();
}
If this is taking a long time then it is probably a network issue or a problem on the web server. Try opening the resource in a browser and see how long that takes. If the webpage is very large, you may want to look at streaming it in chunks rather than reading all the way to the end as in that example. Look at http://msdn.microsoft.com/en-us/library/system.io.stream.read.aspx to see how to read from a stream.
The WebClient class should be more than capable of handling the functionality you describe, for example:
System.Net.WebClient wc = new System.Net.WebClient();
byte[] raw = wc.DownloadData("http://www.yoursite.com/resource/file.htm");
string webData = System.Text.Encoding.UTF8.GetString(raw);
or (further to suggestion from Fredrick in comments)
System.Net.WebClient wc = new System.Net.WebClient();
string webData = wc.DownloadString("http://www.yoursite.com/resource/file.htm");
When you say it took 30 seconds, can you expand on that a little more? There are many reasons as to why that could have happened. Slow servers, internet connections, dodgy implementation etc etc.
You could go a level lower and implement something like this:
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create("http://www.yoursite.com/resource/file.htm");
using (StreamWriter streamWriter = new StreamWriter(webRequest.GetRequestStream(), Encoding.UTF8))
{
streamWriter.Write(requestData);
}
string responseData = string.Empty;
HttpWebResponse httpResponse = (HttpWebResponse)webRequest.GetResponse();
using (StreamReader responseReader = new StreamReader(httpResponse.GetResponseStream()))
{
responseData = responseReader.ReadToEnd();
}
However, at the end of the day the WebClient class wraps up this functionality for you. So I would suggest that you use WebClient and investigate the causes of the 30 second delay.
Regarding the suggestion So I would suggest that you use WebClient and investigate the causes of the 30 second delay.
From the answers for the question System.Net.WebClient unreasonably slow
Try setting Proxy = null;
WebClient wc = new WebClient(); wc.Proxy = null;
Credit to Alex Burtsev
WebClient client = new WebClient();
using (Stream data = client.OpenRead(Text))
{
using (StreamReader reader = new StreamReader(data))
{
string content = reader.ReadToEnd();
string pattern = @"((https?|ftp|gopher|telnet|file|notes|ms-help):((//)|(\\\\))+[\w\d:#@%/;$()~_?\+-=\\\.&]*)";
MatchCollection matches = Regex.Matches(content,pattern);
List<string> urls = new List<string>();
foreach (Match match in matches)
{
urls.Add(match.Value);
}
}