I need some information from a website that\'s not mine, in order to get this information I need to login to the website to gather the information, this happens through a HT
You need to use the HTTPWebRequest and do a POST. This link should help you get started. The key is, you need to look at the HTML Form of the page you're trying to post from to see all the parameters the form needs in order to submit the post.
http://www.netomatix.com/httppostdata.aspx
http://geekswithblogs.net/rakker/archive/2006/04/21/76044.aspx
You'd make the request as though you'd just filled out the form. Assuming it's POST for example, you make a POST request with the correct data. Now if you can't login directly to the same page you want to scrape, you will have to track whatever cookies are set after your login request, and include them in your scraping request to allow you to stay logged in.
It might look like:
HttpWebRequest http = WebRequest.Create(url) as HttpWebRequest;
http.KeepAlive = true;
http.Method = "POST";
http.ContentType = "application/x-www-form-urlencoded";
string postData="FormNameForUserId=" + strUserId + "&FormNameForPassword=" + strPassword;
byte[] dataBytes = UTF8Encoding.UTF8.GetBytes(postData);
http.ContentLength = dataBytes.Length;
using (Stream postStream = http.GetRequestStream())
{
postStream.Write(dataBytes, 0, dataBytes.Length);
}
HttpWebResponse httpResponse = http.GetResponse() as HttpWebResponse;
// Probably want to inspect the http.Headers here first
http = WebRequest.Create(url2) as HttpWebRequest;
http.CookieContainer = new CookieContainer();
http.CookieContainer.Add(httpResponse.Cookies);
HttpWebResponse httpResponse2 = http.GetResponse() as HttpWebResponse;
Maybe.
For some cases, httpResponse.Cookies
will be blank. Use the CookieContainer
instead.
CookieContainer cc = new CookieContainer();
HttpWebRequest http = WebRequest.Create(url) as HttpWebRequest;
http.KeepAlive = true;
http.Method = "POST";
http.ContentType = "application/x-www-form-urlencoded";
http.CookieContainer = cc;
string postData="FormNameForUserId=" + strUserId + "&FormNameForPassword=" + strPassword;
byte[] dataBytes = UTF8Encoding.UTF8.GetBytes(postData);
http.ContentLength = dataBytes.Length;
using (Stream postStream = http.GetRequestStream())
{
postStream.Write(dataBytes, 0, dataBytes.Length);
}
HttpWebResponse httpResponse = http.GetResponse() as HttpWebResponse;
// Probably want to inspect the http.Headers here first
http = WebRequest.Create(url2) as HttpWebRequest;
http.CookieContainer = cc;
HttpWebResponse httpResponse2 = http.GetResponse() as HttpWebResponse;
You can use a WebBrowser control. Just feed it the URL of the site, then use the DOM to set the username and password into the right fields, and eventually send a click to the submit button. This way you don't care about anything but the two input fields and the submit button. No cookie handling, no raw HTML parsing, no HTTP sniffing - all that is done by the browser control.
If you go that way, a few more suggestions:
As an addition to dlambin answer It is necessary to have
http.AllowAutoRedirect=false;
Otherwise
HttpWebResponse httpResponse = http.GetResponse() as HttpWebResponse;
It will make another request to initial url and you won't be able to retrieve url2.