webclient

Security rules for subclassing transparent type with safe-critical constructor in Silverlight

微笑、不失礼 提交于 2019-12-04 11:45:56
In the Silverlight (v4.0) security model, Shawn Farkas says of deriving from types: [...] we see that application types may only derive from other application types or transparent platform types. (*) The * part of this is: (*) This true in 99.9% case. There is another rule about the visibility of the default constructor of a class (which we'll get into next week when we dig deeper into the security model), which also requires that the base class' default constructor (if it has one), must be transparent as well. Practically speaking, you're not generally going to find interesting transparent

Read response header from WebClient in C#

China☆狼群 提交于 2019-12-04 10:27:24
问题 I'm trying to create my first windows client (and this is my fist post her), there shall communicate with a "web services", but i have some trouble to read the response header there is coming back. In my response string do I received a nice JSON document back (and this is my next problem), but i'm not able to "see/read" the header in the response, only the body. Below is the code i'm using. WebClient MyClient = new WebClient(); MyClient.Headers.Add("Content-Type", "application/json");

C# WebClient OpenRead url

可紊 提交于 2019-12-04 10:23:23
So I have this program that fetches a page using a short link (I used Google url shortener). To build my example I used code from Using WebClient in C# is there a way to get the URL of a site after being redirected? using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.IO; using System.Net; namespace ConsoleApplication1 { class Program { static void Main(string[] args) { MyWebClient client = new MyWebClient(); client.OpenRead("http://tinyurl.com/345yj7x"); Uri uri = client.ResponseUri; Console.WriteLine(uri.AbsoluteUri); Console.Read(); } } class

Silverlight 4, subclassing WebClient

百般思念 提交于 2019-12-04 10:22:30
Following an advice, I saw at several web pages (for example Using CookieContainer with WebClient class ), I subclassed WebClient class to use a cookie with it: public class MyWebClient : System.Net.WebClient { } Now, when I initialize MyWebClient: MyWebClient wc = new MyWebClient(); it throws TypeLoadException. My OS is Windows 7 (japanese), so error message is not in english; I see it is related to security rules. What might be the problem? Stephen McDaniel WebClient's constructor is marked with the SecuritySafeCritical attribute. And it looks like that is what is causing the security

Trying to get authentication cookie(s) using HttpWebRequest

血红的双手。 提交于 2019-12-04 09:44:30
问题 I have to scrape a table from a secure site and I'm having trouble logging in to the page and retrieving the authentication token and any other associated cookies. Am I doing something wrong here? public NameValueCollection LoginToDatrose() { var loginUriBuilder = new UriBuilder(); loginUriBuilder.Host = DatroseHostName; loginUriBuilder.Path = BuildURIPath(DatroseBasePath, LOGIN_PAGE); loginUriBuilder.Scheme = "https"; var boundary = Guid.NewGuid().ToString(); var postData = new

C# detect page redirect

烈酒焚心 提交于 2019-12-04 09:28:33
问题 I am trying to determine if a qualification exists on http://www.accreditedqualifications.org.uk in the form: http://www.accreditedqualifications.org.uk/qualification/50084811.seo.aspx 50084811 being a qualification aim entered by the end user. If they enter an invalid one e.g. http://www.accreditedqualifications.org.uk/qualification/50084911.seo.aspx They are redirected to an error page (with incorrect http headers as far as I can see). Is there a way to detect the redirect in C#. I would

How to transfer multiple files from FTP server to local directory using C#?

纵然是瞬间 提交于 2019-12-04 08:13:04
问题 I can transfer one file from ftp server to local directory. using the following code using (WebClient ftpClient = new WebClient()) { ftpClient.Credentials = new System.Net.NetworkCredential("username", "password"); ftpClient.DownloadFile("ftp://website.com/abcd.docx", @"D:\\WestHam\test.docx"); but i don't know how to transfer multiple files. can anybody help me on this. } 回答1: Use this code, just replace the user credentials: static void Main(string[] args) { FtpWebRequest ftpRequest =

Google OAuth - Keeping the Client ID Secret

不打扰是莪最后的温柔 提交于 2019-12-04 07:36:53
When using OAuth in the Google Cloud Endpoints JavaScript client, how do you preserve the secrecy of the client ID? How to implement 0Auth in the Google Cloud Endpoints JavaScript client is detailed here . In the code snippet below the client ID is passed as a parameter to the OAuth method. gapi.auth.authorize({client_id: CLIENT_ID, scope: SCOPES, immediate: mode}, callback); Since the end user will receive the script file in clear text, regardless of the use of HTTPS, how would you avoid handing the client ID over to every user you serve? After all, it would be rather simple to comb the

Readable debug logging for http requests with spring webclient

人盡茶涼 提交于 2019-12-04 07:36:09
I'm using Spring reactive WebClient for sending requests to a http server. Inorder to view the underlying request & response that's being sent, I enabled debug logging for reactor.ipc.netty package. The headers for the outgoing requests can be viewed normally. Tho I'm sending & receiving plain text over http, the log contains the request & responses in the below format (is it hex?) I'm not sure how to view the logged data in a easy to understand way. Better yet log the request & response in a understandable way Here is a snippet of the logged data +---------------------------------------------

How to download a whole folder of files/subfolders from the web in PowerShell

*爱你&永不变心* 提交于 2019-12-04 07:26:52
I can download a single file from the web using: $wc = New-Object System.Net.WebClient $wc.DownloadFile("http://blah/root/somefile.ext", "C:\Downloads\www\blah\root\somefile.ext") But how do I download all the files, including subfolders? Something like the following would be nice... $wc.DownloadFile("http://blah/root/", "C:\Downloads\www\blah\root\") The root folder itself appears as a directory listing in IE, you know, like: [To Parent Directory] 01 July 2012 09:00 1234 somefile.ext 01 July 2012 09:01 1234 someotherfile.ext As a bonus, how would I just downloading the files in the root