webhdfs

How to Authenticate WebHDFS with c#

五迷三道 提交于 2019-12-23 04:32:23
问题 I have been attempting to upload files using c# into Hadoop using the WebHDFS REST API. This code works fine: using (var client = new System.Net.WebClient()) { string result = client.DownloadString("http:/ /host:50070/webhdfs/v1/user/myuser/?op=LISTSTATUS"); client.DownloadFile("http:/ /host:50070/webhdfs/v1/user/myuser/tbible.txt?user.name=myuser&op=OPEN","d:\tbible.txt"); } This code gets a 403 Forbidden: using (var client = new System.Net.WebClient()) { client.UploadFile("http:/ /host

Accessing kerberos protected webhdfs from .Net Application(console)

你。 提交于 2019-12-22 09:56:42
问题 I'm unable to access WebHDFS from browser due to Kerberos security. Can anyone help me with this? Below is the error in browser for “http://****.****/webhdfs/v1/prod/snapshot_rpx/archive?op=LISTSTATUS&user.name=us” HTTP ERROR 401 Problem accessing /webhdfs/v1/prod/snapshot_rpx/archive. Reason: Authentication required .Net code for making a request to this URL HttpWebRequest http = (HttpWebRequest)WebRequest.Create(requestUri); http.Timeout = timeout; http.ContentType = contentType; string

Accessing kerberos secured WebHDFS without SPnego

为君一笑 提交于 2019-12-18 07:21:31
问题 I have a working application for managing HDFS using WebHDFS. I need to be able to do this on a Kerberos secured cluster. The problem is, that there is no library or extension to negotiate the ticket for my app, I only have a basic HTTP client. Would it be possible to create a Java service which would handle the ticket exchange and once it gets the Service ticket to just pass it to the app for use in a HTTP request? In other words, my app would ask the Java service to negotiate the tickets

Copy a file using WebHDFS

允我心安 提交于 2019-12-13 18:50:48
问题 Is there a way to copy a file from (let's say) hdfs://old to hdfs://new without first downloading the file and then uploading it again? 回答1: Don't know about WebHDFS, but this is achievable using hadoop distcp. The command looks something like this: hadoop distcp hdfs://old_nn:8020/old/location/path.file hdfs://new_nn:8020/new/location/path.file 来源: https://stackoverflow.com/questions/36445154/copy-a-file-using-webhdfs

WEBHDFS REST API to copy/move files from windows server/local folder/desktop to HDFS

余生颓废 提交于 2019-12-13 16:04:51
问题 Using WEBHDFS REST API calls can i transfer or copy the files from Windows machine(i.e. windows server or windows local folder or desktop) to Hadoop-HDFS file system? If yes any sample command info? I have tried and i was able to do using Windows->(using ftp)-> Linux directory -> (using webhdfs) -> HDFS and this is two step process and i am looking for one step process directly from Windows -> (webhdfs) -> HDFS. I referred in https://hadoop.apache.org/docs/r1.0.4/webhdfs.html for helpful info

Multiple response parsing in python

╄→гoц情女王★ 提交于 2019-12-12 04:17:16
问题 I am using curl command to access hadoop(webhdfs) and for http response parsing i am using python. But after firing curl command ,multiple responses are being returned. curl -i "http://host:50070/webhdfs/v1/user/hduser/pigtest?op=GETFILESTATUS" HTTP/1.1 401 Authentication required Cache-Control: no-cache Expires: Thu, 14 Jan 2016 10:04:23 GMT Date: Thu, 14 Jan 2016 10:04:23 GMT Pragma: no-cache Expires: Thu, 14 Jan 2016 10:04:23 GMT Date: Thu, 14 Jan 2016 10:04:23 GMT Pragma: no-cache Content

webhdfs always redirect to localhost:50075

拜拜、爱过 提交于 2019-12-12 04:14:58
问题 I have a hdfs cluster (hadoop 2.7.1), with one namenode, one secondary namenode, 3 datanodes. When I enable webhdfs and test, I found it always redirect to "localhost:50075" which is not configured as datanodes. csrd@secondarynamenode:~/lybica-hdfs-viewer$ curl -i -L "http://10.56.219.30:50070/webhdfs/v1/demo.zip?op=OPEN" HTTP/1.1 307 TEMPORARY_REDIRECT Cache-Control: no-cache Expires: Tue, 01 Dec 2015 03:29:21 GMT Date: Tue, 01 Dec 2015 03:29:21 GMT Pragma: no-cache Expires: Tue, 01 Dec 2015

Hadoop Rest API for upload / download

◇◆丶佛笑我妖孽 提交于 2019-12-11 17:27:00
问题 I am trying to perform upload/download a file from Hadoop cluster, using a C# app, but I couldn't find the APIs for Upload and download from the documentation. So can you please let me know how to upload and download files from Hadoop using RestAPIs? Thanks 回答1: You can use the WebHDFS REST API as described here http://hadoop.apache.org/docs/r1.0.4/webhdfs.html Edit: Create and Write to a File Step 1: Submit a HTTP PUT request without automatically following redirects and without sending the

Webhdfs returns wrong datanode address

[亡魂溺海] 提交于 2019-12-11 01:59:46
问题 curl -i -X PUT "http://SomeHostname:50070/webhdfs/v1/file1?op=CREATE" HTTP/1.1 307 TEMPORARY_REDIRECT Content-Type: application/octet-stream Location: http://sslave0:50075/webhdfs/v1/file1?op=CREATE&overwrite=false Content-Length: 0 Server: Jetty(6.1.26) here it return sslave0 for datanode, seem like an internal address to me 回答1: With WebHDFS, the NameNode web interface @port 50070 in your case accepts the put request and assigns the metadata information about the file to be stored. It then

Docker Kerberos WebHDFS AuthenticationException: Unauthorized

南楼画角 提交于 2019-12-10 17:16:34
问题 I have a Spring application that reads a file from HDFS using WebHDFS. When I test it in IDEA, it works. But after I build the project and deploy the Docker image on a virtual machine locally or on a server connected to HDFS, I get: AuthenticationException: Unauthorized On my local machine I have to regularly initialize the token with kinit for autentication. If I don't, I get the same error. I tested the app without Docker on a server, it also works. I think the Docker image does not see the