error while copying the files from local file system to HDFS in Hadoop

廉价感情. 提交于 2019-12-25 09:39:29

问题


I am using hadoop for processing the files, presently i am trying to copy the files from local file system to HDFS using the command below

hadoop fs -put d:\hadoop\weblogs /so/data/weblogs 

Got the error as below

c:\Hadoop\hadoop-1.1.0-SNAPSHOT>hadoop fs -put d:\hadoop\weblogs /so/data/weblog
s
12/12/03 19:05:16 WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop
.ipc.RemoteException: java.io.IOException: File /so/data/weblogs/weblogs/u_ex121
10418.log could only be replicated to 0 nodes, instead of 1
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBloc
k(FSNamesystem.java:1557)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.jav
a:695)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
tion.java:1135)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387)

        at org.apache.hadoop.ipc.Client.call(Client.java:1070)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
        at $Proxy1.addBlock(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryI
nvocationHandler.java:82)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocat
ionHandler.java:59)
        at $Proxy1.addBlock(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock
(DFSClient.java:3518)
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStrea
m(DFSClient.java:3381)
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2500(DFSClien
t.java:2593)
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFS
Client.java:2833)

12/12/03 19:05:16 WARN hdfs.DFSClient: Error Recovery for block null bad datanod
e[0] nodes == null
12/12/03 19:05:16 WARN hdfs.DFSClient: Could not get block locations. Source fil
e "/so/data/weblogs/weblogs/u_ex12110418.log" - Aborting...
put: java.io.IOException: File /so/data/weblogs/weblogs/u_ex12110418.log could o
nly be replicated to 0 nodes, instead of 1
12/12/03 19:05:16 ERROR hdfs.DFSClient: Exception closing file /so/data/weblogs/
weblogs/u_ex12110418.log : org.apache.hadoop.ipc.RemoteException: java.io.IOExce
ption: File /so/data/weblogs/weblogs/u_ex12110418.log could only be replicated t
o 0 nodes, instead of 1
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBloc
k(FSNamesystem.java:1557)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.jav
a:695)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
tion.java:1135)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387)

org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /so/data/weblog
s/weblogs/u_ex12110418.log could only be replicated to 0 nodes, instead of 1
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBloc
k(FSNamesystem.java:1557)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.jav
a:695)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
tion.java:1135)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387)

        at org.apache.hadoop.ipc.Client.call(Client.java:1070)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
        at $Proxy1.addBlock(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryI
nvocationHandler.java:82)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocat
ionHandler.java:59)
        at $Proxy1.addBlock(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock
(DFSClient.java:3518)
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStrea
m(DFSClient.java:3381)
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2500(DFSClien
t.java:2593)
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFS
Client.java:2833)

c:\Hadoop\hadoop-1.1.0-SNAPSHOT>

Can anyone please let me know, whats wrong in the above command, and what need to be done in order to avoid this error ?


回答1:


One of:

  • The remote system with HDFS (as specified in the config) hasn't been started
  • There is a network problem preventing connectivity with the remote system
  • You ran out of disk space on the HDFS file system
  • You have the wrong remote system configured.



回答2:


DataNode of your Hadoop cluster is not running. Check it using jps command.



来源:https://stackoverflow.com/questions/13684637/error-while-copying-the-files-from-local-file-system-to-hdfs-in-hadoop

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!