Hadoop Directory with Spaces

后端 未结 4 1017
清歌不尽
清歌不尽 2021-01-21 10:18

I\'m running into a problem while providing Hadoop a directory that contains spaces.

e.g

inputDir = /abc/xyz/folder name/abc.txt

Hadoo

相关标签:
4条回答
  • 2021-01-21 10:37

    Replacing the space with %20 works for Hadoop shell. As in

    sed 's/ /\%20/g'
    

    And in the actual put command

    hadoop fs -put "$inputDir" $putDest
    

    Without the %20 you get a URI exception. (Which gave me my clue to use %20 over an escape character \ .)

    I realize you're doing via Java. The fact that you're getting a java.io.FileNotFoundException makes me wonder if the code is doing something else with inputDir as opposed to being just the argument to the hadoop put, or an equivalent command of put. If it does any kind of checking of inputDir outside of Hadoop commands it will fail. Java sees it as a path. Hadoop sees it as a URI.

    0 讨论(0)
  • 2021-01-21 10:39

    Try setting using set("path", "/abc/xyz/folder\\ name/abc.txt"); Kindly, note the double back slash.

    0 讨论(0)
  • 2021-01-21 10:51
    inputDir = "/abc/xyz/folder name/" 
    

    must work

    hadoop fs -ls "/abc/xyz/folder name/"
    

    works fine

    0 讨论(0)
  • 2021-01-21 10:55

    Hadoop does not support empty spaces in input directory paths.

    Replace space with _ or your preferred separator character in your directory paths.

    0 讨论(0)
提交回复
热议问题