I\'m running into a problem while providing Hadoop a directory that contains spaces.
e.g
inputDir = /abc/xyz/folder name/abc.txt
Hadoo
Replacing the space with %20 works for Hadoop shell. As in
sed 's/ /\%20/g'
And in the actual put command
hadoop fs -put "$inputDir" $putDest
Without the %20 you get a URI exception. (Which gave me my clue to use %20 over an escape character \ .)
I realize you're doing via Java. The fact that you're getting a java.io.FileNotFoundException makes me wonder if the code is doing something else with inputDir as opposed to being just the argument to the hadoop put, or an equivalent command of put. If it does any kind of checking of inputDir outside of Hadoop commands it will fail. Java sees it as a path. Hadoop sees it as a URI.
Try setting using set("path", "/abc/xyz/folder\\ name/abc.txt"); Kindly, note the double back slash.
inputDir = "/abc/xyz/folder name/"
must work
hadoop fs -ls "/abc/xyz/folder name/"
works fine
Hadoop does not support empty spaces in input directory paths.
Replace space with _ or your preferred separator character in your directory paths.