reading a file in hdfs from pyspark

前端 未结 4 1600
太阳男子
太阳男子 2021-02-02 01:40

I\'m trying to read a file in my hdfs. Here\'s a showing of my hadoop file structure.

hduser@GVM:/usr/local/spark/bin$ hadoop fs -ls -R /
drwxr-xr-x   - hduser s         


        
4条回答
  •  时光取名叫无心
    2021-02-02 02:13

    Since you don't provide authority URI should look like this:

    hdfs:///inputFiles/CountOfMonteCristo/BookText.txt
    

    otherwise inputFiles is interpreted as a hostname. With correct configuration you shouldn't need scheme at all an use:

    /inputFiles/CountOfMonteCristo/BookText.txt
    

    instead.

提交回复
热议问题