Load Spark data locally Incomplete HDFS URI

后端 未结 2 1846
小蘑菇
小蘑菇 2020-12-21 01:03

I have experienced a problem with SBT loading in a local CSV file. Basically, I\'ve written a Spark program in Scala Eclipse which reads the following file:

         


        
相关标签:
2条回答
  • 2020-12-21 01:14

    Use:

    val searches = sc.textFile("hdfs://host:port_no/data/searches")
    

    Default

    host: master
    port_no: 9000
    
    0 讨论(0)
  • 2020-12-21 01:20

    This should work:

    sc.textFile("file:///data/searches")
    

    from you error it seems like spark is loading Hadoop config, this can accure when you have a Hadoop conf file or a Hadoop environment variable set (like HADOOP_CONF_DIR)

    0 讨论(0)
提交回复
热议问题