Loading data from GCS using Spark Local

◇◆丶佛笑我妖孽 提交于 2020-05-31 04:08:06

问题


I am trying to read data from GCS buckets on my local machine, for testing purposes. I would like to sample some of the data in the cloud I have downloaded the GCS Hadoop Connector JAR.

And setup the sparkConf as follow:

conf = SparkConf() \
    .setMaster("local[8]") \
    .setAppName("Test") \
    .set("spark.jars", "path/gcs-connector-hadoop2-latest.jar") \
    .set("spark.hadoop.google.cloud.auth.service.account.enable", "true") \
    .set("spark.hadoop.google.cloud.auth.service.account.json.keyfile", "path/to/keyfile")

sc = SparkContext(conf=conf)

spark = SparkSession.builder \
    .config(conf=sc.getConf()) \
    .getOrCreate()

spark.read.json("gs://gcs-bucket")

I have also tried to set the conf like so:

sc._jsc.hadoopConfiguration().set("fs.AbstractFileSystem.gs.impl",  "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS")
sc._jsc.hadoopConfiguration().set("fs.gs.auth.service.account.json.keyfile", "path/to/keyfile")
sc._jsc.hadoopConfiguration().set("fs.gs.auth.service.account.enable", "true")

I am using PySpark install via PIP and running the code using the unit test module from IntelliJ

py4j.protocol.Py4JJavaError: An error occurred while calling o128.json.
: java.io.IOException: No FileSystem for scheme: gs

What should I do?

Thanks!


回答1:


To solve this issue, you need to add configuration for fs.gs.impl property in addition to properties that you already configured:

sc._jsc.hadoopConfiguration().set("fs.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem")


来源:https://stackoverflow.com/questions/55059063/loading-data-from-gcs-using-spark-local

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!