PySpark sqlContext read Postgres 9.6 NullPointerException

孤人 提交于 2019-12-13 13:51:52

问题


Trying to read a table with PySpark from a Postgres DB. I have set up the following code and verified SparkContext exists:

import os

os.environ['PYSPARK_SUBMIT_ARGS'] = '--driver-class-path /tmp/jars/postgresql-42.0.0.jar --jars /tmp/jars/postgresql-42.0.0.jar pyspark-shell'


from pyspark import SparkContext, SparkConf

conf = SparkConf()
conf.setMaster("local[*]")
conf.setAppName('pyspark')

sc = SparkContext(conf=conf)


from pyspark.sql import SQLContext

properties = {
    "driver": "org.postgresql.Driver"
}
url = 'jdbc:postgresql://tom:@localhost/gqp'

sqlContext = SQLContext(sc)
sqlContext.read \
    .format("jdbc") \
    .option("url", url) \
    .option("driver", properties["driver"]) \
    .option("dbtable", "specimen") \
    .load()

I get the following error:

Py4JJavaError: An error occurred while calling o812.load. : java.lang.NullPointerException

The name of my database is gqp, table is specimen, and have verified it is running on localhost using the Postgres.app macOS app.


回答1:


The URL was the problem!

Originally it was: url = 'jdbc:postgresql://tom:@localhost/gqp'

I removed the tom:@ part, and it worked. The URL must follow the pattern: jdbc:postgresql://ip_address:port/db_name, whereas mine was directly copied from a Flask project.

If you're reading this, hope you didn't make this same mistake :)



来源:https://stackoverflow.com/questions/42686003/pyspark-sqlcontext-read-postgres-9-6-nullpointerexception

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!