PySpark sqlContext read Postgres 9.6 NullPointerException
问题 Trying to read a table with PySpark from a Postgres DB. I have set up the following code and verified SparkContext exists: import os os.environ['PYSPARK_SUBMIT_ARGS'] = '--driver-class-path /tmp/jars/postgresql-42.0.0.jar --jars /tmp/jars/postgresql-42.0.0.jar pyspark-shell' from pyspark import SparkContext, SparkConf conf = SparkConf() conf.setMaster("local[*]") conf.setAppName('pyspark') sc = SparkContext(conf=conf) from pyspark.sql import SQLContext properties = { "driver": "org.postgresql