In PySpark, I want to calculate the correlation between two dataframe vectors, using the following code (I do not have any problem in importing pyspark or createDataFrame):
There's an open resolved issue around this:
https://issues.apache.org/jira/browse/SPARK-27335?jql=text%20~%20%22setcallsite%22
[Note: as it's resolved, if you're using a more recent version of Spark than October 2019, please report to Apache Jira if you're still encountering this issue]
The poster suggests forcing to sync your DF's backend with your Spark context:
df.sql_ctx.sparkSession._jsparkSession = spark._jsparkSession
df._sc = spark._sc
This worked for us, hopefully can work in other cases as well.