pyspark : NameError: name 'spark' is not defined

后端 未结 3 801
余生分开走
余生分开走 2020-12-24 08:40

I am copying the pyspark.ml example from the official document website: http://spark.apache.org/docs/latest/api/python/pyspark.ml.html#pyspark.ml.Transformer



        
相关标签:
3条回答
  • 2020-12-24 08:46

    You can add

    from pyspark.context import SparkContext
    from pyspark.sql.session import SparkSession
    sc = SparkContext('local')
    spark = SparkSession(sc)
    

    to the begining of your codes to define a SparkSession, then the spark.createDataFrame() should work.

    0 讨论(0)
  • 2020-12-24 09:02

    Answer by 率怀一 is good and will work for the first time. But the second time you try it, it will throw the following exception :

    ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=pyspark-shell, master=local) created by __init__ at <ipython-input-3-786525f7559f>:10 
    

    There are two ways to avoid it.

    1) Using SparkContext.getOrCreate() instead of SparkContext():

    from pyspark.context import SparkContext
    from pyspark.sql.session import SparkSession
    sc = SparkContext.getOrCreate()
    spark = SparkSession(sc)
    

    2) Using sc.stop() in the end, or before you start another SparkContext.

    0 讨论(0)
  • 2020-12-24 09:03

    Since you are calling createDataFrame(), you need to do this:

    df = sqlContext.createDataFrame(data, ["features"])
    

    instead of this:

    df = spark.createDataFrame(data, ["features"])
    

    spark stands there as the sqlContext.


    In general, some people have that as sc, so if that didn't work, you could try:

    df = sc.createDataFrame(data, ["features"])
    
    0 讨论(0)
提交回复
热议问题