Spark - “sbt package” - “value $ is not a member of StringContext” - Missing Scala plugin?

前端 未结 3 2086
情话喂你
情话喂你 2021-02-12 14:16

When running \"sbt package\" from the command line for a small Spark Scala application, I\'m getting the \"value $ is not a member of StringContext\" compilation error on the fo

相关标签:
3条回答
  • 2021-02-12 14:34

    You need to make sure you import sqlContext.implicits._

    This gets you implicit class StringToColumn extends AnyRef

    Which is commented as:

    Converts $"col name" into an Column.

    0 讨论(0)
  • 2021-02-12 14:41

    Great answer guys, if resolving import is a concern, then will this work

    import org.apache.spark.sql.{SparkSession, SQLContext}
    val ss = SparkSession.builder().appName("test").getOrCreate()
    val dataDf = ...
    
    import ss.sqlContext.implicits._
    dataDf.filter(not($"column_name1" === "condition"))
    
    0 讨论(0)
  • 2021-02-12 14:42

    In Spark 2.0+

    $-notation for columns can be used by importing implicit on SparkSession object (spark)

    val spark = org.apache.spark.sql.SparkSession.builder
            .master("local")
            .appName("App name")
            .getOrCreate;
    
    import spark.implicits._
    

    then your code with $ notation

    val joined = ordered.join(empLogins, $"login" === $"username", "inner")
      .orderBy($"count".desc)
      .select("login", "count")
    
    0 讨论(0)
提交回复
热议问题