When running \"sbt package\" from the command line for a small Spark Scala application, I\'m getting the \"value $ is not a member of StringContext\" compilation error on the fo
You need to make sure you import sqlContext.implicits._
This gets you implicit class StringToColumn extends AnyRef
Which is commented as:
Converts $"col name" into an Column.
Great answer guys, if resolving import is a concern, then will this work
import org.apache.spark.sql.{SparkSession, SQLContext}
val ss = SparkSession.builder().appName("test").getOrCreate()
val dataDf = ...
import ss.sqlContext.implicits._
dataDf.filter(not($"column_name1" === "condition"))
$-notation for columns can be used by importing implicit on SparkSession
object (spark
)
val spark = org.apache.spark.sql.SparkSession.builder
.master("local")
.appName("App name")
.getOrCreate;
import spark.implicits._
then your code with $ notation
val joined = ordered.join(empLogins, $"login" === $"username", "inner")
.orderBy($"count".desc)
.select("login", "count")