I have used SQL in Spark, in this example:
results = spark.sql(\"select * from ventas\")
where ventas is a dataframe, previosuly cataloged
Next create the df1 as javaobject
df1=sqlcontext.sql("select col1,col2,col3 from table")
Next create df2 as DATAFRAME
df2=spark.sql("select col1,col2,col3 from table")
Check the difference using type(df2) and type(df1)
type(df2)
type(df1)