The error “Invalid job type for this context” in spark SQL job with Spark job server

自古美人都是妖i 提交于 2019-12-10 17:36:19

问题


I create a spark SQL job with spark job server and use HiveContext following the sample below: https://github.com/spark-jobserver/spark-jobserver/blob/master/job-server-extras/src/spark.jobserver/HiveTestJob.scala

I was able to start the server but when I run my application(my Scala class which extends SparkSqlJob), I am getting the following as response:

{
   "status": "ERROR",

   "result": "Invalid job type for this context"
 }

Can any one suggest me what is going wrong or provide a detailed procedure for setting up jobserver for SparkSQL?

The code is below:

import com.typesafe.config.{Config, ConfigFactory}
import org.apache.spark._
import org.apache.spark.sql.hive.HiveContext
import spark.jobserver.{SparkJobValid, SparkJobValidation, SparkHiveJob}

object newHiveRest extends SparkHiveJob {


  def validate(hive: HiveContext, config: Config): SparkJobValidation = SparkJobValid

  def runJob(hive: HiveContext, config: Config): Any = {

    hive.sql(s"use default")
    val maxRdd = hive.sql(s"select count(*) from 'default'.'passenger'")

    maxRdd.count()
  }
}

回答1:


For Spark SQL you can use the following

https://github.com/spark-jobserver/spark-jobserver/blob/master/job-server-extras/src/spark.jobserver/SqlTestJob.scala



来源:https://stackoverflow.com/questions/35032545/the-error-invalid-job-type-for-this-context-in-spark-sql-job-with-spark-job-se

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!