Integrating Spark SQL and Apache Drill through JDBC

后端 未结 1 1211
無奈伤痛
無奈伤痛 2021-01-01 01:37

I would like to create a Spark SQL DataFrame from the results of a query performed over CSV data (on HDFS) with Apache Drill. I successfully configured Spark SQL to make it

相关标签:
1条回答
  • 2021-01-01 01:57

    you can add JDBC Dialect for this and register the dialect before using jdbc connector

    case object DrillDialect extends JdbcDialect {
    
      def canHandle(url: String): Boolean = url.startsWith("jdbc:drill:")
    
      override def quoteIdentifier(colName: java.lang.String): java.lang.String = {
        return colName
      }
    
      def instance = this
    }
    
    JdbcDialects.registerDialect(DrillDialect)
    
    0 讨论(0)
提交回复
热议问题