“INSERT INTO …” with SparkSQL HiveContext

前端 未结 6 1323
你的背包
你的背包 2021-02-05 05:32

I\'m trying to run an insert statement with my HiveContext, like this:

hiveContext.sql(\'insert into my_table (id, score) values (1, 10)\')

Th

相关标签:
6条回答
  • 2021-02-05 05:43

    The accepted answer saveAsTable fails for me with an AnalysisException (I don't understand why). What works for me instead is:

    data = hc.sql("select 1 as id, 10 as score")
    data.write.mode("append").insertInto("my_table")
    

    I'm using Spark v2.1.0.

    0 讨论(0)
  • 2021-02-05 05:45

    Data can be appended to a Hive table using the append mode on the DataFrameWriter.

    data = hc.sql("select 1 as id, 10 as score")
    data.write.mode("append").saveAsTable("my_table")
    

    This gives the same result as an insert.

    0 讨论(0)
  • You tried to perform something that the data file format cannot, hence the Unsupported language features in query exception.

    Many data file format are write-once and no support ACID operation.

    Apache ORC supports ACID operation if you need it.

    Instead, you can use partition to split your data into folders (/data/year=2017/month=10....), here you can append/insert data into your data lake.

    0 讨论(0)
  • 2021-02-05 05:49

    I've had the same problem (Spark 1.5.1), and tried different versions.

    Given

    sqlContext.sql("create table my_table(id int, score int)")
    

    The only versions that worked looked like this:

    sqlContext.sql("insert into table my_table select t.* from (select 1, 10) t")
    sqlContext.sql("insert into       my_table select t.* from (select 2, 20) t")
    
    0 讨论(0)
  • 2021-02-05 05:55

    try this hiveContext.sql("insert into table my_table select 1, 10") if you haven't change your dynamic partition mode to nonstrict, you have to do this hiveCtx.setConf("hive.exec.dynamic.partition.mode", "nonstrict")

    0 讨论(0)
  • 2021-02-05 06:00

    When you first time do this

    $data.write.mode("append").saveAsTable("my_table")
    

    you should replace "append" with "overwrite", Then, you can use "append".

    0 讨论(0)
提交回复
热议问题