Table loaded through Spark not accessible in Hive

后端 未结 2 1904
暗喜
暗喜 2021-01-18 12:26

Hive table created through Spark (pyspark) are not accessible from Hive.

df.write.format(\"orc\").mode(\"overwrite\").saveAsTable(\"db.table\")
2条回答
  •  栀梦
    栀梦 (楼主)
    2021-01-18 12:49

    I faced the same issue after setting the following properties, it is working fine.

    set hive.mapred.mode=nonstrict;
    set hive.optimize.ppd=true;
    set hive.optimize.index.filter=true;
    set hive.tez.bucket.pruning=true;
    set hive.explain.user=false; 
    set hive.fetch.task.conversion=none;
    set hive.support.concurrency=true;
    set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;
    

提交回复
热议问题