hadoop3

How to write a table to hive from spark without using the warehouse connector in HDP 3.1

旧巷老猫 提交于 2019-12-01 10:54:01
问题 when trying to use spark 2.3 on HDP 3.1 to write to a Hive table without the warehouse connector directly into hives schema using: spark-shell --driver-memory 16g --master local[3] --conf spark.hadoop.metastore.catalog.default=hive val df = Seq(1,2,3,4).toDF spark.sql("create database foo") df.write.saveAsTable("foo.my_table_01") fails with: Table foo.my_table_01 failed strict managed table checks due to the following reason: Table is marked as a managed table but is not transactional but a:

HDFS_NAMENODE_USER, HDFS_DATANODE_USER & HDFS_SECONDARYNAMENODE_USER not defined

久未见 提交于 2019-11-29 06:18:36
I am new to hadoop. I'm trying to install hadoop in my laptop in Pseudo-Distributed mode. I am running it with root user, but I'm getting the error below. root@debdutta-Lenovo-G50-80:~# $HADOOP_PREFIX/sbin/start-dfs.sh WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX. Starting namenodes on [localhost] ERROR: Attempting to operate on hdfs namenode as root ERROR: but there is no HDFS_NAMENODE_USER defined. Aborting operation. Starting datanodes ERROR: Attempting to operate on hdfs datanode as root ERROR: but there is no HDFS_DATANODE_USER defined. Aborting