./spark-shell doesn't start correctly (spark1.6.1-bin.hadoop2.6 version)

后端 未结 2 1139
野趣味
野趣味 2021-01-20 23:45

I installed this spark version: spark-1.6.1-bin-hadoop2.6.tgz.

Now when I start spark with ./spark-shell command Im getting this issues (it shows a lot of e

相关标签:
2条回答
  • 2021-01-21 00:23

    You are using spark built with hive support.

    There are two possible solutions based on what you want to do later with your spark-shell or in your spark jobs -

    1. You want to access hive tables in your hadoop+hive installation. You should place hive-site.xml in your spark installation's conf sub-directory. Find hive-site.xml from your existing hive installation. For example, in my cloudera VM the hive-site.xml is at /usr/lib/hive/conf. Launching the spark-shell after doing this step should successfully connect to existing hive metastore and will not try to create a temporary .metastore database in your current working directory.
    2. You do NOT want to access hive tables in your hadoop+hive installation. If you do not care about connecting to hive tables, then you can follow Alberto's solution. Fix the permission issues in the directory from which you are launching spark-shell. Make sure you are allowed to create directories/files in that directory.

    Hope this helps.

    0 讨论(0)
  • 2021-01-21 00:29

    Apparently you don't have permissions to write in that directory, I recommend you to run ./spark-shell in your HOME (you might want to add that command to your PATH), or in any other directory accessible and writable by your user.

    This might also be relevant for you Notebooks together with Spark

    0 讨论(0)
提交回复
热议问题