Hadoop: start-dfs.sh permission denied

后端 未结 9 877
攒了一身酷
攒了一身酷 2020-12-13 14:04

I am installing Hadoop on my laptop. SSH works fine, but I cannot start hadoop.

munichong@GrindPad:~$ ssh localhost
Welcome to Ubuntu 12.10 (GNU/Linux 3.5.0         


        
相关标签:
9条回答
  • 2020-12-13 14:43

    I faced same problem, so tried to connect SSH and got statement like "not found," so I went to the ssh location to debug by the following steps:

    cd ~/.ssh

    ssh_keygen -t rsa -p""

    cat id_rsa.pub >> authorized_keys

    ... then it worked ...

    0 讨论(0)
  • 2020-12-13 14:43

    Try to change the ownership of the folder: /var/log/hadoop/root to the user: munichong. As on all systems the LOGS directory needs to be edited by hadoop. So it requires the permission to edit the LOG folder and its contents.

    sudo will not work in this case as this requires to have the permission of changing the folder contents even after this script finishes its work i.e to start HADOOP services in the background.

    0 讨论(0)
  • 2020-12-13 14:43

    R hadoop installation for permission denied issue, below command works for start-all.sh

    sudo chown -R hadoop /usr/local/hadoop/ 
    
    0 讨论(0)
  • 2020-12-13 14:44

    You are trying to ssh to your own machine (localhost) and missing the authorized_keys file which allows login.

    This file in SSH specifies the SSH keys that can be used for logging into the user account for which the file is configured.

    Follow the below two steps to configure it correctly.

    Generate new keygen with the below command in terminal:

    ssh-keygen
    

    Press enter so as to retain the default name id_rsa.pub

    Now register the generated key file:

    cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
    
    0 讨论(0)
  • 2020-12-13 14:49

    I solved it by setting permissions of all files to 777:

    sudo chmod 777 /usr/local/hadoop-2.7.6/* -R
    
    0 讨论(0)
  • 2020-12-13 14:51

    I was stuck at the same issue for last couple of hours but finally solved it. I had the hadoop installation extracted by same user as one I am using to run hadoop. So user privilege is not issue.
    My cofiguration is like this: Ubuntu linux machine on Google Cloud.

    Hadoop installation /home/ Hadoop data directory /var/lib/hadoop and the directory access bits are 777 so anybody can access. I did ssh into the remote machine made changes to the config files and executed start-dfs.sh, then it gave me "Permission denied (Public key)" So here is the solution: In the same ssh terminal:

    1. ssh-keygen

    2.It will ask for folder location where it will copy the keys, I entered /home/hadoop/.ssh/id_rsa

    3.it will ask for pass phrase, keep it empty for simplicity.

    4.cat /home/hadoop/.ssh/id_rsa.pub >> .ssh/authorized_keys (To copy the newly generated public key to auth file in your users home/.ssh directory)

    1. ssh localhost

    2. start-dfs.sh (Now it should work!)

    0 讨论(0)
提交回复
热议问题