What are simple commands to check if Hadoop daemons are running?
For example if I\'m trying to figure out why HDFS is not setup correctly I\'ll want to know a way to ch
apart from jps, another good idea is to use the web interfaces for NameNode and JobTracker provided by Hadoop. It not only shows you the processes but provides you a lot of other useful info like your cluster summary, ongoing jobs etc atc. to go to the NN UI point your web browser to "YOUR_NAMENODE_HOST:9000" and for JT UI "YOUR_JOBTRACKER_HOST:9001".
To check whether Hadoop Nodes are running or not:
sudo -u hdfs hdfs dfsadmin -report
Configured Capacity: 28799380685 (26.82 GB)
Present Capacity: 25104842752 (23.38 GB)
DFS Remaining: 25012056064 (23.29 GB)
DFS Used: 92786688 (88.49 MB)
DFS Used%: 0.37%
Under replicated blocks: 436
Blocks with corrupt replicas: 0
Missing blocks: 0
Datanodes available: 1 (1 total, 0 dead)
Live datanodes:
Name: 127.0.0.1:50010 (localhost.localdomain)
Hostname: localhost.localdomain
Rack: /default
Decommission Status : Normal
Configured Capacity: 28799380685 (26.82 GB)
DFS Used: 92786688 (88.49 MB)
Non DFS Used: 3694537933 (3.44 GB)
DFS Remaining: 25012056064 (23.29 GB)
DFS Used%: 0.32%
DFS Remaining%: 86.85%
Last contact: Thu Mar 01 22:01:38 IST 2018
Try running this:
for service in /etc/init.d/hadoop-hdfs-*; do $service status; done;
you can use Jps command as vipin said like this command :
/usr/lib/java/jdk1.8.0_25/bin/jps
of course you will change the path of java with the one you have "the path you installed java in"
Jps is A nifty tool for checking whether the expected Hadoop processes are running (part of Sun’s Java since v1.5.0).
the result will be something like that :
2287 TaskTracker
2149 JobTracker
1938 DataNode
2085 SecondaryNameNode
2349 Jps
1788 NameNode
I get the answer from this tutorial: http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
Try jps
command. It specifies the java processes which are up and running.
To check deamons are running?
You can check with jps command
use use below commands also
ps -ef | grep -w namenode
ps -ef | grep -w datanode
ps -ef | grep -w tasktracker
-w :- will help to fetch the exact string
If you have Superuser privilege then you can also use below one for the same:
./hadoop dfsadmin -report
Hope this will help !!!