Applies To: Hadoop HDFS Client
Category: Troubleshooting → HDFS environment
Issue Summary
When attempting to execute hdfs commands (e.g., hdfs dfs -ls), the system returns “command not found” indicating that the Hadoop client binaries are not correctly configured in the system PATH.
Possible Cause(s)
List common reasons why this issue may occur.
Hadoop client binaries are not installed or are incomplete.
The HADOOP_HOME environment variable is not set correctly.
The $HADOOP_HOME/bin directory is not included in the system's PATH environment variable.
Step-by-Step Resolution
1. Verify Hadoop Client Installation:
Check if Hadoop is installed by looking for the Hadoop installation directory cd /opt/hadoop
cd /usr/local/hadoop
2. Locate hdfs Executable:
Navigate to the bin directory within your Hadoop installation.
cd opt/hadoop/bin/
cd $HADOOP_HOME/bin
Check if the hdfs executable exists.
ls -l $HADOOP_HOME/hdfs
3. Check HADOOP_HOME Environment Variable:
To check your Hadoop installation directory.
echo $HADOOP_HOME
If it's incorrect or not set, set it in your .bashrc, .profile, or /etc/profile:
export HADOOP_HOME=/path/to/your/hadoop/installation export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
4. Check PATH Environment Variable:
Run echo $PATH.
Ensure that $HADOOP_HOME/bin is included in the output. If not, add it to your shell configuration file (as shown in step 3).
5. Source Configuration Files:
After modifying .bashrc or .profile,
source ~/.bashrc or source ~/.profile
For system-wide changes in /etc/profile, a re-login or reboot might be required.
6. Test the Command:
After applying changes, try hdfs dfs -ls / again.
Additional Notes:
Ensure that the user running the command has execute permissions on the hdfs binary.
If you are running Hadoop in a multi-user environment, consider setting HADOOP_HOME and PATH in a global profile script like /etc/profile.d/hadoop.sh for all users.