You are not logged in.
Pages: 1
Hi, I followed https://www.linode.com/docs/databases/h … p-cluster/ for set up, and then receive the following message upon entering start-dfs.sh:
Starting namenodes on [master]
ERROR: Both HADOOP_WORKERS and HADOOP_WORKER_NAMES were defined. Aborting.
...
Starting secondary namenodes on [master]
ERROR: Both HADOOP_WORKERS and HADOOP_WORKER_NAMES were defined. Aborting.
Can someone tell me what's going on and how to fix it?
Last edited by vorlket (2019-11-21 07:04:17)
Offline
cat /etc/profile.d/hadoop.sh
export HADOOP_CONF_DIR=/etc/hadoop
export HADOOP_LOG_DIR=/tmp/hadoop/log
export HADOOP_WORKERS=/etc/hadoop/workers
export HADOOP_PID_DIR=/tmp/hadoop/run
Offline
How about:
env | grep HADOOP_WORKER_NAMES
sudo grep -rl HADOOP_WORKER_NAMES /etc
"UNIX is simple and coherent..." - Dennis Ritchie, "GNU's Not UNIX" - Richard Stallman
Offline
grep -rl HADOOP_WORKER_NAMES /etc
/etc/hadoop/hadoop-user-functions.sh.example
Offline
Removing export HADOOP_WORKERS=/etc/hadoop/workers from /etc/profile.d/hadoop.sh works.
Offline
Pages: 1