Answer the question
In order to leave comments, you need to log in
Which folder is responsible for conf in hadoop 2.8.1 version?
Hello.
downloaded hadoop 2.8.1 and try to install on CentOS 7
http://apache.mesi.com.ar/hadoop/common/hadoop-2.8.1/hadoop-2.8.1.tar.gz
../hadoop/etc/hadoop
[[email protected] hadoop]$ ls
capacity-scheduler.xml hadoop-metrics.properties kms-acls.xml mapred-queues.xml.template yarn-env.cmd
configuration.xsl hadoop-policy.xml kms-env.sh mapred-site.xml yarn-env.sh
container-executor.cfg hdfs-site.xml kms-log4j.properties mapred-site.xml.template yarn-site.xml
core-site.xml httpfs-env.sh kms-site.xml masters
hadoop-env.cmd httpfs-log4j.properties log4j.properties slaves
hadoop-env.sh httpfs-signature.secret mapred-env.cmd ssl-client.xml.example
hadoop-metrics2.properties httpfs-site.xml mapred-env.sh ssl-server.xml.example
[[email protected] sbin]$ ./start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Error: Cannot find configuration directory: /opt/hadoop/etc/hadoop/
starting yarn daemons
Error: Cannot find configuration directory: /opt/hadoop/etc/hadoop/
[[email protected] sbin]$ ./start-dfs.sh
Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
Starting namenodes on []
Error: Cannot find configuration directory: /opt/hadoop/etc/hadoop/
Error: Cannot find configuration directory: /opt/hadoop/etc/hadoop/
Starting secondary namenodes [0.0.0.0]
Error: Cannot find configuration directory: /opt/hadoop/etc/hadoop/
[[email protected] sbin]$ ./start-yarn.sh
starting yarn daemons
Error: Cannot find configuration directory: /opt/hadoop/etc/hadoop/
Error: Cannot find configuration directory: /opt/hadoop/etc/hadoop/
Answer the question
In order to leave comments, you need to log in
found an inaccuracy in the question, in fact, there was an environment variable that is responsible just for conf
, it is located in the hadoop-env.sh file,
edit
the HADOOP_CONF_DIR variable file and comment on JAVA_HOME
# The java implementation to use.
export JAVA_HOME=${JAVA_HOME}
# export JAVA_HOME=/opt/jdk1.8.0_144
export HADOOP_OPTS=-Djava.net.preferIPv4Stack=true
export HADOOP_CONF_DIR=/app/hadoop/etc/hadoop/
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question