在尝试构建本地伪hadoop环境时,当我尝试用 start-dfs.sh
```
"Could not find or load main class org.apache.hadoop.hdfs.tools.GetConf"
我的java版本如下所示
java version "1.7.0_85"
OpenJDK Runtime Environment (IcedTea 2.6.1) (7u85-2.6.1-5ubuntu0.14.04.1)
OpenJDK 64-Bit Server VM (build 24.85-b03, mixed mode)
我也改变了我的路线 `hadoop-env.sh` ,下 `/usr/local/hadoop-2.7.1/etc/hadoop` ```
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
对于etc/hadoop/core-site.xml,我将
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
对于etc/hadoop/hdfs-site.xml,我将
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
我还更改了我的/home/hduser/.bashrc文件,添加以下行:(所有路径都正确)
# HADOOP VARIABLES START
export HADOOP_PREFIX =/usr/local/hadoop-2.7.1
export HADOOP_HOME=/usr/local/hadoop-2.7.1
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME=${HADOOP_HOME}
export HADOOP_HDFS_HOME=${HADOOP_HOME}
export YARN_HOME=${HADOOP_HOME}
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_PREFIX}/lib/native
export HADOOP_OPTS="-Djava.library.path=${HADOOP_PREFIX}/lib/native"
export HADOOP_CLASSPATH=$JAVA_HOME/lib/tools.jar
# HADOOP VARIABLES END
当键入start-dfs.sh时,仅显示datanode,当键入start-all.sh时。nodemanager和datanode显示。
6098 NodeManager
5691 DataNode
6267 Jps
什么都看不出来http://localhost:*****/
1条答案
按热度按时间ttisahbt1#
首先使用以下命令格式化namenode
hadoop namenode -format
然后试着从你的终端执行这个./hadoop-daemon.sh start namenode
.jps
要检查的命令。core-site.xml:
hdfs-site.xml: