hadoop安装问题:

cyvaqqii  于 2021-05-29  发布在  Hadoop
关注(0)|答案(7)|浏览(493)

我遵循本教程安装hadoop。不幸的是,当我运行start-all.sh脚本时,控制台上显示了以下错误:

hduser@dennis-HP:/usr/local/hadoop/sbin$ start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
hadoop config script is run...
hdfs script is run...
Config parameter : 
16/04/10 23:45:40 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
localhost: chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out: No such file or directory
localhost: head: cannot open ‘/usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out’ for reading: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out: No such file or directory
localhost: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
localhost: chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out: No such file or directory
localhost: head: cannot open ‘/usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out’ for reading: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
0.0.0.0: chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out: No such file or directory
0.0.0.0: head: cannot open ‘/usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out’ for reading: No such file or directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out: No such file or directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out: No such file or directory
16/04/10 23:45:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
yarn script is run...
starting yarn daemons
mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out
/usr/local/hadoop/sbin/yarn-daemon.sh: line 124: /usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out: No such file or directory
head: cannot open ‘/usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out’ for reading: No such file or directory
/usr/local/hadoop/sbin/yarn-daemon.sh: line 129: /usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out: No such file or directory
/usr/local/hadoop/sbin/yarn-daemon.sh: line 130: /usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out: No such file or directory
localhost: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
localhost: chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 124: /usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out: No such file or directory
localhost: head: cannot open ‘/usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out’ for reading: No such file or directory
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 129: /usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out: No such file or directory
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 130: /usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out: No such file or directory

当我执行jps命令时,抛出了以下错误

hduser@dennis-HP:/usr/local/hadoop/sbin$ jps
3802 Jps

我是hadoop新手,所以请给我看一篇文章,这篇文章将帮助我安装hadoop而不会出现问题
或者如果可能(更可取)解决所面临的问题,请让我知道哪里出了问题以及如何解决?

y1aodyip

y1aodyip1#

我也遇到了类似的问题,我发现hadoop-env.sh中的hadoop\前缀路径不完整。它没有指向我的安装目录,而是指向root所拥有的目录。修好了,一切正常!!
正确的路径

export HADOOP_PREFIX=/home/karan/hadoop-install/hadoop-3.2.1

错误的路径

export HADOOP_PREFIX=/hadoop-install/hadoop-3.2.1
t3irkdon

t3irkdon2#

请检查是否使用chmod或chown命令对文件夹正确设置了权限。
hadoop提供单个节点来启动和停止服务,即hadoop-daemon.sh start[node]
同样,也有启动/停止Yarn的脚本。下面是安装apachehadoop的详细步骤http://www.hadoopstrata.com/staticpost?postnbr=7

qnakjoqk

qnakjoqk3#

你试过了吗 start-dfs.sh 试试下面的命令,看看有什么React

hdfs namenode -format
start-dfs.sh
start-yarn.sh
oyjwcjzk

oyjwcjzk4#

检查 /usr/local/hadoop/logs 权限。如果不在hduser下,则更改所有权。 sudo chown -R username:group directory

ruyhziif

ruyhziif5#

我也有同样的问题
我删除了此文件夹:
hadoop-hadoop\u amine-datanode-amine.out位于:usr/local/hadoopy/logs
开始-dfs.sh开始-yarn.sh
日本:
14048 jps 32226 resourcemanager 32403 nodemanager 6164 secondarynamenode 13548数据节点5806 namenode

brjng4g3

brjng4g36#

当前用户对/usr/local/hadoop的权限有限。尝试更改权限。
sudo chmod 777-r/usr/local/hadoop/

wh6knrhe

wh6knrhe7#

老实说,我不知道我为什么会犯那个错误。。但是我用askubuntu中提供的说明删除了我的整个安装,并用官方网站中描述的安装方法重新安装了它
但是你是对的@krishna,日志是在安装之后自动创建的。我的猜测是,我之前使用的安装有过时的配置细节,很可能与hadoop的安装相冲突

相关问题