hadoop和openjdk:start-dfs.sh(ssh?)出错

9w11ddsr  于 2021-06-01  发布在  Hadoop
关注(0)|答案(2)|浏览(397)

我在按照本教程设置4集群hadoop体系结构时遇到了一个问题。我有以下4台机器(虚拟化):
主节点
节点1
节点2
节点3
我在主节点上设置了所有conf文件,并用scp将它们导出到其他文件中。主节点可以通过ssh访问从节点。我在所有机器上都将javau home设置为.bashrc。然而,我得到的是:

hadoop@master-node:~$ start-dfs.sh
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.4.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Starting namenodes on [node-master]
node-master: ssh: connect to host node-master port 22: Connection timed out
node1: Error: JAVA_HOME is not set and could not be found.
node2: Error: JAVA_HOME is not set and could not be found.
node3: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
hadoop@0.0.0.0's password: 
0.0.0.0: Error: JAVA_HOME is not set and could not be found.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.4.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release

[3种可能性]使用openjdk 11似乎有一个问题,尽管我不太确定这是造成这种混乱的原因。这些错误表明ssh有问题,但是i)我上传了conf文件,没有任何问题,ii)我可以从主节点访问所有节点。这可能与设置javau主路径的方式有关吗?我的文章到此结束。bashrc:

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export PATH=PATH:$PATH/bin

提前感谢您的每一条线索(我不太会使用java,我觉得这里有点迷茫)
[编辑]与oraclejdk8相同

hadoop@master-node:~$  readlink -f /usr/bin/java
/usr/lib/jvm/java-8-oracle/jre/bin/java
hadoop@master-node:~$ export JAVA_HOME=/usr/lib/jvm/java-8-oracle/jre
hadoop@master-node:~$ start-dfs.sh
Starting namenodes on [node-master]
node-master: ssh: connect to host node-master port 22: Connection timed out
node1: Error: JAVA_HOME is not set and could not be found.
node3: Error: JAVA_HOME is not set and could not be found.
node2: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
hadoop@0.0.0.0's password:

0.0.0.0:错误:未设置java\ U home并且找不到。

6mzjoqzu

6mzjoqzu1#

你能导出这样的路径吗,

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export PATH=$PATH:$JAVA_HOME/bin

然后您必须执行以下命令,以确保您的路径包含java\u home变量。在.bashrc文件中附加java&path变量后,执行下面的命令,

source ~/.bashrc

然后检查 echo $PATH ,如果该值包含java\u home值,那么它应该可以工作。

qyzbxkaa

qyzbxkaa2#

找到了!!!!!!原来,javau home是通过ssh连接丢失的(为什么,我不知道)。这让我找到了答案)
为了克服这个问题,

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64

还应添加到

hadoop/etc/hadoop/hadoop-env.sh

相关问题