start-dfs.sh抛出端口22:连接超时错误

krcsximq  于 2021-05-29  发布在  Hadoop
关注(0)|答案(1)|浏览(534)

我正在尝试在伪分布式环境下在ubuntu上安装hadoop。 start-dfs.sh (给我一个错误)

Starting namenodes on [10.1.37.12]

10.1.37.00: ssh: connect to host 10.1.37.12 port 22: Connection timed out

localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-superuser-datanode-superuser-Satellite-E45W-C.out

Starting secondary namenodes [0.0.0.0]

0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-superuser-secondarynamenode-superuser-Satellite-E45W-C.out

我已将端口22添加到防火墙
jps输出:
2562数据节点
3846日元
2743次要名称节点
有人能帮我明白吗,这里怎么了?

EXPORT HADOOP_SSH_OPTS="-p 22"' -- done
added port 22 to firewal("sudo ufw allow 22")
Tried stopping the firewall("sudo ufw disable")
run ssh -vvv 10.1.37.12 and share output
OpenSSH_7.9p1 Ubuntu-10, OpenSSL 1.1.1b  26 Feb 2019
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 19: Applying options for *
debug2: resolve_canonicalize: hostname 10.1.37.12 is address
debug2: ssh_connect_direct
debug1: Connecting to 10.1.37.12 [10.1.37.12] port 22.
debug1: connect to address 10.1.37.12 port 22: Connection timed out
ssh: connect to host 10.1.37.12 port 22: Connection timed out
92dk7w1h

92dk7w1h1#

请检查您的行李 /etc/hosts 文件中,您需要提及专用ip地址(在子网范围内)。必须在worker文件中更新

相关问题