hadoop |本地主机与目标主机不同

33qvvth1  于 2021-05-27  发布在  Hadoop
关注(0)|答案(0)|浏览(259)

我正试着在我的本地电脑上安装hadoop,而我却被困在了这台电脑上。

▶ hadoop fs -mkdir /home/hadoop
mkdir: Failed on local exception: org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length; Host Details : local host is: "aleph-pc.local/192.168.1.129"; destination host is: "aleph-pc":8020;

我认为这与我配置sshd的方式或者sshd中的值有关 fs.default.name 这与另一个问题密切相关。我能从这里去哪里?我很感激任何帮助。

▶ cat /etc/hosts
127.0.0.1 localhost
::1 localhost

▶ hadoop version
Hadoop 2.9.2
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r 826afbeae31ca687bc2f8471dc841b66ed2c6704
Compiled by ajisaka on 2018-11-13T12:42Z
Compiled with protoc 2.5.0
From source with checksum 3a9939967262218aa556c684d107985
This command was run using /home/aleph/Documents/Projects/Hadoop/hadoop-2.9.2/share/hadoop/common/hadoop-common-2.9.2.jar

▶ tail hadoop-2.9.2/etc/hadoop/core-site.xml 
-->

<!-- Put site-specific property overrides in this file. -->

<configuration>
    <property>                                                                                    
        <name>fs.default.name</name>                                                              
        <value>hdfs://aleph-pc/127.0.0.1</value>
    </property>      

</configuration>

▶ jps
17619 Main
32133 NodeManager
32037 ResourceManager
31879 SecondaryNameNode
17816 RemoteMavenServer36
1020 Jps
31678 DataNode

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题