pyarrow.hdfs.connect无法访问我的hadoop群集

sshcrbum  于 2021-06-01  发布在  Hadoop
关注(0)|答案(0)|浏览(415)

我正在为我的hadoop功能安装与python接口的第一步而奋斗。这是我的主节点(本地网络)。

以下是我试图到达主节点时发生的情况:

import pyarrow as pa 
pa.hdfs.connect("192.168.0.37",20500)

File "/usr/local/lib/python3.5/dist-packages/pyarrow/hdfs.py", line 181, in connect
kerb_ticket=kerb_ticket, driver=driver)
File "/usr/local/lib/python3.5/dist-packages/pyarrow/hdfs.py", line 35, in __init__
_maybe_set_hadoop_classpath()
File "/usr/local/lib/python3.5/dist-packages/pyarrow/hdfs.py", line 134, in _maybe_set_hadoop_classpath
classpath = subprocess.check_output([hadoop_bin, 'classpath', '--glob'])
File "/usr/lib/python3.5/subprocess.py", line 626, in check_output

**kwargs).stdout

File "/usr/lib/python3.5/subprocess.py", line 693, in run
with Popen(*popenargs,**kwargs) as process:
File "/usr/lib/python3.5/subprocess.py", line 947, in __init__
restore_signals, start_new_session)
File "/usr/lib/python3.5/subprocess.py", line 1551, in _execute_child
raise child_exception_type(errno_num, err_msg)
FileNotFoundError: [Errno 2] No such file or directory: 'hadoop'

我不知道我做错了什么,我在很多不同的报道中发现了这个问题。我根据文档设置了环境变量。我的任务到此结束

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64/jre
export HADOOP_HOME=/home/david/Apps/hadoop
export CLASSPATH='$HADOOP_HOME/bin/hdfs classpath --glob'

当直接从主节点运行这个脚本时,我似乎遇到了不同的错误。这是否意味着我不能将此脚本用作客户端脚本?我漏了一步吗?
谢谢。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题