libhdfs -无法打开共享库libhdfs.so.0.0.0.0

4szc88ey  于 2022-12-09  发布在  HDFS
关注(0)|答案(1)|浏览(210)

我有一个正在运行的hdfs示例,证据如下-

reikdas@reikdas-HP-Pavilion-x360-Convertible-14-dh1xxx:~$ jps
16083 Jps
12389 NameNode
12774 SecondaryNameNode
11083

reikdas@reikdas-HP-Pavilion-x360-Convertible-14-dh1xxx:~$ hadoop fs -ls /
2021-09-27 12:06:59,520 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
drwxr-xr-x   - reikdas supergroup          0 2021-09-27 00:31 /test

我复制了official documentation中在C中使用libhdfs的标准示例-

#include "hdfs.h"
#include <stdio.h>
#include <string.h>
#include <stdlib.h>

int main(int argc, char **argv) {

    hdfsFS fs = hdfsConnect("default", 0);
    // Also tested - hdfsConnect("127.0.0.1", 9000)
    const char* writePath = "/testfile.txt";
    hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY |O_CREAT, 0, 0, 0);
    if(!writeFile) {
          fprintf(stderr, "Failed to open %s for writing!\n", writePath);
          exit(-1);
    }
    char* buffer = "Hello, World!";
    tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1);
    if (hdfsFlush(fs, writeFile)) {
           fprintf(stderr, "Failed to 'flush' %s\n", writePath);
          exit(-1);
    }
    hdfsCloseFile(fs, writeFile);
}

我成功地编译了它-

gcc testlhdfs.c -I $HADOOP_HDFS_HOME/include/ -L $HADOOP_HDFS_HOME/lib/native -lhdfs

但在尝试运行可执行文件时出现错误-

./a.out: error while loading shared libraries: libhdfs.so.0.0.0: cannot open shared object file: No such file or directory

我不知道为什么会出现此错误,但我可以看到libhdfs.so.0.0.0存在于$HADOOP_HDFS_HOME/lib/native中-

reikdas@reikdas-HP-Pavilion-x360-Convertible-14-dh1xxx:~$ ls -l $HADOOP_HDFS_HOME/lib/native
total 166640
drwxr-xr-x 2 reikdas reikdas      4096 Jun 15 01:44 examples
-rw-r--r-- 1 reikdas reikdas   1507316 Jun 15 01:13 libhadoop.a
-rw-r--r-- 1 reikdas reikdas   1741256 Jun 15 01:44 libhadooppipes.a
lrwxrwxrwx 1 reikdas reikdas        18 Jun 15 01:13 libhadoop.so -> libhadoop.so.1.0.0
-rwxr-xr-x 1 reikdas reikdas    803040 Jun 15 01:13 libhadoop.so.1.0.0
-rw-r--r-- 1 reikdas reikdas    754382 Jun 15 01:44 libhadooputils.a
-rw-r--r-- 1 reikdas reikdas    551556 Jun 15 01:18 libhdfs.a
-rw-r--r-- 1 reikdas reikdas 106522330 Jun 15 01:20 libhdfspp.a
lrwxrwxrwx 1 reikdas reikdas        18 Jun 15 01:20 libhdfspp.so -> libhdfspp.so.0.1.0
-rwxr-xr-x 1 reikdas reikdas  44375064 Jun 15 01:20 libhdfspp.so.0.1.0
lrwxrwxrwx 1 reikdas reikdas        16 Jun 15 01:18 libhdfs.so -> libhdfs.so.0.0.0
-rwxr-xr-x 1 reikdas reikdas    333648 Jun 15 01:18 libhdfs.so.0.0.0
-rw-r--r-- 1 reikdas reikdas  10029114 Jun 15 01:39 libnativetask.a
lrwxrwxrwx 1 reikdas reikdas        22 Jun 15 01:39 libnativetask.so -> libnativetask.so.1.0.0
-rwxr-xr-x 1 reikdas reikdas   3985736 Jun 15 01:39 libnativetask.so.1.0.0

我的其他环境变量也进行了适当的设置-

export HADOOP_HOME=/home/reikdas/hadoop
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"

这可能是我在core-site.xml文件中设置的相关配置选项-

<property>
    <name>fs.default.name</name>
    <value>hdfs://127.0.0.1:9000</value>
</property>

我真的很感激所有的帮助,我可以得到修复这个错误。

xwbd5t1u

xwbd5t1u1#

您可以尝试使用ldconfig $HADOOP_HDFS_HOME/lib/native

相关问题