hdfs c程序失败,返回“exceptionutils::getstacktrace error.”

fhity93d  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(372)

我已经在我的ubuntu18.04上安装并配置了hadoop2.8.4,hdfs命令运行正常,运行hdfs的java程序运行正常,yarn运行正常。

hdfs@ubuntu:$ hdfs dfs -ls /tmp
Found 8 items
drwx-w----   - hdfs supergroup          0 2018-07-30 04:25 /tmp/NOTES.txt
drwx------   - hdfs supergroup          0 2018-07-29 13:12 /tmp/hadoop-yarn
drwx-wx-wx   - hdfs supergroup          0 2018-08-19 03:16 /tmp/hive
drwx-w----   - hdfs supergroup          0 2018-07-30 04:30 /tmp/myNotes.txt
drwx-w----   - hdfs supergroup          0 2018-07-30 04:34 /tmp/myNotes2.txt
drwx-w----   - hdfs supergroup          0 2018-07-30 04:37 /tmp/myNotes3.txt
drwx-w----   - hdfs supergroup          0 2018-07-30 04:38 /tmp/myNotes4.txt
-rw-r--r--   1 hdfs supergroup         15 2018-07-30 04:24 /tmp/testfile.txt

然后我尝试了“hadoop 2快速入门指南”一书中的c程序,mytest.c文件:


# include<stdio.h>

# include<stdlib.h>

# include<string.h>

# include"hdfs.h"

int main(int argc, char* argv[]){
    hdfsFS fs=hdfsConnect("default",0);
    hdfsFile writeFile=hdfsOpenFile(fs,argv[1],O_WRONLY|O_CREAT,0,0,0);
    if(!writeFile){
        fprintf(stderr,"open %s for write failed\n",argv[1]);
        exit(-1);
    }
    char buffer[]="hw";
    tSize numWrittenBytes=hdfsWrite(fs,writeFile,(void*)buffer,strlen(buffer)+1);
    if(hdfsFlush(fs,writeFile)){
        fprintf(stderr,"flush failed for %s\n",argv[1]);
        exit(-1);
    }
    hdfsCloseFile(fs, writeFile);
    return 0;
}

然后我有一个shell脚本来构建并使用用户hdfs运行它:

. $HADOOP_HOME/etc/hadoop/hadoop-env.sh

gcc mytest.c -I$HADOOP_LIB/include -I$HADOOP_HOME/include -I$JAVA_HOME/include -L$HADOOP_LIB/native -L$JAVA_HOME/jre/lib/amd64/server -ljvm -lhdfs -o mytest
export CLASSPATH=`hadoop classpath`
export LD_LIBRARY_PATH=$HADOOP_LIB/native:$JAVA_HOME/jre/lib/amd64/server:$LD_LIBRARY_PATH
./mytest "file01"

似乎函数“hdfsopenfile”失败:

loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsOpenFile(file01.txt): constructNewObjectOfPath error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
open file01 for write failed

你能告诉我为什么我的“hdfsopenfile”失败以及如何修复我的问题吗?谢谢

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题