hadoop fs文件系统错误-copytolocal([class org.apache.hadoop.fs.path,class org.apache.hadoop.fs.path])不存在

okxuctiv  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(255)

在pyspark会话中,我想将文件从s3复制到hadoop集群本地目录,执行此操作时出现以下错误。请帮忙。

file_system.copyToLocal(false, java_path_src, java_path_dst)

参数-

java_path_src - s3://sandbox/metadata/2018-06-07T183915/test.jsonl
 java_path_dst - /home/hadoop/output/

错误-

py4j.protocol.Py4JError: An error occurred while calling o144.copyToLocal. Trace:
py4j.Py4JException: Method copyToLocal([class org.apache.hadoop.fs.Path, class org.apache.hadoop.fs.Path]) does not exist
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:318)
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:326)
at py4j.Gateway.invoke(Gateway.java:272)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:748)

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题