spark程序在windows上以本地模式写入文本文件时失败

mccptt67  于 2021-06-01  发布在  Hadoop
关注(0)|答案(0)|浏览(329)

运行简单 spark 中的程序 IntellijWindows . 当我在调试模式下运行代码时,发现程序尝试执行以下命令失败。

D:\winutils\bin\winutils.exe chmod 0644 C:\Users\himanshu\git\SparkRDDs\target\output\_temporary\0\_temporary\attempt_20170529130949_0027_m_000000_405\part-00000

这就是代码失败的地方。

join.rdd.saveAsTextFile("file:///Users/himanshu/git/SparkRDDs/target/output/")

stacktrace下方-
17/05/29 14:07:48错误执行器:第27.0阶段任务0.0中出现异常(tid 410)org.apache.hadoop.util.shell$exitcodeexception:at org.apache.hadoop.util.shell.runcommand(shell)。java:464)在org.apache.hadoop.util.shell.run(shell。java:379)在org.apache.hadoop.util.shell$shellcommandexecutor.execute(shell。java:589)在org.apache.hadoop.util.shell.execcommand(shell。java:678)在org.apache.hadoop.util.shell.execcommand(shell。java:661)位于org.apache.hadoop.fs.rawlocalfilesystem.setpermission(rawlocalfilesystem)。java:639)在org.apache.hadoop.fs.filterfilesystem.setpermission(filterfilesystem。java:468)在org.apache.hadoop.fs.checksumfilesystem.create(checksumfilesystem。java:456)在org.apache.hadoop.fs.checksumfilesystem.create(checksumfilesystem。java:424)在org.apache.hadoop.fs.filesystem.create(filesystem。java:905)在org.apache.hadoop.fs.filesystem.create(filesystem。java:798)在org.apache.hadoop.mapred.textoutputformat.getrecordwriter(textoutputformat)。java:123)在org.apache.spark.sparkhadoopwriter.open(sparkhadoopwriter。scala:90)位于org.apache.spark.rdd.pairddfunctions$$anonfun$saveashadoopdataset$1$$anonfun$13.apply(pairddfunctions)。scala:1206)在org.apache.spark.rdd.pairddfunctions$$anonfun$saveashadoopdataset$1$$anonfun$13.apply(pairddfunctions)。scala:1197)在org.apache.spark.scheduler.resulttask.runtask(resulttask。scala:87)在org.apache.spark.scheduler.task.run(task。scala:99)在org.apache.spark.executor.executor$taskrunner.run(executor。scala:322)在java.util.concurrent.threadpoolexecutor.runworker(threadpoolexecutor。java:1142)在java.util.concurrent.threadpoolexecutor$worker.run(threadpoolexecutor。java:617)在java.lang.thread.run(线程。java:745)

Surprisingly in Eclipse, the program runs just fine.

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题