spark接收错误无法将文件:/home/.ivy2/jars/net.sourceforge.f2j\u arpack\u combined\u all-0.1.jar添加到spark环境

iibxawm4  于 2021-05-16  发布在  Spark
关注(0)|答案(0)|浏览(222)

我在ubuntu 18.04上运行spark 2.4.7。在本地或aws emr示例上运行spark submit命令时,收到以下错误:

20/12/01 02:25:30 ERROR SparkContext: Failed to add file:/home/reyaz/.ivy2/jars/net.sourceforge.f2j_arpack_combined_all-0.1.jar to Spark environment
java.io.FileNotFoundException: Jar /home/reyaz/.ivy2/jars/net.sourceforge.f2j_arpack_combined_all-0.1.jar not found
    at org.apache.spark.SparkContext.addJarFile$1(SparkContext.scala:1838)
    at org.apache.spark.SparkContext.addJar(SparkContext.scala:1868)
    at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:458)
    at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:458)
    at scala.collection.immutable.List.foreach(List.scala:392)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:458)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:921)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
    at com.deequTest.deequRawDataCLeanedValidation$.main(deequRawDataCLeanedValidation.scala:46)
    at com.deequTest.deequRawDataCLeanedValidation.main(deequRawDataCLeanedValidation.scala)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:564)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我尝试将sourceforge依赖项添加到我的项目中,但仍然没有得到ant结果。我在用包裹
--包com.amazon。deequ:deequ:1.0.4,org.apache。spark:spark-avro_2.11:2.4.0
Spark也显示了
找到net.sourceforge.f2j#arpackŠ合并的Šall;0.1,但仍会出现上述错误

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题