我可以直接从maven/jfrog artifactory提交spark应用程序jar吗

11dmarpk  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(300)

我正在尝试使用spark submit应用程序jar,其中应用程序jar实际上托管在远程repo(而不是本地或hdfs或s3)中。下面是我尝试直接从maven运行sparkpi的示例:

spark-submit --class org.apache.examples.SparkPi --repositories https://mvnrepository.com/repos/central,https://repo.eclipse.org/content/repositories/paho-releases --packages org.apache.spark:spark-examples_2.10:0.9.0-incubating  --jars https://repo1.maven.org/maven2/org/apache/spark/spark-examples_2.10/0.9.0-incubating/spark-examples_2.10-0.9.0-incubating.jar spark-examples_2.10-0.9.0-incubating.jar 10000

它似乎既不工作也不失败,以下是输出:

---------------------------------------------------------------------
|                  |            modules            ||   artifacts   |
|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
|      default     |  123  |   0   |   0   |   21  ||  102  |   0   |
---------------------------------------------------------------------

:: retrieving :: org.apache.spark#spark-submit-parent-a0c4af8a-2537-45f2-a26d-d9d697abfb2b

confs: [default]
    0 artifacts copied, 102 already retrieved (0kB/50ms)
20/07/17 09:53:35 WARN Utils: Your hostname,****.local resolves to a loopback address: 127.0.0.1; using****instead (on interface en0)
20/07/17 09:53:35 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
20/07/17 09:53:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
log4j:WARN No appenders could be found for logger (org.apache.spark.deploy.SparkSubmit$$anon$2).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

我可能错在我的假设,它应该工作,但感谢任何反馈

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题