gradle中的ApacheSpark没有下载jar文件

nue99wik  于 2021-07-14  发布在  Spark
关注(0)|答案(0)|浏览(224)

团队,
我对gradle工具还不熟悉。我正在努力研究apachespark。当我尝试执行gradle命令时,它没有下载必要的spark jar。
c:\users\documents\study\sparktutorial master>gradlew idea:ideamodule无法解析:org.apache。spark:spark-core_2.10:2.0.0无法解析:org.apache。spark:spark-sql_2.10:2.1.0:idea项目:idea工作空间:idea
生成成功
总时间:2.769秒 build.gradle ```
group 'jameslee'
version '1.0-SNAPSHOT'

apply plugin: 'java'
apply plugin: 'idea'
apply plugin: 'eclipse'

sourceCompatibility = 1.8

idea {
project {
jdkName = '1.8'
languageLevel = '1.8'
}
}

repositories {
mavenCentral()
}

dependencies {
compile group: 'org.apache.spark', name: 'spark-core_2.10', version: '2.0.0'
compile group: 'org.apache.spark', name: 'spark-sql_2.10', version: '2.1.0'
}

jar {
zip64 true
archiveName = "StackOverFlowSurvey-spark.jar"
from {
configurations.compile.collect {
it.isDirectory() ? it : zipTree(it)
}
}
manifest {
attributes 'Main-Class': 'com.sparkTutorial.sparkSql.StackOverFlowSurvey'
}

exclude 'META-INF/*.RSA', 'META-INF/*.SF','META-INF/*.DSA'

}

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题