我们的项目使用gradle和scala来构建spark应用程序,但我添加了gcp kms库,现在当它在dataproc上运行时,会出现缺少guava方法的错误:
java.lang.noSuchMethodError: com.google.common.base.Preconditions.checkArgument
我按照以下指南中的建议对google库进行了着色处理:https://cloud.google.com/blog/products/data-analytics/managing-java-dependencies-apache-spark-applications-cloud-dataproc
我在gradle build中的shadowjar定义:
shadowJar {
zip64 true
relocate 'com.google', 'shadow.com.google'
relocate 'com.google.protobuf', 'shadow.com.google.protobuf'
relocate 'google.cloud', 'shadow.google.cloud'
exclude 'META-INF/**'
exclude "LICENSE*"
mergeServiceFiles()
archiveFileName = "myjar"
}
运行时 java tf
在编译的fatjar上,这显示了包括checkargument在内的guava类在 shadow
.
但在运行dataproc spark submit时仍然会出错,而且在运行时似乎仍然选择了hadoop的旧版本。下面是堆栈跟踪,从使用gcp kms decrypt的kmssymetric类开始:
Exception in thread "main" java.lang.NoSuchMethodError: shadow.com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;CLjava/lang/Object;)V
at io.grpc.Metadata$Key.validateName(Metadata.java:629)
at io.grpc.Metadata$Key.<init>(Metadata.java:637)
at io.grpc.Metadata$Key.<init>(Metadata.java:567)
at io.grpc.Metadata$AsciiKey.<init>(Metadata.java:742)
at io.grpc.Metadata$AsciiKey.<init>(Metadata.java:737)
at io.grpc.Metadata$Key.of(Metadata.java:593)
at io.grpc.Metadata$Key.of(Metadata.java:589)
at shadow.com.google.api.gax.grpc.GrpcHeaderInterceptor.<init>(GrpcHeaderInterceptor.java:60)
at shadow.com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:221)
at shadow.com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:194)
at shadow.com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:186)
at shadow.com.google.api.gax.rpc.ClientContext.create(ClientContext.java:155)
at shadow.com.google.cloud.kms.v1.stub.GrpcKeyManagementServiceStub.create(GrpcKeyManagementServiceStub.java:370)
at shadow.com.google.cloud.kms.v1.stub.KeyManagementServiceStubSettings.createStub(KeyManagementServiceStubSettings.java:333)
at shadow.com.google.cloud.kms.v1.KeyManagementServiceClient.<init>(KeyManagementServiceClient.java:155)
at shadow.com.google.cloud.kms.v1.KeyManagementServiceClient.create(KeyManagementServiceClient.java:136)
at shadow.com.google.cloud.kms.v1.KeyManagementServiceClient.create(KeyManagementServiceClient.java:127)
at mycompany.my.class.path.KmsSymmetric.decrypt(KmsSymmetric.scala:31)
我的dataproc提交是:
gcloud dataproc jobs submit spark \
--cluster=${CLUSTER_NAME} \
--project ${PROJECT_ID} \
--region=${REGION} \
--jars=gs://${APP_BUCKET}/${JAR} \
--class=${CLASS} \
--app args --arg1 val1 etc
我使用的是dataproc映像版本1.4
我错过了什么?
暂无答案!
目前还没有任何答案,快来回答吧!