为什么sparkhadooputil在这里是不可访问的,而在spark的较低版本中是可访问的,即使它们是进口的?
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 3.0.2
/_/
Using Scala version 2.12.10 (OpenJDK 64-Bit Server VM, Java 1.8.0_282)
Type in expressions to have them evaluated.
Type :help for more information.
scala> import org.apache.spark.deploy.SparkHadoopUtil
import org.apache.spark.deploy.SparkHadoopUtil
scala> import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.conf.Configuration
scala>
scala>
scala> val hadoopConf: Configuration = SparkHadoopUtil.get.conf
<console>:25: error: object SparkHadoopUtil in package deploy cannot be accessed in package org.apache.spark.deploy
val hadoopConf: Configuration = SparkHadoopUtil.get.conf
^
scala>
1条答案
按热度按时间r1zk6ea11#
那是因为
SparkHadoopUtil
在spark 3中,类已更改为私有类。以下是spark 2.4和spark 3.0的不同之处。Spark2.4:
spark 3.0版: