扫描的分区数(=32767)超出限制

7vhp5slm  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(355)

我正在尝试使用EELSDK将数据流到hive中。

val sink = HiveSink(testDBName, testTableName)
.withPartitionStrategy(new DynamicPartitionStrategy)

val hiveOps:HiveOps = ...
val schema = new StructType(Vector(Field("name", StringType),Field("pk", StringType),Field("pk1",a StringType)))

hiveOps.createTable( 
  testDBName,
  testTableName,
  schema,
  partitionKeys = Seq("pk", "pk1"),
  dialect = ParquetHiveDialect(),
  tableType = TableType.EXTERNAL_TABLE,
  overwrite = true
)
val items = Seq.tabulate(100)(i => TestData(i.toString, "42", "apple"))
val ds = DataStream(items)
ds.to(sink)

获取错误:扫描的分区数(=32767)超过限制(=10000)。32767是2的幂……但还是搞不清楚到底出了什么问题。你知道吗?

ejk8hzay

ejk8hzay1#

spark+hive:扫描的分区数超出限制(=4000)

--conf "spark.sql.hive.convertMetastoreOrc=false"
--conf "spark.sql.hive.metastorePartitionPruning=false"

相关问题