在配置单元上下文中,我无法运行将数据加载到分区表的sql,我设置了 dynamic partition = true
但我还是有问题。
sql语句: insert overwrite table target_table PARTITION (column1,column2) select * , deletion_flag ,'2018-12-23' as date_feed from source_table
Hivesetconf:-
hiveContext.setConf("hive.exec.dynamic.partition","true")
hiveContext.setConf("hive.exec.max.dynamic.partitions","2048")
hiveContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")
错误:
org.apache.hadoop.hive.ql.metadata.hive.loaddynamicpartitions(org.apache.hadoop.fs.path、java.lang.string、java.util.map、boolean、int、boolean、boolean、boolean)
Mavendependency:-
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>1.1.0</version>
</dependency>
谢谢
1条答案
按热度按时间brjng4g31#
在从cloudera repo获得所有maven依赖项之后,我解决了这些问题。