spark 1.6配置单元上下文setconf问题

gkl3eglg  于 2021-06-26  发布在  Hive
关注(0)|答案(1)|浏览(403)

在配置单元上下文中,我无法运行将数据加载到分区表的sql,我设置了 dynamic partition = true 但我还是有问题。
sql语句: insert overwrite table target_table PARTITION (column1,column2) select * , deletion_flag ,'2018-12-23' as date_feed from source_table Hivesetconf:-

hiveContext.setConf("hive.exec.dynamic.partition","true")
  hiveContext.setConf("hive.exec.max.dynamic.partitions","2048")
  hiveContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")

错误:
org.apache.hadoop.hive.ql.metadata.hive.loaddynamicpartitions(org.apache.hadoop.fs.path、java.lang.string、java.util.map、boolean、int、boolean、boolean、boolean)
Mavendependency:-

<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.6.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>1.6.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>1.6.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hive</groupId>
        <artifactId>hive-exec</artifactId>
        <version>1.1.0</version>
    </dependency>

谢谢

brjng4g3

brjng4g31#

在从cloudera repo获得所有maven依赖项之后,我解决了这些问题。

<dependencies>
    <!-- Scala and Spark dependencies -->

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.6.0-cdh5.9.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>1.6.0-cdh5.9.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>1.6.0-cdh5.9.2</version>
    </dependency>
     <!-- https://mvnrepository.com/artifact/org.apache.hive/hive-exec -->
    <dependency>
        <groupId>org.apache.hive</groupId>
        <artifactId>hive-exec</artifactId>
        <version>1.1.0-cdh5.9.2</version>
    </dependency>
    <dependency>
        <groupId>org.scalatest</groupId>
        <artifactId>scalatest_2.10</artifactId>
        <version>3.0.0-SNAP4</version>
    </dependency>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>4.11</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>1.4.1</version>
    </dependency>
    <dependency>
        <groupId>commons-dbcp</groupId>
        <artifactId>commons-dbcp</artifactId>
        <version>1.2.2</version>
    </dependency>
    <dependency>
        <groupId>com.databricks</groupId>
        <artifactId>spark-csv_2.10</artifactId>
        <version>1.4.0</version>
    </dependency>
    <dependency>
        <groupId>com.databricks</groupId>
        <artifactId>spark-xml_2.10</artifactId>
        <version>0.2.0</version>
    </dependency>
    <dependency>
        <groupId>com.amazonaws</groupId>
        <artifactId>aws-java-sdk</artifactId>
        <version>1.0.12</version>
    </dependency>
    <dependency>
        <groupId>com.amazonaws</groupId>
        <artifactId>aws-java-sdk-s3</artifactId>
        <version>1.11.172</version>
    </dependency>
    <dependency>
        <groupId>com.github.scopt</groupId>
        <artifactId>scopt_2.10</artifactId>
        <version>3.2.0</version>
    </dependency>
    <dependency>
        <groupId>javax.mail</groupId>
        <artifactId>mail</artifactId>
        <version>1.4</version>
    </dependency>
</dependencies>
<repositories>
    <repository>
        <id>maven-hadoop</id>
        <name>Hadoop Releases</name>
        <url>https://repository.cloudera.com/content/repositories/releases/</url>
    </repository>
    <repository>
        <id>cloudera-repos</id>
        <name>Cloudera Repos</name>
        <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
    </repository>
</repositories>

相关问题