sqoop将数据导入配置单元throw error org.apache.sqoop.hive.hiveconfig?

qyuhtwio  于 2021-06-03  发布在  Sqoop
关注(0)|答案(1)|浏览(394)

我已经在ambari hdp 2.5.0上安装了hue 3.10
完全配置hue.ini
我的问题是var sqoop将数据从mysql同步到hive,它引发了一个异常:

[main] ERROR org.apache.sqoop.hive.HiveConfig – Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly. 

[main] ERROR org.apache.sqoop.hive.HiveConfig – Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly. 

[main] ERROR org.apache.sqoop.tool.ImportTool – Encountered IOException running import job: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf

at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
    at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:397)
    at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:384)
    at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:342)
    at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:246)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:524)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
    at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:202)
    at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:182)
    at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:51)
    at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:48)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:242)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)

但是,如果在命令行中执行相同的sqoop脚本,就可以了!
添加了环境变量 HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/hdp/current/hive-client/lib to /etc/profile . 还是不行。我曾试过几次独自解决这个问题,但都失败了。
剧本是 /usr/hdp/2.5.0.0-1245/hive/bin/hive . 好像 ${HADOOP_CLASSPATH} point to /usr/hdp/2.5.0.0-1245/atlas/hook/hive/* ? ```

!/bin/bash

if [ -d "/usr/hdp/2.5.0.0-1245/atlas/hook/hive" ]; then
  if [ -z "${HADOOP_CLASSPATH}" ]; then
    export HADOOP_CLASSPATH=/usr/hdp/2.5.0.0-1245/atlas/hook/hive/*
  else
    export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:/usr/hdp/2.5.0.0-1245/atlas/hook/hive/*
  fi
fi

BIGTOP_DEFAULTS_DIR=${BIGTOP_DEFAULTS_DIR-/etc/default}
[ -n "${BIGTOP_DEFAULTS_DIR}" -a -r ${BIGTOP_DEFAULTS_DIR}/hbase ] && . ${BIGTOP_DEFAULTS_DIR}/hbase

export HIVE_HOME=${HIVE_HOME:-/usr/hdp/2.5.0.0-1245/hive}
export HADOOP_HOME=${HADOOP_HOME:-/usr/hdp/2.5.0.0-1245/hadoop}
export ATLAS_HOME=${ATLAS_HOME:-/usr/hdp/2.5.0.0-1245/atlas}

HCATALOG_JAR_PATH=/usr/hdp/2.5.0.0-1245/hive-hcatalog/share/hcatalog/hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:/usr/hdp/2.5.0.0-1245/hive-hcatalog/share/hcatalog/hive-hcatalog-server-extensions-1.2.1000.2.5.0.0-1245.jar:/usr/hdp/2.5.0.0-1245/hive-hcatalog/share/webhcat/java-client/hive-webhcat-java-client-1.2.1000.2.5.0.0-1245.jar

if [ -z "${HADOOP_CLASSPATH}" ]; then
  export HADOOP_CLASSPATH=${HCATALOG_JAR_PATH}
else
  export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:${HCATALOG_JAR_PATH}
fi

exec "${HIVE_HOME}/bin/hive.distro" "$@"
如何解决这个问题?
avwztpqn

avwztpqn1#

对我来说,这个问题显示在ambari工作流编辑器中。要解决这个问题,请在每个sqoop客户机节点中创建一个到hive-exec.jar所在的配置单元库的符号链接。接下来,将hive-exec.jar放入hdfs oozie share lib文件夹中。

su root

cd /usr/hdp/current/sqoop-client/

ln -s /usr/hdp/current/hive-client/lib/hive-exec.jar hive-exec.jar

cp hive-exec.jar lib/

su -l hdfs 

hdfs dfs -put hive-exec.jar /user/oozie/share/lib/sqoop

hdfs dfs -put hive-exec.jar /user/oozie/share/lib/lib_20161117191926/sqoop

相关问题