ApacheHive2.1.1

insrf1ej  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(248)

在Hive中连接直线时,无法创建spark客户端

  1. select count(*) from student;
  2. Query ID = hadoop_20180208184224_f86b5aeb-f27b-4156-bd77-0aab54c0ec67
  3. Total jobs = 1
  4. Launching Job 1 out of 1
  5. In order to change the average load for a reducer (in bytes):
  6. set hive.exec.reducers.bytes.per.reducer=<number>
  7. In order to limit the maximum number of reducers:
  8. set hive.exec.reducers.max=<number>
  9. In order to set a constant number of reducers:
  10. set mapreduce.job.reduces=<number>

执行spark任务失败,出现异常
org.apache.hadoop.hive.ql.metadata.hiveexception(未能创建spark客户端。)

  1. FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
  2. Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask (state=08S01,code=1)

在独立群集模式下安装spark预构建2.0 one
my hive-site.xml——也放在spark/conf中,并删除hdfs路径中的hive jar
已更新hive-site.xml

  1. <property>
  2. <name>hive.execution.engine</name>
  3. <value>spark</value>
  4. <description>
  5. </property>
  6. <property>
  7. <name>hive.metastore.uris</name>
  8. <value/>
  9. <description>Thrift URI for the remote metastore. Used by metastore client to connect to remote metastore.</description>
  10. </property>
  11. <property>
  12. <name>spark.master</name>
  13. <value>yarn</value>
  14. <description>Spark Master URL</description>
  15. </property>
  16. <property>
  17. <name>spark.eventLog.enabled</name>
  18. <value>true</value>
  19. <description>Spark Event Log</description>
  20. </property>
  21. <property>
  22. <name>spark.eventLog.dir</name>
  23. <value>hdfs://10.196.220.131:9000/user/spark/eventLogging</value>
  24. <description>Spark event log folder</description>
  25. </property>
  26. <property>
  27. <name>spark.executor.memory</name>
  28. <value>512m</value>
  29. <description>Spark executor memory</description>
  30. </property>
  31. <property>
  32. <name>spark.serializer</name>
  33. <value>org.apache.spark.serializer.KryoSerializer</value>
  34. <description>Spark serializer</description>
  35. </property>
  36. <property>
  37. <name>spark.yarn.jars</name>
  38. <value>hdfs://10.196.220.131:9000/user/spark/spark-jars/*</value>
  39. </property>
  40. <property>
  41. <name>spark.submit.deployMode</name>
  42. <value>cluster</value>
  43. <description>Spark Master URL</description>
  44. </property>

使用mysqldatabase连接--本地元存储模式

  1. a) ConnectionURL
  2. <name>javax.jdo.option.ConnectionURL</name>
  3. <value>jdbc:mysql://localhost/metastore_db?createDatabaseIfNotExist=true</value>
  4. b) ConnectionUserName
  5. <name>javax.jdo.option.ConnectionUserName</name>
  6. <value>hiveuser</value>
  7. c) ConnectionPassword
  8. <name>javax.jdo.option.ConnectionPassword</name>
  9. <value>xxxx</value>
  10. d) ConnectionDriver
  11. <name>javax.jdo.option.ConnectionDriverName</name>
  12. <value>com.mysql.jdbc.Driver</value>

my-site.xml

  1. <property>
  2. <name>yarn.nodemanager.resource.memory-mb</name>
  3. <value>40960</value>
  4. </property>
  5. <property>
  6. <name>yarn.scheduler.minimum-allocation-mb</name>
  7. <value>2048</value>
  8. </property>
  9. <property>
  10. <name>yarn.scheduler.maximum-allocation-mb</name>
  11. <value>8192</value>
  12. </property>

我的rm web ui日志

  1. 18/02/09 18:08:39 INFO spark.SecurityManager: Changing view acls groups to:
  2. 18/02/09 18:08:39 INFO spark.SecurityManager: Changing modify acls groups to:
  3. 18/02/09 18:08:39 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); groups with view permissions: Set(); users with modify permissions: Set(hadoop); groups with modify permissions: Set()
  4. 18/02/09 18:08:39 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread
  5. Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/JavaSparkListener
  6. at java.lang.ClassLoader.defineClass1(Native Method)
  7. at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
  8. at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
  9. at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
  10. at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
  11. at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
  12. at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
  13. at java.security.AccessController.doPrivileged(Native Method)
  14. at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
  15. at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  16. at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
  17. at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  18. at java.lang.Class.getDeclaredMethods0(Native Method)
  19. at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
  20. at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
  21. at java.lang.Class.getMethod0(Class.java:3018)
  22. at java.lang.Class.getMethod(Class.java:1784)
  23. at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:622)
  24. at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:379)
  25. at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:245)
  26. at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:749)
  27. at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:71)
  28. at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:70)
  29. at java.security.AccessController.doPrivileged(Native Method)
  30. at javax.security.auth.Subject.doAs(Subject.java:422)
  31. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
  32. at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:70)
  33. at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:747)
  34. at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
  35. Caused by: java.lang.ClassNotFoundException: org.apache.spark.JavaSparkListener
  36. at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
  37. at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  38. at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
  39. at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  40. ... 29 more
  41. 18/02/09 18:08:39 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 16, (reason: Shutdown hook called before final status was reported.)
  42. 18/02/09 18:08:39 INFO yarn.ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: Shutdown hook called before final status was reported.)
  43. 18/02/09 18:08:39 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://10.196.220.131:9000/user/hadoop/.sparkStaging/application_1518178947017_0002
  44. 18/02/09 18:08:39 INFO util.ShutdownHookManager: Shutdown hook called

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题