我正在尝试将数据从hbase导出到mysql。
当我尝试在sqoop中执行以下导出时:
sqoop export --connect jdbc:mysql://kraptor/kraptor --username root
--password-file file:///var/lib/hadoop-hdfs/sqoop.password --table Demo_blog --update-key id --update-mode updateonly --export-dir /user/hdfs/demoblog.csv -m4 --lines-terminated-by '\n'
--input-fields-terminated-by ',' --driver com.mysql.jdbc.Driver;
一切正常,但最后它说,出口工作失败。
下面是控制台的日志:
hdfs@node-2:/$ sqoop export --connect jdbc:mysql://kraptor/kraptor --username root --password-file file:///var/lib/hadoop-hdfs/sqoop.password --table Demo_blog --update-key id --update-mode updateonly --export-dir /user/hdfs/demoblog.csv -m4 --lines-terminated-by '\n' --input-fields-terminated-by ',' --driver com.mysql.jdbc.Driver;
Warning: /opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/07/25 09:28:33 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.12.0
17/07/25 09:28:34 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
17/07/25 09:28:34 INFO manager.SqlManager: Using default fetchSize of 1000
17/07/25 09:28:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Demo_blog AS t WHERE 1=0
17/07/25 09:28:34 INFO tool.CodeGenTool: Beginning code generation
17/07/25 09:28:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Demo_blog AS t WHERE 1=0
17/07/25 09:28:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Demo_blog AS t WHERE 1=0
17/07/25 09:28:34 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/bin/../lib/sqoop/../hadoop-mapreduce
Note: /tmp/sqoop-hdfs/compile/953360bccb8d21472993a7ad36ca8dac/Demo_blog.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/07/25 09:28:35 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/953360bccb8d21472993a7ad36ca8dac/Demo_blog.jar
17/07/25 09:28:35 INFO mapreduce.ExportJobBase: Beginning export of Demo_blog
17/07/25 09:28:36 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
17/07/25 09:28:36 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Demo_blog AS t WHERE 1=0
17/07/25 09:28:36 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
17/07/25 09:28:36 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
17/07/25 09:28:36 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
17/07/25 09:28:36 INFO client.RMProxy: Connecting to ResourceManager at node-2.c.k-raptor.internal/10.140.0.3:8032
17/07/25 09:28:38 INFO input.FileInputFormat: Total input paths to process : 1
17/07/25 09:28:38 INFO input.FileInputFormat: Total input paths to process : 1
17/07/25 09:28:38 INFO mapreduce.JobSubmitter: number of splits:4
17/07/25 09:28:38 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1500463014055_0245
17/07/25 09:28:39 INFO impl.YarnClientImpl: Submitted application application_1500463014055_0245
17/07/25 09:28:39 INFO mapreduce.Job: The url to track the job: http://node-2.c.k-raptor.internal:8088/proxy/application_1500463014055_0245/
17/07/25 09:28:39 INFO mapreduce.Job: Running job: job_1500463014055_0245
17/07/25 09:28:45 INFO mapreduce.Job: Job job_1500463014055_0245 running in uber mode : false
17/07/25 09:28:45 INFO mapreduce.Job: map 0% reduce 0%
17/07/25 09:28:50 INFO mapreduce.Job: map 75% reduce 0%
17/07/25 09:28:51 INFO mapreduce.Job: map 100% reduce 0%
17/07/25 09:28:51 INFO mapreduce.Job: Job job_1500463014055_0245 failed with state FAILED due to: Task failed task_1500463014055_0245_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
17/07/25 09:28:51 INFO mapreduce.Job: Counters: 32
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=460611
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=723
HDFS: Number of bytes written=0
HDFS: Number of read operations=12
HDFS: Number of large read operations=0
HDFS: Number of write operations=0
Job Counters
Failed map tasks=1
Launched map tasks=4
Data-local map tasks=1
Rack-local map tasks=3
Total time spent by all maps in occupied slots (ms)=12088
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=12088
Total vcore-milliseconds taken by all map tasks=12088
Total megabyte-milliseconds taken by all map tasks=12378112
Map-Reduce Framework
Map input records=0
Map output records=0
Input split bytes=426
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=59
CPU time spent (ms)=1940
Physical memory (bytes) snapshot=1031741440
Virtual memory (bytes) snapshot=4260794368
Total committed heap usage (bytes)=2472542208
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=0
17/07/25 09:28:51 INFO mapreduce.ExportJobBase: Transferred 723 bytes in 14.8595 seconds (48.6558 bytes/sec)
17/07/25 09:28:51 INFO mapreduce.ExportJobBase: Exported 0 records.
17/07/25 09:28:51 ERROR tool.ExportTool: Error during export:
Export job failed!
at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:439)
at org.apache.sqoop.manager.SqlManager.updateTable(SqlManager.java:965)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:70)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
如果有人能帮上忙,请。谢谢你
暂无答案!
目前还没有任何答案,快来回答吧!