sqoop查询执行错误,成功

bakd9h0s  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(235)

执行sqoop查询时收到错误消息,但几秒钟后,查询成功执行,表记录也成功从sql server导入配置单元。记录数也与sql server和配置单元匹配。
无法获取引发错误的原因。我在分享日志信息,有谁能解释这背后的原因。出于安全考虑,我把ip隐藏起来了。

sqoop import --connect "jdbc:sqlserver://10.128.**.***:1433;database=COCO_Pilot" --username sa --password Passw0rd --table RO_Transaction --hive-import --create-hive-table --hive-table coco_pilot.ro_transaction --warehouse-dir /user/landing --hive-overwrite -m 1;
Warning: /usr/hdp/2.6.2.0-205/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/hdp/2.6.2.0-205/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/12/06 19:54:45 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.2.0-205
17/12/06 19:54:45 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/12/06 19:54:45 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
17/12/06 19:54:45 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
17/12/06 19:54:45 INFO manager.SqlManager: Using default fetchSize of 1000
17/12/06 19:54:45 INFO tool.CodeGenTool: Beginning code generation
17/12/06 19:54:46 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [RO_Transaction] AS t WHERE 1=0
17/12/06 19:54:46 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.6.2.0-205/hadoop-mapreduce
Note: /tmp/sqoop-hdfs/compile/c592fdb7dc832a5adea6b13f299abeeb/RO_Transaction.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/12/06 19:54:48 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/c592fdb7dc832a5adea6b13f299abeeb/RO_Transaction.jar
17/12/06 19:54:48 INFO mapreduce.ImportJobBase: Beginning import of RO_Transaction
17/12/06 19:54:49 INFO client.RMProxy: Connecting to ResourceManager at slave1.snads.com/10.20.30.5:8050
17/12/06 19:54:49 INFO client.AHSProxy: Connecting to Application History server at slave1.snads.com/10.20.30.5:10200
17/12/06 19:54:52 INFO db.DBInputFormat: Using read commited transaction isolation
17/12/06 19:54:52 INFO mapreduce.JobSubmitter: number of splits:1
17/12/06 19:54:52 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1512027414889_0102
17/12/06 19:54:53 INFO impl.YarnClientImpl: Submitted application application_1512027414889_0102
17/12/06 19:54:53 INFO mapreduce.Job: The url to track the job: http://slave1.snads.com:8088/proxy/application_1512027414889_0102/
17/12/06 19:54:53 INFO mapreduce.Job: Running job: job_1512027414889_0102
17/12/06 19:55:00 INFO mapreduce.Job: Job job_1512027414889_0102 running in uber mode : false
17/12/06 19:55:00 INFO mapreduce.Job:  map 0% reduce 0%
17/12/06 19:55:19 INFO mapreduce.Job: Task Id : attempt_1512027414889_0102_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host 10.128.**.***, port 1433 has failed. Error: "connect timed out. Verify the connection properties, check that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port, and that no firewall is blocking TCP connections to the port.".
        at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:749)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
Caused by: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host 10.128.**.***, port 1433 has failed. Error: "connect timed out. Verify the connection properties, check that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port, and that no firewall is blocking TCP connections to the port.".
        at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
        at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
        ... 9 more
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host 10.128.**.***, port 1433 has failed. Error: "connect timed out. Verify the connection properties, check that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port, and that no firewall is blocking TCP connections to the port.".
        at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:170)
        at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:1049)
        at com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:833)
        at com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:716)
        at com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:841)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:247)
        at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
        at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
        ... 10 more

17/12/06 19:55:40 INFO mapreduce.Job:  map 100% reduce 0%
17/12/06 19:55:41 INFO mapreduce.Job: Job job_1512027414889_0102 completed successfully
17/12/06 19:55:41 INFO mapreduce.Job: Counters: 31
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=165681
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=87
                HDFS: Number of bytes written=199193962
                HDFS: Number of read operations=4
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=2
        Job Counters
                Failed map tasks=1
                Launched map tasks=2
                Other local map tasks=2
                Total time spent by all maps in occupied slots (ms)=475111
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=36547
                Total vcore-milliseconds taken by all map tasks=36547
                Total megabyte-milliseconds taken by all map tasks=486513664
        Map-Reduce Framework
                Map input records=1827459
                Map output records=1827459
                Input split bytes=87
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=136
                CPU time spent (ms)=21330
                Physical memory (bytes) snapshot=1431515136
                Virtual memory (bytes) snapshot=13562888192
                Total committed heap usage (bytes)=1369964544
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=199193962
17/12/06 19:55:41 INFO mapreduce.ImportJobBase: Transferred 189.9662 MB in 52.4188 seconds (3.624 MB/sec)
17/12/06 19:55:41 INFO mapreduce.ImportJobBase: Retrieved 1827459 records.
17/12/06 19:55:41 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners
17/12/06 19:55:41 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [RO_Transaction] AS t WHERE 1=0
17/12/06 19:55:42 WARN hive.TableDefWriter: Column id had to be cast to a less precise type in Hive
17/12/06 19:55:42 WARN hive.TableDefWriter: Column Transaction_Date had to be cast to a less precise type in Hive
17/12/06 19:55:42 WARN hive.TableDefWriter: Column Pump_No had to be cast to a less precise type in Hive
17/12/06 19:55:42 WARN hive.TableDefWriter: Column Nozzle_No had to be cast to a less precise type in Hive
17/12/06 19:55:42 WARN hive.TableDefWriter: Column product had to be cast to a less precise type in Hive
17/12/06 19:55:42 WARN hive.TableDefWriter: Column unit_price had to be cast to a less precise type in Hive
17/12/06 19:55:42 WARN hive.TableDefWriter: Column Volume had to be cast to a less precise type in Hive
17/12/06 19:55:42 WARN hive.TableDefWriter: Column Amount had to be cast to a less precise type in Hive
17/12/06 19:55:42 WARN hive.TableDefWriter: Column Start_Totlizer had to be cast to a less precise type in Hive
17/12/06 19:55:42 WARN hive.TableDefWriter: Column End_Totlizer had to be cast to a less precise type in Hive
17/12/06 19:55:42 INFO hive.HiveImport: Loading uploaded data into Hive
17/12/06 19:55:42 WARN conf.HiveConf: HiveConf of name hive.custom-extensions.root does not exist
17/12/06 19:55:42 WARN conf.HiveConf: HiveConf of name hive.custom-extensions.root does not exist

Logging initialized using configuration in jar:file:/usr/hdp/2.6.2.0-205/hive/lib/hive-common-1.2.1000.2.6.2.0-205.jar!/hive-log4j.properties
OK
Time taken: 2.146 seconds
Loading data to table coco_pilot.ro_transaction
Table coco_pilot.ro_transaction stats: [numFiles=1, numRows=0, totalSize=199193962, rawDataSize=0]
OK
Time taken: 0.725 seconds

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题