sqoop-code too large>max表定义?

yzuktlbb  于 2021-06-03  发布在  Hadoop
关注(0)|答案(2)|浏览(436)

我正试图从一个有2000列的teradata表(表定义为90k字符)中导入数据到hdfs中。。。当我执行脚本时,我得到:

/tmp/sqoop-hadoopi/compile/636c527afc3baa6fdf33464f02430602/table.java:21971: code too large

我的sqoop脚本:

sqoop import \
 -libjars $LIB_JARS \
 --connect jdbc:teradata://PRD/Database=database \
 --connection-manager org.apache.sqoop.teradata.TeradataConnManager \
 --table table \
 --username login \
 --password pass \

我的输出日志:

13/11/07 14:54:50 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
13/11/07 14:54:50 INFO manager.SqlManager: Using default fetchSize of 1000
13/11/07 14:54:50 INFO tool.CodeGenTool: Beginning code generation
13/11/07 14:55:31 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM table AS t WHERE 1=0
13/11/07 14:55:46 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop/libexec/..
13/11/07 14:55:46 INFO orm.CompilationManager: Found hadoop core jar at: /usr/lib/hadoop/libexec/../hadoop-core.jar
/tmp/sqoop-hadoopi/compile/636c527afc3baa6fdf33464f02430602/table.java:21971: code too large
  public boolean equals(Object o) {
                 ^
/tmp/sqoop-hadoopi/compile/636c527afc3baa6fdf33464f02430602/table.java:37949: code too large
  public void write(DataOutput __dataOut) throws IOException {
              ^
/tmp/sqoop-hadoopi/compile/636c527afc3baa6fdf33464f02430602/table.java:49925: code too large
  public String toString(DelimiterSet delimiters, boolean useRecordDelim) {
                ^
/tmp/sqoop-hadoopi/compile/636c527afc3baa6fdf33464f02430602/table.java:53970: code too large
  private void __loadFromFields(List<String> fields) {
               ^
Note: /tmp/sqoop-hadoopi/compile/636c527afc3baa6fdf33464f02430602/table.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
4 errors
13/11/07 14:55:51 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Error returned by javac
        at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:205)
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:390)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)

也许有人已经进口了一张大table。。。谢谢!

goucqfw6

goucqfw61#

我不知道你是否已经试过了,但是有一个用于hadoop的teradata连接器:
http://developer.teradata.com/connectivity/articles/teradata-connector-for-hadoop-now-available

41ik7eoe

41ik7eoe2#

java中的每个方法的字节码限制为64kb。我担心当前版本的sqoop没有将在您的案例中生成的长方法分解为多个子方法的工具,因此我建议在sqoopjira上打开一个新的特性请求。

相关问题