在sqoop中指定多个泛型参数的正确方法是什么

hzbexzde  于 2021-06-01  发布在  Hadoop
关注(0)|答案(1)|浏览(378)

查看文档,我看到可用的泛型参数列表如下:

Generic options supported are
-conf <configuration file>     specify an application configuration file
-D <property=value>            use value for given property
-fs <local|namenode:port>      specify a namenode
-jt <local|jobtracker:port>    specify a job tracker
-files <comma separated list of files>    specify comma separated files to be copied to the map reduce cluster
-libjars <comma separated list of jars>    specify comma separated jar files to include in the classpath.
-archives <comma separated list of archives>    specify comma separated archives to be unarchived on the compute machines.

我想在sqoop命令中指定以下两个属性:

mapred.job.queuename=batch 
mapred.child.java.opts="\-Djava.security.egd=file:///dev/urandom"

所以我写了导入sqoop脚本如下:

sqoop import -Dmapred.job.queuename=batch \
        mapred.child.java.opts="\-Djava.security.egd=file:///dev/urandom" \
        --connect $connection \
        --username $username\
        --password $password \
        --table $sourceTable \
        --columns "$columns"\
        --hive-import \
        --hive-overwrite \
        --hive-table $targetTable \
        --compression-codec org.apache.hadoop.io.compress.SnappyCodec \
        --hive-delims-replacement "SPECIAL" \
        --null-string '\\N' \
        --null-non-string '\\N' \
        -m 1

但它不起作用。。。

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.4.0-91/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.4.0-91/accumulo/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/03/07 08:00:04 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.4.0-91
18/03/07 08:00:04 ERROR tool.BaseSqoopTool: Error parsing arguments for import:
18/03/07 08:00:04 ERROR tool.BaseSqoopTool: Unrecognized argument: mapred.child.java.opts=\-Djava.security.egd=file:///dev/urandom
18/03/07 08:00:04 ERROR tool.BaseSqoopTool: Unrecognized argument: --connect

我也想过做以下的事情

sqoop import \
        -D mapred.job.queuename=batch \
        -D mapred.child.java.opts="\-Djava.security.egd=file:///dev/urandom" \
        --connect $connection \
        --username $username \
        --password $password \
        --table $sourceTable \
        --columns "$columns"\
        --hive-import \
        --hive-overwrite \
        --hive-table $targetTable \
        --compression-codec org.apache.hadoop.io.compress.SnappyCodec \
        --hive-delims-replacement "SPECIAL" \
        --null-string '\\N' \
        --null-non-string '\\N' \
        -m 1

但恐怕只有一个二维泛型参数会被解析。
哪种方法是正确的?

kq4fsx7k

kq4fsx7k1#

javad标志在属性后面不需要空格

sqoop import \
    -Dmapred.child.java.opts="-Djava.security.egd=file:///dev/urandom" \
    -Dmapred.job.queue.name=batch \

还修复了队列名称属性

相关问题