将属性文件传递给OozieJava操作

axr492tv  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(335)

我已经建立了一个ooziejava操作工作流,我计划使用一个oozie协调器来调度它。java操作运行一个camus作业,我将其jar和properties配置文件放在workflow/lib目录中。你知道我怎么把-p的论点传给这个吗?目前,我正在做这样的事情:

<workflow-app xmlns="uri:oozie:workflow:0.5" name="camus-wf">
    <start to="camusJob"/>
    <action name="camusJob">
        <java>
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <configuration>
            <property>
                    <name>mapred.job.name</name>
                    <value>camusJob</value>
                </property>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <main-class>com.linkedin.camus.etl.kafka.CamusJob</main-class>
    <arg>-P</arg>
    <arg>${camusJobProperties}</arg>
        </java>
        <ok to="end"/>
        <error to="fail"/>
    </action>
    <kill name="fail">
        <message>${wf:errorMessage(wf:lastErrorNode())}</message>
    </kill>
    <end name="end"/>
</workflow-app>

camusjobproperties看起来像

hdfs://10.0.2.15:8020/coordCamusJob/workflowAppPath/lib/config.properties

但是工作流似乎没有运行(在准备时卡住了)。有没有办法解决这个问题?
谢谢!
编辑:在更正我的namenode url后,我可以看到出现以下错误:

ACTION[0000002-150804091125207-oozie-oozi-W@camusJob] Launcher exception: java.lang.IllegalArgumentException: Wrong FS: hdfs://10.0.2.15:8020/user/root/app/workflow/lib/config.properties, expected: file:///
org.apache.oozie.action.hadoop.JavaMainException: java.lang.IllegalArgumentException: Wrong FS: hdfs://10.0.2.15:8020/user/root/app/workflow/lib/config.properties, expected: file:///
    at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:58)
    at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:39)
    at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:36)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:226)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalArgumentException: Wrong FS: hdfs://10.0.2.15:8020/user/root/app/workflow/lib/config.properties, expected: file:///
    at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645)
    at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:82)
    at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:603)
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:821)
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:598)
    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:414)
    at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:140)
    at org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:341)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:766)
    at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:679)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at com.linkedin.camus.etl.kafka.CamusJob.main(CamusJob.java:646)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55)
    ... 15 more

因此,基本上我的问题是,当属性文件位于hdfs中(特别是在workflow/lib目录中)时,如何传递properties file参数

2admgd59

2admgd591#

对于问题的第一部分:这可能是由于 namenode 或者 jobtracker 对于第二部分:必须配置core-site.xml属性 fs.defaultFShdfs://host:port/ 另外,在java程序中,在 config 对象

相关问题