java—如何在从另一个应用程序启动apache spark launcher作业的过程中正确地等待它?

f0ofjuux  于 2021-07-13  发布在  Java
关注(0)|答案(2)|浏览(399)

当我等到spark apache工作完成时,我试图避免使用“while(true)”解决方案,但没有成功。
我有一个spark应用程序,它假设要处理一些数据并将结果放入数据库,我确实从spring服务调用它,并希望等到工作完成。
例子:
方法的启动器:

@Override
public void run(UUID docId, String query) throws Exception {
    launcher.addAppArgs(docId.toString(), query);

    SparkAppHandle sparkAppHandle = launcher.startApplication();

    sparkAppHandle.addListener(new SparkAppHandle.Listener() {
        @Override
        public void stateChanged(SparkAppHandle handle) {
            System.out.println(handle.getState() + " new  state");
        }

        @Override
        public void infoChanged(SparkAppHandle handle) {
            System.out.println(handle.getState() + " new  state");
        }
    });

    System.out.println(sparkAppHandle.getState().toString());
}

如何正确等待处理程序的状态为“完成”。

uajslkp6

uajslkp61#

我还使用了spring应用程序中的sparklauncher。下面是我所采用的方法的总结(通过以下javadoc中的示例)。
用于启动作业的@service还实现sparkhandle.listener,并通过.startapplication将引用传递给自身。

...
...
@Service
public class JobLauncher implements SparkAppHandle.Listener {
...
...
...
private SparkAppHandle launchJob(String mainClass, String[] args) throws Exception {

    String appResource = getAppResourceName();

    SparkAppHandle handle = new SparkLauncher()
        .setAppResource(appResource).addAppArgs(args)
        .setMainClass(mainClass)
        .setMaster(sparkMaster)
        .setDeployMode(sparkDeployMode)
        .setSparkHome(sparkHome)
        .setConf(SparkLauncher.DRIVER_MEMORY, "2g")
        .startApplication(this);

    LOG.info("Launched [" + mainClass + "] from [" + appResource + "] State [" + handle.getState() + "]");

    return handle;
}

/**

* Callback method for changes to the Spark Job
* /

@Override
public void infoChanged(SparkAppHandle handle) {

    LOG.info("Spark App Id [" + handle.getAppId() + "] Info Changed.  State [" + handle.getState() + "]");

}

/**

* Callback method for changes to the Spark Job's state
* /

@Override
public void stateChanged(SparkAppHandle handle) {

    LOG.info("Spark App Id [" + handle.getAppId() + "] State Changed. State [" + handle.getState() + "]");

}

使用这种方法,可以在状态更改为“失败”、“完成”或“终止”时采取操作。
我希望这些信息对你有帮助。

6psbrbz9

6psbrbz92#

我使用countdownlatch实现了,它按预期工作。

...
final CountDownLatch countDownLatch = new CountDownLatch(1);
SparkAppListener sparkAppListener = new SparkAppListener(countDownLatch);
SparkAppHandle appHandle = sparkLauncher.startApplication(sparkAppListener);
Thread sparkAppListenerThread = new Thread(sparkAppListener);
sparkAppListenerThread.start();
long timeout = 120;
countDownLatch.await(timeout, TimeUnit.SECONDS);    
    ...

private static class SparkAppListener implements SparkAppHandle.Listener, Runnable {
    private static final Log log = LogFactory.getLog(SparkAppListener.class);
    private final CountDownLatch countDownLatch;
    public SparkAppListener(CountDownLatch countDownLatch) {
        this.countDownLatch = countDownLatch;
    }
    @Override
    public void stateChanged(SparkAppHandle handle) {
        String sparkAppId = handle.getAppId();
        State appState = handle.getState();
        if (sparkAppId != null) {
            log.info("Spark job with app id: " + sparkAppId + ",\t State changed to: " + appState + " - "
                    + SPARK_STATE_MSG.get(appState));
        } else {
            log.info("Spark job's state changed to: " + appState + " - " + SPARK_STATE_MSG.get(appState));
        }
        if (appState != null && appState.isFinal()) {
            countDownLatch.countDown();
        }
    }
    @Override
    public void infoChanged(SparkAppHandle handle) {}
    @Override
    public void run() {}
}

相关问题