db2 Spring批处理作业浏览器未使用正确的架构

azpvetkf  于 2022-11-07  发布在  DB2
关注(0)|答案(1)|浏览(182)

运行以下代码时:
作业资源管理器jobExplorer.getJobInstance(161 L)/* 无论如何都有效 */
我得到以下异常:

org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME, JOB_KEY, VERSION from BATCH_JOB_INSTANCE where JOB_INSTANCE_ID = ?]; nested exception is com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704, SQLERRMC=3AFBED1C.BATCH_JOB_INSTANCE, DRIVER=4.19.66
    at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:234)
    at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:72)
    at org.springframework.jdbc.core.JdbcTemplate.translateException(JdbcTemplate.java:1442)
    at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:632)
    at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:668)
    at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:699)
    at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:711)
    at org.springframework.jdbc.core.JdbcTemplate.queryForObject(JdbcTemplate.java:789)
    at org.springframework.batch.core.repository.dao.JdbcJobInstanceDao.getJobInstance(JdbcJobInstanceDao.java:176)
    at org.springframework.batch.core.explore.support.SimpleJobExplorer.getJobInstance(SimpleJobExplorer.java:163)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:564)
    at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:343)

SoftLayerJobService.java

public class SoftLayerJobService {
    private final Logger logger = LoggerFactory.getLogger(SoftLayerJobService.class);
    UploadedFileRepository uploadedFileRepository;
    private final String inputPrefix;
    private final String inputSpecialMappingPrefix;
    private final String inputPath;
    private final String outputDateFormat;
    private JobLauncher jobLauncher;
    private JobLauncher asyncJobLauncher;
    private Job softlayerUploadJob;
    private Job softlayerSpecialMappingJob;
    private Job mappingBlueReportOutJob;
    private Job softlayerNoGoJob;
    private FileWriter fileWriter;
    private RulesService rulesService;
    @Value("${softlayer.mapping.bluereport.output.path}") private String softlayerOutputPath;

    private JobExplorer jobExplorer; 

    @Autowired
    public SoftLayerJobService(JobLauncher jobLauncher, @Qualifier("softlayerUploadJob") Job softlayerUploadJob, @Qualifier("softlayerSpecialMappingJob") Job softlayerSpecialMappingJob,
                               @Qualifier("mappingBlueReportOutJob") Job mappingBlueReportOutJob, @Qualifier("softlayerNoGoJob") Job softlayerNoGoJob,
                               FileWriter fileWriter, SoftlayerProperties softlayerProperties, RulesService rulesService ,UploadedFileRepository uploadedFileRepository, @Qualifier("asyncJobLauncher") JobLauncher asyncJobLauncher, JobExplorer jobExplorer){
        this.softlayerUploadJob = softlayerUploadJob;
        this.softlayerSpecialMappingJob = softlayerSpecialMappingJob;
        this.mappingBlueReportOutJob=mappingBlueReportOutJob;
        this.softlayerNoGoJob=softlayerNoGoJob;
        this.jobLauncher = jobLauncher;
        this.fileWriter = fileWriter;
        this.rulesService = rulesService;
        this.inputPrefix = softlayerProperties.getInputPrefix();
        this.inputSpecialMappingPrefix = softlayerProperties.getInputSpecialMappingPrefix();
        this.inputPath = softlayerProperties.getInputPath();
        this.outputDateFormat = softlayerProperties.getOutputDateFormat();
        this.uploadedFileRepository = uploadedFileRepository;
        this.asyncJobLauncher = asyncJobLauncher;
        this.jobExplorer = jobExplorer;
    }

    /**
     * @param jobInstanceId
7    * @param jobExecutionId
     * @return
     */
    public AsyncJobExecutionResult getJobExecutionResult(long jobInstanceId, long jobExecutionId) {
        AsyncJobExecutionResult result = new AsyncJobExecutionResult();
        result.setJobInstanceId(jobInstanceId);
        result.setJobExecutionId(jobExecutionId);
        logger.debug("Looking for status of job instance id {} job execution id",jobInstanceId, jobExecutionId);
        JobInstance jobInstance = jobExplorer.getJobInstance(jobInstanceId);
        if(jobInstance == null) {
            throw new IllegalArgumentException("Spring Batch Job ID "+jobInstanceId +" could not be found.");
        }
        JobExecution jobExecution = jobExplorer.getJobExecution(jobExecutionId);
        if(jobExecution == null) {
            throw new IllegalArgumentException("Spring Batch Job Execution ID"+jobExecutionId +" could not be found.");
        }
        result.setExitStatus(jobExecution.getExitStatus().toString());
        StepExecution softlayerUploadFileStep = null; 
        Iterator<StepExecution> stepExecutionIterator = jobExecution.getStepExecutions().iterator();
        while (stepExecutionIterator.hasNext() && softlayerUploadFileStep == null) {
            StepExecution current = stepExecutionIterator.next();
            logger.debug("Found StepExecution stepname="+current.getStepName());
            if (current.getStepName().equals("softlayerUploadFile")) {
                softlayerUploadFileStep = current;
            }
        }
        if (softlayerUploadFileStep == null) {
            logger.error("Failed to find proper step execution");
        } else {
            logger.debug("Found async job execution object");
            long recordsCompleted = softlayerUploadFileStep.getWriteCount();
            final String TOTAL_RECORDS_READ_KEY = "softlayerUploadFile_TOTAL_RECORDS_READ";
            long totalRecords = 0;
            if(softlayerUploadFileStep.getExecutionContext().containsKey(TOTAL_RECORDS_READ_KEY)) {
                totalRecords = softlayerUploadFileStep.getExecutionContext().getLong(TOTAL_RECORDS_READ_KEY);
                logger.debug("Softlayer readCount found in context");
            } else {
                logger.debug("Softlayer readCount not found in context");
            }
            logger.debug("Records completed = " + recordsCompleted);
            result.setNumberRecords(recordsCompleted);
            logger.debug("Total records = " + totalRecords);
            long percentComplete = (recordsCompleted==0)?0:(recordsCompleted * 100 )/ totalRecords ;
            logger.debug("softlayer percent complete = " + percentComplete);
            result.setPercentComplete(percentComplete);
        }
        return result;
    }
...
}

我使用的是DB/2,这个错误实际上意味着找不到对象。在一个模式中有一个BATCH_JOB_INSTANCE,其他调用可以到达。但是不知何故,JobExplorer的这个示例转到了其他一些可能是默认的模式。我该如何告诉Spring Batch使用正确的模式呢?
谢谢你,樵夫

icomxhvb

icomxhvb1#

在我看来,这不是最好的答案,但它确实有效,至少对于DB/2数据库是这样。为了解决这个问题,我创建了自己的JobExplorer bean,并设置表前缀以包括模式。我的新代码是:

@Bean(destroyMethod = "")
    public JobExplorer softLayerJobExplorer(@Qualifier("batchDataSource") DataSource batchDataSource) throws Exception {
        JobExplorerFactoryBean jobExplorerFactoryBean = new JobExplorerFactoryBean();
        jobExplorerFactoryBean.setDataSource(batchDataSource);
        jobExplorerFactoryBean.afterPropertiesSet();
        jobExplorerFactoryBean.setTablePrefix("BLUECOST.BATCH_");
        JobExplorer result = jobExplorerFactoryBean.getObject();
identityHashCode="+System.identityHashCode(result));
        return result;
    }

调用jobExplorerFactoryBean.setTablePrefix基本上是一个对象前置,它看起来只是一个String前置,它并不确保它只是一个有效的表名。这是幸运的,因为Spring决定不包含setSchema函数。
我对这个满意吗?不完全满意。但它确实有效。不,我不想为此重写几十个类。我只是试图指定一个模式。只是一个模式。就这样。它不应该这么难。

相关问题