本文整理了Java中org.apache.hadoop.mapreduce.OutputFormat.checkOutputSpecs()
方法的一些代码示例,展示了OutputFormat.checkOutputSpecs()
的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。OutputFormat.checkOutputSpecs()
方法的具体详情如下:
包路径:org.apache.hadoop.mapreduce.OutputFormat
类名称:OutputFormat
方法名:checkOutputSpecs
[英]Check for validity of the output-specification for the job.
This is to validate the output specification for the job when it is a job is submitted. Typically checks that it does not already exist, throwing an exception when it already exists, so that output is not overwritten.
[中]检查作业输出规范的有效性。
这是为了在提交作业时验证作业的输出规范。通常会检查它是否已经存在,当它已经存在时会抛出异常,这样输出就不会被覆盖。
代码示例来源:origin: apache/hive
/**
* Check for validity of the output-specification for the job.
* @param context information about the job
* @throws IOException when output should not be attempted
*/
@Override
public void checkOutputSpecs(JobContext context
) throws IOException, InterruptedException {
getOutputFormat(context).checkOutputSpecs(context);
}
代码示例来源:origin: elastic/elasticsearch-hadoop
@Override
public void checkOutputSpecs(JobContext context) throws IOException, InterruptedException {
List<OutputFormat> formats = getNewApiFormats(CompatHandler.jobContext(context).getConfiguration());
for (OutputFormat format : formats) {
format.checkOutputSpecs(context);
}
}
代码示例来源:origin: apache/hive
@Override
public void checkOutputSpecs(JobContext context) throws IOException, InterruptedException {
for (String alias : getOutputFormatAliases(context)) {
LOGGER.debug("Calling checkOutputSpecs for alias: " + alias);
JobContext aliasContext = getJobContext(alias, context);
OutputFormat<?, ?> outputFormat = getOutputFormatInstance(aliasContext);
outputFormat.checkOutputSpecs(aliasContext);
// Copy credentials and any new config added back to JobContext
context.getCredentials().addAll(aliasContext.getCredentials());
setAliasConf(alias, context, aliasContext);
}
}
代码示例来源:origin: apache/ignite
/** {@inheritDoc} */
@Override protected void run0(HadoopV2TaskContext taskCtx) throws IgniteCheckedException {
try {
JobContextImpl jobCtx = taskCtx.jobContext();
OutputFormat outputFormat = getOutputFormat(jobCtx);
outputFormat.checkOutputSpecs(jobCtx);
OutputCommitter committer = outputFormat.getOutputCommitter(hadoopContext());
if (committer != null)
committer.setupJob(jobCtx);
}
catch (ClassNotFoundException | IOException e) {
throw new IgniteCheckedException(e);
}
catch (InterruptedException e) {
Thread.currentThread().interrupt();
throw new IgniteInterruptedCheckedException(e);
}
}
}
代码示例来源:origin: com.datasalt.pangool/pangool-core
@Override
public void checkOutputSpecs(JobContext context) throws IOException,
InterruptedException {
instantiateWhenNeeded();
instance.checkOutputSpecs(context);
}
代码示例来源:origin: io.hops/hadoop-mapreduce-client-core
@Override
public void checkOutputSpecs(JobContext context)
throws IOException, InterruptedException {
getBaseOut().checkOutputSpecs(context);
}
代码示例来源:origin: org.spark-project.hive.hcatalog/hive-hcatalog-core
/**
* Check for validity of the output-specification for the job.
* @param context information about the job
* @throws IOException when output should not be attempted
*/
@Override
public void checkOutputSpecs(JobContext context
) throws IOException, InterruptedException {
getOutputFormat(context).checkOutputSpecs(context);
}
代码示例来源:origin: com.facebook.presto.hive/hive-apache
/**
* Check for validity of the output-specification for the job.
* @param context information about the job
* @throws IOException when output should not be attempted
*/
@Override
public void checkOutputSpecs(JobContext context
) throws IOException, InterruptedException {
getOutputFormat(context).checkOutputSpecs(context);
}
代码示例来源:origin: com.github.jiayuhan-it/hadoop-mapreduce-client-core
@Override
public void checkOutputSpecs(JobContext context)
throws IOException, InterruptedException {
getBaseOut().checkOutputSpecs(context);
}
代码示例来源:origin: io.prestosql.hadoop/hadoop-apache
@Override
public void checkOutputSpecs(JobContext context)
throws IOException, InterruptedException {
getBaseOut().checkOutputSpecs(context);
}
代码示例来源:origin: com.github.hyukjinkwon.hcatalog/hive-hcatalog-core
/**
* Check for validity of the output-specification for the job.
* @param context information about the job
* @throws IOException when output should not be attempted
*/
@Override
public void checkOutputSpecs(JobContext context
) throws IOException, InterruptedException {
getOutputFormat(context).checkOutputSpecs(context);
}
代码示例来源:origin: org.apache.hive.hcatalog/hive-hcatalog-core
/**
* Check for validity of the output-specification for the job.
* @param context information about the job
* @throws IOException when output should not be attempted
*/
@Override
public void checkOutputSpecs(JobContext context
) throws IOException, InterruptedException {
getOutputFormat(context).checkOutputSpecs(context);
}
代码示例来源:origin: ch.cern.hadoop/hadoop-mapreduce-client-core
@Override
public void checkOutputSpecs(JobContext context)
throws IOException, InterruptedException {
getBaseOut().checkOutputSpecs(context);
}
代码示例来源:origin: com.twitter.elephantbird/elephant-bird-core
@Override
public void checkOutputSpecs(FileSystem ignored, JobConf job) throws IOException {
initOutputFormat(job);
try {
realOutputFormat.checkOutputSpecs(HadoopCompat.newJobContext(job, null));
} catch (InterruptedException e) {
throw new IOException(e);
}
}
代码示例来源:origin: cdapio/cdap
@Override
public void checkOutputSpecs(JobContext context) throws IOException, InterruptedException {
for (String name : MultipleOutputs.getNamedOutputsList(context)) {
Class<? extends OutputFormat> namedOutputFormatClass =
MultipleOutputs.getNamedOutputFormatClass(context, name);
JobContext namedContext = MultipleOutputs.getNamedJobContext(context, name);
OutputFormat<K, V> outputFormat =
ReflectionUtils.newInstance(namedOutputFormatClass, namedContext.getConfiguration());
outputFormat.checkOutputSpecs(namedContext);
}
}
代码示例来源:origin: org.elasticsearch/elasticsearch-hadoop-mr
@Override
public void checkOutputSpecs(JobContext context) throws IOException, InterruptedException {
List<OutputFormat> formats = getNewApiFormats(CompatHandler.jobContext(context).getConfiguration());
for (OutputFormat format : formats) {
format.checkOutputSpecs(context);
}
}
代码示例来源:origin: org.elasticsearch/elasticsearch-spark
@Override
public void checkOutputSpecs(JobContext context) throws IOException, InterruptedException {
List<OutputFormat> formats = getNewApiFormats(CompatHandler.jobContext(context).getConfiguration());
for (OutputFormat format : formats) {
format.checkOutputSpecs(context);
}
}
代码示例来源:origin: com.github.hyukjinkwon.hcatalog/hive-hcatalog-core
@Override
public void checkOutputSpecs(JobContext context) throws IOException, InterruptedException {
for (String alias : getOutputFormatAliases(context)) {
LOGGER.debug("Calling checkOutputSpecs for alias: " + alias);
JobContext aliasContext = getJobContext(alias, context);
OutputFormat<?, ?> outputFormat = getOutputFormatInstance(aliasContext);
outputFormat.checkOutputSpecs(aliasContext);
// Copy credentials and any new config added back to JobContext
context.getCredentials().addAll(aliasContext.getCredentials());
setAliasConf(alias, context, aliasContext);
}
}
代码示例来源:origin: org.apache.hive.hcatalog/hive-hcatalog-core
@Override
public void checkOutputSpecs(JobContext context) throws IOException, InterruptedException {
for (String alias : getOutputFormatAliases(context)) {
LOGGER.debug("Calling checkOutputSpecs for alias: " + alias);
JobContext aliasContext = getJobContext(alias, context);
OutputFormat<?, ?> outputFormat = getOutputFormatInstance(aliasContext);
outputFormat.checkOutputSpecs(aliasContext);
// Copy credentials and any new config added back to JobContext
context.getCredentials().addAll(aliasContext.getCredentials());
setAliasConf(alias, context, aliasContext);
}
}
代码示例来源:origin: com.facebook.presto.hive/hive-apache
@Override
public void checkOutputSpecs(JobContext context) throws IOException, InterruptedException {
for (String alias : getOutputFormatAliases(context)) {
LOGGER.debug("Calling checkOutputSpecs for alias: " + alias);
JobContext aliasContext = getJobContext(alias, context);
OutputFormat<?, ?> outputFormat = getOutputFormatInstance(aliasContext);
outputFormat.checkOutputSpecs(aliasContext);
// Copy credentials and any new config added back to JobContext
context.getCredentials().addAll(aliasContext.getCredentials());
setAliasConf(alias, context, aliasContext);
}
}
内容来源于网络,如有侵权,请联系作者删除!