java—我们在运行mapreduce程序解析区块链文件(encrypted file.dat)时遇到以下异常

7dl7o3gd  于 2021-06-02  发布在  Hadoop
关注(0)|答案(0)|浏览(193)

我们已经执行了下面的hadoop命令,程序使用marreduceapi。当我们运行程序时,它会抛出一个异常,即使文件存在于输入位置。请引导我们。

[cloudera@quickstart mapreduce-bitcoinblock-1.0]$ hadoop jar example-hcl-mr-bitcoinblock-0.1.0.jar org.zuinnote.hadoop.bitcoin.example.driver.BitcoinBlockCounterDriver /user/cloudera/bitcoin/input /user/cloudera/bitcoin/output1
args[0]=/user/cloudera/bitcoin/input
 args[1]=/user/cloudera/bitcoin/output1

##### before the setoutputpath

##### after the setoutputpath

16/10/07 00:04:09 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
16/10/07 00:04:10 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
16/10/07 00:04:11 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
16/10/07 00:04:11 INFO mapred.FileInputFormat: Total input paths to process : 1
16/10/07 00:04:12 WARN hdfs.DFSClient: Caught exception 
java.lang.InterruptedException
    at java.lang.Object.wait(Native Method)
    at java.lang.Thread.join(Thread.java:1281)
    at java.lang.Thread.join(Thread.java:1355)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:600)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:789)
16/10/07 00:04:12 INFO mapreduce.JobSubmitter: number of splits:1
16/10/07 00:04:12 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1475232088753_0022
16/10/07 00:04:13 INFO impl.YarnClientImpl: Submitted application application_1475232088753_0022
16/10/07 00:04:13 INFO mapreduce.Job: The url to track the job: http://quickstart.cloudera:8088/proxy/application_1475232088753_0022/
16/10/07 00:04:13 INFO mapreduce.Job: Running job: job_1475232088753_0022
16/10/07 00:04:24 INFO mapreduce.Job: Job job_1475232088753_0022 running in uber mode : false
16/10/07 00:04:24 INFO mapreduce.Job:  map 0% reduce 0%
16/10/07 00:04:38 INFO mapreduce.Job: Task Id : attempt_1475232088753_0022_m_000000_0, Status : FAILED
Error: java.lang.NullPointerException
    at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.close(MapTask.java:210)
    at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:1976)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:468)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

16/10/07 00:04:59 INFO mapreduce.Job: Task Id : attempt_1475232088753_0022_m_000000_1, Status : FAILED
Error: java.lang.NullPointerException
    at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.close(MapTask.java:210)
    at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:1976)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:468)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

16/10/07 00:05:20 INFO mapreduce.Job: Task Id : attempt_1475232088753_0022_m_000000_2, Status : FAILED
Error: java.lang.NullPointerException
    at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.close(MapTask.java:210)
    at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:1976)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:468)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

16/10/07 00:05:39 INFO mapreduce.Job:  map 100% reduce 0%
16/10/07 00:05:40 INFO mapreduce.Job:  map 100% reduce 100%
16/10/07 00:05:41 INFO mapreduce.Job: Job job_1475232088753_0022 failed with state FAILED due to: Task failed task_1475232088753_0022_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

16/10/07 00:05:41 INFO mapreduce.Job: Counters: 10
    Job Counters 
        Failed map tasks=4
        Killed reduce tasks=1
        Launched map tasks=4
        Other local map tasks=3
        Data-local map tasks=1
        Total time spent by all maps in occupied slots (ms)=65085
        Total time spent by all reduces in occupied slots (ms)=0
        Total time spent by all map tasks (ms)=65085
        Total vcore-seconds taken by all map tasks=65085
        Total megabyte-seconds taken by all map tasks=66647040
Exception in thread "main" java.io.IOException: Job failed!
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:876)
    at org.zuinnote.hadoop.bitcoin.example.driver.BitcoinBlockCounterDriver.main(BitcoinBlockCounterDriver.java:66)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

程序如下所示

/**

* Copyright 2016 ZuInnoTe (Jörn Franke) <zuinnote@gmail.com>
* 
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* 
* http://www.apache.org/licenses/LICENSE-2.0
* 
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
**/

/**
 * Simple Driver for a map reduce job counting the number of transactons in a given blocks from the specified files containing Bitcoin blockchain data
 */
package org.zuinnote.hadoop.bitcoin.example.driver;

import java.io.IOException;
import java.util.*;

import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapred.*;
import org.apache.hadoop.util.*;
import org.zuinnote.hadoop.bitcoin.example.tasks.BitcoinBlockMap;
import org.zuinnote.hadoop.bitcoin.example.tasks.BitcoinBlockReducer;

import org.zuinnote.hadoop.bitcoin.format.*;

/**

* Author: Jörn Franke <zuinnote@gmail.com>
* 
* /

public class BitcoinBlockCounterDriver  {

 public static void main(String[] args) throws Exception {
    JobConf conf = new JobConf(new Configuration(), BitcoinBlockCounterDriver.class);

    conf.setJobName("example-hadoop-bitcoin-transactioncounter-job");

    conf.setMapOutputKeyClass(Text.class);
    conf.setMapOutputValueClass(IntWritable.class);

    conf.setOutputKeyClass(Text.class);
    conf.setOutputValueClass(LongWritable.class);

    conf.setMapperClass(BitcoinBlockMap.class);
    conf.setReducerClass(BitcoinBlockReducer.class);

    conf.setInputFormat(BitcoinBlockFileInputFormat.class);
    conf.setOutputFormat(TextOutputFormat.class);

    /**Set as an example some of the options to configure the Bitcoin fileformat**/
     /**Find here all configuration options: https://github.com/ZuInnoTe/hadoopcryptoledger/wiki/Hadoop-File-Format**/
    conf.set("hadoopcryptoledger.bitcoinblockinputformat.filter.magic","F9BEB4D9");

    System.out.println("args[0]=" + args[0] + "\n args[1]=" + args[1]);
    FileInputFormat.addInputPath(conf, new Path(args[0]));
    System.out.println("##### before the setoutputpath");
    FileOutputFormat.setOutputPath(conf, new Path(args[1]));
    System.out.println("##### after the setoutputpath");    

    JobClient.runJob(conf);
 }

}

再来一个节目

/**

* Copyright 2016 ZuInnoTe (Jörn Franke) <zuinnote@gmail.com>
* 
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* 
* http://www.apache.org/licenses/LICENSE-2.0
* 
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
**/

/**
 * Simple Mapper for counting the number of Bitcoin transactions in a file on HDFS
 */
package org.zuinnote.hadoop.bitcoin.example.tasks;

/**

* Author: Jörn Franke <zuinnote@gmail.com>
* 
* /

import java.io.IOException;
import org.apache.hadoop.mapred.*;
import org.apache.hadoop.io.*;
import org.zuinnote.hadoop.bitcoin.format.*;

import java.util.*;

     public  class BitcoinBlockMap  extends MapReduceBase implements Mapper<BytesWritable, BitcoinBlock, Text, IntWritable> {
        private final static Text defaultKey = new Text("Transaction Count:");
        public void map(BytesWritable key, BitcoinBlock value, OutputCollector<Text, IntWritable> output, Reporter reporter) throws IOException {
            // get the number of transactions
             output.collect(defaultKey, new IntWritable(value.getTransactions().size()));
             }

        }

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题