使用sequencefile类写入文件

kxkpmulp  于 2021-06-03  发布在  Hadoop
关注(0)|答案(2)|浏览(396)

我使用以下代码将一些数据写入sequencefile格式文件。当程序运行一段时间后,我通过eclipse控制台上的红色按钮中断程序。但是,当我检查hdfs上的数据文件时,序列文件的大小是零。而且也不能用'hadoop fs-text filename'命令查看文件。当我使用sequencefile.reader读取之前创建的文件时,遇到了“exception in thread”main“java.io.eofexception”异常。在这种情况下,怎么办?我的开发环境是eclipse3.7(在Windows7上)和hadoop cluster(hadoop版本1.0.3)在centos 6上。
类序列扩展线程{

private String uri = "hdfs://172.20.11.60:9000";
private String filePath = "/user/hadoop/input/";
private String fileName = "Sequence-01.seq";
public SequenceFile.Writer writer;
private static int cnt = 0;

private void init() {
    Configuration conf = new Configuration();
    try {
        FileSystem fs = FileSystem.get(URI.create(uri), conf);
        writer = SequenceFile.createWriter(fs, conf, new Path(filePath
                + fileName), LongWritable.class, Text.class);
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
}

public Sequence() {
    init();
}

@Override
public void run(){
    while(true){
        try {
            writer.append(new LongWritable(100), new Text("hello,world"));
            cnt++;
            if(cnt%100 == 0){
                System.out.println("flush current data to file system");
                writer.syncFs();
            }
        } catch (IOException e) {
            // TODO Auto-generated catch block
            System.out.println("append data error");
            e.printStackTrace();
        }

        try {
            Thread.sleep(1000);
        } catch (InterruptedException e) {
            // TODO Auto-generated catch block
            System.out.println("thread interupted");
            e.printStackTrace();
        }
    }
}

}
公共类testsequencefile{

/**
 * @param args
 */
public static void main(String[] args) {
    // TODO Auto-generated method stub

    new Sequence().start();
}

}

qlckcl4x

qlckcl4x1#

是的,hadoop权威指南是最好的,这里是读写序列文件的例子。
实际上序列文件的形式是字节序列或hadoop可写文件,主要用来组合各种小文件来组合和馈送到map函数。
http://javatute.com/javatute/faces/post/hadoop/2014/creating-sequence-file-using-hadoop.xhtml

ljo96ir5

ljo96ir52#

一般建议:不要打断这个过程。
解决方案:对我来说,下面的代码运行良好。

import java.io.IOException;
import java.net.URI;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.SequenceFile;

import org.apache.hadoop.io.Text;

public class SequenceFileWriteDemo {
private static final String[] DATA = {
"One, two, buckle my shoe",
"Three, four, shut the door",
"Five, six, pick up sticks",
"Seven, eight, lay them straight",
"Nine, ten, a big fat hen"};

public static void main(String[] args) throws IOException {
//String uri = "/home/Desktop/inputSort.txt";
String uri = "hdfs://localhost:9900/out1.seq";

Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(uri), conf);
Path path = new Path(uri);
IntWritable key = new IntWritable();
Text value = new Text();
SequenceFile.Writer writer = null;

try {
writer = SequenceFile.createWriter(fs, conf, path,
    key.getClass(), value.getClass());

    for (int i = 0; i < 130; i++) {
    key.set(100 - i);
    value.set(DATA[i % DATA.length]);

    System.out.printf("[%s]\t%s\t%s\n", writer.getLength(), key, value, key.getClass(), value.getClass());

    writer.append(key, value);
    }
    } finally {
    IOUtils.closeStream(writer);
    }
}}

有关写入序列文件的详细信息,请参阅《hadoop最终指南》(o'reilly出版物)。

相关问题