hadoop压缩

jjhzyzn0  于 2021-06-03  发布在  Hadoop
关注(0)|答案(0)|浏览(265)

我试图用下面的代码压缩一个文件。当文件的大小很小(比如1GB)时,压缩工作正常。但当文件大小在5gb左右时,程序不会失败,而是继续运行2天,没有任何结果。根据我得到的信息消息,这似乎是集群问题,虽然我不太确定。
下面是我得到的错误代码:
错误

我使用的代码

public void compressData(final String inputFilePath,final String outputPath) throws DataFabricAppendException {
    CompressionOutputStream compressionOutputStream = null;
    FSDataOutputStream fsDataOutputStream = null;
    FSDataInputStream fsDataInputStream = null;
    CompressionCodec compressionCodec = null;
    CompressionCodecFactory compressionCodecFactory = null;
    try {
        compressionCodecFactory = new CompressionCodecFactory(conf);
        final Path compressionFilePath = new Path(outputPath);
        fsDataOutputStream = fs.create(compressionFilePath);

        compressionCodec = compressionCodecFactory
                .getCodecByClassName(BZip2Codec.class.getName());
        compressionOutputStream = compressionCodec
                .createOutputStream(fsDataOutputStream);

        fsDataInputStream = new FSDataInputStream(fs.open(new Path(
                inputFilePath)));

        IOUtils.copyBytes(fsDataInputStream, compressionOutputStream, conf,
                false);

        compressionOutputStream.finish();
    } catch (IOException ex) {
        throw new DataFabricAppendException(
                "Error while compressing non-partitioned file : "
                        + inputFilePath, ex);
    } catch (Exception ex) {
        throw new DataFabricAppendException(
                "Error while compressing non-partitioned file : "
                        + inputFilePath, ex);
    } finally {
        try {
            if (compressionOutputStream != null) {
                compressionOutputStream.close();
            }
            if (fsDataInputStream != null) {
                fsDataInputStream.close();
            }
            if (fsDataOutputStream != null) {
                fsDataOutputStream.close();
            }
        } catch (IOException e1) {
            LOG.warn("Could not close necessary objects");
        }
    }
}

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题