HDFS解压含有多个文件的gz压缩文件,急救啊!
无边的绿波 2013-05-30 03:54:04 Java程序中解压一个hdfs上的一个gz后缀的压缩文件,当压缩文件中有多个文件的时候,会解压到一个文件里面造成数据的混乱,该如何处理。先行谢过,望大妞们指点迷津。
代码如下所示:
import java.io.InputStream;
import java.io.OutputStream;
import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.io.compress.CompressionCodec;
import org.apache.hadoop.io.compress.CompressionCodecFactory;
public class FileDecompressor {
public static void main(String[] args) throws Exception {
String uri = "hdfs://namenode:8020/user/wb/data/test/TempDir.gz";
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(uri), conf);
Path inputPath = new Path(uri);
CompressionCodecFactory factory = new CompressionCodecFactory(conf);
CompressionCodec codec = factory.getCodec(inputPath);
if (codec == null) {
System.err.println("No codec found for " + uri);
System.exit(1);
}
String outputUri = CompressionCodecFactory.removeSuffix(uri,
codec.getDefaultExtension());
FileSystem.get(URI.create(outputUri), conf) ;
InputStream in = null;
OutputStream out = null ;
try {
in = codec.createInputStream(fs.open(inputPath));
out = fs.create(new Path(outputUri));
IOUtils.copyBytes(in, out, conf);
} finally {
IOUtils.closeStream(in);
IOUtils.closeStream(out);
}
}
}
急救,求大神指点一二!