我需要根据hbase map reduce中的start和end epoc值(单位为long)过滤epoc值

qni6mghb  于 2021-06-10  发布在  Hbase
关注(0)|答案(1)|浏览(264)

我已经通过扩展 WritableByteArrayComparable 用于比较hbase中长值的类。这是代码供参考。

import org.apache.hadoop.hbase.filter.WritableByteArrayComparable;
import org.apache.hadoop.hbase.util.Bytes;

public class LongWritableComparable extends WritableByteArrayComparable {

public LongWritableComparable() {
    super();
}

public LongWritableComparable(byte[] value) {
    super(value);
}

public LongWritableComparable(Long value) {
    super(Bytes.toBytes(value));
}

@Override
public int compareTo(byte[] otherValue, int arg1, int arg2) {
    // TODO Auto-generated method stub
    byte[] thisValue = this.getValue();
    long thisLong = Bytes.toLong(thisValue);
    long otherLong = Bytes.toLong(otherValue,arg1,arg2);

    if (thisLong == otherLong) {
        return 0;
    }
    if (thisLong < otherLong) {
        return -1;
    }
    return 1;
  }

}

我在driver类中使用的比较器如下:

long endtimelongval = Long.valueOf(datetime.get("endDate").getMillis()).longValue();

LongWritableComparable etval=new LongWritableComparable(endtimelongval);

SingleColumnValueFilter eventCreationEndTimeFilter = new SingleColumnValueFilter(Bytes.toBytes("d"), Bytes.toBytes("et"), CompareOp.LESS,etval );

当我执行上述代码时,它抛出以下错误:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.filter.SingleColumnValueFilter.<init>([B[BLorg/apache/hadoop/hbase/filter/CompareFilter$CompareOp;Lorg/apache/hadoop/hbase/filter/WritableByteArrayComparable;)V

SingleColumnValueFilter 线路。
有人帮我解决吗。提前谢谢。

fwzugrvs

fwzugrvs1#

我找到了一种方法,不需要编写自定义比较器。getmillis()将以毫秒为单位返回,但在hbase表中,epoc值以秒为单位,因此我改为following,它可以正常工作。
long startongval=long.valueof(datetime.get(“startdate”).getmillis()).longvalue()/1000l;
singlecolumnvaluefilter eventcreationstarttimefilter=新建singlecolumnvaluefilter(bytes.tobytes(“d”),bytes.tobytes(“et”),compareop.greater,bytes.tobytes(startOngVal));

相关问题