hbase客户端批处理数据时出错。一开始还可以。过了一段时间就错了!详细错误是:
:1次,org.apache.hadoop.hbase.exceptions.failedsanitycheckexception:请求的行超出了hregion idcard上的范围,bfef6945ac273d83\x00\x00\x00\x00\x17\xcc$,1461584032622.dadb8843fe441dac4a3d7669597ef5.,startkey='bfef6945ac273d83\x00\x00\x00\x17\xcc$',getendkey()=',row='9a6ec957205e1d74\x00\x00\x00\x00\x01\x90\x1f\xf5'位于org.apache.hadoop.hbase.regionserver.rsrpcservices.dobatchop(rsrpcservices)。java:712)位于org.apache.hadoop.hbase.regionserver.rsrpcservices.dononatomicregionmutation(rsrpcservices)。java:662)在org.apache.hadoop.hbase.regionserver.rsrpcservices.multi(rsrpcservices。java:2046)在org.apache.hadoop.hbase.protobuf.generated.clientprotos$clientservice$2.callblockingmethod(clientprotos。java:32393)在org.apache.hadoop.hbase.ipc.rpcserver.call(rpcserver。java:2117)在org.apache.hadoop.hbase.ipc.callrunner.run(callrunner。java:104)在org.apache.hadoop.hbase.ipc.rpcexecutor.consumerloop(rpcexecutor。java:133)在org.apache.hadoop.hbase.ipc.rpcexecutor$1.run(rpcexecutor。java:108)在java.lang.thread.run(线程。java:745)
环境是:
糖化血红蛋白-1.1.3
hadoop2.6版本
hbase客户端1.2.0
hbase客户端的代码是:
public static void batchPutData(Connection connection, long startNum, long count) throws IOException, ParseException{
//table
Table table = connection.getTable(TableName.valueOf(TABLE_NAME));
//index table
Table index_table = connection.getTable(TableName.valueOf(INDEX_TABLE_NAME));
//random name
RandomChineseName randomChineseName = new RandomChineseName();
//random car
RandomCar randomCar = new RandomCar();
List<Put> puts = new ArrayList<Put>();
List<Put> indexPlateputs = new ArrayList<Put>();
for(long i = 0; i < count; i++){
long index = startNum+i;
Date birthdate = RandomUtils.randomDate();
String birthdateStr = DateUtil.dateToStr(birthdate, "yyyy-MM-dd");
boolean isBoy = i%2==0?true:false;
String name = isBoy?randomChineseName.randomBoyName():randomChineseName.randomGirlName();
String nation = RandomUtils.randomNation();
String plate = randomCar.randomPlate();
byte[] idbuff = Bytes.toBytes(index);
String hashPrefix = MD5Hash.getMD5AsHex(idbuff).substring(0, 16);
//create a put for table
Put put = new Put(Bytes.add(Bytes.toBytes(hashPrefix), idbuff));
put.addColumn(Bytes.toBytes("idcard"), Bytes.toBytes("name"), Bytes.toBytes(name));
put.addColumn(Bytes.toBytes("idcard"), Bytes.toBytes("sex"), Bytes.toBytes(isBoy?1:0));
put.addColumn(Bytes.toBytes("idcard"), Bytes.toBytes("birthdate"), Bytes.toBytes(birthdateStr));
put.addColumn(Bytes.toBytes("idcard"), Bytes.toBytes("nation"), Bytes.toBytes(nation));
put.addColumn(Bytes.toBytes("idcard"), Bytes.toBytes("plate"), Bytes.toBytes(plate));
puts.add(put);
//create a put for index table
String namehashPrefix = MD5Hash.getMD5AsHex(Bytes.toBytes(name)).substring(0, 16);
byte[] bprf = Bytes.add(Bytes.toBytes(namehashPrefix), Bytes.toBytes(name));
bprf = Bytes.add(bprf, Bytes.toBytes(SPLIT), Bytes.toBytes(birthdateStr));
Put namePut = new Put(Bytes.add(bprf, Bytes.toBytes(SPLIT), Bytes.toBytes(index)));
namePut.addColumn(Bytes.toBytes("index"), Bytes.toBytes("idcard"), Bytes.toBytes(0));
indexPlateputs.add(namePut);
//insert for every ten thousands
if(i%10000 == 0){
table.put(puts);
index_table.put(indexPlateputs);
puts.clear();
indexPlateputs.clear();
}
}
}
1条答案
按热度按时间6pp0gazn1#
似乎与hbase版本冲突。将hbase版本更改为1.1.4或1.0.0或其他稳定版本以进行尝试。