Hadoop與HBase中遇到的問題(續)java.io.IOException: Non-increasing Bloom keys異常

一隻鳥的天空發表於2014-05-30

在使用Bulkload向HBase匯入資料中, 自己編寫Map與使用KeyValueSortReducer生成HFile時, 出現了下面的異常:

java.io.IOException: Non-increasing Bloom keys: 201301025200000000000003520000000000000500 after 201311195100000000000000010000000000001600

    at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.appendGeneralBloomfilter(StoreFile.java:869)
    at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.append(StoreFile.java:905)
    at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat$1.write(HFileOutputFormat.java:180)
    at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat$1.write(HFileOutputFormat.java:136)
    at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:586)
    at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
    at org.apache.hadoop.hbase.mapreduce.KeyValueSortReducer.reduce(KeyValueSortReducer.java:53)
    at org.apache.hadoop.hbase.mapreduce.KeyValueSortReducer.reduce(KeyValueSortReducer.java:36)
    at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:177)
    at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:649)
    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:418)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)

    at org.apache.hadoop.mapred.Child.main(Child.java:249)


該異常在原始碼的StoreFile類中, 即在使用StoreFile類生成HFile檔案時丟擲異常, 根據控制檯異常資訊可以知道異常出現在原始碼StoreFile.java:905行處,此處是append方法,該方法呼叫appendGeneralBloomfilter方法,生成Bloom key, 原始碼為:

public static class HFileGenerateMapper extends
			Mapper<LongWritable, Text, ImmutableBytesWritable, KeyValue> {
		private static int familyIndex = 0;
		private static Configuration conf = null;
		private static MyMD5 md5 = new MyMD5();
		@Override
		protected void setup(Context context) throws IOException,
				InterruptedException {
			conf = context.getConfiguration();
			familyIndex = conf.getInt("familyIndex",0);
		}
		@Override
		protected void map(LongWritable key, Text value, Context context)
				throws IOException, InterruptedException {
			ImmutableBytesWritable mykey = new ImmutableBytesWritable(
					value.toString().split(",")[0].getBytes());
			List<KeyValue> list = null;
			list = createKeyValue(value.toString());
			Iterator<KeyValue> it = list.iterator();
			while (it.hasNext()) {
				KeyValue kv = new KeyValue();
				kv = it.next();
				if (kv != null) {
					  context.write(mykey, kv);
				}
			}

		}
		/**
		 * a.CITY_NO,to_char(DT,'yyyy-MM-dd'),DATA_TYPE,E0,E1,E2,E3,E4,E5,
		 * MEASUREPOINTID,TRANSFORMERID,ZONEID,CAPACITY
		 * @param str
		 * @return
		 */
		private List<KeyValue> createKeyValue(String str) {
			List<KeyValue> list = new ArrayList<KeyValue>(CONSTANT_HBASE.TB2_FNColNames[familyIndex].length);
			String[] values = str.toString().split(",");
			String[] qualifiersName = CONSTANT_HBASE.TB2_FNColNames[familyIndex];
			for (int i = 0; i < qualifiersName.length; i++) {
				//需要作為rowKey的各個欄位字串組成RowKey
				String rowkey = values[1]+values[0]+values[11]+values[12];
				//加上32位的MD5
				rowkey += md5.getMD5Code(rowkey);
				String family = CONSTANT_HBASE.TB2_FamilyNames[familyIndex];
				String qualifier = qualifiersName[i];
			    String value_str = values[i+CONSTANT_HBASE.TB2_FNColIndex[familyIndex]-1];

				KeyValue kv = new KeyValue(Bytes.toBytes(rowkey),
						Bytes.toBytes(family), Bytes.toBytes(qualifier),
						CONSTANT_HBASE.timeStamp, Bytes.toBytes(value_str));
				list.add(kv);
			}
			return list;
		}
	}

關鍵出錯的那一句在
ImmutableBytesWritable rowkey = new ImmutableBytesWritable(value.toString().split(",")[0].getBytes());
因為最終匯入RowKey的是由多個欄位的字串+32位的MD5值拼接而成的,但是生成ImmutableBytesWritable mykey卻只用到第一個欄位的字串,而這個key是用來全域性排序用的,所以需要mykey與KeyValue kv 的rowkey相等, 於是更改方法便是將map方法程式碼改成如下:

@Override
		protected void map(LongWritable key, Text value, Context context)
				throws IOException, InterruptedException {
			List<KeyValue> list = null;
			list = createKeyValue(value.toString());
			Iterator<KeyValue> it = list.iterator();
			while (it.hasNext()) {
				KeyValue kv = new KeyValue();
				kv = it.next();
				if (kv != null) {
					  <span style="color:#FF0000;">context.write(new ImmutableBytesWritable(kv.getKey()), kv);</span>
				}
			}

		}

執行之後成功了,可以通過http://localhost:50030/jobtracker.jsp檢視任務執行狀態.

相關文章