Flume1.6.0之Error-protobuf-This is supposed to be overridden by subclasses
1.場景:
Apache Flume: Agent 【http-->memory-->hdfs(CDH4)】 (http傳送請求,透過記憶體,然後寫到cdh4的hdfs上)
當前flume agent機器有【CDH4】環境的檔案(而沒有Apache hadoop環境),
故 JAVA_HOME=/opt/cloudera/parcels/CDH/lib/hadoop
2.錯誤:
2016-05-21 19:31:27,756 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:234)] Creating hdfs://alish1-dataservice-01.mypna.cn:8022/testwjp/2016-05-21/19/FlumeData.1463830281582.tmp
2016-05-21 19:31:27,791 (SinkRunner-PollingRunner-DefaultSinkProcessor) [ERROR - org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:459)] process failed
java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.
at com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
at org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$FsPermissionProto.getSerializedSize(HdfsProtos.java:5407)
at com.google.protobuf.CodedOutputStream.computeMessageSizeNoTag(CodedOutputStream.java:749)
at com.google.protobuf.CodedOutputStream.computeMessageSize(CodedOutputStream.java:530)
3.分析:
java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.
at com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
可能出現不同版本的protobuf包
4.驗證:
[root@xxx-01 ~]# find / -name protobuf*.jar
/usr/lib/hadoop-hdfs/protobuf-java-2.4.0a.jar
/usr/share/cmf/lib/cdh4/protobuf-java-2.4.0a.jar
/data/01/local/apache-flume-1.6.0-bin/lib/protobuf-java-2.5.0.jar
/data/01/local/apache-tomcat-7.0.42/webapps/logshedcollector/WEB-INF/lib/protobuf-java-2.4.0a.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/flume-ng/lib/protobuf-java-2.4.1.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/oozie/libtools/protobuf-java-2.4.0a.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hadoop-0.20-mapreduce/lib/protobuf-java-2.4.0a.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hbase/lib/protobuf-java-2.4.0a.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hadoop-httpfs/webapps/webhdfs/WEB-INF/lib/protobuf-java-2.4.0a.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/mahout/lib/protobuf-java-2.4.0a.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hadoop-hdfs/lib/protobuf-java-2.4.0a.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hadoop/client-0.20/protobuf-java-2.4.0a.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hadoop/client-0.20/protobuf-java.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hadoop/lib/protobuf-java-2.4.0a.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hadoop/client/protobuf-java-2.4.0a.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hadoop/client/protobuf-java.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hadoop-yarn/lib/protobuf-java-2.4.0a.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hcatalog/share/webhcat/svr/lib/protobuf-java-2.4.0a.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hadoop-mapreduce/lib/protobuf-java-2.4.0a.jar
果然有兩個版本:
/data/01/local/apache-flume-1.6.0-bin/lib/protobuf-java-2.5.0.jar
/opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hadoop/lib/protobuf-java-2.4.0a.jar
5.解決方案
根據 http://caiguangguang.blog.51cto.com/1652935/1592804
返回"This is supposed to be overridden by subclasses"這句話是 高版本protobuf-java-2.5.0.jar才會有的,
那麼將高版本jar包移除,重新啟動flume,模擬資料http請求,就能寫進hdfs了.
mv /data/01/local/apache-flume-1.6.0-bin/lib/protobuf-java-2.5.0.jar /data/01/local/apache-flume-1.6.0-bin/lib/protobuf-java-2.5.0.jar.bak
6.反思,有個擔憂
flume1.6.0 -->protobuf-java-2.5.0.jar
cdh4.8.6 -->protobuf-java-2.4.0a.jar
cdh5.4.8 -->protobuf-java-2.5.0.jar
把flume的protobuf-java-2.5.0.jar給移除了,相當於 flume使用了2.4.0包,
擔憂1.6.0版本flume的會使用2.5.0版本的函式方法,那麼到時就會拋錯,故就只能等著看了.
7.建議
假如使用CDH的話,個人建議最好使用配套的版本flume,
http://blog.itpub.net/30089851/viewspace-2092318/ 進cloudera檢視對應版本
CDH4.8.6 --》 flume 1.4.0
CDH5.4.8 --》 flume 1.5.0
假如使用Apache Hadoop的話,還是要配套的版本flume.
來自 “ ITPUB部落格 ” ,連結:http://blog.itpub.net/30089851/viewspace-2105023/,如需轉載,請註明出處,否則將追究法律責任。
相關文章
- Destiny 2: Beyond Light Unveils New Stasis Subclasses
- Hadoop2.7實戰v1.0之Flume1.6.0搭建(Http Source-->Memory Chanel --> Hdfs Sink)HadoopHTTP
- oracle之 RAC Interconnect之HAIPOracleAI
- 漢字之美,拼音之韻
- Python基礎之(三)之字典Python
- 設計模式之禪之代理模式設計模式
- 程式碼之美---遞迴之美遞迴
- 科學之抽象管理之提升抽象
- JavaScript之thisJavaScript
- 若之
- 深入Spring之IOC之載入BeanDefinitionSpringBean
- PHP之string之ord()函式使用PHP函式
- 《碼農翻身》之浪潮之巔的WebWeb
- vue 兄弟元件之間傳值之busVue元件
- 架構之爭,體制之惑(1)--產品經理模式之弊論架構模式
- 前端之資料模擬之Mock.js前端MockJS
- 軟體測試江湖之公會武器之爭
- Java常用資料結構之Set之TreeSetJava資料結構
- AI犯錯誰之過?切勿盲目相信之AI
- React之元件(component)之間的通訊React元件
- 計算機網路之HTTP之概況計算機網路HTTP
- 揭秘ORACLE備份之----RMAN之五(CATALOG)Oracle
- Android高階之十三講之安全方案Android
- QT之不同主機之間TCP通訊QTTCP
- 【Zookeeper】原始碼分析之持久化(二)之FileSnap原始碼持久化
- AMD and CMD are dead之KMD.js之懶JS
- 程式設計之美之買票找零程式設計
- Java物件之間的比較之equals和==Java物件
- 前端之HTML前端HTML
- Java 之 JDBCJavaJDBC
- css之定位CSS
- jQuery之documentFragmentjQueryFragment
- react之schedulerReact
- html之iframe,aHTML
- JS之DOMJS
- css之displayCSS
- 機器學習之皮毛機器學習
- .NET之WebAPIWebAPI