ambari2.8+ambari-metrics3.0+bigtop3.2編譯、打包、安裝

zhangxuhui發表於2024-07-30
  • bigtop編譯
    • 資源說明:
      • 軟體及程式碼映象
      • 開發包映象
      • github訪問
    • 編譯相關知識
      • 技術知識
      • bigtop編譯流程及經驗總結
      • 各模組編譯難度及大概耗時(純編譯耗時,不包含下載檔案和排錯時間)
  • centos 真機編譯branch-3.2
    • 硬體說明:
    • 編譯步驟
      • 下載程式碼並切換分支
      • 國內映象配置
      • 基礎環境準備
        • 依賴環境安裝(yum)
        • 依賴環境配置
        • 國內映象配置|軟體全域性配置
      • 修改部分元件原始碼
        • 下載元件原始碼
        • 修改程式碼
          • 修改hadoop程式碼
          • 修改flink程式碼
          • 修改tez程式碼
          • 修改zeppelin程式碼
          • 修改spark原始碼
        • 重新打包元件原始碼
      • 整體編譯[不建議]
        • 整體編譯命令(在bigtop根目錄執行)
      • 逐步逐元件編譯
        • 先編譯一些周邊依賴元件:
        • zookeeper(子專案21個-6分鐘)
        • hadoop(子專案111個-48分+12分鐘)
        • hbase(子專案44個-11分+5分鐘+10分)
        • hive(子專案42個+9分鐘)
        • phoenix(子專案15個-14分鐘)
        • tez(子專案29個-4分鐘)
        • spark(子專案29個-49分+62分)
        • kafka(5分鐘)
        • flink(子專案207個-42分鐘)
        • solr
        • zeppelin
      • 編譯後磁碟佔用
    • branch-3.2 各元件編譯耗時[純編譯時間,不含下載相關包時間]
      • zookeeper(21個子專案-6分鐘-ok)
      • hadoop(111個子專案 60分鐘-ok)
      • hbase(44個子專案-30分鐘-ok)
      • tez(29個子專案-4分鐘)
      • Phoenix(15個子專案-14分鐘)
      • spark(子專案29個-2小時-ok)
      • flink(207個子專案-42分鐘-ok)
  • docker-centos 編譯
  • bigtop 原始碼分析
    • packages.gradle

bigtop編譯

資源說明:

軟體及程式碼映象

  • apache 程式碼及軟體釋出版本資源下載
    • 歷史包(幾乎所有版本):https://archive.apache.org/dist/
    • 歷史包(最近版本):https://apache.osuosl.org/
    • 國內歷史包(幾乎所有版本):https://repo.huaweicloud.com/apache/
    • 國內歷史包(最近版本):https://mirrors.aliyun.com/apache/
    • 騰訊雲:https://mirrors.cloud.tencent.com/
    • 清華大學映象站: https://mirrors.tuna.tsinghua.edu.cn/
  • 開發語言及相關軟體安裝包國內映象
    • gradle
      • https://mirrors.aliyun.com/macports/distfiles/ -> https://mirrors.aliyun.com/macports/distfiles/gradle/
      • https://mirrors.cloud.tencent.com/gradle/ -> https://mirrors.cloud.tencent.com/gradle/
    • nodejs:
      • https://nodejs.org/dist/
      • https://mirrors.tuna.tsinghua.edu.cn/nodejs-release/
      • https://mirrors.cloud.tencent.com/nodejs-release/
      • https://mirrors.aliyun.com/nodejs-release/
    • npm
      • 官方:https://registry.npmjs.org/npm/-/npm-6.9.0.tgz
      • 阿里雲:https://mirrors.aliyun.com/macports/distfiles/

開發包映象

  • jar包[maven映象]:

  • node包[npm、yarn映象]
    • 國內映象:https://registry.npmmirror.com
    • 使用:
      • npm config set registry https://registry.npmmirror.com
      • yarn config set registry https://registry.npmmirror.com
      • bower config set registry https://registry.npmmirror.com
  • bower:
    • 用途:1)可以直接從npm映象下載包並進行安裝 2) 可以git clone github釋出分支,進行安裝
    • 瞭解參考:https://www.oschina.net/p/bower?hmsr=aladdin1e1
    • 國外地址:https://bower.herokuapp.com
    • 國內可訪問映象地址:https://registry.bower.io
    • bower內建命令:
      • 檢視git分支: git ls-remote --tags --heads https://github.com/components/ember.git

github訪問

修改hosts檔案,以便訪問github; https://www.ip138.com/ 透過查詢域名,找到對應ip,將域名和ip填寫到hosts

140.82.112.4 github.com
199.232.69.194 github.global.ssl.fastly.net
185.199.108.133 raw.githubusercontent.com
185.199.109.133 raw.githubusercontent.com
185.199.110.133 raw.githubusercontent.com
185.199.111.133 raw.githubusercontent.com

編譯相關知識

技術知識

  • maven命令引數須知
    • 各種不同的跳過
      • -Drat.skip=true :RAT 外掛用於檢查原始碼中的許可證問題,確保所有的檔案都有適當的許可證宣告;此引數跳過許可檢查
      • -Dmaven.test.skip=true : 跳過測試編譯和測試執行階段。即它不僅會跳過測試的執行,還會跳過測試程式碼的編譯
      • -DskipTests:Maven 跳過測試的執行,但會編譯測試程式碼
      • 有的元件(如flink大部分子專案)需要編譯測試程式碼,有的編譯不需要編譯測試程式碼(io.confluent相關)。
    • maven日誌列印時間:-Dorg.slf4j.simpleLogger.showDateTime=true -Dorg.slf4j.simpleLogger.dateTimeFormat="yyyy-MM-dd HH:mm:ss.SSS
    • mavrn編譯失敗,處理完失敗原因(如網路不通),繼續從上次失敗的模組編譯。mvn <args> -rf: xxxx,起鬨xxx為上次失敗的模組名稱

bigtop編譯流程及經驗總結

  • 目錄說明
    • bigtop.bom: 定義了各個元件的版本
    • package.gradle: 定義了gradle編譯元件的整體流程
    • 元件配置目錄(簡稱packages):bigtop\bigtop-packages\src\common\XXX\,其中xxx為元件名稱,
      • do-component-build:一般在此將maven命令包裝為gradle命令
    • 元件原始碼包下載目錄(簡稱dl):bigtop\dl\xxx,其中xxx元件名稱
    • 元件編譯目錄(簡稱build):bigtop/build/xxx其中xxx元件名稱
      • rpm/SOURCES: 元件配置目錄下的檔案都將被copy到此,另外元件原始碼下載目錄下的原始碼包也被copy到此,
      • rpm/BUILD:元件原始碼將被解壓在此,並進行編譯;(注意每次編譯時,此目錄會重新生成,故編譯前更改此目錄檔案無效)
      • rpm/RPMS:元件打包後的rpm包存放位置;若此位置有檔案,gradle將跳過編譯
      • rpm/SRPMS:srpm包存放位置
      • tar:存放原始碼tar包
  • 元件下載編譯流程:建議詳細閱讀package.gradle
    • 檔案流轉:dl/spark-xxx [tar.gz|zip|tgz] -> build/spark/rpm/SOURCES/ [tar.gz&sourcecode] -> build/spark/rpm/BUILD/spark/spark-xxx[sourcecode] -> build/spark/rpm/RPMS/[src.rpm] -> output/spark/ [src.rpm]
      1. 下載元件原始碼包到dl目錄,若原始碼包或者build/xxx/.download已存在,則不再重新下載(可以利用此規則事先下載,節省時間,或者更改原始碼後,重新打包)
      1. 將dl目錄下的原始碼壓縮包解壓到build目錄;將packages目錄的配置資訊應用到build目錄。(只要沒有編譯成功,會重複此步驟,故修改解壓後的程式碼無效,會被解壓覆蓋)
      1. 在build目錄進行編譯;若發現配置沒有生效,則需要刪除build目錄,重新執行xxx-rpm gradle命令;如npm、yarn、bower映象配置沒有生效的時候
  • 編譯經驗
    • package目錄中,編譯前進行相關配置,尤其是更改為國內映象。
    • 原始碼包事先下載後放到dl中,部分壓縮包需要更改程式碼,重新打包
    • 若是前端專案編譯失敗,如tez-ui,可以定位到目錄,執行node包安裝命令,如 npm install,yarn install,bower install
      • 注意:請勿使用全域性npm、yarn、install,定位到區域性的,然後跟上全目錄執行
    • 若是maven編譯失敗,處理失敗原因後,優先使用 -rf: xxx,進行本模組及後續模組的編譯,等maven在整體編譯成功後,再使用gradle編譯整個元件
      • 注意maven具體命令可以透過檢視gradle編譯日誌中的獲取
    • gradle 編譯時,將日誌保留下來,以便失敗後,檢視原因及搜尋maven、和前端編譯命令及區域性npm、yarn、bower具體位置
  • 並行編譯:bigtop 3.2.0 貌似不支援並行編譯,3.3.0 才支援 --parallel --parallel-threads=N
  • 記錄日誌,以方便獲取mvn命令及問題排查:
    • gradle編譯:
      • ./gradlew tez-rpm -PparentDir=/usr/bigtop >> tez-rpm.log 2>> tez-rpm.log
    • maven編譯(請以gradle日誌中的mvn命令為準):
      • mvn package install -DskipTests -Dorg.slf4j.simpleLogger.showDateTime=true -Dorg.slf4j.simpleLogger.dateTimeFormat="yyyy-MM-dd HH:mm:ss.SSS" >> /soft/code/bigtop/spark-rpm.log 2>> /soft/code/bigtop/spark-rpm.log
    • 前端編譯:
      • npm install
      • yarn install
      • bower install --allow-root

各模組編譯難度及大概耗時(純編譯耗時,不包含下載檔案和排錯時間)

  • 元件名稱 編譯難度(5星) 純編譯耗時
  • bigtop-ambari-mpack * (16秒)[成功]
  • bigtop-groovy * (5秒)[成功]
  • bigtop-jsvc * (5秒)[成功]
  • bigtop-select * (5秒)[成功]
  • bigtop-utils * (5秒)[成功]
  • hadoop *** (1小時)[成功、無日誌]
  • zookeeper * (6分鐘)[成功、無日誌]
  • hive * (1小時) [成功]
  • hbase * (30分鐘)[成功]
  • phoenix * (14分鐘)[成功]
  • tez *** (4分鐘)[成功]
  • ranger (無此項)
  • solr *** (5分鐘)[成功]
  • kafka *** (5分鐘)[成功]
  • spark ***** (2小時)[成功,去掉本地R環境,但是sparkR不去掉]
  • flink ***** (42分鐘)[]
  • zeppelin ()[]

centos 真機編譯branch-3.2

  • 參考(強烈建議先通讀以下文章,以便了解)
    • BigTop3.2.0 大資料元件編譯--基礎環境準備:https://blog.csdn.net/m0_48319997/article/details/128037302
    • BigTop3.2.0 大資料元件編譯--元件編譯 https://blog.csdn.net/m0_48319997/article/details/128101667
    • ambari2.8.0+bigtop3.2.0發行版大資料平臺編譯指南:https://blog.csdn.net/m0_48319997/article/details/130046296
    • Ambari+Bigtop大資料平臺安裝部署指南(Centos7)一:https://blog.csdn.net/m0_48319997/article/details/130069050
    • Ambari+Bigtop大資料平臺安裝部署指南(Centos7)二:https://blog.csdn.net/m0_48319997/article/details/130233526

硬體說明:

  • 聯想筆記本W540:記憶體32G CPU4核8執行緒 固態硬碟1T
  • 編譯環境:vmware centos 7.9 2核4執行緒(佔整體物理機的一半) ,記憶體 12G,硬碟500G(建議至少100G,詳見編譯後磁碟佔用)
  • 說明:centos 的虛擬機器檔案放在固態硬碟分割槽中,以加快響應速度

編譯步驟

下載程式碼並切換分支

git clone https://gitee.com/piaolingzxh/bigtop.git
cd bigtop/
git checkout -b branch-3.2 origin/branch-3.2
  • 說明:
    • 本此編譯時間:2024-7-18~29日
    • branch-3.2 分支會不斷提交程式碼,你獲取的程式碼可能比我的新;編譯時,可能已經修復部分問題,但也可能出現新的問題。
    • 當前branch-3.2最新提交為2024-06-02。git hashcode 為3ffe75e05e8428f353a018aafc9c003be72ca6ff
    • branch3.2的程式碼其實已經包含了3.2.0和3.2.1release的程式碼,也就是比這兩個tag的程式碼更新。有部分commit比master分支還要更新。

國內映象配置

#修改bigtop/bigtop.bom配置 有兩處要修改(兩處全改或者改version=bigtop3.2.x對應的)
#1.修改映象源為國內映象源 103、104行
    APACHE_MIRROR = "https://repo.huaweicloud.com/apache"
    APACHE_ARCHIVE = "https://mirrors.aliyun.com/apache"
#2.解開bigtop-select元件的註釋 刪除273、281行

注意:部分材料可能在aliyun上照不到,需要換回原始地址
     APACHE_MIRROR = "https://apache.osuosl.org"
     APACHE_ARCHIVE = "https://archive.apache.org/dist"

基礎環境準備

依賴環境安裝(yum)

#安裝元件編譯所需的依賴
#1.hadoop依賴
yum -y install fuse-devel cmake cmake3 lzo-devel openssl-devel protobuf* cyrus-* 
#cmake預設版本改為cmake3
mv /usr/bin/cmake /usr/bin/cmake.bak
ln -s /usr/bin/cmake3 /usr/bin/cmake
#2.zookeeper依賴
yum -y install cppunit-devel
#3.spark依賴
yum -y install R* harfbuzz-devel fribidi-devel libcurl-devel libxml2-devel freetype-devel libpng-devel libtiff-devel libjpeg-turbo-devel pandoc* libgit2-devel
# 我的Rscript沒有安裝,且spark編譯時跳過了本地R語言的make
Rscript -e "install.packages(c('knitr', 'rmarkdown', 'devtools', 'testthat', 'e1071', 'survival'), repos='http://mirrors.tuna.tsinghua.edu.cn/CRAN/')"
Rscript -e "install.packages(c('devtools'), repos='http://mirrors.tuna.tsinghua.edu.cn/CRAN/')"
Rscript -e "install.packages(c('evaluate'), repos='https://mirrors.bfsu.edu.cn/CRAN/')"

# 本地makeR環境報錯如下
package 'evaluate' is not available (for R version 3.6.0) 
dependency 'evaluate' is not available for package 'knitr'
Rscript -e "install.packages(c('knitr'), repos='http://mirrors.tuna.tsinghua.edu.cn/CRAN/')"
Rscript -e "install.packages(c('evaluate'), repos='http://mirrors.tuna.tsinghua.edu.cn/CRAN/')"

依賴環境配置

  • github相關hosts
140.82.112.4 github.com
199.232.69.194 github.global.ssl.fastly.net
185.199.108.133 raw.githubusercontent.com
185.199.109.133 raw.githubusercontent.com
185.199.110.133 raw.githubusercontent.com
185.199.111.133 raw.githubusercontent.com

國內映象配置|軟體全域性配置

  • maven映象配置v 注意:~/.m2/settings.xml優先順序比程式目錄配置優先順序高
  • node映象配置: ~/.npmrc , npm config set registry https://registry.npmmirror.com
  • yarn映象配置: ~/.yarnrc
  • bower映象配置: ~/.bowerrc
~/.m2/settings.xml中mirrors節點下配置以下資訊,
<mirror>
  <id>aliyunmaven</id>
  <mirrorOf>*</mirrorOf>
  <name>阿里雲公共倉庫</name>
  <url>https://maven.aliyun.com/repository/public</url>
</mirror>
<mirror>
  <id>aliyunspring</id>
  <mirrorOf>*</mirrorOf>
  <name>阿里雲公共倉庫</name>
  <url>https://maven.aliyun.com/repository/spring</url>
</mirror>
# ~/.npmrc
registry=https://registry.npmmirror.com/
strict-ssl=false
~/.yarnrc
registry "https://registry.npmmirror.com/"
sass_binary_site "https://registry.npmmirror.com/node-sass/"
phantomjs_cdnurl "https://registry.npmmirror.com/phantomjs"
electron_mirror "https://registry.npmmirror.com/electron"
#sqlite3_binary_host_mirror "https://foxgis.oss-cn-shanghai.aliyuncs.com/"
#profiler_binary_host_mirror "https://npm.taobao.org/mirrors/node-inspector/"
#chromedriver_cdnurl "https://cdn.npm.taobao.org/dist/chromedriver"
# ~/.bowerrc
{
  "directory": "bower_components",
  "registry": "https://registry.bower.io",
  "analytics": false,
  "resolvers": [
    "bower-shrinkwrap-resolver-ext"
  ],
  "strict-ssl": false
}

修改部分元件原始碼

下載元件原始碼

#1.先下載
./gradlew tez-download zeppelin-download flink-download spark-download
#2.進入下載目錄
cd dl
#3.解壓這幾個tar
tar -zxvf hadoop-3.3.6.tar.gz
tar -zxvf apache-tez-0.10.2-src.tar.gz
tar -zxvf spark-3.2.3.tar.gz
tar -zxvf flink-1.15.3.tar.gz
tar -zxvf zeppelin-0.10.1.tar.gz

修改程式碼

  • 說明:修改程式碼請再轉到對應元件編譯節點,編譯節點可能會有細節補充
修改hadoop程式碼

-- dl/hadoop-3.3.6-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/pom.xml,在yarn install和bower innstall 之間新增一個節點如下

<execution>
  <phase>generate-resources</phase>
  <id>bower install moment-timezone</id>
  <configuration>
    <arguments>install moment-timezone=https://github.com/moment/moment-timezone.git#=0.5.1 --allow-root</arguments>
  </configuration>
  <goals>
    <goal>bower</goal>
  </goals>
</execution>
修改flink程式碼
1)node、npm版本及相關設定
vi flink-1.15.0/flink-runtime-web/pom.xml ​
在275行 nodeVersion改為v12.22.1​
在276行 npmVersion改為6.14.12
bigtop/dl/flink-1.15.3/flink-runtime-web/pom.xml
<arguments>ci --cache-max=0 --no-save ${npm.proxy}</arguments> 此行刪除,更改為下一行,注意沒了ci
<arguments> install -g --registry=https://registry.npmmirror.com --cache-max=0 --no-save</arguments>
2)註釋掉報錯的測試程式碼
cd dl/flink-1.15.3/flink-formats/flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/  
mv CachedSchemaCoderProviderTest.java CachedSchemaCoderProviderTest.java1
mv RegistryAvroFormatFactoryTest.java RegistryAvroFormatFactoryTest.java1
cd dl/flink-1.15.3/flink-end-to-end-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/
mv SQLClientSchemaRegistryITCase.java SQLClientSchemaRegistryITCase.java1 
3)io.confluent包無法下載的問題,手動下載,並安裝到本地倉庫()
wget https://packages.confluent.io/maven/io/confluent/common-config/6.2.2/common-config-6.2.2.jar ./
wget https://packages.confluent.io/maven/io/confluent/common-utils/6.2.2/common-utils-6.2.2.jar ./
wget https://packages.confluent.io/maven/io/confluent/kafka-avro-serializer/6.2.2/kafka-avro-serializer-6.2.2.jar ./
wget http://packages.confluent.io/maven/io/confluent/kafka-schema-registry-client/6.2.2/kafka-schema-registry-client-6.2.2.jar ./
# 安裝jar包到本地倉庫
mvn install:install-file -Dfile=/soft/ambari-develop/common-config-6.2.2.jar -DgroupId=io.confluent -DartifactId=common-config -Dversion=6.2.2 -Dpackaging=jar
mvn install:install-file -Dfile=/soft/ambari-develop/common-utils-6.2.2.jar -DgroupId=io.confluent -DartifactId=common-utils -Dversion=6.2.2 -Dpackaging=jar
mvn install:install-file -Dfile=/soft/ambari-develop/kafka-avro-serializer-6.2.2.jar -DgroupId=io.confluent -DartifactId=kafka-avro-serializer -Dversion=6.2.2 -Dpackaging=jar
mvn install:install-file -Dfile=/soft/ambari-develop/kafka-schema-registry-client-6.2.2.jar -DgroupId=io.confluent -DartifactId=kafka-schema-registry-client -Dversion=6.2.2 -Dpackaging=jar

npm install -g @angular/cli@13.0.0

修改tez程式碼
vi apache-tez-0.10.2-src/tez-ui/pom.xml 
在37行 allow-root-build改為--allow-root=true

bower國內映象配置:https://bower.herokuapp.com 改為https://registry.bower.io
- 涉及到的檔案:
  - bigtop\bigtop-packages\src\common\ambari\patch13-AMBARI-25946.diff
  - bigtop\bigtop-packages\src\common\tez\patch6-TEZ-4492.diff
修改zeppelin程式碼
vi zeppelin-0.10.1/pom.xml 
在209行plugin.gitcommitid.useNativeGit改為true
vi zeppelin-0.10.1/spark/pom.xml
在50行spark.src.download.url改為https://repo.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}.tgz
在53行spark.bin.download.url改為https://repo.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}-bin-without-hadoop.tgz
vi zeppelin-0.10.1/rlang/pom.xml
在41行spark.src.download.url改為https://repo.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}.tgz
在44行spark.bin.download.url改為https://repo.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}-bin-without-hadoop.tgz
vi zeppelin-0.10.1/flink/flink-scala-parent/pom.xml
在45行flink.bin.download.url改為https://repo.huaweicloud.com/apache/flink/flink-${flink.version}/flink-${flink.version}-bin-scala_${flink.scala.binary.version}.tgz
修改spark原始碼
cd bigtop/dl
vim spark-3.2.3/dev/make-distribution.sh
#BUILD_COMMAND=("$MVN" clean package -DskipTests $@)
BUILD_COMMAND=("$MVN" clean package -DskipTests -Dorg.slf4j.simpleLogger.showDateTime=true -Dorg.slf4j.simpleLogger.dateTimeFormat="yyyy-MM-dd HH:mm:ss.SSS" $@) #此行中新增列印時間引數
tar zcvf spark-3.2.3.tar.gz spark-3.2.3

重新打包元件原始碼

  • 注意:打包後的壓縮包,應與解壓前的壓縮包同名(含字尾)
tar zcvf hadoop-3.3.6.tar.gz hadoop-3.3.6-src
tar zcvf apache-tez-0.10.2-src.tar.gz apache-tez-0.10.2-src
tar zcvf spark-3.2.3.tar.gz spark-3.2.3
tar zcvf zeppelin-0.10.1.tar.gz zeppelin-0.10.1
tar zcvf flink-1.15.3.tar.gz flink-1.15.3

整體編譯[不建議]

  • 預計需要N小時以上,且中間可能報各種錯誤,建議逐個元件進行編譯,等所有元件編譯成功之後,再整體編譯
  • 編譯引數
    • allclean
    • -PparentDir=/usr/bigtop :打出來的rpm包的預設安裝位置,相當於ambari預設安裝路徑/usr/hdp
    • -Dbuildwithdeps=true: 編譯時,若依賴元件沒有編譯,先編譯依賴元件
    • -PpkgSuffix:
      • 編譯時的bigtop版本,類似於hdp版本號,如3.2.0,3.2.1,3.3.0;
      • 此版本號將體現在打包後的檔名重,編譯時務必帶上此引數,否則編譯後,ambari找不到對應的包。
      • 示例包名稱:hadoop_3_2_1-hdfs-namenode-3.3.6-1.el7.x86_64.rpm,其中3_2_1代表bigtop版本號
    • $component-rpm: component 代表元件名稱,如spark,hive,hbase等大資料元件
    • allclean: 刪除build/*、output/、/dist,請慎重操作,重新編譯會極其耗時,且由於被牆,可能出現以前沒出現的錯誤。建議使用此命令前,先將此三個目錄手動備份

特別強調
編譯時一定要新增-PpkgSuffix,否則打出的包不帶bigtop版本號,後期安裝大資料元件的時候,ambari無法識別,具體報錯如下:
Error getting repository data for BIGTOP-3.2.0, repository not found
No package found for hadoop_${stack_version}(expected name: hadoop_3_2_0)

整體編譯命令(在bigtop根目錄執行)
# 
./gradlew bigtop-groovy-rpm bigtop-jsvc-rpm bigtop-select-rpm bigtop-utils-rpm flink-rpm hadoop-rpm hbase-rpm hive-rpm kafka-rpm solr-rpm spark-rpm tez-rpm zeppelin-rpm zookeeper-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-full.log 2>> logs/bigtop-full.log

逐步逐元件編譯

先編譯一些周邊依賴元件:

  • bigtop-ambari-mpack-rpm
  • bigtop-groovy-rpm
  • bigtop-jsvc-rpm
  • bigtop-select-rpm
  • bigtop-utils-rpm
./gradlew bigtop-ambari-mpack-rpm bigtop-groovy-rpm bigtop-jsvc-rpm bigtop-select-rpm bigtop-utils-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-dep.log 2>> logs/bigtop-dep.log

zookeeper(子專案21個-6分鐘)

  • ./gradlew zookeeper-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-zookeeper.log 2>> logs/bigtop-zookeeper.log

hadoop(子專案111個-48分+12分鐘)

  • gradle編譯命令:./gradlew hadoop-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-hadoop.log 2>> logs/bigtop-hadoop.log
  • maven編譯命令示例:
cd /soft/code/bigtop/build/hadoop/rpm/BUILD/hadoop-3.3.6-src
mvn -Pdist -Pnative -Psrc -Pyarn-ui -Dtar -Dzookeeper.version=3.6.4 -Dhbase.profile=2.0 -DskipTests -DskipITs install -rf :hadoop-yarn-ui >> /soft/code/bigtop/logs/bigtop-hadoop-mvn.log 2>> /soft/code/bigtop/logs/bigtop-hadoop-mvn.log

問題點-解決方法

  • 1)cmake3:見上邊基礎環境配置
  • 2)hadoop-yarn-ui編譯
    • hadoop-yarn-ui:pom.xml -> 新增:bower install moment-timezone=https://hub.nuaa.cf/moment/moment-timezone.git#=0.5.1 --allow-root
  • 3)bower要訪問github.com,注意確認網路通暢

具體操作

  • 1)hadoop-yarn-ui編譯時,測試是否聯通github
  • git ls-remote --tags --heads https://github.com/moment/moment-timezone.git 可以使用此命令測試是否能夠正常訪問github分支資訊
  • 2)編譯時,若moment-timezone#0.5.1或者ember#2.8.0無法安裝,報ENORES component No version found that was able to satisfy
  • 則在hadoop-yarn-ui:pom.xml -> 在bow install命令之前新增:bower install moment-timezone=https://hub.nuaa.cf/moment/moment-timezone.git#=0.5.1 --allow-root,詳見hadoop程式碼修改
  • 前端命令位置:
    • node、yarn、bower命令執行的位置(以日誌中為準):bigtop/build/hadoop/rpm/BUILD/hadoop-3.3.6-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/component/webapp
    • 區域性node位置:webapp/node/yarn/dist/bin/yarn install
    • 區域性bower位置:webapp/node_modules/bower/bin/bower,若在root賬戶下編譯執行,需要新增 --allow-root

hbase(子專案44個-11分+5分鐘+10分)

  • ./gradlew hbase-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-hbase.log 2>> logs/bigtop-hbase.log

hive(子專案42個+9分鐘)

  • ./gradlew hive-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-hive.log 2>> logs/bigtop-hive.log

phoenix(子專案15個-14分鐘)

  • ./gradlew bigtop-ambari-mpack-rpm phoenix-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-phoenix.log 2>> logs/bigtop-phoenix.log

tez(子專案29個-4分鐘)

  • ./gradlew tez-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-tez.log 2>> logs/bigtop-tez.log

問題點|解決方法

  • 1)root編譯,apache-tez-0.10.2-src/tez-ui/pom.xml ,​在37行 allow-root-build改為--allow-root=true
  • 2)bower映象:https://bower.herokuapp.com 改為https://registry.bower.io
    • 涉及到的檔案:
      • bigtop\bigtop-packages\src\common\ambari\patch13-AMBARI-25946.diff
      • bigtop\bigtop-packages\src\common\tez\patch6-TEZ-4492.diff
  • 3)證書過期:codemirror#5.11.0 certificate has expired
    • export BOWER_STRCT_SSL=false
    • bower全域性配置:"strct_ssl" = false ,詳細見上

報錯資訊

  • 1)bower ember-cli-shims#0.0.6 ECONNREFUSED Request to https://bower.herokuapp.com/packages/ember-cli-shims failed: connect ECONNREFUSED 108.160.169.186:443
  • 解決方法:bower國內映象設定,見上
  • 2)bower codemirror#5.11.0 CERT_HAS_EXPIRED Request to https://registry.bower.io/packages/codemirror failed: certificate has expired
  • 解決方法:證書過期設定,見上

spark(子專案29個-49分+62分)

  • gradle 命令./gradlew spark-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-spark.log 2>> logs/bigtop-spark.log
  • maven命令(請以日誌中命令為準):mvn clean package -DskipTests -Divy.home=/root/.ivy2 -Dsbt.ivy.home=/root/.ivy2 -Duser.home=/root -Drepo.maven.org= -Dreactor.repo=file:///root/.m2/repository -Dhadoop.version=3.3.6 -Dyarn.version=3.3.6 -Pyarn -Phadoop-3.2 -Phive -Phive-thriftserver -Psparkr -Pkubernetes -Pscala-2.12 -Dguava.version=27.0-jre -DskipTests -Dorg.slf4j.simpleLogger.showDateTime=true -Dorg.slf4j.simpleLogger.dateTimeFormat="yyyy-MM-dd HH:mm:ss.SSS" >> /soft/code/bigtop/spark-rpm-mvn.log 2>> /soft/code/bigtop/spark-rpm-mvn.log
  • 說明:gradle先呼叫spark原始碼解壓目錄下dev/make-distribution.sh進行mvn clean package;然後再執行maven install,花費時間為單次的2倍。純編譯時長總共大約在150分鐘左右

問題點|解決方法

  • 1)SparkR能夠正常打包,但是本地MAKE_R一直失敗,可以考慮取消本地make_r
  • 解決方法:vim bigtop\bigtop-packages\src\common\spark\do-component-build 中./dev/make-distribution.sh --mvn mvn --r $BUILD_OPTS -DskipTests # 此行中去掉 --r
  • 2)scala、java混合編譯耗時較長,maven編譯日誌,顯示時間
  • vim dl/spark-3.2.3/dev/make-distribution.sh
  • BUILD_COMMAND=("$MVN" clean package -DskipTests $@) 註釋掉本行

  • BUILD_COMMAND=("$MVN" clean package -DskipTests -Dorg.slf4j.simpleLogger.showDateTime=true -Dorg.slf4j.simpleLogger.dateTimeFormat="yyyy-MM-dd HH:mm:ss.SSS" $@) #此行中新增列印時間引數
  • vim bigtop\bigtop-packages\src\common\spark\do-component-build
  • mvn $BUILD_OPTS install -DskipTests=$SPARK_SKIP_TESTS -Dorg.slf4j.simpleLogger.showDateTime=true -Dorg.slf4j.simpleLogger.dateTimeFormat="yyyy-MM-dd HH:mm:ss.SSS" #此行中新增列印時間引數

報錯資訊

  • 1)Error in loadVignetteBuilder(pkgdir, TRUE) : vignette builder 'knitr' not found
  • 解決方法:去掉本地make_r

kafka(5分鐘)

  • ./gradlew kafka-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-kafka.log 2>> logs/bigtop-kafka.log

注意點

  • 1)訪問:raw.githubusercontent.com
  • 解決方法:hosts檔案配置(見上)

flink(子專案207個-42分鐘)

  • ./gradlew flink-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-flink.log 2>> logs/bigtop-flink.log

  • 注意:此專案不能跳過測試程式碼編譯
    問題點|解決方法

  • 1)io.confluent相關jar包無法下載,

  • 解決方法:手動下載,並註冊到本地倉庫,詳見上

  • 2)測試程式碼出錯:

    • CachedSchemaCoderProviderTest.java
    • RegistryAvroFormatFactoryTest.java
    • SQLClientSchemaRegistryITCase.java
    1. ndoe、npm 下載失敗,
    • Downloading https://nodejs.org/dist/v16.13.2/node-v16.13.2-linux-x64.tar.gz to /root/.m2/repository/com/github/eirslett/node/16.13.2/node-16.13.2-linux-x64.tar.gz
    • Downloading https://registry.npmjs.org/npm/-/npm-8.1.2.tgz to /root/.m2/repository/com/github/eirslett/npm/8.1.2/npm-8.1.2.tar.gz
    • 解決方案:手動下載,並放到下載目錄(注意字尾名保持一致)
    • mv /home/zxh/soft/ambari-develop/node-v16.13.2-linux-x64.tar.gz /root/.m2/repository/com/github/eirslett/node/16.13.2/node-16.13.2-linux-x64.tar.gz
    • mv /home/zxh/soft/ambari-develop/npm-8.1.2.tgz /root/.m2/repository/com/github/eirslett/npm/8.1.2/npm-8.1.2.tar.gz

solr

  • ./gradlew solr-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-solr.log 2>> logs/bigtop-solr.log
  • 注意點:
    • 1)此專案使用ivy編譯,強烈建議配置國內源

zeppelin

  • ./gradlew zeppelin-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/bigtop-zeppelin.log 2>> logs/bigtop-zeppelin.log
#下載zeppelin原始碼包
​./gradlew zeppelin-download
​#解壓zeppelin原始碼
​cd dl
​tar -zxvf zeppelin-0.10.1.tar.gz
​#修改pom檔案
​vi zeppelin-0.10.1/pom.xml 
​在209行plugin.gitcommitid.useNativeGit改為true
​vi zeppelin-0.10.1/spark/pom.xml
​在50行spark.src.download.url改為https://repo.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}.tgz
​在53行spark.bin.download.url改為https://repo.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}-bin-without-hadoop.tgz
​vi zeppelin-0.10.1/rlang/pom.xml
​在41行spark.src.download.url改為https://repo.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}.tgz
​在44行spark.bin.download.url改為https://repo.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}-bin-without-hadoop.tgz
​vi zeppelin-0.10.1/flink/flink-scala-parent/pom.xml
​在45行flink.bin.download.url改為https://repo.huaweicloud.com/apache/flink/flink-${flink.version}/flink-${flink.version}-bin-scala_${flink.scala.binary.version}.tgz
​#重新打包zeppelin原始碼
​tar -zcvf zeppelin-0.10.1.tar.gz zeppelin-0.10.1
​#編譯
​./gradlew zeppelin-rpm -PparentDir=/usr/bigtop

問題:
[INFO] Downloading https://registry.npmjs.org/npm/-/npm-6.9.0.tgz to /root/.m2/repository/com/github/eirslett/npm/6.9.0/npm-6.9.0.tar.gz
I/O exception (java.net.SocketException) caught when processing request to {s}->https://registry.npmjs.org:443: Connection reset
解決:去阿里雲、華為雲下載,放到上述目標位置,注意檔案重新命名
https://mirrors.aliyun.com/macports/distfiles/npm6/npm-6.9.0.tgz
mvn -Dhadoop3.2.version=3.3.6 -Dlivy.version=0.7.1-incubating -Pscala-2.11 -Phadoop3 -Pbuild-distr -DskipTests clean package -pl '!beam,!hbase,!pig,!jdbc,!flink,!ignite,!kylin,!lens,!cassandra,!elasticsearch,!bigquery,!alluxio,!scio,!groovy,!sap,!java,!geode,!neo4j,!hazelcastjet,!submarine,!sparql,!mongodb,!ksql,!scalding' -am -rf :zeppelin-client-examples  >> /soft/code/bigtop/logs/bigtop-zeppelin.log 2>> /soft/code/bigtop/logs/bigtop-zeppelin.log    
問題:Bower resolver not found: bower-shrinkwrap-resolver-ext
https://www.cnblogs.com/jameBo/p/10615444.html

編譯後磁碟佔用

  • 磁碟佔用:
    • bigtop 16G
    • ambari 2.1G
    • ambari 6G
    • 公共部分磁碟佔用
      • .m2: 16G
      • .ant 2M
      • .ivy2 200M
      • .gradle 1.2g
      • .nvm 75m
      • .npm 400M
      • .cache 400M
  • 總計磁碟佔用:50G+

branch-3.2 各元件編譯耗時[純編譯時間,不含下載相關包時間]

  • 注:不出錯的情況下

zookeeper(21個子專案-6分鐘-ok)

mvn clean install -DskipTests -Pfull-build
[INFO] Reactor Summary for Apache ZooKeeper 3.6.4:
[INFO] 
[INFO] Apache ZooKeeper ................................... SUCCESS [ 19.276 s]
[INFO] Apache ZooKeeper - Documentation ................... SUCCESS [ 13.275 s]
[INFO] Apache ZooKeeper - Jute ............................ SUCCESS [ 24.500 s]
[INFO] Apache ZooKeeper - Server .......................... SUCCESS [ 40.075 s]
[INFO] Apache ZooKeeper - Metrics Providers ............... SUCCESS [  9.652 s]
[INFO] Apache ZooKeeper - Prometheus.io Metrics Provider .. SUCCESS [ 11.991 s]
[INFO] Apache ZooKeeper - Client .......................... SUCCESS [  9.944 s]
[INFO] Apache ZooKeeper - Client - C ...................... SUCCESS [ 43.346 s]
[INFO] Apache ZooKeeper - Recipes ......................... SUCCESS [  9.404 s]
[INFO] Apache ZooKeeper - Recipes - Election .............. SUCCESS [  9.763 s]
[INFO] Apache ZooKeeper - Recipes - Lock .................. SUCCESS [  9.288 s]
[INFO] Apache ZooKeeper - Recipes - Queue ................. SUCCESS [  9.637 s]
[INFO] Apache ZooKeeper - Assembly ........................ SUCCESS [ 19.397 s]
[INFO] Apache ZooKeeper - Compatibility Tests ............. SUCCESS [  9.336 s]
[INFO] Apache ZooKeeper - Compatibility Tests - Curator ... SUCCESS [  8.593 s]
[INFO] Apache ZooKeeper - Tests ........................... SUCCESS [ 13.297 s]
[INFO] Apache ZooKeeper - Contrib ......................... SUCCESS [  9.006 s]
[INFO] Apache ZooKeeper - Contrib - Fatjar ................ SUCCESS [ 16.370 s]
[INFO] Apache ZooKeeper - Contrib - Loggraph .............. SUCCESS [ 16.161 s]
[INFO] Apache ZooKeeper - Contrib - Rest .................. SUCCESS [ 16.152 s]
[INFO] Apache ZooKeeper - Contrib - ZooInspector .......... SUCCESS [ 19.938 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  05:39 min
[INFO] Finished at: 2024-07-27T01:06:56+08:00
[INFO] ------------------------------------------------------------------------

hadoop(111個子專案 60分鐘-ok)

mvn -Pdist -Pnative -Psrc -Pyarn-ui -Dtar -Dzookeeper.version=3.6.4 -Dhbase.profile=2.0 -DskipTests -DskipITs install
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Hadoop Main 3.3.6:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [ 29.205 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  1.466 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  3.582 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  6.002 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.466 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  2.587 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  8.592 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  3.445 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 15.306 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  5.702 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:23 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 17.947 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 16.682 s]
[INFO] Apache Hadoop Registry ............................. SUCCESS [ 13.522 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.180 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 52.640 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [02:12 min]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [03:42 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 17.544 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 12.443 s]
[INFO] Apache Hadoop HDFS-RBF ............................. SUCCESS [ 54.272 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.151 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [  0.142 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [ 30.541 s]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [01:04 min]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [  0.151 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [ 24.234 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [01:04 min]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [ 11.881 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [ 14.836 s]
[INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [ 12.752 s]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 46.104 s]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [  9.362 s]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [ 16.279 s]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [ 11.508 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [ 11.695 s]
[INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [  0.116 s]
[INFO] Apache Hadoop YARN TimelineService HBase Common .... SUCCESS [ 12.285 s]
[INFO] Apache Hadoop YARN TimelineService HBase Client .... SUCCESS [ 20.129 s]
[INFO] Apache Hadoop YARN TimelineService HBase Servers ... SUCCESS [  0.181 s]
[INFO] Apache Hadoop YARN TimelineService HBase Server 2.0  SUCCESS [ 15.300 s]
[INFO] Apache Hadoop YARN TimelineService HBase tests ..... SUCCESS [ 12.552 s]
[INFO] Apache Hadoop YARN Router .......................... SUCCESS [ 13.316 s]
[INFO] Apache Hadoop YARN TimelineService DocumentStore ... SUCCESS [ 11.596 s]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [  0.131 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [ 13.113 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [  9.545 s]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [  6.801 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 14.716 s]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 18.491 s]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [ 11.763 s]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [ 22.129 s]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [ 16.263 s]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [ 17.511 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 10.385 s]
[INFO] Apache Hadoop YARN Services ........................ SUCCESS [  0.100 s]
[INFO] Apache Hadoop YARN Services Core ................... SUCCESS [ 12.803 s]
[INFO] Apache Hadoop YARN Services API .................... SUCCESS [ 10.791 s]
[INFO] Apache Hadoop YARN Application Catalog ............. SUCCESS [  0.148 s]
[INFO] Apache Hadoop YARN Application Catalog Webapp ...... SUCCESS [ 42.258 s]
[INFO] Apache Hadoop YARN Application Catalog Docker Image  SUCCESS [  3.458 s]
[INFO] Apache Hadoop YARN Application MaWo ................ SUCCESS [  0.125 s]
[INFO] Apache Hadoop YARN Application MaWo Core ........... SUCCESS [  9.859 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [  0.161 s]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [  4.323 s]
[INFO] Apache Hadoop YARN UI .............................. SUCCESS [05:15 min]
[INFO] Apache Hadoop YARN CSI ............................. SUCCESS [ 26.939 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [ 31.039 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [  9.021 s]
[INFO] Apache Hadoop MapReduce NativeTask ................. SUCCESS [01:08 min]
[INFO] Apache Hadoop MapReduce Uploader ................... SUCCESS [  8.951 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 15.563 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [ 10.572 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 15.810 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 13.630 s]
[INFO] Apache Hadoop Client Aggregator .................... SUCCESS [  8.084 s]
[INFO] Apache Hadoop Dynamometer Workload Simulator ....... SUCCESS [ 11.176 s]
[INFO] Apache Hadoop Dynamometer Cluster Simulator ........ SUCCESS [ 12.493 s]
[INFO] Apache Hadoop Dynamometer Block Listing Generator .. SUCCESS [ 10.631 s]
[INFO] Apache Hadoop Dynamometer Dist ..................... SUCCESS [ 15.383 s]
[INFO] Apache Hadoop Dynamometer .......................... SUCCESS [  0.080 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 10.879 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [ 11.712 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 15.125 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 13.488 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 11.425 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 12.758 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  5.375 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 40.051 s]
[INFO] Apache Hadoop Kafka Library support ................ SUCCESS [  7.605 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 21.646 s]
[INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [ 11.321 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 12.938 s]
[INFO] Apache Hadoop Resource Estimator Service ........... SUCCESS [ 13.073 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [ 10.130 s]
[INFO] Apache Hadoop Image Generation Tool ................ SUCCESS [ 14.060 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 44.517 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  0.169 s]
[INFO] Apache Hadoop Common Benchmark ..................... SUCCESS [ 20.616 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.124 s]
[INFO] Apache Hadoop Client API ........................... SUCCESS [02:44 min]
[INFO] Apache Hadoop Client Runtime ....................... SUCCESS [02:33 min]
[INFO] Apache Hadoop Client Packaging Invariants .......... SUCCESS [  3.377 s]
[INFO] Apache Hadoop Client Test Minicluster .............. SUCCESS [04:11 min]
[INFO] Apache Hadoop Client Packaging Invariants for Test . SUCCESS [  4.710 s]
[INFO] Apache Hadoop Client Packaging Integration Tests ... SUCCESS [  4.934 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [01:15 min]
[INFO] Apache Hadoop Client Modules ....................... SUCCESS [  0.072 s]
[INFO] Apache Hadoop Tencent COS Support .................. SUCCESS [ 12.562 s]
[INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [ 18.067 s]
[INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [  0.094 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  47:53 min
[INFO] Finished at: 2024-07-27T20:11:32+08:00
[INFO] ------------------------------------------------------------------------
mvn site site:stage -Dzookeeper.version=3.6.4 -Dhbase.profile=2.0 -DskipTests -DskipITs
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Hadoop Main 3.3.6:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [06:14 min]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  0.007 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  2.179 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  0.543 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.190 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.203 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  1.732 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.409 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  2.345 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.579 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [ 55.981 s]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  2.518 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [  3.182 s]
[INFO] Apache Hadoop Registry ............................. SUCCESS [  2.197 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.178 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 19.055 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [ 32.371 s]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [  0.991 s]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [  9.106 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  2.081 s]
[INFO] Apache Hadoop HDFS-RBF ............................. SUCCESS [  5.866 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.104 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [  0.161 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [ 11.008 s]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [ 11.283 s]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [  0.146 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [  5.983 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [  8.138 s]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [  1.694 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [  3.020 s]
[INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [  2.306 s]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 15.166 s]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [  2.273 s]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [  3.446 s]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [  1.712 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [  1.716 s]
[INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [  0.106 s]
[INFO] Apache Hadoop YARN TimelineService HBase Common .... SUCCESS [  1.774 s]
[INFO] Apache Hadoop YARN TimelineService HBase Client .... SUCCESS [  2.472 s]
[INFO] Apache Hadoop YARN TimelineService HBase Servers ... SUCCESS [  0.123 s]
[INFO] Apache Hadoop YARN TimelineService HBase Server 2.0  SUCCESS [  1.879 s]
[INFO] Apache Hadoop YARN TimelineService HBase tests ..... SUCCESS [  3.484 s]
[INFO] Apache Hadoop YARN Router .......................... SUCCESS [  2.241 s]
[INFO] Apache Hadoop YARN TimelineService DocumentStore ... SUCCESS [  1.670 s]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [  0.117 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [  2.072 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [  1.382 s]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [  0.562 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [  7.220 s]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [  3.748 s]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [  2.503 s]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [  5.192 s]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [  3.178 s]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [  5.272 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  1.769 s]
[INFO] Apache Hadoop YARN Services ........................ SUCCESS [  0.098 s]
[INFO] Apache Hadoop YARN Services Core ................... SUCCESS [  3.884 s]
[INFO] Apache Hadoop YARN Services API .................... SUCCESS [  2.252 s]
[INFO] Apache Hadoop YARN Application Catalog ............. SUCCESS [  0.096 s]
[INFO] Apache Hadoop YARN Application Catalog Webapp ...... SUCCESS [  4.489 s]
[INFO] Apache Hadoop YARN Application Catalog Docker Image  SUCCESS [  0.142 s]
[INFO] Apache Hadoop YARN Application MaWo ................ SUCCESS [  0.111 s]
[INFO] Apache Hadoop YARN Application MaWo Core ........... SUCCESS [  1.289 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [  2.222 s]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [  0.751 s]
[INFO] Apache Hadoop YARN UI .............................. SUCCESS [  0.107 s]
[INFO] Apache Hadoop YARN CSI ............................. SUCCESS [  5.498 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [  2.222 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [  2.176 s]
[INFO] Apache Hadoop MapReduce NativeTask ................. SUCCESS [  2.437 s]
[INFO] Apache Hadoop MapReduce Uploader ................... SUCCESS [  1.411 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  2.802 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [  0.645 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  2.216 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [  2.968 s]
[INFO] Apache Hadoop Client Aggregator .................... SUCCESS [  1.283 s]
[INFO] Apache Hadoop Dynamometer Workload Simulator ....... SUCCESS [  1.399 s]
[INFO] Apache Hadoop Dynamometer Cluster Simulator ........ SUCCESS [  1.998 s]
[INFO] Apache Hadoop Dynamometer Block Listing Generator .. SUCCESS [  1.356 s]
[INFO] Apache Hadoop Dynamometer Dist ..................... SUCCESS [  1.456 s]
[INFO] Apache Hadoop Dynamometer .......................... SUCCESS [  0.166 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  1.999 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [  1.946 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  2.649 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  2.565 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  1.736 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  1.759 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  0.097 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [  7.156 s]
[INFO] Apache Hadoop Kafka Library support ................ SUCCESS [  1.221 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [  5.531 s]
[INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [  1.532 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  1.862 s]
[INFO] Apache Hadoop Resource Estimator Service ........... SUCCESS [  1.758 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [  1.407 s]
[INFO] Apache Hadoop Image Generation Tool ................ SUCCESS [  1.865 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [  1.174 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  0.080 s]
[INFO] Apache Hadoop Common Benchmark ..................... SUCCESS [  1.178 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.083 s]
[INFO] Apache Hadoop Client API ........................... SUCCESS [  1.051 s]
[INFO] Apache Hadoop Client Runtime ....................... SUCCESS [  1.091 s]
[INFO] Apache Hadoop Client Packaging Invariants .......... SUCCESS [  0.127 s]
[INFO] Apache Hadoop Client Test Minicluster .............. SUCCESS [  1.269 s]
[INFO] Apache Hadoop Client Packaging Invariants for Test . SUCCESS [  0.141 s]
[INFO] Apache Hadoop Client Packaging Integration Tests ... SUCCESS [  0.125 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [  0.697 s]
[INFO] Apache Hadoop Client Modules ....................... SUCCESS [  0.177 s]
[INFO] Apache Hadoop Tencent COS Support .................. SUCCESS [  1.327 s]
[INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [  0.698 s]
[INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [  0.070 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  12:05 min
[INFO] Finished at: 2024-07-27T20:23:39+08:00
[INFO] ------------------------------------------------------------------------

hbase(44個子專案-30分鐘-ok)

mvn -Phadoop-3.0 -Dhadoop-three.version=3.3.6 -Dhadoop.guava.version=27.0-jre -Djetty.version=9.3.29.v20201019 -Dzookeeper.version=3.6.4 -DskipTests -Dcheckstyle.skip=true -Dadditionalparam=-Xdoclint:none clean install

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache HBase 2.4.17:
[INFO] 
[INFO] Apache HBase ....................................... SUCCESS [  9.339 s]
[INFO] Apache HBase - Checkstyle .......................... SUCCESS [  1.467 s]
[INFO] Apache HBase - Annotations ......................... SUCCESS [  3.171 s]
[INFO] Apache HBase - Build Configuration ................. SUCCESS [  0.350 s]
[INFO] Apache HBase - Logging ............................. SUCCESS [  1.649 s]
[INFO] Apache HBase - Shaded Protocol ..................... SUCCESS [ 41.400 s]
[INFO] Apache HBase - Common .............................. SUCCESS [ 18.882 s]
[INFO] Apache HBase - Metrics API ......................... SUCCESS [  7.207 s]
[INFO] Apache HBase - Hadoop Compatibility ................ SUCCESS [  7.854 s]
[INFO] Apache HBase - Metrics Implementation .............. SUCCESS [  6.949 s]
[INFO] Apache HBase - Hadoop Two Compatibility ............ SUCCESS [  9.814 s]
[INFO] Apache HBase - Protocol ............................ SUCCESS [ 14.866 s]
[INFO] Apache HBase - Client .............................. SUCCESS [ 19.238 s]
[INFO] Apache HBase - Zookeeper ........................... SUCCESS [  8.541 s]
[INFO] Apache HBase - Replication ......................... SUCCESS [  8.806 s]
[INFO] Apache HBase - Resource Bundle ..................... SUCCESS [  0.763 s]
[INFO] Apache HBase - HTTP ................................ SUCCESS [ 13.134 s]
[INFO] Apache HBase - Asynchronous FileSystem ............. SUCCESS [ 11.537 s]
[INFO] Apache HBase - Procedure ........................... SUCCESS [  8.098 s]
[INFO] Apache HBase - Server .............................. SUCCESS [ 51.306 s]
[INFO] Apache HBase - MapReduce ........................... SUCCESS [ 15.222 s]
[INFO] Apache HBase - Testing Util ........................ SUCCESS [ 12.845 s]
[INFO] Apache HBase - Thrift .............................. SUCCESS [ 21.466 s]
[INFO] Apache HBase - RSGroup ............................. SUCCESS [ 12.444 s]
[INFO] Apache HBase - Shell ............................... SUCCESS [ 14.736 s]
[INFO] Apache HBase - Coprocessor Endpoint ................ SUCCESS [ 13.530 s]
[INFO] Apache HBase - Integration Tests ................... SUCCESS [ 15.545 s]
[INFO] Apache HBase - Rest ................................ SUCCESS [ 11.912 s]
[INFO] Apache HBase - Examples ............................ SUCCESS [ 15.016 s]
[INFO] Apache HBase - Shaded .............................. SUCCESS [  0.453 s]
[INFO] Apache HBase - Shaded - Client (with Hadoop bundled) SUCCESS [ 34.695 s]
[INFO] Apache HBase - Shaded - Client ..................... SUCCESS [ 19.313 s]
[INFO] Apache HBase - Shaded - MapReduce .................. SUCCESS [ 30.992 s]
[INFO] Apache HBase - External Block Cache ................ SUCCESS [  9.254 s]
[INFO] Apache HBase - HBTop ............................... SUCCESS [  9.288 s]
[INFO] Apache HBase - Assembly ............................ SUCCESS [ 34.417 s]
[INFO] Apache HBase - Shaded - Testing Util ............... SUCCESS [01:19 min]
[INFO] Apache HBase - Shaded - Testing Util Tester ........ SUCCESS [ 12.668 s]
[INFO] Apache HBase Shaded Packaging Invariants ........... SUCCESS [ 13.913 s]
[INFO] Apache HBase Shaded Packaging Invariants (with Hadoop bundled) SUCCESS [  9.286 s]
[INFO] Apache HBase - Archetypes .......................... SUCCESS [  0.113 s]
[INFO] Apache HBase - Exemplar for hbase-client archetype . SUCCESS [  9.375 s]
[INFO] Apache HBase - Exemplar for hbase-shaded-client archetype SUCCESS [ 11.940 s]
[INFO] Apache HBase - Archetype builder ................... SUCCESS [  0.774 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  10:57 min
[INFO] Finished at: 2024-07-27T21:08:15+08:00
[INFO] ------------------------------------------------------------------------

 mvn -Phadoop-3.0 -Dhadoop-three.version=3.3.6 -Dhadoop.guava.version=27.0-jre -Djetty.version=9.3.29.v20201019 -Dzookeeper.version=3.6.4 -DskipTests -Dcheckstyle.skip=true -Dadditionalparam=-Xdoclint:none site


[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache HBase 2.4.17:
[INFO] 
[INFO] Apache HBase ....................................... SUCCESS [04:46 min]
[INFO] Apache HBase - Checkstyle .......................... SUCCESS [  0.008 s]
[INFO] Apache HBase - Annotations ......................... SUCCESS [  0.012 s]
[INFO] Apache HBase - Build Configuration ................. SUCCESS [  0.007 s]
[INFO] Apache HBase - Logging ............................. SUCCESS [  0.010 s]
[INFO] Apache HBase - Shaded Protocol ..................... SUCCESS [  0.006 s]
[INFO] Apache HBase - Common .............................. SUCCESS [  0.023 s]
[INFO] Apache HBase - Metrics API ......................... SUCCESS [  0.019 s]
[INFO] Apache HBase - Hadoop Compatibility ................ SUCCESS [  0.018 s]
[INFO] Apache HBase - Metrics Implementation .............. SUCCESS [  0.015 s]
[INFO] Apache HBase - Hadoop Two Compatibility ............ SUCCESS [  0.020 s]
[INFO] Apache HBase - Protocol ............................ SUCCESS [  0.005 s]
[INFO] Apache HBase - Client .............................. SUCCESS [  0.028 s]
[INFO] Apache HBase - Zookeeper ........................... SUCCESS [  0.027 s]
[INFO] Apache HBase - Replication ......................... SUCCESS [  0.029 s]
[INFO] Apache HBase - Resource Bundle ..................... SUCCESS [  0.005 s]
[INFO] Apache HBase - HTTP ................................ SUCCESS [  0.057 s]
[INFO] Apache HBase - Asynchronous FileSystem ............. SUCCESS [  0.057 s]
[INFO] Apache HBase - Procedure ........................... SUCCESS [  0.023 s]
[INFO] Apache HBase - Server .............................. SUCCESS [  0.099 s]
[INFO] Apache HBase - MapReduce ........................... SUCCESS [  0.106 s]
[INFO] Apache HBase - Testing Util ........................ SUCCESS [  0.105 s]
[INFO] Apache HBase - Thrift .............................. SUCCESS [  0.094 s]
[INFO] Apache HBase - RSGroup ............................. SUCCESS [  0.119 s]
[INFO] Apache HBase - Shell ............................... SUCCESS [  0.116 s]
[INFO] Apache HBase - Coprocessor Endpoint ................ SUCCESS [  0.076 s]
[INFO] Apache HBase - Integration Tests ................... SUCCESS [  0.130 s]
[INFO] Apache HBase - Rest ................................ SUCCESS [  0.101 s]
[INFO] Apache HBase - Examples ............................ SUCCESS [  0.140 s]
[INFO] Apache HBase - Shaded .............................. SUCCESS [  0.004 s]
[INFO] Apache HBase - Shaded - Client (with Hadoop bundled) SUCCESS [  0.021 s]
[INFO] Apache HBase - Shaded - Client ..................... SUCCESS [  0.020 s]
[INFO] Apache HBase - Shaded - MapReduce .................. SUCCESS [  0.050 s]
[INFO] Apache HBase - External Block Cache ................ SUCCESS [  0.041 s]
[INFO] Apache HBase - HBTop ............................... SUCCESS [  0.025 s]
[INFO] Apache HBase - Assembly ............................ SUCCESS [  0.229 s]
[INFO] Apache HBase - Shaded - Testing Util ............... SUCCESS [  0.088 s]
[INFO] Apache HBase - Shaded - Testing Util Tester ........ SUCCESS [  0.105 s]
[INFO] Apache HBase Shaded Packaging Invariants ........... SUCCESS [  0.055 s]
[INFO] Apache HBase Shaded Packaging Invariants (with Hadoop bundled) SUCCESS [  0.017 s]
[INFO] Apache HBase - Archetypes .......................... SUCCESS [  0.003 s]
[INFO] Apache HBase - Exemplar for hbase-client archetype . SUCCESS [  0.094 s]
[INFO] Apache HBase - Exemplar for hbase-shaded-client archetype SUCCESS [  0.090 s]
[INFO] Apache HBase - Archetype builder ................... SUCCESS [  0.005 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  04:51 min
[INFO] Finished at: 2024-07-27T21:13:08+08:00
[INFO] ------------------------------------------------------------------------

mvn -Phadoop-3.0 -Dhadoop-three.version=3.3.6 -Dhadoop.guava.version=27.0-jre -Djetty.version=9.3.29.v20201019 -Dzookeeper.version=3.6.4 -DskipTests -Dcheckstyle.skip=true -Dadditionalparam=-Xdoclint:none package assembly:single
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache HBase 2.4.17:
[INFO] 
[INFO] Apache HBase ....................................... SUCCESS [  5.820 s]
[INFO] Apache HBase - Checkstyle .......................... SUCCESS [  1.286 s]
[INFO] Apache HBase - Annotations ......................... SUCCESS [  1.009 s]
[INFO] Apache HBase - Build Configuration ................. SUCCESS [  0.382 s]
[INFO] Apache HBase - Logging ............................. SUCCESS [  1.215 s]
[INFO] Apache HBase - Shaded Protocol ..................... SUCCESS [ 42.784 s]
[INFO] Apache HBase - Common .............................. SUCCESS [ 11.707 s]
[INFO] Apache HBase - Metrics API ......................... SUCCESS [  6.980 s]
[INFO] Apache HBase - Hadoop Compatibility ................ SUCCESS [  7.306 s]
[INFO] Apache HBase - Metrics Implementation .............. SUCCESS [  8.228 s]
[INFO] Apache HBase - Hadoop Two Compatibility ............ SUCCESS [  8.730 s]
[INFO] Apache HBase - Protocol ............................ SUCCESS [  9.967 s]
[INFO] Apache HBase - Client .............................. SUCCESS [ 10.097 s]
[INFO] Apache HBase - Zookeeper ........................... SUCCESS [  7.798 s]
[INFO] Apache HBase - Replication ......................... SUCCESS [  7.468 s]
[INFO] Apache HBase - Resource Bundle ..................... SUCCESS [  0.290 s]
[INFO] Apache HBase - HTTP ................................ SUCCESS [  9.759 s]
[INFO] Apache HBase - Asynchronous FileSystem ............. SUCCESS [ 10.864 s]
[INFO] Apache HBase - Procedure ........................... SUCCESS [  6.501 s]
[INFO] Apache HBase - Server .............................. SUCCESS [ 20.587 s]
[INFO] Apache HBase - MapReduce ........................... SUCCESS [ 11.919 s]
[INFO] Apache HBase - Testing Util ........................ SUCCESS [ 13.675 s]
[INFO] Apache HBase - Thrift .............................. SUCCESS [ 14.494 s]
[INFO] Apache HBase - RSGroup ............................. SUCCESS [ 11.399 s]
[INFO] Apache HBase - Shell ............................... SUCCESS [ 13.935 s]
[INFO] Apache HBase - Coprocessor Endpoint ................ SUCCESS [ 12.549 s]
[INFO] Apache HBase - Integration Tests ................... SUCCESS [ 14.051 s]
[INFO] Apache HBase - Rest ................................ SUCCESS [ 10.351 s]
[INFO] Apache HBase - Examples ............................ SUCCESS [ 14.691 s]
[INFO] Apache HBase - Shaded .............................. SUCCESS [  0.431 s]
[INFO] Apache HBase - Shaded - Client (with Hadoop bundled) SUCCESS [ 33.310 s]
[INFO] Apache HBase - Shaded - Client ..................... SUCCESS [ 18.827 s]
[INFO] Apache HBase - Shaded - MapReduce .................. SUCCESS [ 30.286 s]
[INFO] Apache HBase - External Block Cache ................ SUCCESS [  9.316 s]
[INFO] Apache HBase - HBTop ............................... SUCCESS [  8.675 s]
[INFO] Apache HBase - Assembly ............................ SUCCESS [02:01 min]
[INFO] Apache HBase - Shaded - Testing Util ............... SUCCESS [01:11 min]
[INFO] Apache HBase - Shaded - Testing Util Tester ........ SUCCESS [ 10.106 s]
[INFO] Apache HBase Shaded Packaging Invariants ........... SUCCESS [ 11.550 s]
[INFO] Apache HBase Shaded Packaging Invariants (with Hadoop bundled) SUCCESS [  8.374 s]
[INFO] Apache HBase - Archetypes .......................... SUCCESS [  0.099 s]
[INFO] Apache HBase - Exemplar for hbase-client archetype . SUCCESS [  9.118 s]
[INFO] Apache HBase - Exemplar for hbase-shaded-client archetype SUCCESS [ 11.979 s]
[INFO] Apache HBase - Archetype builder ................... SUCCESS [  0.576 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  10:44 min
[INFO] Finished at: 2024-07-27T21:23:53+08:00
[INFO] ------------------------------------------------------------------------

tez(29個子專案-4分鐘)

mvn clean package -Dtar -Dhadoop.version=3.3.6 -Phadoop28 -DskipTests
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for tez 0.10.2:
[INFO] 
[INFO] tez ................................................ SUCCESS [  2.515 s]
[INFO] hadoop-shim ........................................ SUCCESS [  3.211 s]
[INFO] tez-api ............................................ SUCCESS [ 10.015 s]
[INFO] tez-build-tools .................................... SUCCESS [  0.142 s]
[INFO] tez-common ......................................... SUCCESS [  2.432 s]
[INFO] tez-runtime-internals .............................. SUCCESS [  3.497 s]
[INFO] tez-runtime-library ................................ SUCCESS [  6.568 s]
[INFO] tez-mapreduce ...................................... SUCCESS [  4.433 s]
[INFO] tez-examples ....................................... SUCCESS [  2.743 s]
[INFO] tez-dag ............................................ SUCCESS [  9.771 s]
[INFO] tez-tests .......................................... SUCCESS [  4.260 s]
[INFO] tez-ext-service-tests .............................. SUCCESS [  3.605 s]
[INFO] tez-ui ............................................. SUCCESS [01:08 min]
[INFO] tez-plugins ........................................ SUCCESS [  0.064 s]
[INFO] tez-protobuf-history-plugin ........................ SUCCESS [  5.118 s]
[INFO] tez-yarn-timeline-history .......................... SUCCESS [  5.678 s]
[INFO] tez-yarn-timeline-history-with-acls ................ SUCCESS [  5.366 s]
[INFO] tez-yarn-timeline-cache-plugin ..................... SUCCESS [ 20.169 s]
[INFO] tez-yarn-timeline-history-with-fs .................. SUCCESS [  4.842 s]
[INFO] tez-history-parser ................................. SUCCESS [ 20.014 s]
[INFO] tez-aux-services ................................... SUCCESS [ 13.834 s]
[INFO] tez-tools .......................................... SUCCESS [  0.075 s]
[INFO] tez-perf-analyzer .................................. SUCCESS [  0.062 s]
[INFO] tez-job-analyzer ................................... SUCCESS [  3.651 s]
[INFO] tez-javadoc-tools .................................. SUCCESS [  1.221 s]
[INFO] hadoop-shim-impls .................................. SUCCESS [  0.048 s]
[INFO] hadoop-shim-2.8 .................................... SUCCESS [  0.988 s]
[INFO] tez-dist ........................................... SUCCESS [ 41.472 s]
[INFO] Tez ................................................ SUCCESS [  0.316 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  04:05 min
[INFO] Finished at: 2024-07-27T22:25:28+08:00
[INFO] ------------------------------------------------------------------------

Phoenix(15個子專案-14分鐘)

mvn -DskipTests -Dhadoop.version=3.3.6 -Dhbase.version=2.4.17 -Dhbase.profile=2.4 clean install
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Phoenix 5.1.3:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [  4.768 s]
[INFO] Phoenix Hbase 2.5.0 compatibility .................. SUCCESS [ 10.642 s]
[INFO] Phoenix Hbase 2.4.1 compatibility .................. SUCCESS [  4.776 s]
[INFO] Phoenix Hbase 2.4.0 compatibility .................. SUCCESS [  4.079 s]
[INFO] Phoenix Hbase 2.3.0 compatibility .................. SUCCESS [  4.855 s]
[INFO] Phoenix Hbase 2.2.5 compatibility .................. SUCCESS [  3.527 s]
[INFO] Phoenix Hbase 2.1.6 compatibility .................. SUCCESS [  3.922 s]
[INFO] Phoenix Core ....................................... SUCCESS [01:00 min]
[INFO] Phoenix - Pherf .................................... SUCCESS [ 13.966 s]
[INFO] Phoenix - Tracing Web Application .................. SUCCESS [ 13.353 s]
[INFO] Phoenix Client Parent .............................. SUCCESS [  0.034 s]
[INFO] Phoenix Client ..................................... SUCCESS [05:16 min]
[INFO] Phoenix Client Embedded ............................ SUCCESS [05:08 min]
[INFO] Phoenix Server JAR ................................. SUCCESS [01:00 min]
[INFO] Phoenix Assembly ................................... SUCCESS [ 26.774 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  13:58 min
[INFO] Finished at: 2024-07-28T13:05:51+08:00
[INFO] ------------------------------------------------------------------------

spark(子專案29個-2小時-ok)

./dev/make-distribution.sh --mvn mvn -Divy.home=/root/.ivy2 -Dsbt.ivy.home=/root/.ivy2 -Duser.home=/root -Drepo.maven.org= -Dreactor.repo=file:///root/.m2/repository -Dhadoop.version=3.3.6 -Dyarn.version=3.3.6 -Pyarn -Phadoop-3.2 -Phive -Phive-thriftserver -Psparkr -Pkubernetes -Pscala-2.12 -Dguava.version=27.0-jre -DskipTests
->
mvn package -DskipTests -Dorg.slf4j.simpleLogger.showDateTime=true -Dorg.slf4j.simpleLogger.dateTimeFormat=yyyy-MM-dd HH:mm:ss.SSS -Divy.home=/root/.ivy2 -Dsbt.ivy.home=/root/.ivy2 -Duser.home=/root -Drepo.maven.org= -Dreactor.repo=file:///root/.m2/repository -Dhadoop.version=3.3.6 -Dyarn.version=3.3.6 -Pyarn -Phadoop-3.2 -Phive -Phive-thriftserver -Psparkr -Pkubernetes -Pscala-2.12 -Dguava.version=27.0-jre -DskipTests

2024-07-27 23:18:30.330 [INFO] ------------------------------------------------------------------------
2024-07-27 23:18:30.330 [INFO] Reactor Summary for Spark Project Parent POM 3.2.3:
2024-07-27 23:18:30.330 [INFO] 
2024-07-27 23:18:30.331 [INFO] Spark Project Parent POM ........................... SUCCESS [  5.161 s]
2024-07-27 23:18:30.332 [INFO] Spark Project Tags ................................. SUCCESS [  9.465 s]
2024-07-27 23:18:30.332 [INFO] Spark Project Sketch ............................... SUCCESS [ 10.156 s]
2024-07-27 23:18:30.332 [INFO] Spark Project Local DB ............................. SUCCESS [  2.971 s]
2024-07-27 23:18:30.333 [INFO] Spark Project Networking ........................... SUCCESS [  8.322 s]
2024-07-27 23:18:30.333 [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  3.003 s]
2024-07-27 23:18:30.333 [INFO] Spark Project Unsafe ............................... SUCCESS [ 14.102 s]
2024-07-27 23:18:30.333 [INFO] Spark Project Launcher ............................. SUCCESS [  2.603 s]
2024-07-27 23:18:30.334 [INFO] Spark Project Core ................................. SUCCESS [04:39 min]
2024-07-27 23:18:30.334 [INFO] Spark Project ML Local Library ..................... SUCCESS [ 52.463 s]
2024-07-27 23:18:30.335 [INFO] Spark Project GraphX ............................... SUCCESS [01:04 min]
2024-07-27 23:18:30.335 [INFO] Spark Project Streaming ............................ SUCCESS [01:57 min]
2024-07-27 23:18:30.336 [INFO] Spark Project Catalyst ............................. SUCCESS [05:23 min]
2024-07-27 23:18:30.336 [INFO] Spark Project SQL .................................. SUCCESS [08:20 min]
2024-07-27 23:18:30.336 [INFO] Spark Project ML Library ........................... SUCCESS [06:02 min]
2024-07-27 23:18:30.336 [INFO] Spark Project Tools ................................ SUCCESS [ 17.453 s]
2024-07-27 23:18:30.336 [INFO] Spark Project Hive ................................. SUCCESS [03:53 min]
2024-07-27 23:18:30.337 [INFO] Spark Project REPL ................................. SUCCESS [ 58.167 s]
2024-07-27 23:18:30.337 [INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 14.265 s]
2024-07-27 23:18:30.337 [INFO] Spark Project YARN ................................. SUCCESS [02:02 min]
2024-07-27 23:18:30.338 [INFO] Spark Project Kubernetes ........................... SUCCESS [01:57 min]
2024-07-27 23:18:30.339 [INFO] Spark Project Hive Thrift Server ................... SUCCESS [01:58 min]
2024-07-27 23:18:30.339 [INFO] Spark Project Assembly ............................. SUCCESS [  6.100 s]
2024-07-27 23:18:30.339 [INFO] Kafka 0.10+ Token Provider for Streaming ........... SUCCESS [ 54.621 s]
2024-07-27 23:18:30.340 [INFO] Spark Integration for Kafka 0.10 ................... SUCCESS [01:15 min]
2024-07-27 23:18:30.341 [INFO] Kafka 0.10+ Source for Structured Streaming ........ SUCCESS [02:21 min]
2024-07-27 23:18:30.341 [INFO] Spark Project Examples ............................. SUCCESS [01:34 min]
2024-07-27 23:18:30.341 [INFO] Spark Integration for Kafka 0.10 Assembly .......... SUCCESS [ 14.814 s]
2024-07-27 23:18:30.341 [INFO] Spark Avro ......................................... SUCCESS [01:54 min]
2024-07-27 23:18:30.341 [INFO] ------------------------------------------------------------------------
2024-07-27 23:18:30.342 [INFO] BUILD SUCCESS
2024-07-27 23:18:30.342 [INFO] ------------------------------------------------------------------------
2024-07-27 23:18:30.342 [INFO] Total time:  49:01 min
2024-07-27 23:18:30.343 [INFO] Finished at: 2024-07-27T23:18:30+08:00
2024-07-27 23:18:30.343 [INFO] ------------------------------------------------------------------------

mvn -Divy.home=/root/.ivy2 -Dsbt.ivy.home=/root/.ivy2 -Duser.home=/root -Drepo.maven.org= -Dreactor.repo=file:///root/.m2/repository -Dhadoop.version=3.3.6 -Dyarn.version=3.3.6 -Pyarn -Phadoop-3.2 -Phive -Phive-thriftserver -Psparkr -Pkubernetes -Pscala-2.12 -Dguava.version=27.0-jre install -DskipTests=true -Dorg.slf4j.simpleLogger.showDateTime=true '-Dorg.slf4j.simpleLogger.dateTimeFormat=yyyy-MM-dd HH:mm:ss.SSS'
2024-07-28 00:20:33.546 [INFO] ------------------------------------------------------------------------
2024-07-28 00:20:33.546 [INFO] Reactor Summary for Spark Project Parent POM 3.2.3:
2024-07-28 00:20:33.546 [INFO] 
2024-07-28 00:20:33.547 [INFO] Spark Project Parent POM ........................... SUCCESS [  8.573 s]
2024-07-28 00:20:33.547 [INFO] Spark Project Tags ................................. SUCCESS [ 10.006 s]
2024-07-28 00:20:33.548 [INFO] Spark Project Sketch ............................... SUCCESS [  7.142 s]
2024-07-28 00:20:33.548 [INFO] Spark Project Local DB ............................. SUCCESS [  8.655 s]
2024-07-28 00:20:33.548 [INFO] Spark Project Networking ........................... SUCCESS [ 18.733 s]
2024-07-28 00:20:33.548 [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 18.477 s]
2024-07-28 00:20:33.548 [INFO] Spark Project Unsafe ............................... SUCCESS [  8.548 s]
2024-07-28 00:20:33.548 [INFO] Spark Project Launcher ............................. SUCCESS [  7.839 s]
2024-07-28 00:20:33.549 [INFO] Spark Project Core ................................. SUCCESS [05:28 min]
2024-07-28 00:20:33.549 [INFO] Spark Project ML Local Library ..................... SUCCESS [01:07 min]
2024-07-28 00:20:33.549 [INFO] Spark Project GraphX ............................... SUCCESS [01:22 min]
2024-07-28 00:20:33.549 [INFO] Spark Project Streaming ............................ SUCCESS [02:28 min]
2024-07-28 00:20:33.549 [INFO] Spark Project Catalyst ............................. SUCCESS [07:59 min]
2024-07-28 00:20:33.549 [INFO] Spark Project SQL .................................. SUCCESS [10:54 min]
2024-07-28 00:20:33.550 [INFO] Spark Project ML Library ........................... SUCCESS [07:34 min]
2024-07-28 00:20:33.550 [INFO] Spark Project Tools ................................ SUCCESS [  7.285 s]
2024-07-28 00:20:33.550 [INFO] Spark Project Hive ................................. SUCCESS [04:07 min]
2024-07-28 00:20:33.551 [INFO] Spark Project REPL ................................. SUCCESS [01:07 min]
2024-07-28 00:20:33.551 [INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 23.160 s]
2024-07-28 00:20:33.551 [INFO] Spark Project YARN ................................. SUCCESS [02:25 min]
2024-07-28 00:20:33.551 [INFO] Spark Project Kubernetes ........................... SUCCESS [02:23 min]
2024-07-28 00:20:33.551 [INFO] Spark Project Hive Thrift Server ................... SUCCESS [02:25 min]
2024-07-28 00:20:33.551 [INFO] Spark Project Assembly ............................. SUCCESS [  5.738 s]
2024-07-28 00:20:33.551 [INFO] Kafka 0.10+ Token Provider for Streaming ........... SUCCESS [01:07 min]
2024-07-28 00:20:33.552 [INFO] Spark Integration for Kafka 0.10 ................... SUCCESS [01:31 min]
2024-07-28 00:20:33.552 [INFO] Kafka 0.10+ Source for Structured Streaming ........ SUCCESS [02:46 min]
2024-07-28 00:20:33.552 [INFO] Spark Project Examples ............................. SUCCESS [02:29 min]
2024-07-28 00:20:33.552 [INFO] Spark Integration for Kafka 0.10 Assembly .......... SUCCESS [ 16.961 s]
2024-07-28 00:20:33.552 [INFO] Spark Avro ......................................... SUCCESS [02:18 min]
2024-07-28 00:20:33.552 [INFO] ------------------------------------------------------------------------
2024-07-28 00:20:33.553 [INFO] BUILD SUCCESS
2024-07-28 00:20:33.553 [INFO] ------------------------------------------------------------------------
2024-07-28 00:20:33.553 [INFO] Total time:  01:02 h
2024-07-28 00:20:33.554 [INFO] Finished at: 2024-07-28T00:20:33+08:00
2024-07-28 00:20:33.554 [INFO] ------------------------------------------------------------------------

flink(207個子專案-42分鐘-ok)

vn install -Drat.skip=true -DskipTests -Dhadoop.version=3.3.6
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Flink : 1.15.3:
[INFO] 
[INFO] Flink : ............................................ SUCCESS [  7.746 s]
[INFO] Flink : Annotations ................................ SUCCESS [  9.409 s]
[INFO] Flink : Architecture Tests ......................... SUCCESS [  0.393 s]
[INFO] Flink : Architecture Tests : Base .................. SUCCESS [  1.935 s]
[INFO] Flink : Test utils : ............................... SUCCESS [  0.334 s]
[INFO] Flink : Test utils : Junit ......................... SUCCESS [ 10.334 s]
[INFO] Flink : Metrics : .................................. SUCCESS [  0.805 s]
[INFO] Flink : Metrics : Core ............................. SUCCESS [  4.101 s]
[INFO] Flink : Core ....................................... SUCCESS [01:12 min]
[INFO] Flink : Table : .................................... SUCCESS [  0.385 s]
[INFO] Flink : Table : Common ............................. SUCCESS [ 22.496 s]
[INFO] Flink : Table : API Java ........................... SUCCESS [ 10.875 s]
[INFO] Flink : Java ....................................... SUCCESS [ 14.422 s]
[INFO] Flink : Connectors : ............................... SUCCESS [  0.227 s]
[INFO] Flink : Connectors : File Sink Common .............. SUCCESS [  1.395 s]
[INFO] Flink : RPC : ...................................... SUCCESS [  0.299 s]
[INFO] Flink : RPC : Core ................................. SUCCESS [  1.355 s]
[INFO] Flink : RPC : Akka ................................. SUCCESS [ 19.480 s]
[INFO] Flink : RPC : Akka-Loader .......................... SUCCESS [  7.192 s]
[INFO] Flink : Queryable state : .......................... SUCCESS [  0.329 s]
[INFO] Flink : Queryable state : Client Java .............. SUCCESS [  2.441 s]
[INFO] Flink : FileSystems : .............................. SUCCESS [  0.308 s]
[INFO] Flink : FileSystems : Hadoop FS .................... SUCCESS [  7.778 s]
[INFO] Flink : Runtime .................................... SUCCESS [02:13 min]
[INFO] Flink : Streaming Java ............................. SUCCESS [ 32.967 s]
[INFO] Flink : Table : API bridge base .................... SUCCESS [  1.123 s]
[INFO] Flink : Table : API Java bridge .................... SUCCESS [  2.388 s]
[INFO] Flink : Table : Code Splitter ...................... SUCCESS [  4.890 s]
[INFO] Flink : Optimizer .................................. SUCCESS [  8.934 s]
[INFO] Flink : Clients .................................... SUCCESS [  6.233 s]
[INFO] Flink : DSTL ....................................... SUCCESS [  0.231 s]
[INFO] Flink : DSTL : DFS ................................. SUCCESS [  1.950 s]
[INFO] Flink : State backends : ........................... SUCCESS [  0.222 s]
[INFO] Flink : State backends : RocksDB ................... SUCCESS [  4.026 s]
[INFO] Flink : State backends : Changelog ................. SUCCESS [  1.786 s]
[INFO] Flink : Test utils : Utils ......................... SUCCESS [  4.068 s]
[INFO] Flink : Libraries : ................................ SUCCESS [  0.225 s]
[INFO] Flink : Libraries : CEP ............................ SUCCESS [  7.600 s]
[INFO] Flink : Table : Runtime ............................ SUCCESS [ 18.623 s]
[INFO] Flink : Scala ...................................... SUCCESS [01:45 min]
[INFO] Flink : Table : SQL Parser ......................... SUCCESS [ 12.453 s]
[INFO] Flink : Table : SQL Parser Hive .................... SUCCESS [  6.941 s]
[INFO] Flink : Table : API Scala .......................... SUCCESS [ 26.600 s]
[INFO] Flink : Test utils : Connectors .................... SUCCESS [  3.125 s]
[INFO] Flink : Architecture Tests : Test .................. SUCCESS [  1.333 s]
[INFO] Flink : Connectors : Base .......................... SUCCESS [  3.567 s]
[INFO] Flink : Connectors : Files ......................... SUCCESS [  7.590 s]
[INFO] Flink : Examples : ................................. SUCCESS [  0.403 s]
[INFO] Flink : Examples : Batch ........................... SUCCESS [ 28.380 s]
[INFO] Flink : Connectors : Hadoop compatibility .......... SUCCESS [ 14.366 s]
[INFO] Flink : Tests ...................................... SUCCESS [01:20 min]
[INFO] Flink : Streaming Scala ............................ SUCCESS [01:04 min]
[INFO] Flink : Table : API Scala bridge ................... SUCCESS [ 21.147 s]
[INFO] Flink : Table : Planner ............................ SUCCESS [06:23 min]
[INFO] Flink : Formats : .................................. SUCCESS [  0.183 s]
[INFO] Flink : Format : Common ............................ SUCCESS [  0.461 s]
[INFO] Flink : Formats : Csv .............................. SUCCESS [  2.454 s]
[INFO] Flink : Formats : Hadoop bulk ...................... SUCCESS [  2.788 s]
[INFO] Flink : Formats : Orc .............................. SUCCESS [  3.154 s]
[INFO] Flink : Formats : Orc nohive ....................... SUCCESS [  2.782 s]
[INFO] Flink : Formats : Avro ............................. SUCCESS [  8.346 s]
[INFO] Flink : Formats : Parquet .......................... SUCCESS [ 17.089 s]
[INFO] Flink : Connectors : Hive .......................... SUCCESS [ 35.185 s]
[INFO] Flink : Python ..................................... SUCCESS [01:06 min]
[INFO] Flink : Table : SQL Client ......................... SUCCESS [  4.754 s]
[INFO] Flink : Connectors : AWS Base ...................... SUCCESS [  2.323 s]
[INFO] Flink : Connectors : Cassandra ..................... SUCCESS [  7.680 s]
[INFO] Flink : Formats : Json ............................. SUCCESS [  2.482 s]
[INFO] Flink : Connectors : Elasticsearch base ............ SUCCESS [  3.874 s]
[INFO] Flink : Connectors : Elasticsearch 6 ............... SUCCESS [  2.679 s]
[INFO] Flink : Connectors : Elasticsearch 7 ............... SUCCESS [  2.081 s]
[INFO] Flink : Connectors : Google PubSub ................. SUCCESS [  1.840 s]
[INFO] Flink : Connectors : HBase base .................... SUCCESS [  3.534 s]
[INFO] Flink : Connectors : HBase 1.4 ..................... SUCCESS [  7.560 s]
[INFO] Flink : Connectors : HBase 2.2 ..................... SUCCESS [  6.232 s]
[INFO] Flink : Connectors : JDBC .......................... SUCCESS [  5.505 s]
[INFO] Flink : Metrics : JMX .............................. SUCCESS [  0.838 s]
[INFO] Flink : Formats : Avro confluent registry .......... SUCCESS [  0.941 s]
[INFO] Flink : Connectors : Kafka ......................... SUCCESS [  9.958 s]
[INFO] Flink : Connectors : Amazon Kinesis Data Streams ... SUCCESS [  2.815 s]
[INFO] Flink : Connectors : Kinesis ....................... SUCCESS [ 32.955 s]
[INFO] Flink : Connectors : Nifi .......................... SUCCESS [  1.290 s]
[INFO] Flink : Connectors : Pulsar ........................ SUCCESS [ 22.619 s]
[INFO] Flink : Connectors : RabbitMQ ...................... SUCCESS [  1.688 s]
[INFO] Flink : Architecture Tests : Production ............ SUCCESS [  3.075 s]
[INFO] Flink : FileSystems : Hadoop FS shaded ............. SUCCESS [  7.840 s]
[INFO] Flink : FileSystems : S3 FS Base ................... SUCCESS [  2.241 s]
[INFO] Flink : FileSystems : S3 FS Hadoop ................. SUCCESS [ 14.848 s]
[INFO] Flink : FileSystems : S3 FS Presto ................. SUCCESS [01:40 min]
[INFO] Flink : FileSystems : OSS FS ....................... SUCCESS [ 24.308 s]
[INFO] Flink : FileSystems : Azure FS Hadoop .............. SUCCESS [ 29.502 s]
[INFO] Flink : FileSystems : Google Storage FS Hadoop ..... SUCCESS [ 35.837 s]
[INFO] Flink : Runtime web ................................ SUCCESS [04:48 min]
[INFO] Flink : Connectors : HCatalog ...................... SUCCESS [ 12.331 s]
[INFO] Flink : Connectors : Amazon Kinesis Data Firehose .. SUCCESS [  1.719 s]
[INFO] Flink : Connectors : SQL : Elasticsearch 6 ......... SUCCESS [  9.443 s]
[INFO] Flink : Connectors : SQL : Elasticsearch 7 ......... SUCCESS [ 12.243 s]
[INFO] Flink : Connectors : SQL : HBase 1.4 ............... SUCCESS [  8.191 s]
[INFO] Flink : Connectors : SQL : HBase 2.2 ............... SUCCESS [ 18.302 s]
[INFO] Flink : Connectors : SQL : Hive 1.2.2 .............. SUCCESS [ 14.127 s]
[INFO] Flink : Connectors : SQL : Hive 2.2.0 .............. SUCCESS [ 14.409 s]
[INFO] Flink : Connectors : SQL : Hive 2.3.6 .............. SUCCESS [ 13.488 s]
[INFO] Flink : Connectors : SQL : Hive 3.1.2 .............. SUCCESS [ 19.408 s]
[INFO] Flink : Connectors : SQL : Kafka ................... SUCCESS [  2.206 s]
[INFO] Flink : Connectors : SQL : Amazon Kinesis Data Streams SUCCESS [  3.958 s]
[INFO] Flink : Connectors : SQL : Amazon Kinesis Data Firehose SUCCESS [  3.819 s]
[INFO] Flink : Connectors : SQL : Kinesis ................. SUCCESS [ 12.975 s]
[INFO] Flink : Connectors : SQL : Pulsar .................. SUCCESS [  6.255 s]
[INFO] Flink : Connectors : SQL : RabbitMQ ................ SUCCESS [  0.450 s]
[INFO] Flink : Formats : Sequence file .................... SUCCESS [  1.351 s]
[INFO] Flink : Formats : Compress ......................... SUCCESS [  1.467 s]
[INFO] Flink : Formats : Avro AWS Glue Schema Registry .... SUCCESS [  3.423 s]
[INFO] Flink : Formats : JSON AWS Glue Schema Registry .... SUCCESS [  3.054 s]
[INFO] Flink : Formats : SQL Orc .......................... SUCCESS [  0.695 s]
[INFO] Flink : Formats : SQL Parquet ...................... SUCCESS [  1.107 s]
[INFO] Flink : Formats : SQL Avro ......................... SUCCESS [  1.702 s]
[INFO] Flink : Formats : SQL Avro Confluent Registry ...... SUCCESS [  1.819 s]
[INFO] Flink : Examples : Streaming ....................... SUCCESS [ 28.345 s]
[INFO] Flink : Examples : Table ........................... SUCCESS [ 17.437 s]
[INFO] Flink : Examples : Build Helper : .................. SUCCESS [  0.260 s]
[INFO] Flink : Examples : Build Helper : Streaming State machine SUCCESS [  1.097 s]
[INFO] Flink : Examples : Build Helper : Streaming Google PubSub SUCCESS [  6.774 s]
[INFO] Flink : Container .................................. SUCCESS [  0.685 s]
[INFO] Flink : Queryable state : Runtime .................. SUCCESS [  2.076 s]
[INFO] Flink : Dist-Scala ................................. SUCCESS [  3.018 s]
[INFO] Flink : Kubernetes ................................. SUCCESS [ 15.094 s]
[INFO] Flink : Yarn ....................................... SUCCESS [  4.876 s]
[INFO] Flink : Table : API Java Uber ...................... SUCCESS [  5.322 s]
[INFO] Flink : Table : Planner Loader Bundle .............. SUCCESS [  6.538 s]
[INFO] Flink : Table : Planner Loader ..................... SUCCESS [  4.900 s]
[INFO] Flink : Libraries : Gelly .......................... SUCCESS [  7.432 s]
[INFO] Flink : Libraries : Gelly scala .................... SUCCESS [ 38.454 s]
[INFO] Flink : Libraries : Gelly Examples ................. SUCCESS [ 21.924 s]
[INFO] Flink : External resources : ....................... SUCCESS [  0.153 s]
[INFO] Flink : External resources : GPU ................... SUCCESS [  0.445 s]
[INFO] Flink : Metrics : Dropwizard ....................... SUCCESS [  0.717 s]
[INFO] Flink : Metrics : Graphite ......................... SUCCESS [  0.533 s]
[INFO] Flink : Metrics : InfluxDB ......................... SUCCESS [  1.629 s]
[INFO] Flink : Metrics : Prometheus ....................... SUCCESS [  1.235 s]
[INFO] Flink : Metrics : StatsD ........................... SUCCESS [  0.576 s]
[INFO] Flink : Metrics : Datadog .......................... SUCCESS [  1.279 s]
[INFO] Flink : Metrics : Slf4j ............................ SUCCESS [  0.642 s]
[INFO] Flink : Libraries : CEP Scala ...................... SUCCESS [ 26.066 s]
[INFO] Flink : Libraries : State processor API ............ SUCCESS [  3.937 s]
[INFO] Flink : Dist ....................................... SUCCESS [ 26.987 s]
[INFO] Flink : Yarn Tests ................................. SUCCESS [ 11.118 s]
[INFO] Flink : E2E Tests : ................................ SUCCESS [  0.195 s]
[INFO] Flink : E2E Tests : CLI ............................ SUCCESS [  0.386 s]
[INFO] Flink : E2E Tests : Parent Child classloading program SUCCESS [  0.363 s]
[INFO] Flink : E2E Tests : Parent Child classloading lib-package SUCCESS [  0.243 s]
[INFO] Flink : E2E Tests : Dataset allround ............... SUCCESS [  0.454 s]
[INFO] Flink : E2E Tests : Dataset Fine-grained recovery .. SUCCESS [  0.532 s]
[INFO] Flink : E2E Tests : Datastream allround ............ SUCCESS [  1.913 s]
[INFO] Flink : E2E Tests : Batch SQL ...................... SUCCESS [  0.288 s]
[INFO] Flink : E2E Tests : Stream SQL ..................... SUCCESS [  0.352 s]
[INFO] Flink : E2E Tests : Distributed cache via blob ..... SUCCESS [  0.311 s]
[INFO] Flink : E2E Tests : High parallelism iterations .... SUCCESS [  9.859 s]
[INFO] Flink : E2E Tests : Stream stateful job upgrade .... SUCCESS [  0.861 s]
[INFO] Flink : E2E Tests : Queryable state ................ SUCCESS [  2.495 s]
[INFO] Flink : E2E Tests : Local recovery and allocation .. SUCCESS [  0.338 s]
[INFO] Flink : E2E Tests : Elasticsearch 6 ................ SUCCESS [  3.868 s]
[INFO] Flink : Quickstart : ............................... SUCCESS [  1.611 s]
[INFO] Flink : Quickstart : Java .......................... SUCCESS [  0.927 s]
[INFO] Flink : Quickstart : Scala ......................... SUCCESS [  0.390 s]
[INFO] Flink : E2E Tests : Quickstart ..................... SUCCESS [  0.698 s]
[INFO] Flink : E2E Tests : Confluent schema registry ...... SUCCESS [  3.283 s]
[INFO] Flink : E2E Tests : Stream state TTL ............... SUCCESS [  8.303 s]
[INFO] Flink : E2E Tests : SQL client ..................... SUCCESS [  1.151 s]
[INFO] Flink : E2E Tests : File sink ...................... SUCCESS [  0.492 s]
[INFO] Flink : E2E Tests : State evolution ................ SUCCESS [  1.045 s]
[INFO] Flink : E2E Tests : RocksDB state memory control ... SUCCESS [  0.867 s]
[INFO] Flink : E2E Tests : Common ......................... SUCCESS [  1.695 s]
[INFO] Flink : E2E Tests : Metrics availability ........... SUCCESS [  0.452 s]
[INFO] Flink : E2E Tests : Metrics reporter prometheus .... SUCCESS [  0.549 s]
[INFO] Flink : E2E Tests : Heavy deployment ............... SUCCESS [ 10.037 s]
[INFO] Flink : E2E Tests : Connectors : Google PubSub ..... SUCCESS [  1.107 s]
[INFO] Flink : E2E Tests : Streaming Kafka base ........... SUCCESS [  0.475 s]
[INFO] Flink : E2E Tests : Streaming Kafka ................ SUCCESS [  9.267 s]
[INFO] Flink : E2E Tests : Plugins : ...................... SUCCESS [  0.138 s]
[INFO] Flink : E2E Tests : Plugins : Dummy fs ............. SUCCESS [  0.198 s]
[INFO] Flink : E2E Tests : Plugins : Another dummy fs ..... SUCCESS [  0.216 s]
[INFO] Flink : E2E Tests : TPCH ........................... SUCCESS [  0.979 s]
[INFO] Flink : E2E Tests : Streaming Kinesis .............. SUCCESS [ 27.913 s]
[INFO] Flink : E2E Tests : Elasticsearch 7 ................ SUCCESS [  5.258 s]
[INFO] Flink : E2E Tests : Common Kafka ................... SUCCESS [  1.509 s]
[INFO] Flink : E2E Tests : TPCDS .......................... SUCCESS [  1.184 s]
[INFO] Flink : E2E Tests : Netty shuffle memory control ... SUCCESS [  0.310 s]
[INFO] Flink : E2E Tests : Python ......................... SUCCESS [  9.581 s]
[INFO] Flink : E2E Tests : HBase .......................... SUCCESS [  3.361 s]
[INFO] Flink : E2E Tests : Pulsar ......................... SUCCESS [  1.123 s]
[INFO] Flink : E2E Tests : Avro AWS Glue Schema Registry .. SUCCESS [  2.192 s]
[INFO] Flink : E2E Tests : JSON AWS Glue Schema Registry .. SUCCESS [  2.127 s]
[INFO] Flink : E2E Tests : Scala .......................... SUCCESS [ 11.629 s]
[INFO] Flink : E2E Tests : Kinesis SQL tests .............. SUCCESS [  0.717 s]
[INFO] Flink : E2E Tests : Kinesis Firehose SQL tests ..... SUCCESS [  0.666 s]
[INFO] Flink : E2E Tests : SQL ............................ SUCCESS [  0.558 s]
[INFO] Flink : State backends : Heap spillable ............ SUCCESS [  1.245 s]
[INFO] Flink : Table : Test Utils ......................... SUCCESS [  1.048 s]
[INFO] Flink : Contrib : .................................. SUCCESS [  0.134 s]
[INFO] Flink : Contrib : Connectors : Wikiedits ........... SUCCESS [  0.674 s]
[INFO] Flink : FileSystems : Tests ........................ SUCCESS [  1.941 s]
[INFO] Flink : Docs ....................................... SUCCESS [  3.384 s]
[INFO] Flink : Walkthrough : .............................. SUCCESS [  0.285 s]
[INFO] Flink : Walkthrough : Common ....................... SUCCESS [  2.847 s]
[INFO] Flink : Walkthrough : Datastream Java .............. SUCCESS [  0.316 s]
[INFO] Flink : Walkthrough : Datastream Scala ............. SUCCESS [  0.424 s]
[INFO] Flink : Tools : CI : Java .......................... SUCCESS [  0.584 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  41:53 min
[INFO] Finished at: 2024-07-27T00:58:18+08:00
[INFO] ------------------------------------------------------------------------

docker-centos 編譯

  • Bigtop 系列 - 如何使用 Bigtop 構建 RPM/DEB 包:https://www.bilibili.com/video/BV1DL411X7xZ
  • 下載編譯環境:
docker pull docker.fxxk.dedyn.io/bigtop/slaves:3.2.0-centos-7
# 將原生代碼目錄G:\OpenSource\Data\platform\bigtop對映到ws目錄
# 將maven本地倉庫檔案目錄對映到/root,方便之後maven、grandle、ant使用 
docker run -d -it -p 8000:8000 --network ambari -v G:\OpenSource\Data\platform\bigtop:/ws -v F:\docker\data\bigtop:/root --workdir /ws --name repo bigtop/slaves:3.2.0-centos-7


docker pull docker.fxxk.dedyn.io/bigtop/slaves:3.2.0-centos-7
docker pull docker.fxxk.dedyn.io/bigtop/slaves:trunk-centos-7
docker pull docker.fxxk.dedyn.io/bigtop/puppet:trunk-centos-7
docker pull mariadb:10.2
docker pull centos:7  
docker pull mysql:5.7

bigtop 原始碼分析

packages.gradle

  • task 列表
    • packages-help: All package build related tasks information
    • bom-json: List the components of the stack in json format
    • all-components: List the components of the stack
    • 單個元件:如zookeeper
      • ${component}-download: Download $component artifacts 壓縮包存在或者build/xxx/.dowwnload存在,則不重新下載
      • ${component}-tar: Preparing a tarball for $component artifacts,若build/xxx/.tar存在,則跳過,否則解壓到build/spark/tar/xxxx-xxx
      • $component-deb:Building DEB for $component artifacts
      • $component-sdeb:Building SDEB for $component artifacts
      • $component-rpm:Building RPM for $component artifacts 若/build/xxx/.rpm存在,則跳過;否則呼叫rpmbuild編譯,然後將build/spark/rpm/RPMS下的檔案複製到 output/zookeeper
      • $component-srpm:Building SRPM for $component artifacts;
      • $component-pkg:Invoking a native binary packaging component $ptype
      • $component-spkg:Invoking a native binary packaging component s$ptype
      • $component-pkg-ind: Invoking a native binary packaging for $component in Docker
      • $component-version: Show version of $component component
      • ${component}_vardefines: 變數定義
      • $component-info: Info about $component component build
      • $component-relnotes: Preparing release notes for $component. No yet implemented!!!
      • $component-clean: Removing $component component build and output directories
      • $component-help: List of available tasks for $component
    • 所有元件
      • srpm:Build all SRPM packages for the stack components
      • rpm: Build all RPM packages for the stack
      • sdeb:Build all SDEB packages for the stack components
      • deb: Build all DEB packages for the stack components
      • pkgs:Build all native packages for the stack components
      • pkgs-ind: Build all native packages for the stack components inside Docker
      • allclean:Removing $BUILD_DIR, $OUTPUT_DIR, and $DIST_DIR,Cleaning all components' build and output directories
      • realclean:Removing $DL_DIR
    • apt:Creating APT repository
    • yum: Creating YUM repository
    • repo:Invoking a native repository target $
    • repo-ind: Invoking a native repository in Docker

相關文章