1 - 為什麼要自己編譯 Hadoop
一般個人安裝使用的都是 Apache 的 Hadoop(還有 CDH Hadoop等等)。
從 Apache 官網下載的安裝包是在一些特定的機器上編譯而來的,並不能相容所有的環境,尤其是本地庫(用來壓縮,支援C程式等等),不同平臺有不同的限制。
2 - 準備編譯環境
1)本機系統:macOS Big Sur 11.0.1版本;
保證能夠連線網際網路,Linux 系統,需要關閉防火牆和SELinux:
service iptables stop
chkconfig iptables off
# 關閉SELinux
vim /etc/selinux/config
# 註釋:SELINUX=enforcing
# 新增:SELINUX=disable
2)配置 JDK 環境變數,版本為1.8.0_162;
Linux 系統,一般要解除安裝掉系統自帶的 Java 環境:
# 檢視已安裝的版本:
rpm -qa | grep java
# 解除安裝:
rpm -e java-1.6.0-openjdk-1.6.0.41-1.13.13.1.el6_8.x86_64 java-1.7.0-openjdk-1.7.0.131-2.6.9.0.el6_8.x86_64
3)安裝 Maven,版本為 3.5.2;
為了加速依賴的下載,可以新增阿里雲的 Maven 映象:
<mirror>
<id>alimaven</id>
<name>aliyun maven repo</name>
<url>http://maven.aliyun.com/nexus/content/groups/public/</url>
<mirrorOf>central</mirrorOf>
</mirror>
4)上述軟體的環境變數資訊:
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home
export CLASSPATH=$JAVA_HOME/lib/tools.jar:$JAVA_HOME/lib/dt.jar:.
export PATH=$JAVA_HOME/bin:$PATH:.
export MAVEN_HOME=/usr/local/apache-maven-3.5.2
export PATH=$PATH:$MAVEN_HOME/bin
export HADOOP_HOME=/Users/healchow/bigdata/hadoop-3.2.1
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
3 - 安裝依賴庫
macOS 系統一般通過 Homebrew 安裝依賴,方便快捷。
建議把 Homebrew 的倉庫修改到國內,避免速度慢導致安裝失敗:https://zhuanlan.zhihu.com/p/351199589。
1)安裝 gcc,cmake,以及 GNU 相關庫:
brew install gcc cmake autoconf automake libtool
2)安裝 gzip、bzip2、zlib、snappy 等壓縮庫:
brew install gzip bzip2 zlib
手動安裝 snappy 1.1.4 —— 其他版本會出錯!
# 下載並解壓:
wget https://github.com/google/snappy/archive/1.1.4.tar.gz
tar -zxf 1.1.4.tar.gz
cd snappy-1.1.4
# 指定安裝路徑,便於 brew 連結(不指定,就會安裝到 /usr/local/bin)
./autogen.sh
./configure --prefix=/usr/local/Cellar/snappy/1.1.4
# 編譯並安裝到上面的路徑:
make && make install
# 新增到環境變數:
brew link snappy
3)安裝 openssl 依賴,並配置環境變數:
brew install openssl
向 ~/.bash_profile
中新增環境變數:
export OPENSSL_ROOT_DIR="/usr/local/opt/openssl@1.1"
export OPENSSL_INCLUDE_DIR="$OPENSSL_ROOT_DIR/include"
export PKG_CONFIG_PATH="${OPENSSL_ROOT_DIR}/lib/pkgconfig"
# 儲存後,令環境變數立即生效:
source ~/.bash_profile
4)手動安裝 protobuf 2.5.0:
必須安裝此版本,否則在編譯原始碼過程中,會報錯:
org.apache.maven.plugin.MojoExecutionException: protoc version is ‘libprotoc 3.11.2’, expected version is ‘2.5.0’
brew 源中是最新版,沒有 2.5.0,需要下載然後編譯安裝。
下載連結:https://github.com/protocolbuffers/protobuf/releases/tag/v2.5.0,解壓後,編譯安裝:
wget https://github.com/google/protobuf/releases/download/v2.5.0/protobuf-2.5.0.tar.gz
tar -zxf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
# 指定安裝路徑,便於 brew 連結(不指定,就會安裝到 /usr/local/bin)
./configure --prefix=/usr/local/Cellar/protobuf/2.5.0
# 編譯,並安裝到上面的路徑:
make && make install
# 新增到環境變數:
brew link protobuf
5)可選安裝 isa-l:
先安裝 nasm:brew install nasm
然後下載原始碼包(https://github.com/intel/isa-l/releases),編譯安裝:
cd isa-l-2.28.0
# 執行建立configure
autoreconf --install --symlink -f
./configure --prefix=/usr/local/Cellar/isa-l --libdir=/usr/local/Cellar/isa-l/lib AS=yasm --target=darwin
# 編譯安裝:
make && make install
# 建立軟連結:
cd /usr/local/lib
ln -s /usr/local/Cellar/isa-l/lib/libisal.2.dylib libisal.2.dylib
ln -s /usr/local/Cellar/isa-l/lib/libisal.a libisal.a
ln -s /usr/local/Cellar/isa-l/lib/libisal.dylib libisal.dylib
ln -s /usr/local/Cellar/isa-l/lib/libisal.la libisal.la
cd /usr/local/lib/pkgconfig
ln -s /usr/local/Cellar/isa-l/lib/pkgconfig/libisal.pc libisal.pc
4 - 編譯 Hadoop 原始碼
下載 Apache Hadoop 原始碼包,這裡下載 3.2.1 版本(https://archive.apache.org/dist/hadoop/core/hadoop-3.2.1/);
下載後,解壓到 ${HOME}/bigdata/
。
編譯原始碼,編譯命令是:
cd ${HOME}/bigdata/hadoop-3.2.1-src
# 編譯支援 snappy 壓縮,需要指定 openssl.prefix,否則預設使用 macOS 自帶的 openssl,會導致編譯失敗:
# -e -X 引數是列印編譯過程中的所有日誌:
mvn clean package -DskipTests -Pdist,native -Dmaven.javadoc.skip -Dtar \
-Drequire.bzip2 -Dbzip2.prefix=/usr/local/Cellar/bzip2/1.0.8 \
-Drequire.openssl -Dopenssl.prefix=/usr/local/Cellar/openssl@1.1/1.1.1k \
-Drequire.snappy -Dsnappy.lib=/usr/local/Cellar/snappy/1.1.4/lib \
-Drequire.isal -Disal.prefix=/usr/local/Cellar/isa-l -Disal.lib=/usr/local/Cellar/isa-l/lib \
-e -X
5 - 遇到的問題及解決方法
5.1 hadoop-common 模組編譯出錯
[WARNING] CMake Warning (dev) at CMakeLists.txt:47 (find_package):
[WARNING] Policy CMP0074 is not set: find_package uses <PackageName>_ROOT variables.
[WARNING] Run "cmake --help-policy CMP0074" for policy details. Use the cmake_policy
[WARNING] command to set the policy and suppress this warning.
[WARNING]
[WARNING] Environment variable ZLIB_ROOT is set to:
[WARNING]
[WARNING] /usr/local/Cellar/zlib/1.2.11/
[WARNING]
[WARNING] For compatibility, CMake is ignoring the variable.
[WARNING] This warning is for project developers. Use -Wno-dev to suppress it.
[WARNING]
[WARNING] CMake Error at /usr/local/Cellar/cmake/3.20.5/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:230 (message):
[WARNING] Could NOT find ZLIB (missing: ZLIB_LIBRARY) (found version "1.2.11")
[WARNING] Call Stack (most recent call first):
[WARNING] /usr/local/Cellar/cmake/3.20.5/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:594 (_FPHSA_FAILURE_MESSAGE)
[WARNING] /usr/local/Cellar/cmake/3.20.5/share/cmake/Modules/FindZLIB.cmake:120 (FIND_PACKAGE_HANDLE_STANDARD_ARGS)
[WARNING] CMakeLists.txt:47 (find_package)
它提示說找不到 ZLIB_LIBRARY
,而 ZLIB_ROOT
被忽略了。看看我的環境變數:
export ZLIB_ROOT=/usr/local/Cellar/zlib/1.2.11
export ZLIB_LIBRARY=/usr/local/Cellar/zlib/1.2.11/lib
export ZLIB_INCLUDE_DIR=/usr/local/Cellar/zlib/1.2.11/include
經過一番查詢,原來 XXX_ROOT
在 CMake 3.12 以上是這樣的作用:
再參考這位大神(https://github.com/MarkDana/Compile-Hadoop2.2.0-on-MacOS)的分析:
如果是通過
brew
安裝的zlib
庫,那麼可以到這個路徑檢視:
cd /usr/local/Cellar/cmake/3.20.5/share/cmake/Modules
vim FindZLIB.cmake
約 60行,它會先去檢視 ZLIB_ROOT 變數:
所以我們只需要設定 ZLIB_ROOT 即可,為了讓此變數生效,需要在 CMakeFile 中啟用 cmake 的 CMP0074 策略:
修改報錯專案對應的 CMake 配置:
vim hadoop-common-project/hadoop-common/src/CMakeLists.txt
,啟用新特性:
# 在 cmake_minimum_required(VERSION 3.1 FATAL_ERROR) 之後,加入這一行:
cmake_policy(SET CMP0074 NEW)
最後,環境變數中只需要保留這一行即可:
export ZLIB_ROOT=/usr/local/Cellar/zlib/1.2.11
然後,此錯誤就消失了。
5.2 hadoop-common 模組,仍然出錯
[WARNING] CMake Warning (dev) in CMakeLists.txt:
[WARNING] No project() command is present. The top-level CMakeLists.txt file must
[WARNING] contain a literal, direct call to the project() command. Add a line of
[WARNING] code such as
[WARNING]
[WARNING] project(ProjectName)
[WARNING]
[WARNING] near the top of the file, but after cmake_minimum_required().
[WARNING]
[WARNING] CMake is pretending there is a "project(Project)" command on the first
[WARNING] line.
[WARNING] This warning is for project developers. Use -Wno-dev to suppress it.
[WARNING]
[WARNING] CMake Error at CMakeLists.txt:68 (message):
[WARNING] Required bzip2 library and/or header files could not be found.
[WARNING]
[WARNING]
[WARNING] -- Configuring incomplete, errors occurred!
找不到 bzip2
庫或相關的標頭檔案。。。可是我的 bzip2 環境變數都已經設定了呀:
export BZIP2_ROOT=/usr/local/Cellar/bzip2/1.0.8
export BZIP2_INCLUDE_DIR=/usr/local/Cellar/bzip2/1.0.8/include
export BZIP2_LIBRARY=/usr/local/Cellar/bzip2/1.0.8
其他各種環境變數和 LDFLAGS、CPPFLAGS 設定都無效;
經過各種搜尋,感覺 macOS 上就不能編譯 bzip2
。所以,我改了這裡,跳過檢查:
# 修改下面一行,直接設定了 REQUIRE_BZIP2,即 TRUE
# if(BZIP2_INCLUDE_DIR AND BZIP2_LIBRARIES)
if(REQUIRE_BZIP2)
5.3 MapReduce NativeTask 模組編譯出錯
[WARNING] 2 warnings and 12 errors generated.
[WARNING] make[2]: *** [CMakeFiles/nttest.dir/main/native/test/TestCompressions.cc.o] Error 1
[WARNING] make[2]: *** Waiting for unfinished jobs....
[WARNING] make[1]: *** [CMakeFiles/nttest.dir/all] Error 2
[WARNING] make: *** [all] Error 2
......
[INFO] Apache Hadoop MapReduce NativeTask ................. FAILURE [ 21.506 s]
搜尋後得知,brew 安裝的 snappy 版本是最新的 1.1.9,是通過 C++11 編譯的,但是 Hadoop 3.2.1 的編譯不支援C++11。
期間,又嘗試安裝了 snappy 1.1.5 編譯還是會出錯:
[WARNING] CMake Error at CMakeLists.txt:96 (message):
[WARNING] Required snappy library could not be found.
[WARNING] SNAPPY_LIBRARY=SNAPPY_LIBRARY-NOTFOUND, SNAPPY_INCLUDE_DIR=,
[WARNING] CUSTOM_SNAPPY_INCLUDE_DIR=, CUSTOM_SNAPPY_PREFIX=, CUSTOM_SNAPPY_INCLUDE=
嘗試新增了環境變數,不起作用:
export SNAPPY_LIBRARY=/usr/local/Cellar/snappy/1.1.5
export SNAPPY_INCLUDE_DIR=/usr/local/Cellar/snappy/1.1.5/include
# 仍然會報下面的錯:
Required snappy library could not be found.
[WARNING] SNAPPY_LIBRARY=SNAPPY_LIBRARY-NOTFOUND, SNAPPY_INCLUDE_DIR=,
[WARNING] CUSTOM_SNAPPY_INCLUDE_DIR=, CUSTOM_SNAPPY_PREFIX=, CUSTOM_SNAPPY_INCLUDE=
找了這麼一個編譯 snappy 的方法,macOS Big Sur 系統驗證不可用:
下載 https://github.com/electrum/hadoop-snappy.git,並編譯,然後拷貝至 Hadoop 叢集的 native 下。
這裡還有一個大神解決 snappy 版本編譯的操作,我沒有試,提供參考:https://blog.csdn.net/weixin_44570264/article/details/106846117
所以我又安裝了最上面提到的 snappy 1.1.4,再測試,然後它終於編譯成功了✌️
6 - 編譯成功,測試驗證
編譯命令已經放在上面第4姐了。貼上編譯成功的證明✌️
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 1.893 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 4.338 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 1.560 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 2.337 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.359 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 1.777 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 3.786 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 0.951 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 6.846 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 1.994 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [ 54.530 s]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 3.630 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 5.173 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.118 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 27.638 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [ 31.633 s]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [02:38 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 4.768 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 1.722 s]
[INFO] Apache Hadoop HDFS-RBF ............................. SUCCESS [ 5.303 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.042 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [ 0.054 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [ 6.738 s]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [ 9.302 s]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [ 2.945 s]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [ 0.133 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [ 8.103 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [ 40.942 s]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [ 1.310 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [ 2.386 s]
[INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [ 1.992 s]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 12.021 s]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [ 1.714 s]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [ 2.445 s]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [ 1.740 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [ 1.592 s]
[INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [ 0.061 s]
[INFO] Apache Hadoop YARN TimelineService HBase Common .... SUCCESS [ 2.382 s]
[INFO] Apache Hadoop YARN TimelineService HBase Client .... SUCCESS [ 2.167 s]
[INFO] Apache Hadoop YARN TimelineService HBase Servers ... SUCCESS [ 0.124 s]
[INFO] Apache Hadoop YARN TimelineService HBase Server 1.2 SUCCESS [ 2.625 s]
[INFO] Apache Hadoop YARN TimelineService HBase tests ..... SUCCESS [ 3.917 s]
[INFO] Apache Hadoop YARN Router .......................... SUCCESS [ 1.785 s]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [ 0.119 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [ 1.679 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [ 1.112 s]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [ 0.196 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 5.185 s]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 2.387 s]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [ 1.852 s]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [ 3.299 s]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [ 1.948 s]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [ 3.972 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 1.252 s]
[INFO] Apache Hadoop YARN Services ........................ SUCCESS [ 0.040 s]
[INFO] Apache Hadoop YARN Services Core ................... SUCCESS [ 2.626 s]
[INFO] Apache Hadoop YARN Services API .................... SUCCESS [ 1.434 s]
[INFO] Apache Hadoop Image Generation Tool ................ SUCCESS [ 0.980 s]
[INFO] Yet Another Learning Platform ...................... SUCCESS [ 1.346 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [ 0.044 s]
[INFO] Apache Hadoop YARN UI .............................. SUCCESS [ 0.069 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [ 9.978 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [ 0.671 s]
[INFO] Apache Hadoop MapReduce NativeTask ................. SUCCESS [ 39.343 s]
[INFO] Apache Hadoop MapReduce Uploader ................... SUCCESS [ 0.862 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 1.086 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [ 4.303 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 0.906 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 1.362 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 0.496 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [ 0.599 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 1.424 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 0.968 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 0.466 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 0.543 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 4.609 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 0.960 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 3.611 s]
[INFO] Apache Hadoop Kafka Library support ................ SUCCESS [ 0.838 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 2.333 s]
[INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [ 0.449 s]
[INFO] Apache Hadoop Client Aggregator .................... SUCCESS [ 3.429 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 2.420 s]
[INFO] Apache Hadoop Resource Estimator Service ........... SUCCESS [ 1.536 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [ 0.592 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 10.087 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.048 s]
[INFO] Apache Hadoop Client API ........................... SUCCESS [01:24 min]
[INFO] Apache Hadoop Client Runtime ....................... SUCCESS [01:05 min]
[INFO] Apache Hadoop Client Packaging Invariants .......... SUCCESS [ 0.262 s]
[INFO] Apache Hadoop Client Test Minicluster .............. SUCCESS [02:00 min]
[INFO] Apache Hadoop Client Packaging Invariants for Test . SUCCESS [ 0.174 s]
[INFO] Apache Hadoop Client Packaging Integration Tests ... SUCCESS [ 0.151 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 22.983 s]
[INFO] Apache Hadoop Client Modules ....................... SUCCESS [ 0.053 s]
[INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [ 0.607 s]
[INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [ 0.054 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14:07 min
[INFO] Finished at: 2021-06-30T00:10:40+08:00
[INFO] Final Memory: 406M/2110M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "dev" could not be activated because it does not exist.
看截圖:
編譯好的安裝檔案,在這個目錄下:
我們需要的壓縮包在 lib/native
下。
${原始碼}/hadoop-dist/target/hadoop-3.2.1
# 本地庫包在這裡:
${原始碼}/hadoop-dist/target/hadoop-3.2.1/lib/native
拷貝 native 下的檔案,到已經 Hadoop 叢集的安裝目錄中,然後檢查它對本地庫的支援:
沒有惱人的 WARN 警告了,zlib、snappy 等壓縮功能也都有了✌️
7 - 經驗總結
1)儘量用 CentOS系統編譯。macOS 編譯,大部分本地庫都不會通過,會卡死在 CMake。
2)額外使用的環境變數如下:
# 本地編譯 Hadoop,必須設定 ZLIB_ROOT,且在 CMakeFile 中啟用 cmake 的 CMP0074 策略:
export ZLIB_ROOT=/usr/local/Cellar/zlib/1.2.11
# export ZLIB_LIBRARY=/usr/local/Cellar/zlib/1.2.11/lib
# export ZLIB_INCLUDE_DIR=/usr/local/Cellar/zlib/1.2.11/include
export OPENSSL_ROOT_DIR="/usr/local/opt/openssl@1.1"
export OPENSSL_INCLUDE_DIR="$OPENSSL_ROOT_DIR/include"
export PKG_CONFIG_PATH="${OPENSSL_ROOT_DIR}/lib/pkgconfig"
參考資料 - 價值順序
mac系統編譯3.2.1版本hadoop
Mac OSX 編譯hadoop本地庫
Compile-Hadoop2.2.0-on-MacOS
hadoop原始碼研究 編譯錯誤記錄
版權宣告
出處:部落格園-瘦風的南牆(https://www.cnblogs.com/shoufeng)
感謝閱讀,公眾號 「瘦風的南牆」 ,手機端閱讀更佳,還有其他福利和心得輸出,歡迎掃碼關注?
本文版權歸博主所有,歡迎轉載,但 [必須在頁面明顯位置標明原文連結],否則博主保留追究相關人士法律責任的權利。