CentOS6.5實現Hadoop_2.8.1編譯及HDFS偽分散式部署
CentOS6.5實現Hadoop編譯及HDFS偽分散式部署
Hadoop 2.x 三大元件
MapReduce(Others) -- 計算
YARN -- 資源和作業排程平臺
HDFS -- 儲存
環境準備:
[root@hadoop001 software]# pwd
/opt/software
-rw-r--r--. 1 root root 8617253 May 14 07:05 apache-maven-3.3.9-bin.zip
-rw-r--r--. 1 root root 7546219 May 14 07:05 findbugs-1.3.9.zip
-rw-r--r--. 1 root root 34523353 May 14 07:05 hadoop-2.8.1-src.tar.gz
-rw-r--r--. 1 root root 424555111 May 14 07:09 hadoop-2.8.1.tar.gz
-rw-r--r--. 1 root root 173271626 May 14 07:09 jdk-8u45-linux-x64.gz
-rw-r--r--. 1 root root 96721446 May 14 07:07 .m2.tar.gz
-rw-r--r--. 1 root root 2401901 May 14 07:04 protobuf-2.5.0.tar.gz
1.Hadoop原始碼下載
[root@hadoop001 software]# tar -xzvf hadoop-2.8.1-src.tar.gz
# 檢查原始碼包環境
[root@hadoop001 hadoop-2.8.1-src]# cat BUILDING.txt
----------------------------------------------------------------------------------
Requirements:
* Unix System
* JDK 1.7+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
* Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
2.JAVA安裝
[root@hadoop001 ~]# mkdir -p /usr/java
[root@hadoop001 ~]# mv jdk-8u45-linux-x64.gz /usr/java
[root@hadoop001 ~]# cd /usr/java
[root@hadoop001 ~]# tar -xzvf jdk-8u45-linux-x64.gz
# 修改使用者和使用者組
[root@hadoop001 java]# ll
total 169388
drwxr-xr-x. 8 uucp 143 4096 Apr 10 2015 jdk1.8.0_45
-rw-r--r--. 1 root root 173271626 May 14 07:09 jdk-8u45-linux-x64.gz
[root@hadoop002 java]# chown -R root:root jdk1.8.0_45
[root@hadoop002 java]# ll
total 169388
drwxr-xr-x. 8 root root 4096 Apr 11 2015 jdk1.8.0_45
-rw-r--r--. 1 root root 173271626 Mar 16 15:25 jdk-8u45-linux-x64.gz
# JAVA全域性變數配置
[root@hadoop001 java]# vi /etc/profile
export JAVA_HOME=/usr/java/jdk1.8.0_45
export PATH=$JAVA_HOME/bin:$PATH
注:不用解除安裝原有JDK,環境變數可以覆蓋原來JDK
[root@hadoop001 java]# source /etc/profile
[root@hadoop001 java]# which java
/usr/java/jdk1.8.0_45/bin/java
[root@hadoop001 java]# java -version
java version "1.8.0_45"
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
3.Maven安裝與部署
[root@hadoop001 software]# ll
total 466948
-rw-r--r--. 1 root root 8617253 May 14 07:05 apache-maven-3.3.9-bin.zip
[root@hadoop001 software]# unzip apache-maven-3.3.9-bin.zip
[root@hadoop001 software]# ll
total 466952
drwxr-xr-x. 6 root root 4096 Nov 10 2015 apache-maven-3.3.9
-rw-r--r--. 1 root root 8617253 May 14 07:05 apache-maven-3.3.9-bin.zip
#Maven全域性環境變數
[root@hadoop001 java]# vi /etc/profile
export MAVEN_HOME=/opt/software/apache-maven-3.3.9
export PATH=$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[root@hadoop001 software]# source /etc/profile
[root@hadoop001 software]# which mvn
/opt/software/apache-maven-3.3.9/bin/mvn
[root@hadoop001 software]# mvn -version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-10T11:41:47-05:00)
Maven home: /opt/software/apache-maven-3.3.9
Java version: 1.8.0_45, vendor: Oracle Corporation
Java home: /usr/java/jdk1.8.0_45/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"
#Maven倉庫檔案手動匯入
[root@hadoop001 software]# mv .m2.tar.gz ~
[root@hadoop001 software]# cd ~
[root@hadoop001 ~]# tar -xzvf .m2.tar.gz
# Maven倉庫預設路徑和修改
[root@hadoop001 conf]# pwd
/opt/software/apache-maven-3.3.9/conf/settings.xml
<!-- localRepository
| The path to the local repository maven will use to store artifacts.
|
| Default: ${user.home}/.m2/repository
/path/to/local/repo
-->
-- pro.xml以maven倉庫的專案,mvn 編譯、打包、測試
4.protobuf原始碼安裝
[root@hadoop001 software]# pwd
/opt/software
[root@hadoop001 software]# tar -xzvf protobuf-2.5.0.tar.gz
[root@hadoop001 software]# ll
total 466956
drwxr-xr-x. 10 109965 5000 4096 Feb 26 2013 protobuf-2.5.0
-rw-r--r--. 1 root root 2401901 May 14 07:04 protobuf-2.5.0.tar.gz
[root@hadoop001 software]# chown -R root:root protobuf-2.5.0
[root@hadoop001 software]# cd protobuf-2.5.0
[root@hadoop001 protobuf-2.5.0]# yum install -y gcc gcc-c++ make cmake
[root@hadoop001 protobuf-2.5.0]# ./configure --prefix=/usr/local/protobuf
[root@hadoop001 protobuf-2.5.0]# make && make install
# protobuf全域性環境變數
[root@hadoop001 java]# vi /etc/profile
export PROTOC_HOME=/usr/local/protobuf
export PATH=$PROTOC_HOME/bin:$FINDBUGS_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[root@hadoop001 protobuf-2.5.0]# source /etc/profile
[root@hadoop001 protobuf-2.5.0]# protoc --version
libprotoc 2.5.0
[root@hadoop001 protobuf-2.5.0]#
5.Findbugs安裝
[root@hadoop001 software]# pwd
/opt/software
[root@hadoop001 software]# unzip findbugs-1.3.9.zip
[root@hadoop001 software]# ll
total 466960
drwxr-xr-x. 7 root root 4096 Aug 21 2009 findbugs-1.3.9
-rw-r--r--. 1 root root 7546219 May 14 07:05 findbugs-1.3.9.zip
#Findbugs全域性環境變數
[root@hadoop002 software]# vi /etc/profile
export FINDBUGS_HOME=/opt/software/findbugs-1.3.9
export PATH=$FINDBUGS_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[root@hadoop001 software]#
[root@hadoop001 software]# source /etc/profile
[root@hadoop001 software]# findbugs -version
1.3.9
6.其他依賴
yum install -y openssl openssl-devel svn ncurses-devel zlib-devel libtool
yum install -y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake
7.Hadoop原始碼編譯
[root@hadoop001 hadoop-2.8.1-src]# mvn clean package -Pdist,native -DskipTests -Dtar
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 3.050 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 7.795 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 5.657 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 6.914 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 5.264 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 7.341 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 1.698 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 6.050 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 6.549 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [ 3.666 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.016 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 47.066 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:39 min
[INFO] Finished at: 2018-05-14T22:30:38-04:00
[INFO] Final Memory: 190M/454M
[INFO] ------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------------------------
hodoop部署
單機 --無程式
偽分散式部署 -- 程式存在+1節點 開發
叢集部署 -- 程式存在+n階段 開發/生產
[root@hadoop001 software]# tar -xzvf hadoop-2.8.1.tar.gz
[root@hadoop001 software]# chown -R root:root hadoop-2.8.1
[root@hadoop001 hadoop-2.8.1]# ll
drwxrwxr-x. 2 root root 4096 Jun 2 2017 bin -- 執行命令shell
drwxrwxr-x. 3 root root 4096 Jun 2 2017 etc -- 配置檔案
drwxrwxr-x. 2 root root 4096 Jun 2 2017 include
drwxrwxr-x. 3 root root 4096 Jun 2 2017 lib -- 庫
drwxrwxr-x. 2 root root 4096 Jun 2 2017 libexec
-rw-rw-r--. 1 root root 99253 Jun 2 2017 LICENSE.txt
-rw-rw-r--. 1 root root 15915 Jun 2 2017 NOTICE.txt
-rw-r--r--. 1 root root 1366 Jun 2 2017 README.txt
drwxrwxr-x. 2 root root 4096 Jun 2 2017 sbin -- 啟動和關閉hadoop
drwxrwxr-x. 4 root root 4096 Jun 2 2017 share -- jar
#hadoop全域性環境變數
[root@hadoop001 hadoop-2.8.1]# vim /etc/profile
export HADOOP_HOME=/opt/software/hadoop-2.8.1
export PATH=$HADOOP_HOME/bin:$PROTOC_HOME/bin:$FINDBUGS_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[root@hadoop001 hadoop-2.8.1]# source /etc/profile
[root@hadoop001 hadoop-2.8.1]# which hadoop
/opt/software/hadoop-2.8.1/bin/hadoop
[root@hadoop001 bin]# rm -rf *.cmd
[root@hadoop001 sbin]# rm -rf *.cmd
#配置core-site檔案
[root@hadoop001 hadoop]# vim core-site.xml
點選(此處)摺疊或開啟
-
<configuration>
-
<property>
-
<name>fs.defaultFS</name>
-
<value>hdfs://localhost:9000</value> --HDFD埠和路徑
-
</property>
-
</configuration>
-
-
-
<configuration>
-
<property>
-
<name>dfs.replication</name>
-
<value>1</value> -- 叢集主機數量
-
</property>
-
</configuration>
- <configuration>
#本機ssh驗證配置
[root@hadoop001 hadoop]# cd ~
[root@hadoop001 ~]# cd .ssh
[root@hadoop001 .ssh]# ll
total 0
[root@hadoop001 ~]# rm -rf .ssh
[root@hadoop001 ~]# ssh-keygen
[root@hadoop001 ~]# cd .ssh
[root@hadoop001 .ssh]# ll
total 8
-rw-------. 1 root root 1675 May 14 23:38 id_rsa
-rw-r--r--. 1 root root 396 May 14 23:38 id_rsa.pub
[root@hadoop001 .ssh]# cat id_rsa.pub >> authorized_keys
[root@hadoop001 .ssh]# ll
total 12
-rw-r--r--. 1 root root 396 May 14 23:40 authorized_keys
-rw-------. 1 root root 1675 May 14 23:38 id_rsa
-rw-r--r--. 1 root root 396 May 14 23:38 id_rsa.pub
#第一次必須操作,本機修改過埠
[root@hadoop001 ~]# ssh localhost date
ssh: connect to host localhost port 22: Connection refused
[root@hadoop001 ~]# ssh -p2222 localhost date
The authenticity of host '[localhost]:2222 ([::1]:2222)' can't be established.
RSA key fingerprint is 09:b9:67:65:cb:e6:ca:31:5d:33:6c:3b:92:9e:c2:1a.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '[localhost]:2222' (RSA) to the list of known hosts.
Mon May 14 23:42:37 EDT 2018
#格式化HDFS檔案系統
[root@hadoop001 ~]# which hdfs
/opt/software/hadoop-2.8.1/bin/hdfs
[root@hadoop001 ~]# hdfs namenode -format
#啟動HDFS檔案系統
[root@hadoop001 sbin]# /opt/software/hadoop-2.8.1/sbin/start-dfs.sh
Starting namenodes on [localhost]
localhost: ssh: connect to host localhost port 22: Connection refused
localhost: ssh: connect to host localhost port 22: Connection refused
Starting secondary namenodes [0.0.0.0]
0.0.0.0: ssh: connect to host 0.0.0.0 port 22: Connection refused
#解決port 22錯誤
[root@hadoop000 hadoop]# vi hadoop-env.sh
新增:export HADOOP_SSH_OPTS="-p 2222"
#解決 ERROR:JAVA_HOME
[root@hadoop000 hadoop]# vi hadoop-env.sh
export JAVA_HOME=/usr/java/jdk1.8.0_45
#啟動OK
[root@hadoop001 sbin]# /opt/software/hadoop-2.8.1/sbin/start-dfs.sh
[root@hadoop001 hadoop]# /opt/software/hadoop-2.8.1/sbin/start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /opt/software/hadoop-2.8.1/logs/hadoop-root-namenode-hadoop001.out
localhost: starting datanode, logging to /opt/software/hadoop-2.8.1/logs/hadoop-root-datanode-hadoop001.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /opt/software/hadoop-2.8.1/logs/hadoop-root-secondarynamenode-hadoop001.out
#檢測啟動
[root@hadoop000 hadoop-2.8.1]# jps
16243 Jps
15943 DataNode
5127 Launcher
16139 SecondaryNameNode
15853 NameNode
來自 “ ITPUB部落格 ” ,連結:http://blog.itpub.net/31441024/viewspace-2154483/,如需轉載,請註明出處,否則將追究法律責任。
相關文章
- CentOS6.5實現Hadoop_2.8.1編譯CentOSHadoop編譯
- CentOS6.5基於ROOT使用者的HDFS偽分散式部署(a Single Node Cluster)CentOS分散式
- CentOS6.5基於Hadoop使用者的HDFS偽分散式部署(a Single Node Cluster)CentOSHadoop分散式
- 偽分散式hdfs的配置(個人總結)分散式
- Redis 偽分散式安裝部署配置Redis分散式
- CentOS6.5基於Hadoop使用者的Yarn偽分散式部署(a Single Node Cluster)CentOSHadoopYarn分散式
- hadoop3.1.0 HDFS快速搭建偽分散式環境Hadoop分散式
- Mac部署hadoop3(偽分散式)MacHadoop分散式
- 大資料2-Hadoop偽分散式+ZK+HDFS大資料Hadoop分散式
- 分散式編譯工具分散式編譯
- zabbix分散式監控環境完全編譯安裝部署分散式編譯
- HDFS分散式儲存分散式
- 分散式鎖的實現及原理分散式
- HDFS分散式叢集搭建分散式
- HDFS分散式儲存的意義及技術解析分散式
- zookeeper 分散式鎖的原理及實現分散式
- 在 Ubuntu 22 的基礎上進行 Hadoop 偽分散式(HDFS)的搭建UbuntuHadoop分散式
- Memcached 編譯安裝部署、LRU 演算法、分散式演算法剖析編譯演算法分散式
- HDFS分散式檔案系統分散式
- 分散式檔案系統-HDFS分散式
- CentOS6.5下Hadoop的編譯CentOSHadoop編譯
- 分散式爬蟲的部署之Scrapyd分散式部署分散式爬蟲
- 分散式檔案儲存hdfs簡介及常用命令分散式
- Ubuntu下偽分散式模式Hadoop的安裝及配置Ubuntu分散式模式Hadoop
- Hadoop hdfs完全分散式搭建教程Hadoop分散式
- 什麼是HDFS 分散式儲存分散式
- Hbase偽分散式環境搭建分散式
- hadoop偽分散式安裝Hadoop分散式
- hadoop+spark偽分散式HadoopSpark分散式
- 實現分散式鎖分散式
- 分散式鎖實現分散式
- MassTransit | 基於StateMachine實現Saga編排式分散式事務Mac分散式
- 分散式系統選主場景分析及實現分散式
- CentOS7 hadoop3.3.1安裝(單機分散式、偽分散式、分散式)CentOSHadoop分散式
- 如果快速理解及實現單點登入及分散式回話分散式
- [BBED]Oracle 11.2.0.4 Centos6.5下編譯bbedOracleCentOS編譯
- CentOS6.5編譯安裝最新MySQL 5.7.11CentOS編譯MySql
- Linux下啟動偽分散式HADOOP && MySQL命令及指令碼Linux分散式HadoopMySql指令碼