Hive-0.14.0版本,本地獨立模式,MySQL作為後設資料庫

abraham_dba_2013發表於2014-12-24

系統

Red hat linux 6.4

Hadoop版本

2.5.2

Hive版本

0.14.0

Mysql資料庫版本

5.6.15

 
1.安裝JDK,這個步驟在部署hadoop分散式叢集時候已經操作過了,可以略過。

2.安裝Mysql 資料庫
     資料庫的安裝順序
          (1)Mysql-server
          (2)mysql-client
          (3)啟動mysql服務
          (4)用root使用者登入,以授權方式建立hive後設資料庫,和hive使用者
     grant all on hive.* to 'hive'@'%' identified by 'hive';  這個hive資料庫,及其 使用者和密碼,是後面配置hive要用到的 !!!
備註:
Red Hat linux 下的mysql 安裝 可參考 http://blog.itpub.net/28929558/viewspace-1192693/

3.部署Hive
(1)到官網下載hive http://mirror.bit.edu.cn/apache/hive/
(2)解壓到自己需要的安裝路徑
(3)配置hive環境變數

點選(此處)摺疊或開啟

  1. #set hive_env
  2. export HIVE_HOME=/home/zhang/hive
  3. export PATH=$PATH:/home/zhang/hive/bin
  4. export CLASS_PATH=$CLASS_PATH:/home/zhang/hive/lib

(4)修改$HIVE_HOME/conf 下面配置檔案(將需要配置的兩個模板,拷貝重新命名)

點選(此處)摺疊或開啟

  1. cp hive-env.sh.template hive-env.sh
  2. cp hive-default.xml.template hive-site.xml
(5)配置hive-env.sh 檔案,指定 HADOOP_HOME

點選(此處)摺疊或開啟

  1. # Set HADOOP_HOME to point to a specific hadoop install directory
  2. HADOOP_HOME=/home/zhang/hadoop-2.5.2
(6)配置 hive-site.xml,指定MySQL資料庫驅動、資料庫名、使用者名稱及密碼,
修改的內容如下所示:

點選(此處)摺疊或開啟

  1. <property>
  2.   <name>javax.jdo.option.ConnectionURL</name>
  3.   <value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
  4.   <description>JDBC connect string for a JDBC metastore</description>
  5. </property>


  6. <property>
  7.   <name>javax.jdo.option.ConnectionDriverName</name>
  8.   <value>com.mysql.jdbc.Driver</value>
  9.   <description>Driver class name for a JDBC metastore</description>
  10. </property>


  11. <property>
  12.   <name>javax.jdo.option.ConnectionUserName</name>
  13.   <value>hive</value>
  14.   <description>username to use against metastore database</description>
  15. </property>


  16. <property>
  17.   <name>javax.jdo.option.ConnectionPassword</name>
  18.   <value>hive</value>
  19.   <description>password to use against metastore database</description>
  20. </property>

  21. <property>
  22.   <name>hive.metastore.local</name>
  23.   <value>true</value>
  24.   <description></description>
  25. </property>
以上幾個配置項,跟0.11.0版本沒有區別,但是下面幾個要注意配置下:
在hive下建立臨時IO的tmp資料夾。然後將路徑配置到下列引數中

點選(此處)摺疊或開啟

  1. <property>
  2.     <name>hive.querylog.location</name>
  3.     <value>/home/zhang/hive/iotmp</value>
  4.     <description>Location of Hive run time structured log file</description>
  5.   </property>
  6.   
  7.   <property>
  8.     <name>hive.exec.local.scratchdir</name>
  9.     <value>/home/zhang/hive/iotmp</value>
  10.     <description>Local scratch space for Hive jobs</description>
  11.   </property>
  12.   
  13.   <property>
  14.     <name>hive.downloaded.resources.dir</name>
  15.     <value>/home/zhang/hive/iotmp</value>
  16.     <description>Temporary local directory for added resources in the remote file system.</description>
  17.   </property>
備註如果不配置啟動或操作hive時候會報錯:

點選(此處)摺疊或開啟

  1. [zhang@namenode ~]$ hive
  2. 14/12/17 03:04:45 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
  3. 14/12/17 03:04:45 WARN conf.HiveConf: HiveConf of name hive.metastore.local does not exist
  4. Logging initialized using configuration in jar:file:/home/zhang/hive/lib/hive-common-0.14.0.jar!/hive-log4j.properties
  5. SLF4J: Class path contains multiple SLF4J bindings.
  6. SLF4J: Found binding in [jar:file:/home/zhang/hadoop-2.5.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  7. SLF4J: Found binding in [jar:file:/home/zhang/hive/lib/hive-jdbc-0.14.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  8. SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
  9. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
  10. Exception in thread \"main\" java.lang.RuntimeException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
  11. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:444)
  12. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:672)
  13. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
  14. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  15. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  16. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  17. at java.lang.reflect.Method.invoke(Method.java:483)
  18. at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
  19. Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
  20. at org.apache.hadoop.fs.Path.initialize(Path.java:206)
  21. at org.apache.hadoop.fs.Path.(Path.java:172)
  22. at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:487)
  23. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:430)
  24. ... 7 more
  25. Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
  26. at java.net.URI.checkPath(URI.java:1823)
  27. at java.net.URI.(URI.java:745)
  28. at org.apache.hadoop.fs.Path.initialize(Path.java:203)
  29. ... 10 more

(7)下載 mysql-connector-java-5.1.34-bin.jar 檔案,並放到$HIVE_HOME/lib目錄下 (到官網自行下載即可)
如果沒有此jar包,啟動hive會報如下錯誤:

點選(此處)摺疊或開啟

  1. Caused by: org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException: The specified datastore driver (\"com.mysql.jdbc.Driver\") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
  2. at org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:58)
  3. at org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54)
  4. at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)
  5. ... 66 more

至此,部署完畢,可以啟動,測試下:
看到如下結果,說明啟動成功

點選(此處)摺疊或開啟

  1. [zhang@namenode ~]$ hive
  2. 14/12/17 18:48:22 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
  3. 14/12/17 18:48:22 WARN conf.HiveConf: HiveConf of name hive.metastore.local does not exist
  4. Logging initialized using configuration in jar:file:/home/zhang/hive/lib/hive-common-0.14.0.jar!/hive-log4j.properties
  5. SLF4J: Class path contains multiple SLF4J bindings.
  6. SLF4J: Found binding in [jar:file:/home/zhang/hadoop-2.5.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  7. SLF4J: Found binding in [jar:file:/home/zhang/hive/lib/hive-jdbc-0.14.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  8. SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
  9. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
  10. hive> show tables;
  11. OK
  12. Time taken: 0.924 seconds
  13. hive> show databases;
  14. OK
  15. default
  16. Time taken: 0.051 seconds, Fetched: 1 row(s)

建立表測試:如果這部分沒有報錯,那麼去檢查HDFS下面是否生成了相關檔案

點選(此處)摺疊或開啟

  1. hive> create table test(t_id int,t_name string) row format delimited fields terminated by \'|\' stored as textfile;
  2. OK
  3. Time taken: 1.586 seconds
  4. hive> show tables;
  5. OK
  6. test
  7. Time taken: 0.078 seconds, Fetched: 1 row(s)
  8. hive> select * from test;
  9. OK
  10. Time taken: 0.599 seconds
檢查HDFS是否生成相關檔案:

點選(此處)摺疊或開啟

  1. [zhang@datanode01 ~]$ hdfs dfs -ls /
  2. 14/12/17 18:55:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  3. Found 3 items
  4. drwxr-xr-x - zhang supergroup 0 2014-12-16 02:21 /input
  5. drwx-wx-wx - zhang supergroup 0 2014-12-17 01:25 /tmp
  6. drwxr-xr-x - zhang supergroup 0 2014-12-17 18:54 /user
  7. [zhang@datanode01 ~]$ hdfs dfs -ls /user/
  8. 14/12/17 18:55:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  9. Found 1 items
  10. drwxr-xr-x - zhang supergroup 0 2014-12-17 18:54 /user/hive
  11. [zhang@datanode01 ~]$ hdfs dfs -ls /user/hive/
  12. 14/12/17 18:55:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  13. Found 1 items
  14. drwxr-xr-x - zhang supergroup 0 2014-12-17 18:54 /user/hive/warehouse
  15. [zhang@datanode01 ~]$ hdfs dfs -ls /user/hive/warehouse/
  16. 14/12/17 18:55:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  17. Found 1 items
  18. drwxr-xr-x - zhang supergroup 0 2014-12-17 18:54 /user/hive/warehouse/test
檔案生成,至此驗證完畢







來自 “ ITPUB部落格 ” ,連結:http://blog.itpub.net/29439655/viewspace-1378888/,如需轉載,請註明出處,否則將追究法律責任。

相關文章