1.啟動hive時報錯
java.lang.ExceptionInInitializerError
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:190)
at org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsPublisher.init(JDBCStatsPublisher.java:265)
at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:412)
Caused by: java.lang.SecurityException: sealing violation: package org.apache.derby.impl.jdbc.authentication is sealed
at java.net.URLClassLoader.getAndVerifyPackage(URLClassLoader.java:388)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:417)
複製程式碼
解決方案:
將mysql-connector-java-5.1.6-bin.jar包匯入到$HIVE_HOME/lib目錄下
複製程式碼
2.啟動hive時報錯:
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.TerminalFactory.create(TerminalFactory.java:101)
at jline.TerminalFactory.get(TerminalFactory.java:158)
Exception in thread "main" java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.console.ConsoleReader.<init>(ConsoleReader.java:230)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
複製程式碼
解決方案:
將當前hive版本的$HIVE_HOME/lib目錄下的jline-2.12.jar包拷貝到$HADOOP_HOME/share/hadoop/yarn/lib目錄下, 並將舊版本的Hive的Jline包從$HADOOP_HOME/etc/hadoop/yarn/lib目錄下刪除
複製程式碼
3.啟動hive時報錯
Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
at org.apache.hadoop.fs.Path.initialize(Path.java:206)
at org.apache.hadoop.fs.Path.<init>(Path.java:172)
Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
at java.net.URI.checkPath(URI.java:1804)
at java.net.URI.<init>(URI.java:752)
at org.apache.hadoop.fs.Path.initialize(Path.java:203)
... 11 more
複製程式碼
解決方案:
1.檢視hive-site.xml配置,會看到配置值含有"system:java.io.tmpdir"的配置項
2.新建資料夾${HIVE_HOME}/hive/logs
3.將含有"system:java.io.tmpdir"的配置項的值修改為${HIVE_HOME}/hive/logs
即: 新添屬性為
<property>
<name>hive.exec.local.scratchdir</name>
<value>${HIVE_HOME}/logs/HiveJobsLog</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>${HIVE_HOME}/logs/ResourcesLog</value>
<description>Temporary local directory for added resources in the remote file system.</description>
</property>
<property>
<name>hive.querylog.location</name>
<value>${HIVE_HOME}/logs/HiveRunLog</value>
<description>Location of Hive run time structured log file</description>
</property>
<property>
<name>hive.server2.logging.operation.log.location</name>
<value>${HIVE_HOME}/logs/OpertitionLog</value>
<description>Top level directory where operation logs are stored if logging functionality is enabled</description>
</property>
複製程式碼
4.啟動hive時報錯
Caused by: java.sql.SQLException: Access denied for user 'root'@'master' (using password: YES)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:946)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2870)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:812)
at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:3269)
複製程式碼
解決方案:
mysql密碼不正確, 檢視hive-site.xml配置與mysql的密碼是否一致
複製程式碼
5.操作表資料時(向表中匯入資料), 報錯:
FAILED: RuntimeException org.apache.hadoop.security.AccessControlException: Permission denied: user=services02, access=EXECUTE, inode="/tmp":services01:supergroup:drwx------
複製程式碼
解決方案:
user=services02與inode="/tmp":services01:supergroup不同時,說明hive登入的主機與HDFS的active狀態的主機不一樣
應把user=services02的主機變為HDFS的active狀態的主機.
複製程式碼
二.擴充套件Parquet功能:
- 1.建立儲存格式為parquet時,建表語句為: (Hive版本為0.12)
create table parquet_test(x int, y string)
row format serde 'parquet.hive.serde.ParquetHiveSerDe'
stored as inputformat 'parquet.hive.DeprecatedParquetInputFormat'
outputformat 'parquet.hive.DeprecatedParquetOutputFormat';
複製程式碼
報錯:
FAILED: SemanticException [Error 10055]: Output Format must implement
HiveOutputFormat, otherwise it should be either IgnoreKeyTextOutputFormat or
SequenceFileOutputFormat
複製程式碼
*解決方案
因為parquet.hive.DeprecatedParquetOutputFormat類並沒有在Hive的CLASSPATH中配置
單獨下載parquet-hive-1.2.5.jar包(此類屬於$IMPALA_HOME/lib目錄下), 在$HIVE_HOME/lib目錄下建立個軟鏈就可以了
複製程式碼
cd $HIVE_HOME/lib
ln -s $/home/hadoop/soft/gz.zip/parquet-hive-1.2.5.jar
複製程式碼
2.繼續提交創表語句: (Hive版本為0.12)
create table parquet_test(x int, y string)
row format serde 'parquet.hive.serde.ParquetHiveSerDe'
stored as inputformat 'parquet.hive.DeprecatedParquetInputFormat'
outputformat 'parquet.hive.DeprecatedParquetOutputFormat';
複製程式碼
報錯:
Exception in thread "main" java.lang.NoClassDefFoundError: parquet/hadoop/api/WriteSupport
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
複製程式碼
解決方案:
通過yum下載Parquet
sodu yum -y install parquet
下載parquet的jar包在/usr/lib/parquet目錄下, 將/usr/lib/parquet目錄下的所有jar(除javadoc.jar和sources.jar外)拷貝到$HIVE_HOME/lib目錄下.
若yum無法下載parquet資源包, 則這是需要配置yum源, 請自行百度尋找資料
複製程式碼