sqoop 安裝與配置使用
sqoop 安裝與配置使用
hive 與 mysql/pgsql 資料傳遞
sqoop 安裝
安裝包 上傳至 hive-server.
$ docker cp mysql-connector-java-5.1.28.jar dc387ff5c56d:/opt/mysql-connector-java-5.1.28.jar
$ docker cp sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz dc387ff5c56d:/opt/sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz
進入 hive-server 安裝 sqoop
$ docker exec -it dc387ff5c56d /bin/bash
# tar -zxvf sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz
# rm sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz
# mv sqoop-1.4.6.bin__hadoop-2.0.4-alpha/ sqoop
# mv mysql-connector-java-5.1.28.jar sqoop/lib/mysql-connector-java-5.1.28.jar
# cp sqoop/conf/sqoop-env-template.sh sqoop/conf/sqoop-env.sh
# vim sqoop/conf/sqoop-env.sh
#Set the path to where bin/hive is available
export HIVE_HOME=/opt/hive/
sqoop 將sqoop 資料匯入
# /opt/sqoop/bin/sqoop list-databases --connect jdbc:mysql://192.168.20.204:3306/ --username ad_user --password 'dl_ad!@#123'
Warning: /opt/sqoop/bin/../../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /opt/sqoop/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/sqoop/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /opt/sqoop/bin/../../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
21/07/30 10:24:23 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
21/07/30 10:24:23 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
21/07/30 10:24:23 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
addata
dl_ad
dl_college
e_statistics
i_commodity
i_marketing
i_member
i_operation
i_order
information_schema
log
mysql
performance_schema
test
在 hive 中建立 mysql 對應需要同步的資料庫
[work@Lan-Kvm-20157-Reptile2 ~]$ docker exec -it dc387ff5c56d /bin/bash
root@dc387ff5c56d:/opt# hive
hive> show databases;
OK
default
Time taken: 1.081 seconds, Fetched: 1 row(s)
# hive 建立對應資料庫
hive> create database i_member;
OK
Time taken: 0.155 seconds
hive> create database i_order;
OK
Time taken: 0.074 seconds
hive> create database i_commodity;
OK
Time taken: 0.074 seconds
hive> create database i_marketing;
OK
Time taken: 0.058 seconds
hive> create database i_operation;
OK
Time taken: 0.069 seconds
# 匯入 mysql 資料庫
資料匯入
$ /opt/sqoop/bin/sqoop import
--connect jdbc:mysql://192.168.20.204:3306/i_member
--username ad_user
--password 'dl_ad!@#123'
--table t_member
--num-mappers 1
--hive-import
--fields-terminated-by "t"
--hive-overwrite
--hive-database i_member
--hive-table t_member
21/08/01 08:48:01 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 3.1657 seconds (0 bytes/sec)
21/08/01 08:48:01 INFO mapreduce.ImportJobBase: Retrieved 1000 records.
21/08/01 08:48:01 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `t_member` AS t LIMIT 1
21/08/01 08:48:01 WARN hive.TableDefWriter: Column create_time had to be cast to a less precise type in Hive
21/08/01 08:48:01 WARN hive.TableDefWriter: Column update_time had to be cast to a less precise type in Hive
21/08/01 08:48:01 INFO hive.HiveImport: Loading uploaded data into Hive
21/08/01 08:48:09 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
21/08/01 08:48:09 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/opt/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
21/08/01 08:48:09 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/opt/hadoop-2.7.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
21/08/01 08:48:09 INFO hive.HiveImport: SLF4J: See for an explanation.
21/08/01 08:48:10 INFO hive.HiveImport:
21/08/01 08:48:10 INFO hive.HiveImport: Logging initialized using configuration in file:/opt/hive/conf/hive-log4j2.properties Async: true
21/08/01 08:48:12 INFO hive.HiveImport: OK
21/08/01 08:48:12 INFO hive.HiveImport: Time taken: 1.583 seconds
21/08/01 08:48:13 INFO hive.HiveImport: Loading data to table i_member.t_member
21/08/01 08:48:15 INFO hive.HiveImport: OK
21/08/01 08:48:15 INFO hive.HiveImport: Time taken: 2.832 seconds
21/08/01 08:48:16 INFO hive.HiveImport: Hive import complete.
# 此時訪問 http://192.168.20.157:50070/explorer.html#/user/hive/warehouse/i_member.db 就可以檢視到 i_member 資料庫中 t_member 表
查詢資料驗證
hive> use i_member;
OK
Time taken: 0.04 seconds
hive> show tables;
OK
t_member
Time taken: 0.064 seconds, Fetched: 1 row(s)
hive> select * from t_member;
OK
1 bywind1 password 2 15321761517 1 2 k8fy5od-tunhgvlx 1 2019-11-10 09:00:00.0 2019-11-23 10:22:01.0
2 bywind2 password 1 15321761517 2 1 u694h5d2r0j-znib 1 2019-11-10 09:00:00.0 2019-11-29 10:22:01.0
3 bywind3 password 2 15321761517 3 1 _5836va1ck7hx9mt 1 2019-11-10 09:00:00.0 2019-11-22 10:22:01.0
4 bywind4 password 2 15321761517 4 3 t8a2wf_o3k0zdeiu 1 2019-11-10 09:00:00.0 2019-11-27 10:22:01.0
5 bywind5 password 2 15321761517 5 1 -g0m6qku4sfv97yn 1 2019-11-10 09:00:00.0 2019-11-23 10:22:01.0
來自 “ ITPUB部落格 ” ,連結:http://blog.itpub.net/2035/viewspace-2807042/,如需轉載,請註明出處,否則將追究法律責任。
相關文章
- Linux環境Sqoop安裝配置及使用LinuxOOP
- Sqoop 1.99.6 安裝和使用OOP
- Sqoop之 Sqoop 1.4.6 安裝OOP
- SQOOP安裝部署OOP
- sqoop的安裝OOP
- DNSmasq 的安裝、配置與使用DNS
- ubuntu 安裝使用 Sqoop1.4.6 + hive 2.3.0UbuntuOOPHive
- 本地windows搭建spark環境,安裝與詳細配置(jdk安裝與配置,scala安裝與配置,hadoop安裝與配置,spark安裝與配置)WindowsSparkJDKHadoop
- Sqoop2安裝OOP
- Windows下Scoop安裝、配置與使用WindowsOOP
- Flink 1.7.0 安裝、配置與使用
- nginx 編譯安裝與配置使用Nginx編譯
- Sqoop安裝及驗證OOP
- Sqoop1.4.6配置和使用OOP
- centos7 (阿里雲、linux) 單機spark的安裝與配置詳解(jdk安裝與配置,scala安裝與配置,hadoop安裝與配置,spark安裝與配置)CentOS阿里LinuxSparkJDKHadoop
- Sqoop安裝使用手冊_雲帆大資料OOP大資料
- HIVE之 Sqoop 1.4.6 安裝、hive與oracle表互導HiveOOPOracle
- Sqoop的介紹和安裝OOP
- MacVim安裝與配置Mac
- [Redis] 安裝與配置Redis
- 【MongoDB】安裝與配置MongoDB
- 【Redis】安裝與配置Redis
- 【MySQL】安裝與配置MySql
- vim安裝與配置
- jdk安裝與配置JDK
- Rabbitmq安裝與配置MQ
- MySQL安裝與配置MySql
- Nginx安裝與配置Nginx
- Grafana 安裝與配置Grafana
- Redis安裝與配置Redis
- Mahout安裝與配置
- Spark安裝與配置Spark
- Nginx 安裝與配置Nginx
- Flume篇---Flume安裝配置與相關使用
- SonarQube 安裝配置使用
- sqoop安裝及資料遷移OOP
- CentOS6.5安裝sqoop2CentOSOOP
- Kafka SSL安裝與配置Kafka