19【線上日誌分析】之sparkdemo.jar執行在yarn上過程
1.將之前打包的jar包上傳
[root@sht-sgmhadoopnn-01 spark]# pwd
/root/learnproject/app/spark
[root@sht-sgmhadoopnn-01 spark]# rz
rz waiting to receive.
Starting zmodem transfer. Press Ctrl+C to cancel.
Transferring sparkdemo.jar...
100% 164113 KB 421 KB/sec 00:06:29 0 Errors
2.以下是錯誤
2.1 ERROR1: Exception in thread "main" java.lang.SecurityException: Invalid signature file digest for Manifest main attributes
IDEA打包的jar包,需要使用zip刪除指定檔案
zip -d sparkdemo.jar META-INF/*.RSA META-INF/*.DSA META-INF/*.SF
2.2 ERROR2: Exception in thread "main" java.lang.UnsupportedClassVersionError: com/learn/java/main/OnLineLogAnalysis2 : Unsupported major.minor version 52.0
yarn環境的jdk版本低於編譯jar包的jdk版本(需要一致或者高於;每個節點需要安裝jdk,同時修改每個節點的hadoop-env.sh檔案的JAVA_HOME引數指向)
2.3 ERROR3: java.lang.NoSuchMethodError: com.google.common.base.Stopwatch.createStarted()Lcom/google/common/base/Stopwatch;
17/02/15 17:30:35 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.NoSuchMethodError: com.google.common.base.Stopwatch.createStarted()Lcom/google/common/base/Stopwatch;
java.lang.NoSuchMethodError: com.google.common.base.Stopwatch.createStarted()Lcom/google/common/base/Stopwatch;
at org.influxdb.impl.InfluxDBImpl.ping(InfluxDBImpl.java:178)
at org.influxdb.impl.InfluxDBImpl.version(InfluxDBImpl.java:201)
at com.learn.java.main.OnLineLogAnalysis2.main(OnLineLogAnalysis2.java:69)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:627)
拋錯資訊為NoSuchMethodError,表示 guava可能有多版本,則低版本
[root@sht-sgmhadoopnn-01 app]# pwd
/root/learnproject/app
[root@sht-sgmhadoopnn-01 app]# ll
total 470876
-rw-r--r-- 1 root root 7509833 Jan 16 22:11 AdminLTE.zip
drwxr-xr-x 12 root root 4096 Feb 14 11:21 hadoop
-rw-r--r-- 1 root root 197782815 Dec 24 21:16 hadoop-2.7.3.tar.gz
drwxr-xr-x 7 root root 4096 Feb 7 11:16 kafka-manager-1.3.2.1
-rw-r--r-- 1 root root 59682993 Dec 26 14:44 kafka-manager-1.3.2.1.zip
drwxr-xr-x 2 root root 4096 Jan 7 16:21 kafkaoffsetmonitor
drwxr-xr-x 2 777 root 4096 Feb 14 14:48 pid
drwxrwxr-x 4 1000 1000 4096 Oct 29 01:46 sbt
-rw-r--r-- 1 root root 1049906 Dec 25 21:29 sbt-0.13.13.tgz
drwxrwxr-x 6 root root 4096 Mar 4 2016 scala
-rw-r--r-- 1 root root 28678231 Mar 4 2016 scala-2.11.8.tgz
drwxr-xr-x 13 root root 4096 Feb 15 17:01 spark
-rw-r--r-- 1 root root 187426587 Nov 12 06:54 spark-2.0.2-bin-hadoop2.7.tgz
[root@sht-sgmhadoopnn-01 app]#
[root@sht-sgmhadoopnn-01 app]# find ./ -name *guava*
[root@sht-sgmhadoopnn-01 app]# mv ./hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar ./hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar.bak
[root@sht-sgmhadoopnn-01 app]# cp ./spark/libs/guava-20.0.jar ./hadoop/share/hadoop/yarn/lib/
[root@sht-sgmhadoopnn-01 app]# mv ./spark/jars/guava-14.0.1.jar ./spark/jars/guava-14.0.1.jar.bak
[root@sht-sgmhadoopnn-01 app]# cp ./spark/libs/guava-20.0.jar ./spark/jars/
[root@sht-sgmhadoopnn-01 app]# mv ./hadoop/share/hadoop/common/lib/guava-11.0.2.jar ./hadoop/share/hadoop/common/lib/guava-11.0.2.jar.bak
[root@sht-sgmhadoopnn-01 app]# cp ./spark/libs/guava-20.0.jar ./hadoop/share/hadoop/common/lib/
3.後臺提交jar包執行
[root@sht-sgmhadoopnn-01 spark]#
[root@sht-sgmhadoopnn-01 spark]# nohup /root/learnproject/app/spark/bin/spark-submit \
> --name onlineLogsAnalysis \
> --master yarn \
> --deploy-mode cluster \
> --conf "spark.scheduler.mode=FAIR" \
> --conf "spark.sql.codegen=true" \
> --driver-memory 2G \
> --executor-memory 2G \
> --executor-cores 1 \
> --num-executors 3 \
> --class com.learn.java.main.OnLineLogAnalysis2 \
> /root/learnproject/app/spark/sparkdemo.jar &
[1] 22926
[root@sht-sgmhadoopnn-01 spark]# nohup: ignoring input and appending output to `nohup.out'
[root@sht-sgmhadoopnn-01 spark]#
[root@sht-sgmhadoopnn-01 spark]#
[root@sht-sgmhadoopnn-01 spark]# tail -f nohup.out
4.yarn web介面檢視執行log
ApplicationMaster:開啟為spark history server web介面
logs: 檢視stderr 和 stdout日誌 (system.out.println方法輸出到stdout日誌中)
5.檢視spark history web
6.檢視DashBoard ,實時視覺化
來自 “ ITPUB部落格 ” ,連結:http://blog.itpub.net/30089851/viewspace-2133917/,如需轉載,請註明出處,否則將追究法律責任。
相關文章
- 18【線上日誌分析】之Spark on Yarn配置日誌Web UI(HistoryServer服務)SparkYarnWebUIServer
- IT 執行在雲端,而云執行在 Linux 上Linux
- 03【線上日誌分析】之hadoop-2.7.3編譯和搭建叢集環境(HDFS HA,Yarn HA)Hadoop編譯Yarn
- 06【線上日誌分析】之KafkaOffsetMonitor監控工具的搭建Kafka
- knockout原始碼分析之執行過程原始碼
- 16【線上日誌分析】之grafana-4.1.1 Install和新建日誌分析的DashBoardGrafana
- 22【線上日誌分析】之專案第二階段概述
- 11【線上日誌分析】之redis-3.2.5 install(單節點)Redis
- 08【線上日誌分析】之Flume Agent(聚合節點) sink to kafka clusterKafka
- 00【線上日誌分析】之專案概述和GitHub專案地址Github
- 節點2線上日誌生成歸檔日誌在節點1上的初步分析
- sql執行過程分析SQL
- Samza在YARN上的啟動過程 =》 之二 submitApplicationYarnMITAPP
- 為什麼主線核心不能執行在我的手機上?
- 17【線上日誌分析】之使用IDEA將工程Build成jar包IdeaUIJAR
- IT仍然執行在Java 8上 ·Vicki BoykisJava
- 收集、分析線上日誌資料實戰——ELK
- HiveSQL的執行過程分析HiveSQL
- 12【線上日誌分析】之RedisLive監控工具的詳細安裝Redis
- 【REDO】Oracle 日誌挖掘,分析歸檔日誌線上日誌主要步驟Oracle
- Oracle 線上日誌管理Oracle
- 線上日誌損壞
- MYSQL 連線登入過程分析MySql
- SSL連線建立過程分析(1)
- Informix 執行緒sleep 分析過程ORM執行緒
- 【DSL】Elasticsearch之Analyze(分析過程)Elasticsearch
- 精盡MyBatis原始碼分析 - SQL執行過程(三)之 ResultSetHandlerMyBatis原始碼SQL
- 精盡MyBatis原始碼分析 - SQL執行過程(一)之 ExecutorMyBatis原始碼SQL
- 精盡MyBatis原始碼分析 - SQL執行過程(二)之 StatementHandlerMyBatis原始碼SQL
- 如何優化執行在webkit上的web app優化WebKitAPP
- 24【線上日誌分析】之Tomcat的支援log4j,日誌輸出為json格式TomcatJSON
- 23【線上日誌分析】之改造CDH的HDFS的NN,DN程式,日誌輸出為json格式JSON
- 非線上日誌丟失
- RAC 線上日誌的管理
- MySQL伺服器連線過程分析MySql伺服器
- 原始碼分析OKHttp的執行過程原始碼HTTP
- crtmpserver 執行過程簡明分析Server
- Mapreduce(二):MR的執行過程分析