記錄CDH Spark2的spark2-submit的一個No such file or directory問題
在測試的CDH Spark2, 執行spark streaming,
命令如下:
點選(此處)摺疊或開啟
-
spark2-submit \
-
--class com.telenav.dataplatform.demo.realtimecases.WeatherAlerts \
-
--master yarn --deploy-mode cluster \
- /usr/local/sparkProject/realtimeCases-0.0.1-SNAPSHOT.jar
錯誤:
點選(此處)摺疊或開啟
-
17/03/02 21:01:56 INFO cluster.YarnClusterScheduler: Adding task set 0.0 with 1 tasks
-
17/03/02 21:01:56 WARN net.ScriptBasedMapping: Exception running /etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/topology.py 172.16.102.64
-
java.io.IOException: Cannot run program "/etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/topology.py" (in directory "/yarn/nm/usercache/spark/appcache/application_1488459089260_0003/container_1488459089260_0003_01_000001"): error=2, No such file or directory
-
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
-
at org.apache.hadoop.util.Shell.runCommand(Shell.java:548)
-
at org.apache.hadoop.util.Shell.run(Shell.java:504)
-
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:786)
-
at org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.runResolveCommand(ScriptBasedMapping.java:251)
-
at org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.resolve(ScriptBasedMapping.java:188)
-
at org.apache.hadoop.net.CachedDNSToSwitchMapping.resolve(CachedDNSToSwitchMapping.java:119)
-
at org.apache.hadoop.yarn.util.RackResolver.coreResolve(RackResolver.java:101)
-
at org.apache.hadoop.yarn.util.RackResolver.resolve(RackResolver.java:81)
-
at org.apache.spark.scheduler.cluster.YarnScheduler.getRackForHost(YarnScheduler.scala:37)
-
at org.apache.spark.scheduler.TaskSetManager$$anonfun$org$apache$spark$scheduler$TaskSetManager$$addPendingTask$1.apply(TaskSetManager.scala:201)
-
at org.apache.spark.scheduler.TaskSetManager$$anonfun$org$apache$spark$scheduler$TaskSetManager$$addPendingTask$1.apply(TaskSetManager.scala:182)
-
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
-
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
-
at org.apache.spark.scheduler.TaskSetManager.org$apache$spark$scheduler$TaskSetManager$$addPendingTask(TaskSetManager.scala:182)
-
at org.apache.spark.scheduler.TaskSetManager$$anonfun$1.apply$mcVI$sp(TaskSetManager.scala:161)
-
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
-
at org.apache.spark.scheduler.TaskSetManager.<init>(TaskSetManager.scala:160)
-
at org.apache.spark.scheduler.TaskSchedulerImpl.createTaskSetManager(TaskSchedulerImpl.scala:222)
-
at org.apache.spark.scheduler.TaskSchedulerImpl.submitTasks(TaskSchedulerImpl.scala:186)
-
at org.apache.spark.scheduler.DAGScheduler.submitMissingTasks(DAGScheduler.scala:1058)
-
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:933)
-
at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:873)
-
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1632)
-
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1624)
-
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1613)
-
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
-
Caused by: java.io.IOException: error=2, No such file or directory
-
at java.lang.UNIXProcess.forkAndExec(Native Method)
-
at java.lang.UNIXProcess.<init>(UNIXProcess.java:247)
-
at java.lang.ProcessImpl.start(ProcessImpl.java:134)
- at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
解決思路:
1.分析這句話,
17/03/02 21:01:56 WARN net.ScriptBasedMapping: Exception running /etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/topology.py 172.16.102.64
java.io.IOException: Cannot run program "/etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/topology.py" (in directory "/yarn/nm/usercache/spark/appcache/application_1488459089260_0003/container_1488459089260_0003_01_000001"): error=2, No such file or directory
說明在這個ip的機器上 沒有這個py檔案。
然後去機器驗證,
然後再將01機器的 配置檔案 全部copy到另外四臺即可。
scp -r /etc/spark2/conf.cloudera.spark2_on_yarn
scp -r /etc/spark2/conf.cloudera.spark2_on_yarn
scp -r /etc/spark2/conf.cloudera.spark2_on_yarn
scp -r /etc/spark2/conf.cloudera.spark2_on_yarn
驗證:
就ok了
來自 “ ITPUB部落格 ” ,連結:http://blog.itpub.net/30089851/viewspace-2134627/,如需轉載,請註明出處,否則將追究法律責任。
相關文章
- 記錄一個小問題
- 記錄一個mysql連線慢的問題MySql
- 記錄一個新專案遇到的 MySQL 問題MySql
- 記錄 libldap-2.4.so.2: cannot open shared object file: No such file or directoryLDAObject
- Docker啟動故障問題 no such file or directory解決方法Docker
- mac下/usr/local/bin No such file or directory問題解決Mac
- 兩個Oracle配置問題的記錄Oracle
- Laravel event 事件使用中 記錄的一個小問題Laravel事件
- 記錄redis的一些問題Redis
- 工作當中碰到的一個UTL_FILE的問題
- 問題1.libXp.so.6: cannot open shared object file: No such file or directoryObject
- (問題解決)Cannot set LC_CTYPE to default locale: No such file or directory
- 記錄開發過程一個路由問題路由
- 一個VNPY 的“CTP:平昨倉位不足”問題的解決記錄
- 個人CSS問題的記錄CSS
- 記錄一些日常的小問題(前端)前端
- 記錄一個 gitlab 登入問題解決Gitlab
- 記錄一個SSH client 退格鍵(backspace)無法正常使用的問題client
- 記錄CDH安裝的一個坑:could not contact scm server at localhost:7182, giving upServerlocalhost
- Oracle ASM File DirectoryOracleASM
- 今天記錄一個小程式使用客服的功能遇到的坑,版本庫問題
- 記錄一次問題排查
- 記錄一次XTTS遷移碰到的問題TTS
- maven專案遇到的一些問題記錄Maven
- 記錄一些常見問題的不同解法
- 記錄使用websocket時因為Sec-Websocket-Protocol遇到的一個問題WebProtocol
- 升級Xcode 10遇到的問題做個記錄XCode
- python unix :No such file or directoryPython
- Python關於使用subprocess.Popen時遇到的一個小問題記錄Python
- 記錄一個HttpClient超時連線配置不生效的問題排查過程HTTPclient
- 記錄一次spark連線mysql遇到的問題SparkMySql
- 記錄一次無法很好解決的問題
- 一個SMMU記憶體訪問異常的問題記憶體
- 面試問題記錄 一 (基礎部分)面試
- 介面詭異的404問題記錄
- 學習httprunner遇到的問題記錄HTTP
- 專案出現的問題記錄
- 記錄使用Performance API遇到的問題ORMAPI