spark下統計單詞頻次
寫了一個簡單的語句,還沒有優化:
scala> sc.
| textFile("/etc/profile").
| flatMap((s:String)=>s.split("\\s")).
| map(_.toUpperCase).
| map((s:String)=>(s, 1)).
| filter((pair)=>pair._1.forall((ch)=>ch>'A'&&ch<'Z')).
| reduceByKey(_+_).
| sortByKey().
| foreach(println)
注意這程式碼還可以優化:
scala> sc.
| textFile("/etc/profile").
| flatMap(_.split("\\s")).
| map(_.toUpperCase).
| map((_, 1)).
| filter(_._1.forall((ch)=>ch>'A'&&ch<'Z')).
| reduceByKey(_+_).
| sortByKey().
| foreach(println)
輸出結果如下:
15/03/06 08:50:44 INFO MemoryStore: ensureFreeSpace(75904) called with curMem=259812, maxMem=277842493
15/03/06 08:50:44 INFO MemoryStore: Block broadcast_10 stored as values in memory (estimated size 74.1 KB, free 264.7 MB)
15/03/06 08:50:44 INFO FileInputFormat: Total input paths to process : 1
15/03/06 08:50:44 INFO SparkContext: Starting job: sortByKey at <console>:20
15/03/06 08:50:44 INFO DAGScheduler: Registering RDD 25 (filter at <console>:18)
15/03/06 08:50:44 INFO DAGScheduler: Got job 4 (sortByKey at <console>:20) with 2 output partitions (allowLocal=false)
15/03/06 08:50:44 INFO DAGScheduler: Final stage: Stage 10(sortByKey at <console>:20)
15/03/06 08:50:44 INFO DAGScheduler: Parents of final stage: List(Stage 11)
15/03/06 08:50:44 INFO DAGScheduler: Missing parents: List(Stage 11)
15/03/06 08:50:44 INFO DAGScheduler: Submitting Stage 11 (FilteredRDD[25] at filter at <console>:18), which has no missing parents
15/03/06 08:50:44 INFO MemoryStore: ensureFreeSpace(3736) called with curMem=335716, maxMem=277842493
15/03/06 08:50:44 INFO MemoryStore: Block broadcast_11 stored as values in memory (estimated size 3.6 KB, free 264.6 MB)
15/03/06 08:50:44 INFO DAGScheduler: Submitting 2 missing tasks from Stage 11 (FilteredRDD[25] at filter at <console>:18)
15/03/06 08:50:44 INFO TaskSchedulerImpl: Adding task set 11.0 with 2 tasks
15/03/06 08:50:44 INFO TaskSetManager: Starting task 0.0 in stage 11.0 (TID 16, localhost, PROCESS_LOCAL, 1162 bytes)
15/03/06 08:50:44 INFO TaskSetManager: Starting task 1.0 in stage 11.0 (TID 17, localhost, PROCESS_LOCAL, 1162 bytes)
15/03/06 08:50:44 INFO Executor: Running task 1.0 in stage 11.0 (TID 17)
15/03/06 08:50:44 INFO Executor: Running task 0.0 in stage 11.0 (TID 16)
15/03/06 08:50:44 INFO HadoopRDD: Input split: file:/etc/profile:1189+1189
15/03/06 08:50:44 INFO HadoopRDD: Input split: file:/etc/profile:0+1189
15/03/06 08:50:44 INFO Executor: Finished task 1.0 in stage 11.0 (TID 17). 1863 bytes result sent to driver
15/03/06 08:50:44 INFO TaskSetManager: Finished task 1.0 in stage 11.0 (TID 17) in 43 ms on localhost (1/2)
15/03/06 08:50:44 INFO Executor: Finished task 0.0 in stage 11.0 (TID 16). 1863 bytes result sent to driver
15/03/06 08:50:44 INFO TaskSetManager: Finished task 0.0 in stage 11.0 (TID 16) in 51 ms on localhost (2/2)
15/03/06 08:50:44 INFO DAGScheduler: Stage 11 (filter at <console>:18) finished in 0.054 s
15/03/06 08:50:44 INFO DAGScheduler: looking for newly runnable stages
15/03/06 08:50:44 INFO DAGScheduler: running: Set()
15/03/06 08:50:44 INFO DAGScheduler: waiting: Set(Stage 10)
15/03/06 08:50:44 INFO DAGScheduler: failed: Set()
15/03/06 08:50:44 INFO TaskSchedulerImpl: Removed TaskSet 11.0, whose tasks have all completed, from pool
15/03/06 08:50:44 INFO DAGScheduler: Missing parents for Stage 10: List()
15/03/06 08:50:44 INFO DAGScheduler: Submitting Stage 10 (MapPartitionsRDD[28] at sortByKey at <console>:20), which is now runnable
15/03/06 08:50:44 INFO MemoryStore: ensureFreeSpace(2856) called with curMem=339452, maxMem=277842493
15/03/06 08:50:44 INFO MemoryStore: Block broadcast_12 stored as values in memory (estimated size 2.8 KB, free 264.6 MB)
15/03/06 08:50:44 INFO DAGScheduler: Submitting 2 missing tasks from Stage 10 (MapPartitionsRDD[28] at sortByKey at <console>:20)
15/03/06 08:50:44 INFO TaskSchedulerImpl: Adding task set 10.0 with 2 tasks
15/03/06 08:50:44 INFO TaskSetManager: Starting task 0.0 in stage 10.0 (TID 18, localhost, PROCESS_LOCAL, 948 bytes)
15/03/06 08:50:44 INFO TaskSetManager: Starting task 1.0 in stage 10.0 (TID 19, localhost, PROCESS_LOCAL, 948 bytes)
15/03/06 08:50:44 INFO Executor: Running task 0.0 in stage 10.0 (TID 18)
15/03/06 08:50:44 INFO Executor: Running task 1.0 in stage 10.0 (TID 19)
15/03/06 08:50:44 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight: 50331648, targetRequestSize: 10066329
15/03/06 08:50:44 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
15/03/06 08:50:44 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 0 remote fetches in 0 ms
15/03/06 08:50:44 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight: 50331648, targetRequestSize: 10066329
15/03/06 08:50:44 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
15/03/06 08:50:44 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 0 remote fetches in 0 ms
15/03/06 08:50:44 INFO Executor: Finished task 0.0 in stage 10.0 (TID 18). 1165 bytes result sent to driver
15/03/06 08:50:44 INFO TaskSetManager: Finished task 0.0 in stage 10.0 (TID 18) in 18 ms on localhost (1/2)
15/03/06 08:50:44 INFO Executor: Finished task 1.0 in stage 10.0 (TID 19). 1293 bytes result sent to driver
15/03/06 08:50:44 INFO TaskSetManager: Finished task 1.0 in stage 10.0 (TID 19) in 28 ms on localhost (2/2)
15/03/06 08:50:44 INFO DAGScheduler: Stage 10 (sortByKey at <console>:20) finished in 0.031 s
15/03/06 08:50:44 INFO TaskSchedulerImpl: Removed TaskSet 10.0, whose tasks have all completed, from pool
15/03/06 08:50:44 INFO SparkContext: Job finished: sortByKey at <console>:20, took 0.107864348 s
15/03/06 08:50:44 INFO SparkContext: Starting job: foreach at <console>:21
15/03/06 08:50:44 INFO MapOutputTrackerMaster: Size of output statuses for shuffle 4 is 144 bytes
15/03/06 08:50:44 INFO DAGScheduler: Registering RDD 26 (reduceByKey at <console>:19)
15/03/06 08:50:44 INFO DAGScheduler: Got job 5 (foreach at <console>:21) with 2 output partitions (allowLocal=false)
15/03/06 08:50:44 INFO DAGScheduler: Final stage: Stage 12(foreach at <console>:21)
15/03/06 08:50:44 INFO DAGScheduler: Parents of final stage: List(Stage 14)
15/03/06 08:50:44 INFO DAGScheduler: Missing parents: List(Stage 14)
15/03/06 08:50:44 INFO DAGScheduler: Submitting Stage 14 (ShuffledRDD[26] at reduceByKey at <console>:19), which has no missing parents
15/03/06 08:50:44 INFO MemoryStore: ensureFreeSpace(2472) called with curMem=342308, maxMem=277842493
15/03/06 08:50:44 INFO MemoryStore: Block broadcast_13 stored as values in memory (estimated size 2.4 KB, free 264.6 MB)
15/03/06 08:50:44 INFO DAGScheduler: Submitting 2 missing tasks from Stage 14 (ShuffledRDD[26] at reduceByKey at <console>:19)
15/03/06 08:50:44 INFO TaskSchedulerImpl: Adding task set 14.0 with 2 tasks
15/03/06 08:50:44 INFO TaskSetManager: Starting task 0.0 in stage 14.0 (TID 20, localhost, PROCESS_LOCAL, 937 bytes)
15/03/06 08:50:44 INFO TaskSetManager: Starting task 1.0 in stage 14.0 (TID 21, localhost, PROCESS_LOCAL, 937 bytes)
15/03/06 08:50:44 INFO Executor: Running task 1.0 in stage 14.0 (TID 21)
15/03/06 08:50:44 INFO Executor: Running task 0.0 in stage 14.0 (TID 20)
15/03/06 08:50:44 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight: 50331648, targetRequestSize: 10066329
15/03/06 08:50:44 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
15/03/06 08:50:44 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 0 remote fetches in 0 ms
15/03/06 08:50:44 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight: 50331648, targetRequestSize: 10066329
15/03/06 08:50:44 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
15/03/06 08:50:44 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 0 remote fetches in 1 ms
15/03/06 08:50:44 INFO Executor: Finished task 1.0 in stage 14.0 (TID 21). 996 bytes result sent to driver
15/03/06 08:50:44 INFO TaskSetManager: Finished task 1.0 in stage 14.0 (TID 21) in 14 ms on localhost (1/2)
15/03/06 08:50:44 INFO Executor: Finished task 0.0 in stage 14.0 (TID 20). 996 bytes result sent to driver
15/03/06 08:50:44 INFO TaskSetManager: Finished task 0.0 in stage 14.0 (TID 20) in 21 ms on localhost (2/2)
15/03/06 08:50:44 INFO TaskSchedulerImpl: Removed TaskSet 14.0, whose tasks have all completed, from pool
15/03/06 08:50:44 INFO DAGScheduler: Stage 14 (reduceByKey at <console>:19) finished in 0.022 s
15/03/06 08:50:44 INFO DAGScheduler: looking for newly runnable stages
15/03/06 08:50:44 INFO DAGScheduler: running: Set()
15/03/06 08:50:44 INFO DAGScheduler: waiting: Set(Stage 12)
15/03/06 08:50:44 INFO DAGScheduler: failed: Set()
15/03/06 08:50:44 INFO DAGScheduler: Missing parents for Stage 12: List()
15/03/06 08:50:44 INFO DAGScheduler: Submitting Stage 12 (ShuffledRDD[29] at sortByKey at <console>:20), which is now runnable
15/03/06 08:50:44 INFO MemoryStore: ensureFreeSpace(2304) called with curMem=344780, maxMem=277842493
15/03/06 08:50:44 INFO MemoryStore: Block broadcast_14 stored as values in memory (estimated size 2.3 KB, free 264.6 MB)
15/03/06 08:50:44 INFO DAGScheduler: Submitting 2 missing tasks from Stage 12 (ShuffledRDD[29] at sortByKey at <console>:20)
15/03/06 08:50:44 INFO TaskSchedulerImpl: Adding task set 12.0 with 2 tasks
15/03/06 08:50:44 INFO TaskSetManager: Starting task 0.0 in stage 12.0 (TID 22, localhost, PROCESS_LOCAL, 948 bytes)
15/03/06 08:50:44 INFO TaskSetManager: Starting task 1.0 in stage 12.0 (TID 23, localhost, PROCESS_LOCAL, 948 bytes)
15/03/06 08:50:45 INFO Executor: Running task 1.0 in stage 12.0 (TID 23)
15/03/06 08:50:45 INFO Executor: Running task 0.0 in stage 12.0 (TID 22)
15/03/06 08:50:45 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight: 50331648, targetRequestSize: 10066329
15/03/06 08:50:45 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
15/03/06 08:50:45 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 0 remote fetches in 0 ms
15/03/06 08:50:45 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight: 50331648, targetRequestSize: 10066329
15/03/06 08:50:45 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
15/03/06 08:50:45 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 0 remote fetches in 0 ms
(LOGIN,2)
(MERGING,1)
(MUCH,1)
(NEED,1)
(NOT,1)
(PREVENT,1)
(RESERVED,1)
(SCRIPT,1)
(SETS,1)
(SETUP,1)
(SHELL,2)
(SYSTEM,2)
(THE,1)
(THEN,8)
(THIS,3)
(THRESHOLD,1)
(TO,5)
(UIDGID,1)
(UNLESS,1)
(UNSET,2)
(USER,1)
(WE,1)
(WIDE,1)
(WILL,1)
(YOU,3)
(YOUR,1)
15/03/06 08:50:45 INFO Executor: Finished task 1.0 in stage 12.0 (TID 23). 826 bytes result sent to driver
15/03/06 08:50:45 INFO TaskSetManager: Finished task 1.0 in stage 12.0 (TID 23) in 13 ms on localhost (1/2)
(,260)
(BETTER,1)
(BY,1)
(CHECK,1)
(COULD,1)
(CURRENT,1)
(CUSTOM,1)
(DO,1)
(DONE,1)
(ELSE,5)
(ENVIRONMENT,1)
(EXPORT,15)
(FI,8)
(FILE,2)
(FOR,5)
(FUNCTIONS,1)
(FUTURE,1)
(GET,1)
(GO,1)
(GOOD,1)
(HISTCONTROL,1)
(I,2)
(IF,8)
(IN,6)
(IS,1)
(IT,1)
(KNOW,1)
(KSH,1)
15/03/06 08:50:45 INFO Executor: Finished task 0.0 in stage 12.0 (TID 22). 826 bytes result sent to driver
15/03/06 08:50:45 INFO TaskSetManager: Finished task 0.0 in stage 12.0 (TID 22) in 27 ms on localhost (2/2)
15/03/06 08:50:45 INFO TaskSchedulerImpl: Removed TaskSet 12.0, whose tasks have all completed, from pool
15/03/06 08:50:45 INFO DAGScheduler: Stage 12 (foreach at <console>:21) finished in 0.025 s
15/03/06 08:50:45 INFO SparkContext: Job finished: foreach at <console>:21, took 0.07397057 s
通過如下程式碼,可以輸出參與計算的節點名稱,注意start-all並指定shell的–master引數:
spark-shell --master spark://bluejoe0:7077
程式碼如下:
rdd.mapPartitions(_=>Array[String](("hostname" !!).trim).iterator, false).collect
res28: Array[String] = Array(bluejoe4, bluejoe5)
相關文章
- Javafx-【直方圖】文字頻次統計工具 中文/英文單詞統計Java直方圖
- Spark入門(三)--Spark經典的單詞統計Spark
- 詞頻統計
- 詞頻統計mapreduce
- 對文字檔案中的單詞統計出現的次數(即詞頻)並按詞頻的從高到低排序排序
- 個人專案----詞頻統計----單元測試
- Java簡單實現漢語詞頻統計Java
- 詞頻統計-------------web版本Web
- python如何統計詞頻Python
- 【python技能】詞頻統計Python
- 統計檔案中出現的單詞次數
- python TK庫 統計word文件單詞詞頻程式 UI選擇文件PythonUI
- python實現詞頻統計Python
- 分析“詞頻統計“專案程式
- PostgreSQL全文檢索-詞頻統計SQL
- 基於RDD的Spark應用程式開發案列講解(詞頻統計)Spark
- 文字挖掘之語料庫、分詞、詞頻統計分詞
- Matlab 日期頻次統計Matlab
- 【week2】 詞頻統計效能分析
- Hadoop MapReduce之wordcount(詞頻統計)Hadoop
- 詞頻統計任務程式設計實踐程式設計
- 個人專案----詞頻統計WEB(部分功能)Web
- 個人專案----詞頻統計(補全功能)
- Ospaf專案-commits詞頻統計模組MIT
- 瓦爾登湖各單詞出現頻次,並按排次由高到低排序排序
- Python統計四六級考試的詞頻Python
- 在Java中使用Lambda表示式統計詞頻Java
- C++ 統計單詞數C++
- 呼叫MapReduce對檔案中單詞出現次數進行統計
- js統計陣列中單詞出現次數程式碼例項JS陣列
- Java、Scala、Python ☞ 本地WordCount詞頻統計對比JavaPython
- 統計知識:頻數表(百度名詞)
- 【csp202403-1】詞頻統計【第33次CCF計算機軟體能力認證】計算機
- 瓦爾登湖單詞統計+排序排序
- Spark-stream基礎---sparkStreaming和Kafka整合wordCount單詞計數SparkKafka
- python 統計文章單詞個數Python
- pyton 統計單詞並排序-ok排序
- 統計檔案中單詞個數