docker下的spark使用

weixin_34019929發表於2018-10-31
在網站
https://hub.docker.com/r/mesosphere/spark/tags/
下可以檢視spark的映象
1,拉取映象
sudo docker pull mesosphere/spark:2.4.0-2.2.1-3-hadoop-2.6
2,執行映象
sudo docker run --name spark2 --hostname  spark2  -it  mesosphere/spark:2.4.0-2.2.1-3-hadoop-2.6 bash
配置環境變數
root@spark2:/opt/spark/dist# export PATH=$JAVA_HOME/bin:$PATH
root@spark2:/opt/spark/dist# export SPARK_HOME=/opt/spark/dist
root@spark2:/opt/spark/dist# export PATH=$SPARK_HOME/bin:$PATH
root@spark2:/opt/spark/dist# spark-shell


相關文章