There are 4 was to run spark:
- local
- Standalone
- YARN
- Mesos
When Spark is run in local mode (as you're doing on your laptop) a separate Spark Master and separate Spark Workers + Exectuors are not launched.
Instead a single Java Virtual Machine (JVM) is launched with one Executor, whose ID is <driver>. This special Executor runs the Driver (which is the "Spark shell" application in this instance) and this special Executor also
runs our Scala code. By default, this single Executor will be started with X threads, where X is equal to the # of cores on your machine.
Local mode is used when you want to run Spark locally and not in a distributed cluster. Spark local mode is different than Standalone mode (which is still designed for a cluster setup).
To summarize, in local mode, the Spark shell application (aka the Driver) and the Spark Executor is run within the same JVM. More precisely, the single Executor that is launched is named <driver> and this Executor runs both
the driver code and the executes our Spark Scala transformations and actions.
So, although there is no Master UI in local mode, if you are curious, here is what a Master UI looks like here is a screenshot:

https://blueplastic.gitbooks.io/how-to-light-your-spark-on-a-stick/content/spark_web_uis/spark_stages_ui.html
本文详细介绍了Spark在本地模式下的运行机制,包括如何在笔记本上运行Spark,不使用独立的Spark Master和Worker,而是在同一台机器上启动一个单独的Java虚拟机(JVM),其中既包含了Driver应用(Sparkshell)也执行了Scala代码的转换和操作。尽管没有Master UI界面,文章提供了Master UI界面的截图供参考。
1886

被折叠的 条评论
为什么被折叠?



