Deployment Environments — Run Modes
Spark Deployment Environments (aka Run Modes):
A Spark application is composed of the driver and executors that can run locally (on a single JVM) or using cluster resources (like CPU, RAM and disk that are managed by a cluster manager).
Note
|
You can specify where to run the driver using the deploy mode (using --deploy-mode option of spark-submit or spark.submit.deployMode Spark property).
|
Master URLs
Spark supports the following master URLs (see private object SparkMasterRegex):
-
local
,local[N]
andlocal[*]
for Spark local -
local[N, maxRetries]
for Spark local-with-retries -
local-cluster[N, cores, memory]
for simulating a Spark cluster ofN
executors (threads),cores
CPUs andmemory
locally (aka Spark local-cluster) -
spark://host:port,host1:port1,…
for connecting to Spark Standalone cluster(s) -
mesos://
for Spark on Mesos cluster -
yarn
for Spark on YARN
You can specify the master URL of a Spark application as follows: