import org.apache.spark.launcher.SparkLauncher
val command = new SparkLauncher()
  .setAppResource("SparkPi")
  .setVerbose(true)
val appHandle = command.startApplication()
 SparkLauncher — Launching Spark Applications Programmatically
SparkLauncher is an interface to launch Spark applications programmatically, i.e. from a code (not spark-submit directly). It uses a builder pattern to configure a Spark application and launch it as a child process using spark-submit.
SparkLauncher belongs to org.apache.spark.launcher Scala package in spark-launcher build module.
SparkLauncher uses SparkSubmitCommandBuilder to build the Spark command of a Spark application to launch.
| Setter | Description | 
|---|---|
  | 
Adds command line arguments for a Spark application.  | 
  | 
Adds a file to be submitted with a Spark application.  | 
  | 
Adds a jar file to be submitted with the application.  | 
  | 
Adds a python file / zip / egg to be submitted with a Spark application.  | 
  | 
Adds a no-value argument to the Spark invocation.  | 
  | 
Adds an argument with a value to the Spark invocation. It recognizes known command-line arguments, i.e.   | 
  | 
Sets the working directory of spark-submit.  | 
  | 
Redirects stderr to stdout.  | 
  | 
Redirects error output to the specified   | 
  | 
Redirects error output to the specified   | 
  | 
Redirects output to the specified   | 
  | 
Redirects standard output to the specified   | 
  | 
Sets all output to be logged and redirected to a logger with the specified name.  | 
  | 
Sets the name of an Spark application  | 
  | 
Sets the main application resource, i.e. the location of a jar file for Scala/Java applications.  | 
  | 
Sets a Spark property. Expects   | 
  | 
Sets the deploy mode.  | 
  | 
Sets a custom   | 
  | 
Sets the main class.  | 
  | 
Sets the master URL.  | 
  | 
Sets the internal   | 
  | 
Sets a custom   | 
  | 
Enables verbose reporting for SparkSubmit.  | 
After the invocation of a Spark application is set up, use launch() method to launch a sub-process that will start the configured Spark application. It is however recommended to use startApplication method instead.