Spark Properties and spark-defaults.conf Properties File

Spark properties are the means of tuning the execution environment for your Spark applications.

The default Spark properties file is $SPARK_HOME/conf/spark-defaults.conf that could be overriden using spark-submit's --properties-file command-line option.

Table 1. Environment Variables
Environment Variable Default Value Description

SPARK_CONF_DIR

${SPARK_HOME}/conf

Spark’s configuration directory (with spark-defaults.conf)

Tip
Read the official documentation of Apache Spark on Spark Configuration.

spark-defaults.conf — Default Spark Properties File

spark-defaults.conf (under SPARK_CONF_DIR or $SPARK_HOME/conf) is the default properties file with the Spark properties of your Spark applications.

Note
spark-defaults.conf is loaded by AbstractCommandBuilder’s loadPropertiesFile internal method.

Calculating Path of Default Spark Properties — Utils.getDefaultPropertiesFile method

getDefaultPropertiesFile(env: Map[String, String] = sys.env): String

getDefaultPropertiesFile calculates the absolute path to spark-defaults.conf properties file that can be either in directory specified by SPARK_CONF_DIR environment variable or $SPARK_HOME/conf directory.

Note
getDefaultPropertiesFile is a part of private[spark] org.apache.spark.util.Utils object.

Environment Variables

results matching ""

    No results matching ""