Access private members in Scala in Spark shell

If you ever wanted to use private[spark] members in Spark using the Scala programming language, e.g. toy with org.apache.spark.scheduler.DAGScheduler or similar, you will have to use the following trick in Spark shell - use :paste -raw as described in REPL: support for package definition.

Open spark-shell and execute :paste -raw that allows you to enter any valid Scala code, including package.

The following snippet shows how to access private[spark] member DAGScheduler.RESUBMIT_TIMEOUT:

scala> :paste -raw
// Entering paste mode (ctrl-D to finish)

package org.apache.spark

object spark {
  def test = {
    import org.apache.spark.scheduler._
    println(DAGScheduler.RESUBMIT_TIMEOUT == 200)
  }
}

scala> spark.test
true

scala> sc.version
res0: String = 1.6.0-SNAPSHOT

results matching ""

    No results matching ""