$ jps -lm
999 org.apache.spark.deploy.master.Master --ip japila.local --port 7077 --webui-port 8080
397
669 org.jetbrains.idea.maven.server.RemoteMavenServer
1198 sun.tools.jps.Jps -lm
$ jps -lm | grep -i spark
999 org.apache.spark.deploy.master.Master --ip japila.local --port 7077 --webui-port 8080
Checking Status of Spark Standalone
jps
Since you’re using Java tools to run Spark, use jps -lm
as the tool to get status of any JVMs on a box, Spark’s ones including. Consult jps documentation for more details beside -lm
command-line options.
If you however want to filter out the JVM processes that really belong to Spark you should pipe the command’s output to OS-specific tools like grep
.
spark-daemon.sh status
You can also check out ./sbin/spark-daemon.sh status
.
When you start Spark Standalone using scripts under sbin
, PIDs are stored in /tmp
directory by default. ./sbin/spark-daemon.sh
status can read them and do the "boilerplate" for you, i.e. status a PID.
$ jps -lm | grep -i spark
999 org.apache.spark.deploy.master.Master --ip japila.local --port 7077 --webui-port 8080
$ ls /tmp/spark-*.pid
/tmp/spark-jacek-org.apache.spark.deploy.master.Master-1.pid
$ ./sbin/spark-daemon.sh status org.apache.spark.deploy.master.Master 1
org.apache.spark.deploy.master.Master is running.