Killing a Spark Application

Killing spark applications are required when you have updated a kpi model or the configurations for Kafka, Zookeeper, or Spark.

Note!

As prerequisite, the script kpi_params.sh must be prepared according to 5.2 Preparing and Creating Scripts for KPI Management, as it contains connection information for spark and kafka etc.

Stopping the Spark Cluster

$ stop.sh

You can also use the spark web interface by clicking on the kill buttons, next to the Application Id in the Spark Master web interface.

Example using the web interface;

  1. Kill a submitted application by clicking on the kill link next to the Application Id.

    Spark Master UI - Running Applications
  2. Identify the Spark driver that coordinates application. This driver must be killed manually, ensuring that the application can be submitted again.

  3. When you are running one application there will be only one driver. However, when you are running multiple applications, you must click on the Worker name of each driver to find the name of the coordinated application.

    Spark UI - Running drivers
  4. The Spark application is listed in the Job Details column.

  5. To kill a driver, click the kill link next to the Submission Id.

     

    Â