Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Killing spark applications are required when you have updated a kpi model or the configurations for Kafka, Zookeeper, or Spark.

Note

Note!

As prerequisite, the script kpi_params.sh must be prepared according to 5.2 Preparing and Creating Scripts for KPI Management, as it contains connection information for spark and kafka etc.

Stopping the Spark Cluster

Code Block
$ stop.sh

You can also use the spark web interface by clicking on the kill buttons, next to the Application Id in the Spark Master web interface.

Example using the web interface;

  1. Kill a submitted application by clicking on the kill link next to the Application Id.

    Image Added
  2. Identify the Spark driver that coordinates application. This driver must be killed manually, ensuring that the application can be submitted again.

  3. When you are running one application there will be only one driver. However, when you are running multiple applications, you must click on the Worker name of each driver to find the name of the coordinated application.

    Image Added
  4. The Spark application is listed in the Job Details column.

    Image Added
  5. To kill a driver, click the kill link next to the Submission Id.

    Image Added