Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Killing spark applications are required when you have updated a kpi model or the configurations for Kafka, Zookeeper, or Spark.

Note

Note!

As prerequisite, the script kpi_params.sh must be prepared according to 5.2 Preparing and Creating Scripts for KPI Management, as it contains connection information for spark and kafka etc.

Stopping the Spark Cluster

Code Block
$ stop.sh

You can also use the spark web interface by clicking on the kill buttons, next to the Application Id in the Spark Master web interface.

...