Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

Killing spark applications are required when you have updated a service model or the service configurations for Kafka, Zookeeper, or Spark.

Note!

As prerequisite, the scripts must be prepared according to Depcrecated Preparing and Creating Scripts for KPI Management.

Stopping the Spark Cluster

$ stop.sh


  • No labels

0 Comments

You are not logged in. Any changes you make will be marked as anonymous. You may want to Log In if you already have an account.