Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

When you change the number of spark slaves, you may also want to update the property spark.default.parallellism in the service configuration. This is to optimize the performance of the Spark cluster. However, in order for the change in the Spark configuration to become effective you must restart the cluster and submit the spark application. For further information about spark.default.parallellism, see 4.3 KPI Management - External Software.

If the Spark UI indicates that a worker has status "DEAD", you must restart the cluster. 

Code Block
languagetext
themeEclipse
$ mzsh spark worker-restart spark/<service-instance>



Info
titleExample - Restarting a worker
Code Block
languagetext
themeEclipse
$ mzsh spark worker-restart spark/spark1
Note
titleNote!
It is not recommended to run more than one Spark worker per host. The command above will only restart one worker per host. Scroll ignore
scroll-viewportfalse
scroll-pdftrue
scroll-officefalse
scroll-chmtrue
scroll-docbooktrue
scroll-eclipsehelptrue
scroll-epubtrue
scroll-htmlfalse

Next: