Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

In preparation for using KPI Management in MediationZone, a number of scripts need to be extracted and some setup is required. This is described on this page.

...

These scripts will be used for different procedures that you find in the sections for KPI Management - Distributed Processing.

Preparations before extracting scripts:

A Prerequisite is that Spark, ZooKeeper, and Kafka are installed and up and running. For more information about this, see LINK needed

Before running the command to extract the scripts, these parameters need to be set as environment variables as they will be entered into some scripts:

...

1. Set up your preferred KPI profile or use the simplified example configuration which can be found in kpi_tst.zip ← Where can we find this?

2. Find and the kpi_spark*.mzp among the installation files and copy it to a place you want to keep your KPI application files.

...

Here all that needs to be changed is the master host ip, as well as the ports if needed. ← I don’t know what SPARK_LOCAL_IP is used for.

Submit.sh only needs to be updated if the log format is to be altered. In that case, add this parameter:

...

  1. Access the conf-folder of Apache Spark, the spark-defaults.conf.template file should be renamed to spark-defaults.conf and the following configuration variables and options added:

    Code Block
    languagenone
    spark.driver.defaultJavaOptions    --add-opens java.base/java.lang=ALL-UNNAMED \
    --add-opens java.base/java.lang.invoke=ALL-UNNAMED \
    --add-opens java.base/java.lang.reflect=ALL-UNNAMED \
    --add-opens java.base/java.util=ALL-UNNAMED \
    --add-opens java.base/java.util.concurrent=ALL-UNNAMED \
    --add-opens java.base/java.util.concurrent.atomic=ALL-UNNAMED \
    --add-opens java.base/java.io=ALL-UNNAMED \
    --add-opens java.base/java.net=ALL-UNNAMED \
    --add-opens java.base/java.nio=ALL-UNNAMED \
    --add-opens java.base/sun.nio.ch=ALL-UNNAMED \
    --add-opens java.base/sun.nio.cs=ALL-UNNAMED \
    --add-opens java.base/sun.util.calendar=ALL-UNNAMED \
    --add-opens java.base/sun.security.action=ALL-UNNAMED
    
    spark.executor.defaultJavaOptions    --add-opens java.base/java.lang=ALL-UNNAMED \
    --add-opens java.base/java.lang.invoke=ALL-UNNAMED \
    --add-opens java.base/java.lang.reflect=ALL-UNNAMED \
    --add-opens java.base/java.util=ALL-UNNAMED \
    --add-opens java.base/java.util.concurrent=ALL-UNNAMED \
    --add-opens java.base/java.util.concurrent.atomic=ALL-UNNAMED \
    --add-opens java.base/java.io=ALL-UNNAMED \
    --add-opens java.base/java.net=ALL-UNNAMED \
    --add-opens java.base/java.nio=ALL-UNNAMED \
    --add-opens java.base/sun.nio.ch=ALL-UNNAMED \
    --add-opens java.base/sun.nio.cs=ALL-UNNAMED \
    --add-opens java.base/sun.util.calendar=ALL-UNNAMED \
    --add-opens java.base/sun.security.action=ALL-UNNAMED
    
    spark.master.rest.enabled true
  2. Add this to the jvm-args section of the execution context definition for the ec that will run the KPI Management workflows (open it by e.g. “mzsh mzadmin/<password> topo open kpi_ec”)

Code Block
jvmargs {
    args=[
            "--add-opens", "java.base/java.lang.invoke=ALL-UNNAMED",
            "--add-opens", "java.base/java.lang.reflect=ALL-UNNAMED",
            "--add-opens", "java.base/java.util=ALL-UNNAMED"
    ]
}

NB! The lines “jvmargs {“, “args=[“, “]” and “}” are not necessarily new, but just included to clarify where to edit.

...

Code Block
$ submit.sh kpiapp ...

11. You should now can see 2 workers, and 2 executors:

Code Block
$ jps

Will give you something like:
pid1 Worker
pid2 Worker
pid3 CoarseGrainedExecutorBackend
pid4 CoarseGrainedExecutorBackend
pid5 DriverWrapper
pid6 CodeServerMain
pid8 Master

...