Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

In preparation for using KPI Management in MediationZone, somescripts need to be extracted and some setup is required. This is described on this page.you need to extract the following scripts:

The scripts are as follows:

...

These scripts will be used for different procedures in the KPI Management - Distributed Processing sections.

Preparations before extracting scripts:

A Prerequisite is that Spark, ZooKeeper, and Kafka are installed. Zookeeper and Kafka should be up and running as well. For more information about this, see 5.3 KPI Management - External Software.

Before running the command to extract the scripts, these parameters need to be set as environment variables as they will be entered into some scripts:

...

1. Set up your preferred KPI profile or use the simplified example configuration which can be found in kpi_tst.zip <- Where can we find this?

2. Find the kpi_spark*.mzp among the installation files and copy it to where you want to keep your KPI application files.

...

7. The next step is to modify the scripts under the bin folder according to your specifications and requirements, as they are when extracted they are to be considered as more of examples than a finished configuration. The scripts kpi_params.sh and spark_common_params.sh are the ones that need to be updated. In addition, the settings in spark-defaults.conf need to be added to the spark configuration file equally named.The changes that need to be done in most cases are the following:

In kpi_params.sh:, KAFKA_BROKERS need to be configured with the hosts and ports of the kafka brokers, e. gFor example:

export KAFKA_BROKERS=”192"192.168.1.100:9092,192.168.1.101:9092,192.168.1.102:9092”9092"

The username and password for an a user with access to the profile is needed to be entered as the property MZ_PLATFORM_AUTH, unless the default username and password mzadmin/dr is used. The password is encrypted using the mzsh command encryptpassword.

The memory settings may need to be altered depending on the expected load, as well as the UI port for the KPI App inside Spark (default 4040).
In addition to the addresses and ports of the platform, kafka and zookeeper may need to be updated.
In spark_common_params.sh:Here all that needs to be changed is , you may need to change the master host ip, as well as the IP and ports if neededapplicable. Edit the kpiapp/bin/spark_common_param.sh, so it has the SPARK_HOME path.

  1. Access the conf-folder of Apache Spark, the spark-defaults.conf.template file should be renamed to spark-defaults.conf and the following configuration variables and options added:

    Code Block
    languagenone
    spark.driver.defaultJavaOptions    --add-opens java.base/java.lang=ALL-UNNAMED \
    --add-opens java.base/java.lang.invoke=ALL-UNNAMED \
    --add-opens java.base/java.lang.reflect=ALL-UNNAMED \
    --add-opens java.base/java.util=ALL-UNNAMED \
    --add-opens java.base/java.util.concurrent=ALL-UNNAMED \
    --add-opens java.base/java.util.concurrent.atomic=ALL-UNNAMED \
    --add-opens java.base/java.io=ALL-UNNAMED \
    --add-opens java.base/java.net=ALL-UNNAMED \
    --add-opens java.base/java.nio=ALL-UNNAMED \
    --add-opens java.base/sun.nio.ch=ALL-UNNAMED \
    --add-opens java.base/sun.nio.cs=ALL-UNNAMED \
    --add-opens java.base/sun.util.calendar=ALL-UNNAMED \
    --add-opens java.base/sun.security.action=ALL-UNNAMED
    
    spark.executor.defaultJavaOptions    --add-opens java.base/java.lang=ALL-UNNAMED \
    --add-opens java.base/java.lang.invoke=ALL-UNNAMED \
    --add-opens java.base/java.lang.reflect=ALL-UNNAMED \
    --add-opens java.base/java.util=ALL-UNNAMED \
    --add-opens java.base/java.util.concurrent=ALL-UNNAMED \
    --add-opens java.base/java.util.concurrent.atomic=ALL-UNNAMED \
    --add-opens java.base/java.io=ALL-UNNAMED \
    --add-opens java.base/java.net=ALL-UNNAMED \
    --add-opens java.base/java.nio=ALL-UNNAMED \
    --add-opens java.base/sun.nio.ch=ALL-UNNAMED \
    --add-opens java.base/sun.nio.cs=ALL-UNNAMED \
    --add-opens java.base/sun.util.calendar=ALL-UNNAMED \
    --add-opens java.base/sun.security.action=ALL-UNNAMED
    
    spark.master.rest.enabled true
  2. Add this to the jvm-args jvmargs section of the execution context definition for the ec that will run the KPI Management workflows (open it by e.g. “mzsh . For example:
    You can open the configuration by running:
    mzsh mzadmin/<password> topo open kpi_ec”)ec

Code Block
jvmargs {
    args=[
            "--add-opens", "java.base/java.lang.invoke=ALL-UNNAMED",
            "--add-opens", "java.base/java.lang.reflect=ALL-UNNAMED",
            "--add-opens", "java.base/java.util=ALL-UNNAMED"
    ]
}

...