Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

In preparation for using KPI Management in MediationZone, a number of scripts need to be extracted and some setup is required. This is described on this page.

...

These scripts will be used for different procedures that you find in the sections for KPI Management - Distributed Processing.

Preparations before extracting scripts:

A Prerequisite is that Spark, ZooKeeper, and Kafka are installed and up and running. For more information about this, see LINK needed

Before running the command to extract the scripts, these parameters need to be set as environment variables as they will be entered into some scripts:

Code Block
export KAFKA_BROKERS="127.0.0.1:9092"
export SPARK_UI_PORT=4040 
export MZ_PLATFORM_AUTH="mzadmin:DR-4-1D2E6A059AF8120841E62C87CFDB3FF4"
export MZ_KPI_PROFILE_NAME="kpi_common.SalesModel"
export MZ_PLATFORM_URL="http://127.0.0.1:9036"
export ZOOKEEPER_HOSTS="127.0.0.1:2181"
export SPARK_HOME=opt/spark-3.3.2-bin-hadoop3-scala2.13
export KAFKA_HOME=/opt/kafka_2.13-3.3.1
export $PATH=$SPARK_HOME/bin:$KAFKA_HOME/bin:$PATH

Creating Extracting scripts and KPI app:

1. Set up your preferred KPI profile or use the simplified example configuration which can be found in kpi_tst.zip Where can we find this?

2. Find and copy the kpi_spark*.mzp among the installation files . Copy and copy it to a place you want to keep your KPI application files.

3. To install extract the KPI app after building it , and extract the app installation:run the following command. It extracts the software needed by spark for the KPI app as well as the scripts needed for starting and configuring spark.

Code Block
$ cd release/packages

$ java -jar kpi_spark_9.1.0.0.mzp install

...

Code Block
$ export SPARK_HOME="your spark home"

7. The next step is to modify the scripts under the bin folder according to your specifications and requirements, as they are when extracted they are to be considered as more of examples than a finished configuration.

The scripts submit.sh, kpi_params.sh and spark_common_params.sh need to be updated. In addition, the settings in spark-defaults.conf need to be added to the spark configuration file equally named.

The changes that need to be done in most cases are the following:

In kpi_params.sh:

KAFKA_BROKERS need to be configured with the hosts and ports of the kafka brokers, e.g:

export KAFKA_BROKERS=”192.168.1.100:9092,192.168.1.101:9092,192.168.1.102:9092”

The memory settings may need to be altered depending on the expected load, as well as the UI port for the KPI App inside Spark (default 4040).
In addition the addresses and ports of the platform, kafka and zookeeper may need to be updated.
In spark_common_params.sh:

Here all that needs to be changed is the master host ip, as well as the ports if needed. ← I don’t know what SPARK_LOCAL_IP is used for.

Submit.sh only needs to be updated if the log format is to be altered. In that case, add this parameter:

log4j_setting="-Dlog4j.configuration=/home/davids/userstories/xe10095/log4j/spark_log4j.properties"

With the path to a log4j properties file. Below is an example of this:

Code Block
log4j.rootLogger = INFO, FILE
log4j.appender.FILE=org.apache.log4j.FileAppender
log4j.appender.FILE.File=/tmp/mz_kpiapp_log4j.out
log4j.appender.FILE.ImmediateFlush=true
log4j.appender.FILE.Threshold=info
log4j.appender.FILE.Append=true
log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
log4j.appender.FILE.layout.ConversionPattern=%d{HH:mm:ss}-%t-%x-%-5p-%-10c:%m%n

Edit the kpiapp/bin/spark_common_param.sh, so it has the SPARK_HOME path.

...