Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  1. A user provisions a service model configuration directly via a REST interface or via a KPI profile in the Desktop.

    Tip
    titleHint!

    Existing profiles and service models can be viewed in the  Web UI as well.


  2. The user then submits a configurable application to the Spark cluster, which performs the KPI calculations.  

  3. Input data is received by the KPI Cluster In agent via KDR UDRs. The agent encapsulates these UDRs in KafkaUDRs that are then sent to a dedicated input topic via a Kafka Producer agent.  

  4. The Spark cluster periodically polls the input topic and performs the KPI calculations that are based on the service model and the input data. The polling interval depends on the duration of the Spark batch intervals.

  5. When the timestamps of the input data indicate that a configurable time period has elapsed, the Spark cluster sends the calculated KPIs to a dedicated output topic. There is also a separate topic for alarm output. If the service model has been configured to produce immediate alarms, the Spark cluster sends the KPIs that hit an alarm level, within a Spark batch, potentially before their KPI period closes.

  6. The data on the output and alarm topics are collected via Kafka Collector agents. The KPI Cluster Out agent extracts and decodes the KPI data to KPIAggregatedOutput UDRs.


Scroll ignore
scroll-viewportfalse
scroll-pdftrue
scroll-officefalse
scroll-chmtrue
scroll-docbooktrue
scroll-eclipsehelptrue
scroll-epubtrue
scroll-htmlfalse

This chapter includes the following sections: