...
Runtime errors that occur in the Spark application when it is running are logged in the file MZ_HOME/external/spark/runtime/work/driver-<number>/stderr
.
Runtime errors on the executor level are logged in the file MZ_HOME/external/spark/runtime/work/app-<number>/<executorId>/stderr
.
You can also access these logs from the Spark Master Web UI:
Click a Worker id under Running Drivers.
...
Spark UI - MasterClick stderr under Logs.
...
Spark UI - Worker
KPI Processing Accumulators
When a Spark batch has finished processing, a set of accumulators are logged in the file MZ_HOME/external/spark/runtime/work/driver-<number>/stdout
. These accumulators serve as a summary of what has been collected and calculated within the batch.
The following accumulators are logged:
Accumulator | Description |
---|---|
CalculatedKPIs | This accumulator includes |
DiscardedKPIs | This accumulator is incremented by one for each calculated KPI that belongs to a previously closed period. |
FailedMetricCalculations | This accumulator is incremented by one for each metric calculation that fails, e g due to invalid data in the input records. If there are several nodes in the node tree(s) that contain the metric, one input record may affect several metric calculations. |
FailedKPICalculations | This accumulator is incremented by one for a KPI calculation that fails due to undefined metrics in the KPI expression. In order for the accumulator to be incremented, the following conditions must apply: - The period for the KPI ends during the Spark batch. - The KPI expression uses multiple metrics and one or more of these are undefined. |
GeneratedKPIOutputs | This accumulator is incremented by one for each successfully calculated and delivered KPI. |
MissingExpressionForInputType | This accumulator is increased by one for each input record that does not match a |
Info | |||||||
---|---|---|---|---|---|---|---|
Example - Counters in stdoutThe example below indicates that 20 input records failed to match both a
|
You can also access these accumulators from the Spark Master Web UI:
Click a Worker id under Running Drivers.
Click stdout under Logs.
Note |
---|
Note!The accumulators are logged using log4j, meaning that the configured log level will decide whether or not the accumulators will be logged. The log level is configured in the property |
Note |
---|
Note!it is possible to log the accumulators to a separate log file by adding at the log4j.properties The file accumulators.log will be created under the driver folder in MZ_HOME/external/spark/runtime/work |