Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Info

Example. Configuration filenames

Default configuration file:

Code Block
$MZ_HOME/etc/logging/apl-log4j.properties

EC specific configuration file for ec1:

Code Block
$MZ_HOME/etc/logging/ec1-apl-log4j.properties

Copying the default configuration file into an EC specific configuration file:

Code Block
$cp apl-log4j.properties ec1-apl-log4j.properties

Copying a template configuration file into an EC specific configuration file:

Code Block
$cp apl-log4j-trace.properties ec1-apl-log4j.properties

The following template configuration files are included by default:

...

The files listed above have a different log level setting but are otherwise identical.

Whenever the APL function log.* is called in the Analysis agent, this function invokes logging with log4j. After the workflow is done executing with a specific EC, a new EC specific configuration file <ec-name>-apl-log4j.properties will be created. This new EC specific configuration file is created by default with the same configurations saved in the file apl-log4j.properties. For instance, if the EC name is ec1 then ec1-apl-log4j.propertiesis created in $MZ_HOME/etc/logging and will have identical settings as apl-log4j.properties.

MZSH does not support dedicated commands to make changes to the log level. Changes on the log level and other properties in the configuration file must be made manually. Whatever changes made to the configuration file takes effect at the next workflow run without the need to restart the EC.

In a multi-host installation, the EC specific configuration file is always created on the host where EC is located. Therefore, log4j always read the EC specific configuration according to the EC location. For instance, workflow run by ec1 located on host my-host-name. The log4j always read the EC specific configuration file ec1-apl-log4j.properties located at path $MZ_HOME/etc/logging on host my-host-name.

The content of the files defines the logging.

...

Info

Examples Appender Configurations

Code Block
# Default
log4j.appender.Default=com.digitalroute.apl.log.DRRollingFileAppender
log4j.appender.Default.file=${mz.home}/log/log4j/{workflow}.log
log4j.appender.Default.layout=org.apache.log4j.PatternLayout
log4j.appender.Default.layout.ConversionPattern=[%d{dd MMM yyyy HH:mm:ss,SSS}];[%-5p];[pico=%X{pico}];[%t];[tag=%X{tag}];[%c]:%m%n
log4j.appender.Default.MaxFileSize=10MB
log4j.appender.Default.MaxBackupIndex=20
log4j.logger.aplLogger.Default=TRACE, Default

The appender named Default will write a single file for all workflows contained under the Default folder.

Code Block
# PRIMARY
log4j.appender.PRIMARY=com.digitalroute.apl.log.DRRollingMultiFileAppender
log4j.appender.PRIMARY.file=${mz.home}/log/log4j/{workflow}.log
log4j.appender.PRIMARY.layout=org.apache.log4j.PatternLayout
log4j.appender.PRIMARY.layout.ConversionPattern=[%d{dd MMM yyyy HH:mm:ss,SSS}];[%-5p];[pico=%X{pico}];[%t];[tag=%X{tag}];[%c]:%m%n
log4j.appender.PRIMARY.MaxFileSize=10MB
log4j.appender.PRIMARY.MaxBackupIndex=20
log4j.logger.aplLogger.RT_Folder.RT_TEST_WF=TRACE, PRIMARY

The appender named Primary will create multiple files; one for each workflow instance based on the RT_Folder.RT_TEST_WF workflow.

Code Block
# SECONDARY
log4j.appender.SECONDARY=com.digitalroute.apl.log.DRRollingFileAppender
log4j.appender.SECONDARY.file=${mz.home}/log/log4j/{workflow}.log
log4j.appender.SECONDARY.layout=org.apache.log4j.PatternLayout
log4j.appender.SECONDARY.layout.ConversionPattern=[%d{dd MMM yyyy HH:mm:ss,SSS}];[%-5p];[pico=%X{pico}];[%t];[tag=%X{tag}];[%c]:%m%n
log4j.appender.SECONDARY.MaxFileSize=10MB
log4j.appender.SECONDARY.MaxBackupIndex=20
log4j.logger.aplLogger.RT_Folder.RT_TEST_WF=TRACE, SECONDARY

The appender Secondary will  create a single file for each workflow instance based on the RT_Folder.RT_TEST_WF workflow. The file will take the name of the first workflow instance it encounters, for example "RT_Folder.RT_TEST_WF.workflow_1" 

Tip

Hint!

You can change the maximum file size and the number of backup files by adding the following lines:
log4j.appender.a.MaxFileSize=100MB
log4j.appender.a.MaxBackupIndex=10

You can add a filtering rule by adding the line log4j.logger.<configuration name>=<log level>. This is useful when you want to set different log levels for specific folders or configurations.

If you want to apply the filtering rule to all APL configurations in the default folder, change the last line in the previous example to log4j.logger.Default=DEBUG.

Info

...

Example. Sets the general log level to ERROR and to DEBUG for the agent named agent_1

log4j.logger.aplLogger=ERROR, a 
log4j.appender.a=com.digitalroute.apl.log.DRRollingFileAppender
log4j.appender.a.file=${mz.home}/log/{pico}_{workflow}.log
log4j.appender.a.layout=com.digitalroute.apl.log.JsonLayout
log4j.logger.Default.debug.workflow_1.agent_1=DEBUG

Note

Note!

For performance reasons it is recommended to use the DRRollingFileAppender and configure individual appenders for each workflow. Only use the DRRollingMultiFileAppender if you need individual files on a workflow instance level.

For more information about available settings, see the log4j documentation at https://logging.apache.org/log4j/1.2/manual.html.

...


The fields in the log output are described below.

Field

Description

timestamp

The time when the message was logged. The UTC timezone and international standard date and time notation is used by default. 
You can specify the date format by adding the following line in the configuration file:

log4j.appender.a.layout.DateFormat=<Java SimpleDateFormat pattern>

For information about how to use SimpleDateFormat patterns, see:
https://docs.oracle.com/javase/8/docs/api/java/text/SimpleDateFormat.html

level

The log level i e FATAL, ERRORWARN, INFO, DEBUG, or TRACE.

thread

The name of the workflow thread.

category

The logged configuration. This field contains the category class of the appender that is defined in the configuration file.

message

The log message is specified in the APL command.

pico

The name of the Execution Context.

Warning

Warning!

The ECs must be restarted if you manually delete or rename active log files or backup log files.

Tip

Hint!

  If the log files are not generated as expected, review the EC logs. Your configuration files may contain errors.