Inter Workflow Collection Agent

Inter Workflow Collection Agent

The collecting Inter Workflow agent collects batch files from a storage server, which can be based on file storage or database storage. The data it collects has previously been submitted to the storage server by a forwarding Inter Workflow agent.

Note!

  • An Inter Workflow profile cannot be used by more than one Inter Workflow collection agent at a time. A workflow trying to use an already locked profile will abort.

  • In a batch workflow, the collecting Inter Workflow agent hands over the data, in UDR form, to the next agent in turn, one at a time. In a real-time workflow, it routes the UDRs into the workflow, one batch at a time.

It is possible to restrict memory consumption by setting the property mz.iwf.max_size_block  in the <ec>.conf or the platform.conf, on the EC or Platform that runs the Inter Workflow storage. For further information on how to modify properties, see Updating Pico Configurations. If the agent wants to allocate more memory than the given property value during collection, the collection will abort instead of suffering a possible "out of memory" error. The value representation should be in bytes. See the following example:

Example - Restricting memory consumption

mz.iwf.max_size_block="65535"

Note!
The minimum value is 32000 bytes, and even if a lower value is configured, 32000 will apply.

Every batch file that the agent routes to the workflow is preceded by a special UDR that is called NewFileUDR and contains the name of the batch file.

inter-workflow-ver-2.png
Inter workflow collection process

PostgreSQL Large Object Cleanup

When the Inter Workflow profile uses Database Storage, it stores file payloads as large objects in the database. The system task called Interwf LOCleaner cleans up unused large objects from PostgreSQL or SAP HANA databases. It has optional configuration settings to control how and when the cleanup runs. For details, see Inter Workflow Large Object Cleanup