Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Assume that you have a batch use case where you collect files, and have to do duplication checks and aggregation. You want to make your solution scalable to improve the processing times of your data during periods of high usage. You will need to create 2-3 workflows in your new batch scaling solution. In this example, we use three.

...

  1. The File collection workflow(s) manage the InterWF (Inter workflow) partitions and they will use an ID Field (e.g. customer ID) to determine which shard/partition a UDR belongs to.

  2. The number of partitions created is determined by the Max Scale Factor parameter. This is configured in ….

...

  1. The Aggregation workflow(s) will collect data from an inter-workflow topic and use a separate aggregation session storage topic.

...

Prerequisites for Kafka?

Are there any prerequisites required to be able to configure batch scaling using Kafka storage?

...