Kafka Consumer
Note!
Before you configure the Kafka Consumer function in Usage Engine Cloud Edition:
Ensure that the Kafka cluster is reachable from Usage Engine Cloud Edition.
Configure Kafka ACLs so that the client identity has the required TOPIC and GROUP permissions.
For more information, see Prerequisites for using Kafka functions | Kafka Consumer permissions and operations.
The purpose of this collector is to enable easy message collection from Kafka containers. With the Kafka Consumer, you collect messages from one or more topics from configured Kafka brokers and process them in your stream. The collection method and batch size of the messages can also be configured. Once the messages are collected, they are converted to readable strings and passed to the next function in JSON format.
You can configure this function in the following settings. All these configurations are mandatory.
Kafka Consumer | Kafka brokers to set the Kafka brokers to collect the message(s) from.
Kafka Consumer | Authentication is to set the secret from the Secrets wallet with the credentials of the Kafka broker.
Kafka Consumer | Kafka topic to set the topic to collect the messages from.
Kafka Consumer | Collection method defines how you want the Kafka Consumer to collect the messages.
Kafka Consumer | Batch size sets the number of messages to collect per batch.
Kafka brokers
Fill in the server host IP and the port number of the Kafka broker you want your stream to collect messages from.
Field | Description |
|---|---|
Host | The hostname, IPv4 or IPv6 address of the Kafka broker. |
Port | The listening port of the Kafka broker. The default port is 9092 |
You can add additional Kafka brokers by clicking on Add broker.
Authentication
Setting | Description |
Select from secrets wallet | Note! This option only appears when the Authentication toggle is enabled. Select the Secrets wallet radio button to select a secret with the Kafka broker secret type. For more details, see Secrets wallet. |
Direct input | Note! This option only appears when the Authentication toggle is enabled. Direct input authentication options Select the Direct input radio button to choose between the two authentication type:
Note! The Password field only accepts parameter reference and requires you to use Global parameters.
Note! The Client Secret field only accepts parameter reference and requires you to use Global parameters. |
None | If the Kafka broker does not require authentication, you can disable Authentication by toggling it off. Note! SSL is enabled for all authentication types. |
Kafka topic
Key in the Topic name for the Kafka Consumer to collect the messages from. You may add more than one topic.
Note!
Ensure to enter the full and correct topic name.
Collection method
Provides options on the method to collect messages from the Kafka topics.
Option | Description |
|---|---|
All messages | With All messages selected, the function collects all messages in the Kafka topic, including the messages that has already been committed in the previous execution. A Consumer group ID is arbitrarily assigned by Usage Engine. |
New messages | With New messages, the function begins collecting from the last committed message of the Kafka topic. An alphanumeric Consumer group ID is assigned by you. Only one Consumer group ID is allowed. |
Note!
The Kafka Consumer can collect messages in Buffer format and supports GZIP or ZSTD compression formats.
The diagram below is an example of how Kafka Consumer collects messages using either methods.
Batch size
Set the number of messages for each batch so that the stream commits to a certain number of messages per batch collection. If the topic contains 150 messages and the batch size is set to 100, the stream collects the first 100 messages from the topic to pass onto the subsequent operation. Then the next 50 messages are collected in the next batch, and the operation continues.
If there are messages less than the batch size, all the messages are collected. For example, if the topic has 50 messages, and the batch size is 100, all 50 messages are collected, and the stream continues with the next operation.
The default value of the batch size is 100, which is also the minimum value. If you do not enter a value, then the default value will be used.