Kafka Consumer

Kafka Consumer

576a2e8d-1dc6-4166-88ce-fe1eebec9f82.png

Note!

Before you configure the Kafka Consumer function in Usage Engine Cloud Edition:

  • Ensure that the Kafka cluster is reachable from Usage Engine Cloud Edition.

  • Configure Kafka ACLs so that the client identity has the required TOPIC and GROUP permissions.

For more information, see Prerequisites for using Kafka functions | Kafka Consumer permissions and operations.

The purpose of this collector is to enable easy message collection from Kafka containers. With the Kafka Consumer, you collect messages from one or more topics from configured Kafka brokers and process them in your stream. The collection method and batch size of the messages can also be configured. Once the messages are collected, they are converted to readable strings and passed to the next function in JSON format.

You can configure this function in the following settings. All these configurations are mandatory.

Kafka brokers

Fill in the server host IP and the port number of the Kafka broker you want your stream to collect messages from.

Field

Description

Field

Description

Host

The hostname, IPv4 or IPv6 address of the Kafka broker.

Port

The listening port of the Kafka broker. The default port is 9092

Kafka broker.png
Kafka brokers configuration

You can add additional Kafka brokers by clicking on Add broker.

Authentication

Setting

Description

Select from secrets wallet

Note!

This option only appears when the Authentication toggle is enabled.

Select the Secrets wallet radio button to select a secret with the Kafka broker secret type. For more details, see Secrets wallet.

Direct input

Note!

This option only appears when the Authentication toggle is enabled.

gpa-kafkaconsumer-di.png
Direct input authentication options

Select the Direct input radio button to choose between the two authentication type:

  • PLAIN/SCRAM

    • SASL mechanism - Select the SASL authentication mechanism that the Kafka broker is configured to use.

    • Username - The username used to authenticate the Kafka client with the broker.

    • Password - The password associated with the username used to authenticate the Kafka client with the broker.

Note!

The Password field only accepts parameter reference and requires you to use Global parameters.

  • OAuth

    • Host - The hostname or IP address of the OAuth server that issues access tokens.

    • Path - The HTTP path on the OAuth server used to request access tokens.

    • Client ID - The OAuth client identifier that represents this Kafka client application when requesting tokens from the OAuth server.

    • Client Secret - The secret associated with the OAuth client ID, used by the Kafka client to authenticate with the OAuth server when requesting tokens.

Note!

The Client Secret field only accepts parameter reference and requires you to use Global parameters.

None

If the Kafka broker does not require authentication, you can disable Authentication by toggling it off.

Note!

SSL is enabled for all authentication types.

Kafka topic

Key in the Topic name for the Kafka Consumer to collect the messages from. You may add more than one topic.

Note!

Ensure to enter the full and correct topic name.

Kafka topic.png
Kafka topic configuration

Collection method

Provides options on the method to collect messages from the Kafka topics.

Option

Description

Option

Description

All messages

With All messages selected, the function collects all messages in the Kafka topic, including the messages that has already been committed in the previous execution. A Consumer group ID is arbitrarily assigned by Usage Engine.

New messages

With New messages, the function begins collecting from the last committed message of the Kafka topic. An alphanumeric Consumer group ID is assigned by you. Only one Consumer group ID is allowed.

Note!

The Kafka Consumer can collect messages in Buffer format and supports GZIP or ZSTD compression formats.

Collection method.png
Collection method configuration

The diagram below is an example of how Kafka Consumer collects messages using either methods.

kafka_offset_diagram.png
Kafka Consumer offset diagram

Batch size

Set the number of messages for each batch so that the stream commits to a certain number of messages per batch collection. If the topic contains 150 messages and the batch size is set to 100, the stream collects the first 100 messages from the topic to pass onto the subsequent operation. Then the next 50 messages are collected in the next batch, and the operation continues.

Batch size.png
Batch size configuration

If there are messages less than the batch size, all the messages are collected. For example, if the topic has 50 messages, and the batch size is 100, all 50 messages are collected, and the stream continues with the next operation.

The default value of the batch size is 100, which is also the minimum value. If you do not enter a value, then the default value will be used.