Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

The Kafka profile enables you to configure which topic and which embedded service key to use.
 
The Kafka profile is loaded when you start a workflow that depends on it. Changes to the profile become effective when you restart the workflow.

Configuration

To create a new Kafka profile configuration, click the New Configuration button in the upper left part of the Desktop window, and then select Kafka Profile from the menu.

The contents of the menus in the menu bar may change depending on which configuration type that has been opened. The Kafka profile uses the standard menu items and buttons that are visible for all configurations, and these are described in 2.1 Menus and Buttons.

The Kafka profile configuration contains two tabs: Connectivity and Advanced.

Connectivity tab

The Connectivity tab is displayed by default when creating or opening a Kafka profile.

Kafka profile configuration - Connectivity tab

The Connectivity tab contains the following settings:

SettingDescription
Kafka Version

The Kafka agents supports Kafka client 0.8 to 2.4.1. Since there are compatibility issues between versions 0.9 and 0.10, as well as 2.4.0 and 2.4.1, you have three options; 0.8 - 0.9, 0.10 - 2.4.0 or 2.4.1.

Note!

Authentication with Kerberos and Encryption with SSL, as described in 9.48.2 Kafka Agents Overview, is only available for Kafka version 0.9.

Kafka Topic

Enter the Kafka topic that you want to use for your configuration.

For information on how to create a Kafka topic, refer to MZSH Command Line Tool User's Guide.

Use Embedded Kafka

If you want to use the Kafka Service which is the Kafka embedded in, select this check box. When you select this check box, you are only required to populate the Kafka Service Key field.

Note!

This option is only available when running Kafka client version 0.8.

Kafka Service Key

If you have selected to use Kafka Service, you must complete the Kafka Service Key. To determine which service key to use for Kafka Services, refer to MZ_HOME/common/config/cell/default/master/services/custom.conf.

Host

If you are using external Kafka, enter the host name for Zookeeper.

Port

If you are using external Kafka, enter the port for Zookeeper.

Kafka Brokers

A Broker is a node in a Kafka cluster. If you are using external Kafka, you must add Kafka Brokers. Use the Add button to enter the addresses of the Kafka Brokers that you want to connect to.

Advanced tab

In the Advanced tab you can configure properties for optimizing the performance of the Kafka Producer and Consumer. The Advanced tab contains two tabs: Producer and Consumer.

Producer tab

In the Producer tab, you can configure the properties of the Kafka forwarding agent. 

Kafka profile configuration - Producer tab in the Advanced tab

The property producer.abortunknown=true sets the agent to abort if the broker replies with Unknown topic or partition. For further information on the other properties, see the text in the Advanced producer properties field, or refer to https://kafka.apache.org.

When running in Acknowledged execution mode, the property producer.full.response determines if the data sent to the kafka log is also included in the response UDR. The value is set to true by default. Setting the value to false reduces the memory footprint.

For information on how to configure the properties for SSL and Kerberos, please refer to https://www.cloudera.com/documentation/kafka/latest/topics/kafka_security.html.

Note!

Once you have edited the JAAS file required for Kerberos, you will need to restart the EC to register the changes made.


For further information on the other properties, see the text in the Advanced producer properties field or refer to https://kafka.apache.org.

Enabling Compression for Kafka

Compression for messages to be sent to Kafka brokers can now be enabled from the Advanced producer properties. The compression codec utilized by follows the standard Kafka library, where Gzip, Lz4 and Snappy are supported.

To enable compression, just add the property compression.type into the Advanced producer properties. Followed by the value, gzip, lz4snappy or none.

Example - Enabling Compression for Kafka using Gzip

When using Lz4 compression

When using the Lz4 compression codec, additional steps are required when running the embedded kafka solution.

First, ensure that the lz4-1.2.0.jar file is located in the $MZ_HOME/3pp/ directory.

Next, run the following topo commands:

mzsh topo set topo://container:<container_name>/pico:sc1/obj:config.classpath '{jars:["lib/picostart.jar", "3pp/lz4-1.2.0.jar"]}'
mzsh topo set topo://container:<container_name>/pico:sc2/obj:config.classpath '{jars:["lib/picostart.jar", "3pp/lz4-1.2.0.jar"]}'
mzsh topo set topo://container:<container_name>/pico:sc3/obj:config.classpath '{jars:["lib/picostart.jar", "3pp/lz4-1.2.0.jar"]}'


Consumer tab

In the Consumer tab, you can configure the properties of the Kafka collection agent.

Kafka profile configuration - Consumer tab in the Advanced tab

See the text in the Advanced consumer properties field for further information about the properties.


  • No labels