Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 14 Next »

The Kafka profile enables you to configure which topic and which embedded service key to use.
 
The Kafka profile is loaded when you start a workflow that depends on it. Changes to the profile become effective when you restart the workflow.

Configuration

To create a new Kafka profile configuration, click the New Configuration button in the upper left part of the Desktop window, and then select Kafka Profile from the menu.

The contents of the menus in the menu bar may change depending on which configuration type that has been opened. The Kafka profile uses the standard menu items and buttons that are visible for all configurations, and these are described in 2.1 Menus and Buttons.

The Kafka profile configuration contains two tabs: Connectivity and Advanced.

Connectivity tab

The Connectivity tab is displayed by default when creating or opening a Kafka profile.

Kafka profile configuration - Connectivity tab

The Connectivity tab contains the following settings:

SettingDescription
Kafka Version

The Kafka agents support Kafka client 0.8 to 2.4.1. Since there are compatibility issues between versions 0.9 and 0.10, as well as 2.4.0 and 2.4.1, you have three options; 0.8 - 0.9, 0.10 - 2.4.0 or 2.4.1.

Note!

Authentication with Kerberos and Encryption with SSL, as described in 9.49.2 Kafka Agents Overview, is only available for Kafka version 0.9.

Kafka Topic

Enter the Kafka topic that you want to use for your configuration.

For information on how to create a Kafka topic, refer to 2.2.11 kafka.

Use Embedded Kafka

If you want to use the Kafka Service which is the Kafka embedded in , select this check box. When you select this check box, you are only required to populate the Kafka Service Key field.

Note!

This option is only available when running Kafka client version 0.8.

Kafka Service Key

If you have selected to use Kafka Service, you must complete the Kafka Service Key. To determine which service key to use for Kafka Services, refer to MZ_HOME/common/config/cell/default/master/services/custom.conf.

Host

If you are using external Kafka, enter the hostname for Zookeeper.

Port

If you are using external Kafka, enter the port for Zookeeper.

Kafka Brokers

A Broker is a node in a Kafka cluster. If you are using external Kafka, you must add Kafka Brokers. Use the Add button to enter the addresses of the Kafka Brokers that you want to connect to.

Advanced tab

In the Advanced tab you can configure properties for optimizing the performance of the Kafka Producer and Consumer. The Advanced tab contains two tabs: Producer and Consumer.

Producer tab

In the Producer tab, you can configure the properties of the Kafka forwarding agent. 

Kafka profile configuration - Producer tab in the Advanced tab

The property producer.abortunknown=true sets the agent to abort if the broker replies with Unknown topic or partition. For further information on the other properties, see the text in the Advanced producer properties field, or refer to https://kafka.apache.org.

When running in Acknowledged execution mode, the property producer.full.response determines if the data sent to the kafka log is also included in the response UDR. The value is set to true by default. Setting the value to false reduces the memory footprint.

For information on how to configure the properties for SSL and Kerberos, please refer to https://www.cloudera.com/documentation/kafka/latest/topics/kafka_security.html.

Note!

Once you have edited the JAAS file required for Kerberos, you will need to restart the EC to register the changes made.

Note!

If you make any changes to the security configuration of the Kafka Producer, any topics used must be recreated before they can be used.

For further information on the other properties, see the text in the Advanced producer properties field or refer to https://kafka.apache.org.

Enabling Compression for Kafka

Compression for messages to be sent to Kafka brokers can now be enabled from the Advanced producer properties. The compression codec utilized by follows the standard Kafka library, where Gzip, Lz4, and Snappy are supported.

To enable compression, just add the property compression.type to Advanced producer properties. Followed by the value, gzip, lz4snappy or none.

Example - Enabling Compression for Kafka using Gzip

When using Lz4 compression

When using the Lz4 compression codec, additional steps are required when running the embedded Kafka solution.

First, ensure that the lz4-1.2.0.jar file is located in the $MZ_HOME/3pp/ directory.

Next, run the following topo commands:

mzsh topo set topo://container:<container_name>/pico:sc1/obj:config.classpath '{jars:["lib/picostart.jar", "3pp/lz4-1.2.0.jar"]}'
mzsh topo set topo://container:<container_name>/pico:sc2/obj:config.classpath '{jars:["lib/picostart.jar", "3pp/lz4-1.2.0.jar"]}'
mzsh topo set topo://container:<container_name>/pico:sc3/obj:config.classpath '{jars:["lib/picostart.jar", "3pp/lz4-1.2.0.jar"]}'


Consumer tab

In the Consumer tab, you can configure the properties of the Kafka collection agent.

Kafka profile configuration - Consumer tab in the Advanced tab

See the text in the Advanced consumer properties field for further information about the properties.


Note!

The sasl.jaas.config client property has been added to Advance Producer Properties and Advance Consumer Properties in the Kafka profile.

This new property is used to configure SASL authentication directly in the client's properties instead of using a JAAS file. This simplification lets you run multiple clients in the same JVM by using different sets of credentials, which is not possible with a JAAS file.

You can still use the existing java.security.auth.login.config system property which points to a JAAS file. However, this option allows only one set of user credentials for all client connections from a JVM. This means that MediationZone users won’t be able to run multiple Kafka workflows against different Kafka brokers on the same EC/ECSA. 

When both the JAAS configuration system property (java.security.auth.login.config) and client property (sasl.jaas.config) are specified, the client property will be used.


  • No labels