The Kafka profile enables you to configure which topic and which embedded service key to use.
The Kafka profile is loaded when you start a workflow that depends on it. Changes to the profile become effective when you restart the workflow.
Configuration
To create a new Kafka profile configuration, click the New Configuration button in the upper left part of the Desktop window, and then select Kafka Profile from the menu.
...
The Kafka profile configuration contains two tabs: Connectivity and Advanced.
Connectivity tab
The Connectivity tab is displayed by default when creating or opening a Kafka profile.
Kafka profile configuration - Connectivity tab
...
The Connectivity tab contains the following settings:
Setting | Description | |
---|---|---|
Kafka Version | The Kafka agents support Kafka client 0.8 to 2.4.1. Since there are compatibility issues between versions 0.9 and 0.10, as well as 2.4.0 and 2.4.1, you have three options; 0.8 - 0.9, 0.10 - 2.4.0 or 2.4.1.
|
| ||
Kafka Topic | Enter the Kafka topic that you want to use for your configuration. For information on how to create a Kafka topic, refer to 2.2.11 kafka. | |
Use Embedded Kafka | If you want to use the Kafka Service which is the Kafka embedded in |
, select this check box. When you select this check box, you are only required to populate the Kafka Service Key field.
|
| ||
Kafka Service Key | If you have selected to use Kafka Service, you must complete the Kafka Service Key. To determine which service key to use for Kafka Services, refer to | |
Host | If you are using external Kafka, enter the hostname for Zookeeper. | |
Port | If you are using external Kafka, enter the port for Zookeeper. | |
Kafka Brokers | A Broker is a node in a Kafka cluster. If you are using external Kafka, you must add Kafka Brokers. Use the Add button to enter the addresses of the Kafka Brokers that you want to connect to. |
Advanced tab
In the Advanced tab you can configure properties for optimizing the performance of the Kafka Producer and Consumer. The Advanced tab contains two tabs: Producer and Consumer.
Producer tab
In the Producer tab, you can configure the properties of the Kafka forwarding agent.
...
Warning |
---|
Caution! |
...
The property producer.abortunknown=true
sets the agent to abort if the broker replies with Unknown topic or partition
. For further information on the other properties, see the text in the Advanced producer properties field, or refer to https://kafka.apache.org.
...
For information on how to configure the properties for SSL and Kerberos, please refer to https://www.cloudera.com/documentation/kafka/latest/topics/kafka_security.html.
Note |
---|
Note! Once you have edited the JAAS file required for Kerberos, you will need to restart the EC to register the changes made. |
Note |
---|
...
Note! If you make any changes to the security configuration of the Kafka Producer, any topics used must be recreated before they can be used. |
For further information on the other properties, see the text in the Advanced producer properties field or refer to https://kafka.apache.org.
Enabling Compression for Kafka
Compression for messages to be sent to Kafka brokers can now be enabled from the Advanced producer properties. The compression codec utilized by follows the standard Kafka library, where Gzip, Lz4, and Snappy are supported.
To enable compression, just add the property compression.type
to Advanced producer properties. Followed by the value, gzip
, lz4
, snappy
or none.
Info |
---|
Example - Enabling Compression for Kafka using Gzip |
Warning |
---|
...
Warning - When using Lz4 compression When using the Lz4 compression codec, additional steps are required when running the embedded Kafka solution. First, ensure that the lz4-1.2.0.jar file is located in the $MZ_HOME/3pp/ directory. Next, run the following topo commands:
|
Consumer tab
In the Consumer tab, you can configure the properties of the Kafka collection agent.
...
See the text in the Advanced consumer properties field for further information about the properties.
Note |
---|
...
Note! The sasl.jaas.config client property has been added to Advance Producer Properties and Advance Consumer Properties in the Kafka profile. This new property is used to configure SASL authentication directly in the client's properties instead of using a JAAS file. This simplification lets you run multiple clients in the same JVM by using different sets of credentials, which is not possible with a JAAS file. You can still use the existing java.security.auth.login.config system property which points to a JAAS file. However, this option allows only one set of user credentials for all client connections from a JVM. This means that MediationZone users won’t be able to run multiple Kafka workflows against different Kafka brokers on the same EC/ECSA. When both the JAAS configuration system property (java.security.auth.login.config) and client property (sasl.jaas.config) are specified, the client property will be used. |
Scroll ignore | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||||||||
|