Skip to main content

Pentaho+ documentation has moved!

The new product documentation portal is here. Check it out now at docs.hitachivantara.com

 

Hitachi Vantara Lumada and Pentaho Documentation

Kafka Producer

Parent article

The Kafka Producer allows you to publish messages in near-real-time across worker nodes where multiple, subscribed members have access. A Kafka Producer step publishes a stream of records to one Kafka topic.

General

Enter the following information in the transformation step name field.

  • Step name: Specifies the unique name of the transformation on the canvas. The Step Name is set to Kafka Producer by default.

Options

The Kafka Producer step features a Kafka connection setup tab and a configuration property options tab. Each tab is described below.

Setup tab

Kafka Producer step

Fill in the following fields.

OptionDescription
Connection

Select a connection type:

  • Direct

    Specify the Bootstrap servers from which you want to receive the Kafka streaming data.

  • Cluster

    Specify the Hadoop cluster configuration from which you want to retrieve the Kafka streaming data. In a Hadoop cluster configuration, you can specify information like host names and ports for HDFS, Job Tracker, security, and other big data cluster components. Multiple servers can be specified if these are part of the same cluster. For information on Hadoop clusters, see Set up Pentaho to connect to a Hadoop cluster

Client IDThe unique Client identifier, used to identify and set up a durable connection path to the server to make requests and to distinguish between different clients.
TopicThe category to which records are published.
Key FieldIn Kafka, all messages can be keyed, allowing for messages to be distributed to partitions based on their keys in a default routing scheme. If no key is present, messages are randomly distributed to partitions.
Message FieldThe individual record contained in a topic.

Options tab

Options tab

Use this tab to configure the Kafka Producer broker sources. For further information on these input names, see the Apache Kafka documentation site: https://kafka.apache.org/documentation/.

Metadata injection support

All fields of this step support metadata injection. You can use this step with ETL metadata injection to pass metadata to your transformation at runtime.