ePrivacy and GPDR Cookie Consent by Cookie Consent Skip to main content

Loader Kafka

Loader Kafka sends input data as messages to Kafka brokers.

The component can produce two types of message payloads

  • JSON message payloads 

  • Avro message payloads 

The component builds upon the Python package for Kafka client. Specifically C/C++ implementation of librdkafka and confluent-python module.

In terms of authentication and communication encryption the component supports the following modes of operation

  • PLAINTEXT: No authentication and no encryption

  • SSL: Authentication and broker communication encryption using SSL/TLS certificates 

  • SASL_SSL: Authentication using Kerberos (GSSAPI) with encryption using SSL/TLS certificates

  • SASL_PLAINTEXT: Authentication using Kerberos (GSSAPI) without communication encryption

Visit the relevant section of documentation as configuration parameters  might have specific meaning or behavior depending on message payload type or authentication scheme

Warning: The topic must exist prior to loading. The component does not create a target topic automatically.

Warning: The component only supports schema discovery and message production for Avro payloads.

Warning: Avro schemas are only supported for message values, not for message keys

Warning: The component was tested against the Confluent Kafka Schema Registry. Other registry implementations were not tested but may work.

Data In/Data Out

Data In

The component searches for input files in directory  /data/in/files .

 

The files should be in newline-delimited JSON format (*.ndjson). 

 

Each input JSON object found in an input file is translated into a single Kafka message.

 

The expected structure of the input JSON object depends on the chosen message payload format.

For JSON message format the input JSON object is used simply as a message payload. 

 

For Avro message format the input JSON object must provide properties required by Avro schema. The final message is then created according to the discovered schema by populating it with values from the JSON object.

Data Out N/A

Learn more: about the folder structure here.

Parameters

Connection Settings


Server(s)

(required)

List of Kafka brokers the loader should attempt initial connection with

If multiple hosts are provided the loader will attempt to establish connection in the same order as brokers are specified.

Format: host:port

Example: kafka-broker1.meiro.io:9094

Security protocol

(required)

Authentication and encryption protocol the loader should use for communication with Kafka brokers

Possible values:

  • plaintext

  • ssl

  • sasl_ssl

  • sasl_plaintext

Connection log level

Log level for diagnostic logging on Kafka connection. Value is passed to configuration property debug for librdkafka.

Possible options:

  • Default (normal level of logging)

  • Security (authentication and connection initiation messages)

  • All (full diagnostic logging,  very granular)