Loader Kafka
Loader Kafka sends input data as messages to Kafka brokers.
The component can produce two types of message payloads
-
JSON message payloads
-
Avro message payloads
The component builds upon the Python package for Kafka client. Specifically C/C++ implementation of librdkafka and confluent-python module.
In terms of authentication and communication encryption the component supports the following modes of operation
-
PLAINTEXT: No authentication and no encryption
-
SSL: Authentication and broker communication encryption using SSL/TLS certificates
-
SASL_SSL: Authentication using Kerberos (GSSAPI) with encryption using SSL/TLS certificates
-
SASL_PLAINTEXT: Authentication using Kerberos (GSSAPI) without communication encryption
Visit the relevant section of documentation as configuration parameters might have specific meaning or behavior depending on message payload type or authentication scheme
Warning: The topic must exist prior to loading. The component does not create a target topic automatically.
Warning: The component only supports schema discovery and message production for Avro payloads.
Warning: Avro schemas are only supported for message values, not for message keys
Warning: The component was tested against the Confluent Kafka Schema Registry. Other registry implementations were not tested but may work.
Data In/Data Out
Data In |
The component searches for input files in directory /data/in/files . The files should be in newline-delimited JSON format (*.ndjson). Each input JSON object found in an input file is translated into a single Kafka message. The expected structure of the input JSON object depends on the chosen message payload format.
For Avro message format the input JSON object must provide properties required by Avro schema. Final message is then created according to the discovered schema by populating it with values from the JSON object. |
Data Out | N/A |
Learn more: about the folder structure here.
Parameters
Connection Settings