The target-kafka loader sends data into Kafka after it was pulled from a source using an extractor
Getting Started
Prerequisites
If you haven't already, follow the initial steps of the Getting Started guide:
Installation and configuration
-
Add the target-kafka loader to your
project using
:meltano add -
Configure the target-kafka
settings using
:meltano config
meltano add target-kafkameltano config target-kafka set --interactiveNext steps
Follow the remaining steps of the Getting Started guide:
If you run into any issues, learn how to get help.
Capabilities
The current capabilities for
target-kafka
may have been automatically set when originally added to the Hub. Please review the
capabilities when using this loader. If you find they are out of date, please
consider updating them by making a pull request to the YAML file that defines the
capabilities for this loader.
This plugin has the following capabilities:
- about
- schema-flattening
- stream-maps
- structured-logging
- validate-records
You can
override these capabilities or specify additional ones
in your meltano.yml by adding the capabilities key.
Settings
The
target-kafka settings that are known to Meltano are documented below. To quickly
find the setting you're looking for, click on any setting name from the list:
batch_sizebatch_size_rowsbootstrap_serverscompression_typeflattening_max_key_lengthinclude_sdc_propertiescommon_topickey_propertiestopic_prefix
You can also list these settings using
with the meltano configlist
subcommand:
meltano config target-kafka list
You can
override these settings or specify additional ones
in your meltano.yml by adding the settings key.
Please consider adding any settings you have defined locally to this definition on MeltanoHub by making a pull request to the YAML file that defines the settings for this plugin.
Batch Size (batch_size)
-
Environment variable:
TARGET_KAFKA_BATCH_SIZE -
Default Value:
100
Number of records to batch before sending to kafka
Configure this setting directly using the following Meltano command:
meltano config target-kafka set batch_size [value]Batch Size Rows (batch_size_rows)
-
Environment variable:
TARGET_KAFKA_BATCH_SIZE_ROWS
Maximum number of rows in each batch.
Configure this setting directly using the following Meltano command:
meltano config target-kafka set batch_size_rows [value]Bootstrap Servers (bootstrap_servers)
-
Environment variable:
TARGET_KAFKA_BOOTSTRAP_SERVERS
Kafka bootstrap servers (comma separated: localhost:9092,localhost:9093)
Configure this setting directly using the following Meltano command:
meltano config target-kafka set bootstrap_servers [value]Compression Type (compression_type)
-
Environment variable:
TARGET_KAFKA_COMPRESSION_TYPE -
Default Value:
snappy
Kafka message compression
Configure this setting directly using the following Meltano command:
meltano config target-kafka set compression_type [value]Max Key Length (flattening_max_key_length)
-
Environment variable:
TARGET_KAFKA_FLATTENING_MAX_KEY_LENGTH
The maximum length of a flattened key.
Configure this setting directly using the following Meltano command:
meltano config target-kafka set flattening_max_key_length [value]Include Sdc Properties (include_sdc_properties)
-
Environment variable:
TARGET_KAFKA_INCLUDE_SDC_PROPERTIES -
Default Value:
true
Include Meltano-specific metadata (_sdc_extracted_at, _sdc_received_at, etc. )
Configure this setting directly using the following Meltano command:
meltano config target-kafka set include_sdc_properties [value]Common Topic (common_topic)
-
Environment variable:
TARGET_KAFKA_COMMON_TOPIC
When set, all streams write to this topic; validation is disabled and each record gets a "stream" field with the source stream name. When unset, each stream uses its name as the topic (with optional topic_prefix).
Configure this setting directly using the following Meltano command:
meltano config target-kafka set common_topic [value]Key Properties (key_properties)
-
Environment variable:
TARGET_KAFKA_KEY_PROPERTIES -
Default Value:
[]
Properties to use as Kafka message key (empty = round-robin partition)
Configure this setting directly using the following Meltano command:
meltano config target-kafka set key_properties [value]Topic Prefix (topic_prefix)
-
Environment variable:
TARGET_KAFKA_TOPIC_PREFIX
Prefix to add to all topic names
Configure this setting directly using the following Meltano command:
meltano config target-kafka set topic_prefix [value]Expand To Show SDK Settings
Add Record Metadata (add_record_metadata)
-
Environment variable:
TARGET_KAFKA_ADD_RECORD_METADATA
Whether to add metadata fields to records.
Configure this setting directly using the following Meltano command:
meltano config target-kafka set add_record_metadata [value]Faker Locale (faker_config.locale)
-
Environment variable:
TARGET_KAFKA_FAKER_CONFIG_LOCALE
One or more LCID locale strings to produce localized output for: https://faker.readthedocs.io/en/master/#localization
Configure this setting directly using the following Meltano command:
meltano config target-kafka set faker_config locale [value]Faker Seed (faker_config.seed)
-
Environment variable:
TARGET_KAFKA_FAKER_CONFIG_SEED
Value to seed the Faker generator for deterministic output: https://faker.readthedocs.io/en/master/#seeding-the-generator
Configure this setting directly using the following Meltano command:
meltano config target-kafka set faker_config seed [value]Enable Schema Flattening (flattening_enabled)
-
Environment variable:
TARGET_KAFKA_FLATTENING_ENABLED
'True' to enable schema flattening and automatically expand nested properties.
Configure this setting directly using the following Meltano command:
meltano config target-kafka set flattening_enabled [value]Max Flattening Depth (flattening_max_depth)
-
Environment variable:
TARGET_KAFKA_FLATTENING_MAX_DEPTH
The max depth to flatten schemas.
Configure this setting directly using the following Meltano command:
meltano config target-kafka set flattening_max_depth [value]Load Method (load_method)
-
Environment variable:
TARGET_KAFKA_LOAD_METHOD -
Default Value:
append-only
The method to use when loading data into the destination. append-only will always write all input records whether that records already exists or not. upsert will update existing records and insert new records. overwrite will delete all existing records and insert all input records.
Configure this setting directly using the following Meltano command:
meltano config target-kafka set load_method [value]User Stream Map Configuration (stream_map_config)
-
Environment variable:
TARGET_KAFKA_STREAM_MAP_CONFIG
User-defined config values to be used within map expressions.
Configure this setting directly using the following Meltano command:
meltano config target-kafka set stream_map_config [value]Stream Maps (stream_maps)
-
Environment variable:
TARGET_KAFKA_STREAM_MAPS
Config object for stream maps capability. For more information check out Stream Maps.
Configure this setting directly using the following Meltano command:
meltano config target-kafka set stream_maps [value]Validate Records (validate_records)
-
Environment variable:
TARGET_KAFKA_VALIDATE_RECORDS -
Default Value:
true
Whether to validate the schema of the incoming streams.
Configure this setting directly using the following Meltano command:
meltano config target-kafka set validate_records [value]Something missing?
This page is generated from a YAML file that you can contribute changes to.
Edit it on GitHub!Looking for help?
#plugins-general
channel.


