DuckLake

target-ducklake (definite variant)🥇

DuckLake is an integrated data lake and catalog format

The target-ducklake loader sends data into DuckLake after it was pulled from a source using an extractor

Getting Started

Prerequisites

If you haven't already, follow the initial steps of the Getting Started guide:

  1. Install Meltano
  2. Create your Meltano project

Installation and configuration

  1. Add the target-ducklake loader to your project using
    meltano add
    :
  2. meltano add target-ducklake
  3. Configure the target-ducklake settings using
    meltano config
    :
  4. meltano config target-ducklake set --interactive

Next steps

If you run into any issues, learn how to get help.

Capabilities

The current capabilities for target-ducklake may have been automatically set when originally added to the Hub. Please review the capabilities when using this loader. If you find they are out of date, please consider updating them by making a pull request to the YAML file that defines the capabilities for this loader.

This plugin has the following capabilities:

  • about
  • schema-flattening
  • stream-maps
  • validate-records

You can override these capabilities or specify additional ones in your meltano.yml by adding the capabilities key.

Settings

The target-ducklake settings that are known to Meltano are documented below. To quickly find the setting you're looking for, click on any setting name from the list:

You can also list these settings using

meltano config
with the list subcommand:

meltano config target-ducklake list

You can override these settings or specify additional ones in your meltano.yml by adding the settings key.

Please consider adding any settings you have defined locally to this definition on MeltanoHub by making a pull request to the YAML file that defines the settings for this plugin.

Auto Cast Timestamps (auto_cast_timestamps)

  • Environment variable: TARGET_DUCKLAKE_AUTO_CAST_TIMESTAMPS
  • Default Value: false

When True, automatically attempts to cast timestamp-like fields to timestamp types in ducklake.


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set auto_cast_timestamps [value]

Batch Size Rows (batch_size_rows)

  • Environment variable: TARGET_DUCKLAKE_BATCH_SIZE_ROWS

Maximum number of rows in each batch.


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set batch_size_rows [value]

Catalog URL (catalog_url)

  • Environment variable: TARGET_DUCKLAKE_CATALOG_URL

URL connection string to your catalog database


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set catalog_url [value]

Data Path (data_path)

  • Environment variable: TARGET_DUCKLAKE_DATA_PATH

GCS, S3, or local folder path for data storage


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set data_path [value]

Flattening Max Level (flatten_max_level)

  • Environment variable: TARGET_DUCKLAKE_FLATTEN_MAX_LEVEL
  • Default Value: 0

Maximum depth for flattening nested fields. Set to 0 to disable flattening.


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set flatten_max_level [value]

Max Batch Size (max_batch_size)

  • Environment variable: TARGET_DUCKLAKE_MAX_BATCH_SIZE
  • Default Value: 10000

Maximum number of records to process in a single batch


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set max_batch_size [value]

Partition Fields (partition_fields)

  • Environment variable: TARGET_DUCKLAKE_PARTITION_FIELDS

Object mapping stream names to arrays of partition column definitions. Each stream key maps directly to an array of column definitions.


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set partition_fields [value]

Public Key (public_key)

  • Environment variable: TARGET_DUCKLAKE_PUBLIC_KEY

Public key for private GCS and S3 storage authentication (optional)


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set public_key [value]

Region (region)

  • Environment variable: TARGET_DUCKLAKE_REGION

AWS region for S3 storage type (required when using S3 with explicit credentials)


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set region [value]

Secret Key (secret_key)

  • Environment variable: TARGET_DUCKLAKE_SECRET_KEY

Secret key for private GCS and S3 storage authentication (optional)


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set secret_key [value]

Storage Type (storage_type)

  • Environment variable: TARGET_DUCKLAKE_STORAGE_TYPE
  • Default Value: local

Type of storage: GCS, S3, or local


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set storage_type [value]

Target Schema Prefix (target_schema_prefix)

  • Environment variable: TARGET_DUCKLAKE_TARGET_SCHEMA_PREFIX

Prefix to add to the target schema name. If not provided, no prefix will be added. May be useful if target schema name is inferred from the stream name and you want to add a prefix to the schema name.


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set target_schema_prefix [value]

Temporary File Directory (temp_file_dir)

  • Environment variable: TARGET_DUCKLAKE_TEMP_FILE_DIR
  • Default Value: temp_files/

Directory path for storing temporary parquet files


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set temp_file_dir [value]
Expand To Show SDK Settings

Add Singer Record Metadata (add_record_metadata)

  • Environment variable: TARGET_DUCKLAKE_ADD_RECORD_METADATA
  • Default Value: false

When True, automatically adds Singer Data Capture (SDC) metadata columns to target tables:


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set add_record_metadata [value]

Default Target Schema Name (default_target_schema)

  • Environment variable: TARGET_DUCKLAKE_DEFAULT_TARGET_SCHEMA

Default database schema where data should be written. If not provided schema will attempt to be derived from the stream name (e.g. database taps provide schema name in the stream name).


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set default_target_schema [value]

Faker Locale (faker_config.locale)

  • Environment variable: TARGET_DUCKLAKE_FAKER_CONFIG_LOCALE

One or more LCID locale strings to produce localized output for: https://faker.readthedocs.io/en/master/#localization


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set faker_config locale [value]

Faker Seed (faker_config.seed)

  • Environment variable: TARGET_DUCKLAKE_FAKER_CONFIG_SEED

Value to seed the Faker generator for deterministic output: https://faker.readthedocs.io/en/master/#seeding-the-generator


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set faker_config seed [value]

Enable Schema Flattening (flattening_enabled)

  • Environment variable: TARGET_DUCKLAKE_FLATTENING_ENABLED

'True' to enable schema flattening and automatically expand nested properties.


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set flattening_enabled [value]

Max Flattening Depth (flattening_max_depth)

  • Environment variable: TARGET_DUCKLAKE_FLATTENING_MAX_DEPTH

The max depth to flatten schemas.


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set flattening_max_depth [value]

Load Method (load_method)

  • Environment variable: TARGET_DUCKLAKE_LOAD_METHOD
  • Default Value: append-only

The method to use when loading data into the destination. append-only will always write all input records whether that records already exists or not. upsert will update existing records and insert new records. overwrite will delete all existing records and insert all input records.


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set load_method [value]

User Stream Map Configuration (stream_map_config)

  • Environment variable: TARGET_DUCKLAKE_STREAM_MAP_CONFIG

User-defined config values to be used within map expressions.


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set stream_map_config [value]

Stream Maps (stream_maps)

  • Environment variable: TARGET_DUCKLAKE_STREAM_MAPS

Config object for stream maps capability. For more information check out Stream Maps.


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set stream_maps [value]

Validate Records (validate_records)

  • Environment variable: TARGET_DUCKLAKE_VALIDATE_RECORDS
  • Default Value: true

Whether to validate the schema of the incoming streams.


Configure this setting directly using the following Meltano command:

meltano config target-ducklake set validate_records [value]

Something missing?

This page is generated from a YAML file that you can contribute changes to.

Edit it on GitHub!

Looking for help?

If you're having trouble getting the target-ducklake loader to work, look for an existing issue in its repository, file a new issue, or join the Meltano Slack community and ask for help in the
#plugins-general
channel.

Install

meltano add target-ducklake

Maintenance Status

  • Maintenance Status
  • Built with the Meltano SDK

Repo

https://github.com/definite-app/target-ducklake
  • Stars
  • Forks
  • Last Commit Date
  • Open Issues
  • Open PRs
  • Contributors
  • License

Maintainer

  • Definite

Keywords

  • meltano_sdk