The target-snowflake Meltano loader sends data into Snowflake after it was pulled from a source using an extractor.

Alternative variants #

Multiple variants of target-snowflake are available. This document describes the datamill-co variant.

Alternative variants are:

Getting Started #

Prerequisites #

If you haven't already, follow the initial steps of the Getting Started guide:

  1. Install Meltano
  2. Create your Meltano project
  3. Add an extractor to pull data from a source

Dependencies #

target-snowflake requires the libpq library to be available on your system. If you’ve installed PostgreSQL, you should already have it, but you can also install it by itself using the libpq-dev package on Ubuntu/Debian or the libpq Homebrew formula on macOS.

Installation and configuration #

  1. Add the target-snowflake loader to your project using meltano add :

    meltano add loader target-snowflake --variant datamill-co
  2. Configure the settings below using meltano config .

Next steps #

Follow the remaining steps of the Getting Started guide:

  1. Run a data integration (EL) pipeline
If you run into any issues, learn how to get help.

Capabilities #

These capabilities can also be overriden by specifying the capabilities key in your meltano.yml file.

Settings #

target-snowflake requires the configuration of the following settings:

The settings for loader target-snowflake that are known to Meltano are documented below. To quickly find the setting you're looking for, use the Table of Contents at the top of the page.

You can override these settings or specify additional ones in your meltano.yml by adding the settings key. Please consider adding any settings you have defined locally to this definition on MeltanoHub by making a pull request to the YAML file that defines the settings for this loader.

Account (snowflake_account) #

ACCOUNT might require the region and cloud platform where your account is located, in the form of: <your_account_name>.<region_id>.<cloud> (e.g. xy12345.east-us-2.azure)

Refer to Snowflake’s documentation about Accounts.

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set snowflake_account <snowflake_account>

export TARGET_SNOWFLAKE_SNOWFLAKE_ACCOUNT=<snowflake_account>

Username (snowflake_username) #

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set snowflake_username <snowflake_username>

export TARGET_SNOWFLAKE_SNOWFLAKE_USERNAME=<snowflake_username>

Password (snowflake_password) #

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set snowflake_password <snowflake_password>

export TARGET_SNOWFLAKE_SNOWFLAKE_PASSWORD=<snowflake_password>

Role (snowflake_role) #

If not specified, Snowflake will use the user’s default role.

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set snowflake_role <snowflake_role>

export TARGET_SNOWFLAKE_SNOWFLAKE_ROLE=<snowflake_role>

Snowflake Database (snowflake_database) #

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set snowflake_database <snowflake_database>

export TARGET_SNOWFLAKE_SNOWFLAKE_DATABASE=<snowflake_database>

Snowflake Authenticator (snowflake_authenticator) #

Specifies the authentication provider for snowflake to use. Valud options are the internal one (“snowflake”), a browser session (“externalbrowser”), or Okta (“https://.okta.com"). See the snowflake docs for more details.

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set snowflake_authenticator <snowflake_authenticator>

export TARGET_SNOWFLAKE_SNOWFLAKE_AUTHENTICATOR=<snowflake_authenticator>

Warehouse (snowflake_warehouse) #

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set snowflake_warehouse <snowflake_warehouse>

export TARGET_SNOWFLAKE_SNOWFLAKE_WAREHOUSE=<snowflake_warehouse>

Invalid Records Detect (invalid_records_detect) #

Include false in your config to disable crashing on invalid records

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set invalid_records_detect false

export TARGET_SNOWFLAKE_INVALID_RECORDS_DETECT=false

Invalid Records Threshold (invalid_records_threshold) #

Include a positive value n in your config to allow at most n invalid records per stream before giving up.

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set invalid_records_threshold 0

export TARGET_SNOWFLAKE_INVALID_RECORDS_THRESHOLD=0

Disable Collection (disable_collection) #

Include true in your config to disable Singer Usage Logging: https://github.com/datamill-co/target-snowflake#usage-logging

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set disable_collection true

export TARGET_SNOWFLAKE_DISABLE_COLLECTION=true

Logging Level (logging_level) #

  • Environment variable: TARGET_SNOWFLAKE_LOGGING_LEVEL
  • Options: DEBUG INFO WARNING ERROR CRITICAL
  • Default: INFO

The level for logging. Set to DEBUG to get things like queries executed, timing of those queries, etc. See Python’s Logger Levels for information about valid values.

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set logging_level DEBUG

export TARGET_SNOWFLAKE_LOGGING_LEVEL=DEBUG

Persist Empty Tables (persist_empty_tables) #

Whether the Target should create tables which have no records present in Remote.

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set persist_empty_tables true

export TARGET_SNOWFLAKE_PERSIST_EMPTY_TABLES=true

Snowflake Schema (snowflake_schema) #

Note $MELTANO_EXTRACT__LOAD_SCHEMA will expand to the value of the load_schema extra for the extractor used in the pipeline, which defaults to the extractor’s namespace, e.g. tap_gitlab for tap-gitlab. Values are automatically converted to uppercase before they’re passed on to the plugin, so tap_gitlab becomes TAP_GITLAB.

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set snowflake_schema <snowflake_schema>

export TARGET_SNOWFLAKE_SNOWFLAKE_SCHEMA=<snowflake_schema>

State Support (state_support) #

Whether the Target should emit STATE messages to stdout for further consumption. In this mode, which is on by default, STATE messages are buffered in memory until all the records that occurred before them are flushed according to the batch flushing schedule the target is configured with.

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set state_support false

export TARGET_SNOWFLAKE_STATE_SUPPORT=false

Target S3 Bucket (target_s3.bucket) #

When included, use S3 to stage files. Bucket where staging files should be uploaded to.

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set target_s3.bucket <target_s3.bucket>

export TARGET_SNOWFLAKE_TARGET_S3_BUCKET=<target_s3.bucket>

Target S3 Key Prefix (target_s3.key_prefix) #

Prefix for staging file uploads to allow for better delineation of tmp files

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set target_s3.key_prefix <target_s3.key_prefix>

export TARGET_SNOWFLAKE_TARGET_S3_KEY_PREFIX=<target_s3.key_prefix>

Target S3 AWS Access Key ID (target_s3.aws_access_key_id) #

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set target_s3.aws_access_key_id <target_s3.aws_access_key_id>

export TARGET_SNOWFLAKE_TARGET_S3_AWS_ACCESS_KEY_ID=<target_s3.aws_access_key_id>

Target S3 AWS Secret Access Key (target_s3.aws_secret_access_key) #

How to use #

Manage this setting using meltano config or an environment variable:

meltano config target-snowflake set target_s3.aws_secret_access_key <target_s3.aws_secret_access_key>

export TARGET_SNOWFLAKE_TARGET_S3_AWS_SECRET_ACCESS_KEY=<target_s3.aws_secret_access_key>

Troubleshooting #

Error: pg_config executable not found or libpq-fe.h: No such file or directory #

This error message indicates that the libpq dependency is missing.

To resolve this, refer to the “Dependencies” section above.

Looking for help? #

If you're having trouble getting the target-snowflake loader to work, look for an existing issue in its repository, file a new issue, or join the Meltano Slack community and ask for help in the #plugins-general channel.

Found an issue on this page? #

This page is generated from a YAML file that you can contribute changes to. Edit it on GitHub!