The target-snowflake Singer target sends data into Snowflake data warehouse after it was pulled from a source using a Singer tap.

Alternative variants

Multiple variants of target-snowflake are available. This document describes the datamill-co variant.

Alternative variants are:

Standalone usage

Install the package using pip:

pip install target-snowflake

For additional instructions, refer to the README in the repository.

Usage with Meltano

Install Meltano, create your Meltano project, and add the target to your project as a loader:

meltano add loader target-snowflake --variant datamill-co

For additional instructions, refer to the Meltano-specific documentation for target-snowflake.

Capabilities

Settings

snowflake_account

ACCOUNT might require the region and cloud platform where your account is located, in the form of: <your_account_name>.<region_id>.<cloud> (e.g. xy12345.east-us-2.azure) Refer to Snowflake’s documentation about Account: https://docs.snowflake.net/manuals/user-guide/connecting.html#your-snowflake-account-name-and-url

snowflake_username

snowflake_password

snowflake_role

If not specified, Snowflake will use the user’s default role.

snowflake_database

snowflake_authenticator

Specifies the authentication provider for snowflake to use. Valud options are the internal one (“snowflake”), a browser session (“externalbrowser”), or Okta (“https://.okta.com"). See the snowflake docs for more details.

snowflake_warehouse

snowflake_schema

invalid_records_detect

Include false in your config to disable crashing on invalid records

invalid_records_threshold

Include a positive value n in your config to allow at most n invalid records per stream before giving up.

disable_collection

Include true in your config to disable Singer Usage Logging: https://github.com/datamill-co/target-snowflake#usage-logging

logging_level

The level for logging. Set to DEBUG to get things like queries executed, timing of those queries, etc.

persist_empty_tables

Whether the Target should create tables which have no records present in Remote.

state_support

Whether the Target should emit STATE messages to stdout for further consumption. In this mode, which is on by default, STATE messages are buffered in memory until all the records that occurred before them are flushed according to the batch flushing schedule the target is configured with.

target_s3.bucket

When included, use S3 to stage files. Bucket where staging files should be uploaded to.

target_s3.key_prefix

Prefix for staging file uploads to allow for better delineation of tmp files

target_s3.aws_access_key_id

target_s3.aws_secret_access_key

Looking for help?

If you're having trouble getting target-snowflake to work by itself or with Meltano, look for an existing issue in its repository, file a new issue, or join the Meltano Slack community and ask for help in the #plugins-general channel.

Found an issue on this page?

This page is generated from a YAML file that you can contribute changes to! It is also validated against a JSON Schema used for taps and targets.