Snowflake

target-snowflake (datamill-co variant)🥈

Snowflake database loader

The target-snowflake loader sends data into Snowflake after it was pulled from a source using an extractor

Alternate Implementations

Getting Started

Prerequisites

If you haven't already, follow the initial steps of the Getting Started guide:

  1. Install Meltano
  2. Create your Meltano project

Dependencies

target-snowflake requires the libpq library to be available on your system. If you've installed PostgreSQL, you should already have it, but you can also install it by itself using the libpq-dev package on Ubuntu/Debian or the libpq Homebrew formula on macOS.

Installation and configuration

  1. Add the target-snowflake loader to your project using
    meltano add
    :
  2. meltano add loader target-snowflake --variant datamill-co
  3. Configure the target-snowflake settings using
    meltano config
    :
  4. meltano config target-snowflake set --interactive

Next steps

If you run into any issues, learn how to get help.

Capabilities

This plugin currently has no capabilities defined. If you know the capabilities required by this plugin, please contribute!

Settings

The target-snowflake settings that are known to Meltano are documented below. To quickly find the setting you're looking for, click on any setting name from the list:

You can also list these settings using

meltano config
with the list subcommand:

meltano config target-snowflake list

You can override these settings or specify additional ones in your meltano.yml by adding the settings key.

Please consider adding any settings you have defined locally to this definition on MeltanoHub by making a pull request to the YAML file that defines the settings for this plugin.

Disable Collection (disable_collection)

  • Environment variable: TARGET_SNOWFLAKE_DISABLE_COLLECTION
  • Default Value: false

Include true in your config to disable Singer Usage Logging: https://github.com/datamill-co/target-snowflake#usage-logging


Configure this setting directly using the following Meltano command:

meltano config target-snowflake set disable_collection [value]

Invalid Records Detect (invalid_records_detect)

  • Environment variable: TARGET_SNOWFLAKE_INVALID_RECORDS_DETECT
  • Default Value: true

Include false in your config to disable crashing on invalid records


Configure this setting directly using the following Meltano command:

meltano config target-snowflake set invalid_records_detect [value]

Invalid Records Threshold (invalid_records_threshold)

  • Environment variable: TARGET_SNOWFLAKE_INVALID_RECORDS_THRESHOLD
  • Default Value: 0

Include a positive value n in your config to allow at most n invalid records per stream before giving up.


Configure this setting directly using the following Meltano command:

meltano config target-snowflake set invalid_records_threshold [value]

Logging Level (logging_level)

  • Environment variable: TARGET_SNOWFLAKE_LOGGING_LEVEL
  • Default Value: INFO

The level for logging. Set to DEBUG to get things like queries executed, timing of those queries, etc. See Python's Logger Levels for information about valid values.


Configure this setting directly using the following Meltano command:

meltano config target-snowflake set logging_level [value]

Persist Empty Tables (persist_empty_tables)

  • Environment variable: TARGET_SNOWFLAKE_PERSIST_EMPTY_TABLES
  • Default Value: false

Whether the Target should create tables which have no records present in Remote.


Configure this setting directly using the following Meltano command:

meltano config target-snowflake set persist_empty_tables [value]

Account (snowflake_account)

  • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_ACCOUNT

ACCOUNT might require the region and cloud platform where your account is located, in the form of: <your_account_name>.<region_id>.<cloud> (e.g. xy12345.east-us-2.azure)

Refer to Snowflake's documentation about Accounts.


Configure this setting directly using the following Meltano command:

meltano config target-snowflake set snowflake_account [value]

Snowflake Authenticator (snowflake_authenticator)

  • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_AUTHENTICATOR
  • Default Value: snowflake

Specifies the authentication provider for snowflake to use. Valud options are the internal one ("snowflake"), a browser session ("externalbrowser"), or Okta ("https://.okta.com"). See the snowflake docs for more details.


Configure this setting directly using the following Meltano command:

meltano config target-snowflake set snowflake_authenticator [value]

Snowflake Database (snowflake_database)

  • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_DATABASE
[No description provided.]

Configure this setting directly using the following Meltano command:

meltano config target-snowflake set snowflake_database [value]

Password (snowflake_password)

  • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_PASSWORD
[No description provided.]

Configure this setting directly using the following Meltano command:

meltano config target-snowflake set snowflake_password [value]

Role (snowflake_role)

  • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_ROLE

If not specified, Snowflake will use the user's default role.


Configure this setting directly using the following Meltano command:

meltano config target-snowflake set snowflake_role [value]

Snowflake Schema (snowflake_schema)

  • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_SCHEMA
  • Default Value: $MELTANO_EXTRACT__LOAD_SCHEMA

Note $MELTANO_EXTRACT__LOAD_SCHEMA will expand to the value of the load_schema extra for the extractor used in the pipeline, which defaults to the extractor's namespace, e.g. tap_gitlab for tap-gitlab. Values are automatically converted to uppercase before they're passed on to the plugin, so tap_gitlab becomes TAP_GITLAB.


Configure this setting directly using the following Meltano command:

meltano config target-snowflake set snowflake_schema [value]

Username (snowflake_username)

  • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_USERNAME
[No description provided.]

Configure this setting directly using the following Meltano command:

meltano config target-snowflake set snowflake_username [value]

Warehouse (snowflake_warehouse)

  • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_WAREHOUSE
[No description provided.]

Configure this setting directly using the following Meltano command:

meltano config target-snowflake set snowflake_warehouse [value]

State Support (state_support)

  • Environment variable: TARGET_SNOWFLAKE_STATE_SUPPORT
  • Default Value: true

Whether the Target should emit STATE messages to stdout for further consumption. In this mode, which is on by default, STATE messages are buffered in memory until all the records that occurred before them are flushed according to the batch flushing schedule the target is configured with.


Configure this setting directly using the following Meltano command:

meltano config target-snowflake set state_support [value]

Target S3 AWS Access Key ID (target_s3.aws_access_key_id)

  • Environment variable: TARGET_SNOWFLAKE_TARGET_S3_AWS_ACCESS_KEY_ID
[No description provided.]

Configure this setting directly using the following Meltano command:

meltano config target-snowflake set target_s3 aws_access_key_id [value]

Target S3 AWS Secret Access Key (target_s3.aws_secret_access_key)

  • Environment variable: TARGET_SNOWFLAKE_TARGET_S3_AWS_SECRET_ACCESS_KEY
[No description provided.]

Configure this setting directly using the following Meltano command:

meltano config target-snowflake set target_s3 aws_secret_access_key [value]

Target S3 Bucket (target_s3.bucket)

  • Environment variable: TARGET_SNOWFLAKE_TARGET_S3_BUCKET

When included, use S3 to stage files. Bucket where staging files should be uploaded to.


Configure this setting directly using the following Meltano command:

meltano config target-snowflake set target_s3 bucket [value]

Target S3 Key Prefix (target_s3.key_prefix)

  • Environment variable: TARGET_SNOWFLAKE_TARGET_S3_KEY_PREFIX

Prefix for staging file uploads to allow for better delineation of tmp files


Configure this setting directly using the following Meltano command:

meltano config target-snowflake set target_s3 key_prefix [value]

Troubleshooting

Error: pg_config executable not found or libpq-fe.h: No such file or directory

This error message indicates that the libpq dependency is missing.

To resolve this, refer to the "Dependencies" section above.

Something missing?

This page is generated from a YAML file that you can contribute changes to.

Edit it on GitHub!

Looking for help?

If you're having trouble getting the target-snowflake loader to work, look for an existing issue in its repository, file a new issue, or join the Meltano Slack community and ask for help in the
#plugins-general
channel.

Install

meltano add loader target-snowflake --variant datamill-co

Maintenance Status

  • Maintenance Status

Repo

https://github.com/datamill-co/target-snowflake
  • Stars
  • Forks
  • Last Commit Date
  • Open Issues
  • Open PRs
  • Contributors
  • License

Maintainer

  • Data Mill

Meltano Stats

  • Total Executions (Last 3 Months)
  • Projects (Last 3 Months)

PyPI Stats

  • PyPI Downloads
  • PyPI Package Version

Keywords

  • database