The target-snowflake loader sends data into Snowflake after it was pulled from a source using an extractor
Alternate Implementations
Getting Started
Prerequisites
If you haven't already, follow the initial steps of the Getting Started guide:
Dependencies
target-snowflake
requires the
libpq
library to be available on your system.
If you've installed PostgreSQL, you should already have it, but you can also install it by itself using the
libpq-dev
package on Ubuntu/Debian or the
libpq
Homebrew formula on macOS.
Installation and configuration
-
Add the target-snowflake loader to your
project using
:meltano add
-
Configure the target-snowflake
settings using
:meltano config
meltano add loader target-snowflake --variant datamill-co
meltano config target-snowflake set --interactive
Next steps
Follow the remaining steps of the Getting Started guide:
If you run into any issues, learn how to get help.
Capabilities
This plugin currently has no capabilities defined. If you know the capabilities required by this plugin, please contribute!Settings
The
target-snowflake
settings that are known to Meltano are documented below. To quickly
find the setting you're looking for, click on any setting name from the list:
disable_collection
invalid_records_detect
invalid_records_threshold
logging_level
persist_empty_tables
snowflake_account
snowflake_authenticator
snowflake_database
snowflake_password
snowflake_role
snowflake_schema
snowflake_username
snowflake_warehouse
state_support
target_s3.aws_access_key_id
target_s3.aws_secret_access_key
target_s3.bucket
target_s3.key_prefix
You can also list these settings using
with the meltano config
list
subcommand:
meltano config target-snowflake list
You can
override these settings or specify additional ones
in your meltano.yml
by adding the settings
key.
Please consider adding any settings you have defined locally to this definition on MeltanoHub by making a pull request to the YAML file that defines the settings for this plugin.
Disable Collection (disable_collection)
-
Environment variable:
TARGET_SNOWFLAKE_DISABLE_COLLECTION
-
Default Value:
false
Include true
in your config to disable Singer Usage Logging: https://github.com/datamill-co/target-snowflake#usage-logging
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set disable_collection [value]
Invalid Records Detect (invalid_records_detect)
-
Environment variable:
TARGET_SNOWFLAKE_INVALID_RECORDS_DETECT
-
Default Value:
true
Include false
in your config to disable crashing on invalid records
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set invalid_records_detect [value]
Invalid Records Threshold (invalid_records_threshold)
-
Environment variable:
TARGET_SNOWFLAKE_INVALID_RECORDS_THRESHOLD
-
Default Value:
0
Include a positive value n
in your config to allow at most n
invalid records per stream before giving up.
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set invalid_records_threshold [value]
Logging Level (logging_level)
-
Environment variable:
TARGET_SNOWFLAKE_LOGGING_LEVEL
-
Default Value:
INFO
The level for logging. Set to DEBUG
to get things like queries executed, timing of those queries, etc. See Python's Logger Levels for information about valid values.
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set logging_level [value]
Persist Empty Tables (persist_empty_tables)
-
Environment variable:
TARGET_SNOWFLAKE_PERSIST_EMPTY_TABLES
-
Default Value:
false
Whether the Target should create tables which have no records present in Remote.
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set persist_empty_tables [value]
Account (snowflake_account)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_ACCOUNT
ACCOUNT
might require the region
and cloud
platform where your account is located, in the form of: <your_account_name>.<region_id>.<cloud>
(e.g. xy12345.east-us-2.azure
)
Refer to Snowflake's documentation about Accounts.
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set snowflake_account [value]
Snowflake Authenticator (snowflake_authenticator)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_AUTHENTICATOR
-
Default Value:
snowflake
Specifies the authentication provider for snowflake to use. Valud options are the internal one ("snowflake"), a browser session ("externalbrowser"), or Okta ("https://
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set snowflake_authenticator [value]
Snowflake Database (snowflake_database)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_DATABASE
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set snowflake_database [value]
Password (snowflake_password)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_PASSWORD
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set snowflake_password [value]
Role (snowflake_role)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_ROLE
If not specified, Snowflake will use the user's default role.
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set snowflake_role [value]
Snowflake Schema (snowflake_schema)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_SCHEMA
-
Default Value:
$MELTANO_EXTRACT__LOAD_SCHEMA
Note $MELTANO_EXTRACT__LOAD_SCHEMA
will expand to the value of the load_schema
extra for the extractor used in the pipeline, which defaults to the extractor's namespace, e.g. tap_gitlab
for tap-gitlab
. Values are automatically converted to uppercase before they're passed on to the plugin, so tap_gitlab
becomes TAP_GITLAB
.
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set snowflake_schema [value]
Username (snowflake_username)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_USERNAME
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set snowflake_username [value]
Warehouse (snowflake_warehouse)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_WAREHOUSE
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set snowflake_warehouse [value]
State Support (state_support)
-
Environment variable:
TARGET_SNOWFLAKE_STATE_SUPPORT
-
Default Value:
true
Whether the Target should emit STATE
messages to stdout for further consumption. In this mode, which is on by default, STATE messages are buffered in memory until all the records that occurred before them are flushed according to the batch flushing schedule the target is configured with.
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set state_support [value]
Target S3 AWS Access Key ID (target_s3.aws_access_key_id)
-
Environment variable:
TARGET_SNOWFLAKE_TARGET_S3_AWS_ACCESS_KEY_ID
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set target_s3 aws_access_key_id [value]
Target S3 AWS Secret Access Key (target_s3.aws_secret_access_key)
-
Environment variable:
TARGET_SNOWFLAKE_TARGET_S3_AWS_SECRET_ACCESS_KEY
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set target_s3 aws_secret_access_key [value]
Target S3 Bucket (target_s3.bucket)
-
Environment variable:
TARGET_SNOWFLAKE_TARGET_S3_BUCKET
When included, use S3 to stage files. Bucket where staging files should be uploaded to.
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set target_s3 bucket [value]
Target S3 Key Prefix (target_s3.key_prefix)
-
Environment variable:
TARGET_SNOWFLAKE_TARGET_S3_KEY_PREFIX
Prefix for staging file uploads to allow for better delineation of tmp files
Configure this setting directly using the following Meltano command:
meltano config target-snowflake set target_s3 key_prefix [value]
Troubleshooting
Error: pg_config executable not found
or libpq-fe.h: No such file or directory
This error message indicates that the libpq
dependency is missing.
To resolve this, refer to the "Dependencies" section above.
Something missing?
This page is generated from a YAML file that you can contribute changes to.
Edit it on GitHub!Looking for help?
#plugins-general
channel.