Snowflake (datamill-co
variant)
Table of Contents
- Alternative variants
- Standalone usage
- Usage with Meltano
- Capabilities
- Settings
-
snowflake_account
-
snowflake_username
-
snowflake_password
-
snowflake_role
-
snowflake_database
-
snowflake_authenticator
-
snowflake_warehouse
-
snowflake_schema
-
invalid_records_detect
-
invalid_records_threshold
-
disable_collection
-
logging_level
-
persist_empty_tables
-
state_support
-
target_s3.bucket
-
target_s3.key_prefix
-
target_s3.aws_access_key_id
-
target_s3.aws_secret_access_key
-
- Looking for help?
The
target-snowflake
Singer target
sends data into
Snowflake
after it was pulled from a source using a
Singer tap.
Alternative variants #
Multiple
variants
of target-snowflake
are available. This document describes the
datamill-co
variant.
Alternative variants are:
-
transferwise
(default) -
meltano
Standalone usage #
Install the package using pip:
pip install target-snowflake
For additional instructions, refer to the README in the repository.
Usage with Meltano #
Install Meltano, create your Meltano project, and add the target to your project as a loader:
meltano add loader target-snowflake --variant datamill-co
For additional instructions, refer to the
Meltano-specific documentation for target-snowflake
.
Capabilities #
Settings #
snowflake_account
#
ACCOUNT
might require the region
and cloud
platform where your account is located, in the form of: <your_account_name>.<region_id>.<cloud>
(e.g. xy12345.east-us-2.azure
) Refer to Snowflake’s documentation about Account: https://docs.snowflake.net/manuals/user-guide/connecting.html#your-snowflake-account-name-and-url
snowflake_username
#
snowflake_password
#
snowflake_role
#
If not specified, Snowflake will use the user’s default role.
snowflake_database
#
snowflake_authenticator
#
- Default:
snowflake
Specifies the authentication provider for snowflake to use. Valud options are the internal one (“snowflake”), a browser session (“externalbrowser”), or Okta (“https://
snowflake_warehouse
#
snowflake_schema
#
invalid_records_detect
#
- Default:
true
Include false
in your config to disable crashing on invalid records
invalid_records_threshold
#
- Default:
0
Include a positive value n
in your config to allow at most n
invalid records per stream before giving up.
disable_collection
#
- Default:
false
Include true
in your config to disable Singer Usage Logging: https://github.com/datamill-co/target-snowflake#usage-logging
logging_level
#
- Default:
INFO
The level for logging. Set to DEBUG
to get things like queries executed, timing of those queries, etc.
persist_empty_tables
#
- Default:
false
Whether the Target should create tables which have no records present in Remote.
state_support
#
- Default:
true
Whether the Target should emit STATE
messages to stdout for further consumption. In this mode, which is on by default, STATE messages are buffered in memory until all the records that occurred before them are flushed according to the batch flushing schedule the target is configured with.
target_s3.bucket
#
When included, use S3 to stage files. Bucket where staging files should be uploaded to.
target_s3.key_prefix
#
Prefix for staging file uploads to allow for better delineation of tmp files
target_s3.aws_access_key_id
#
target_s3.aws_secret_access_key
#
Looking for help? #
If you're having trouble getting
target-snowflake
to work by itself or with
Meltano, look for an
existing issue in its repository, file a new issue,
or
join the Meltano Slack community
and ask for help in the #plugins-general
channel.
Found an issue on this page? #
This page is generated from a YAML file that you can contribute changes to! It is also validated against a JSON Schema used for taps and targets.