The target-postgres Singer target sends data into PostgreSQL database after it was pulled from a source using a Singer tap.

Alternative variants

Multiple variants of target-postgres are available. This document describes the datamill-co variant.

Alternative variants are:

Standalone usage

Install the package using pip:

pip install singer-target-postgres

For additional instructions, refer to the README in the repository.

Usage with Meltano

Install Meltano, create your Meltano project, and add the target to your project as a loader:

meltano add loader target-postgres --variant datamill-co

For additional instructions, refer to the Meltano-specific documentation for target-postgres.

Capabilities

Settings

postgres_host

postgres_port

postgres_database

postgres_username

postgres_password

postgres_schema

postgres_sslmode

Refer to the libpq docs for more information about SSL: https://www.postgresql.org/docs/current/libpq-connect.html#LIBPQ-PARAMKEYWORDS

postgres_sslcert

Only used if a SSL request w/ a client certificate is being made

postgres_sslkey

Only used if a SSL request w/ a client certificate is being made

postgres_sslrootcert

Used for authentication of a server SSL certificate

postgres_sslcrl

Used for authentication of a server SSL certificate

invalid_records_detect

Include false in your config to disable target-postgres from crashing on invalid records

invalid_records_threshold

Include a positive value n in your config to allow for target-postgres to encounter at most n invalid records per stream before giving up.

disable_collection

Include true in your config to disable Singer Usage Logging: https://github.com/datamill-co/target-postgres#usage-logging

logging_level

The level for logging. Set to DEBUG to get things like queries executed, timing of those queries, etc.

persist_empty_tables

Whether the Target should create tables which have no records present in Remote.

max_batch_rows

The maximum number of rows to buffer in memory before writing to the destination table in Postgres

max_buffer_size

The maximum number of bytes to buffer in memory before writing to the destination table in Postgres. Default: 100MB in bytes

batch_detection_threshold

How often, in rows received, to count the buffered rows and bytes to check if a flush is necessary. There’s a slight performance penalty to checking the buffered records count or bytesize, so this controls how often this is polled in order to mitigate the penalty. This value is usually not necessary to set as the default is dynamically adjusted to check reasonably often.

state_support

Whether the Target should emit STATE messages to stdout for further consumption. In this mode, which is on by default, STATE messages are buffered in memory until all the records that occurred before them are flushed according to the batch flushing schedule the target is configured with.

add_upsert_indexes

Whether the Target should create column indexes on the important columns used during data loading. These indexes will make data loading slightly slower but the deduplication phase much faster. Defaults to on for better baseline performance.

before_run_sql

Raw SQL statement(s) to execute as soon as the connection to Postgres is opened by the target. Useful for setup like SET ROLE or other connection state that is important.

after_run_sql

Raw SQL statement(s) to execute as soon as the connection to Postgres is opened by the target. Useful for setup like SET ROLE or other connection state that is important.

Looking for help?

If you're having trouble getting target-postgres to work by itself or with Meltano, look for an existing issue in its repository, file a new issue, or join the Meltano Slack community and ask for help in the #plugins-general channel.

Found an issue on this page?

This page is generated from a YAML file that you can contribute changes to! It is also validated against a JSON Schema used for taps and targets.