BigQuery
Table of Contents
The tap-bigquery
Meltano extractor pulls data from BigQuery that can then be sent to a destination using a loader.
Alternative variants #
Multiple
variants
of tap-bigquery
are available.
This document describes the default anelendata
variant,
which is recommended for new users.
Alternative variants are:
Getting Started #
Prerequisites #
If you haven't already, follow the initial steps of the Getting Started guide:
Additionally you should follow the steps in the “Activate the Google BigQuery API” section of the repository’s README.
Installation and configuration #
-
Add the
tap-bigquery
extractor to your project usingmeltano add
:meltano add extractor tap-bigquery
-
Configure the settings below using
meltano config
.
Next steps #
Follow the remaining steps of the Getting Started guide:
- Select entities and attributes to extract
- Add a loader to send data to a destination
- Run a data integration (EL) pipeline
Capabilities #
Settings #
tap-bigquery
requires the
configuration
of the following settings:
The settings for extractor tap-bigquery
that are known to Meltano are documented below.
To quickly find the
setting you're looking for, use the Table of Contents at
the top of the page.
Streams (streams
)
#
-
Environment variable:
TAP_BIGQUERY_STREAMS
Array of objects with name
, table
, columns
, datetime_key
, and filters
keys:
name
: The entity name, used by most loaders as the name of the table to be created.table
: Fully qualified table name in BigQuery, with format`<project>.<dataset>.<table>`
. Since backticks have special meaning in YAML, values inmeltano.yml
should be wrapped in double quotes.columns
: Array of column names to select. Using["*"]
is not recommended as it can become very expensive for a table with a large number of columns.datetime_key
: Name of datetime column to use as replication key.filters
: Optional array ofWHERE
clauses to filter extracted data, e.g."column='value'"
.
How to use #
Manage this setting using
meltano config
or an
environment variable:
meltano config tap-bigquery set streams '[...]'
export TAP_BIGQUERY_STREAMS='[...]'
Credentials Path (credentials_path
)
#
-
Environment variable:
TAP_BIGQUERY_CREDENTIALS_PATH
- Default:
$MELTANO_PROJECT_ROOT/client_secrets.json
Fully qualified path to client_secrets.json
for your service account.
See the “Activate the Google BigQuery API” section of the repository’s README and https://cloud.google.com/docs/authentication/production.
By default, this file is expected to be at the root of your project directory.
How to use #
Manage this setting using
meltano config
or an
environment variable:
meltano config tap-bigquery set credentials_path <credentials_path>
export TAP_BIGQUERY_CREDENTIALS_PATH=<credentials_path>
Start Datetime (start_datetime
)
#
-
Environment variable:
TAP_BIGQUERY_START_DATETIME
Determines how much historical data will be extracted. Please be aware that the larger the time period and amount of data, the longer the initial extraction can be expected to take.
How to use #
Manage this setting using
meltano config
or an
environment variable:
meltano config tap-bigquery set start_datetime YYYY-MM-DDTHH:MM:SSZ
export TAP_BIGQUERY_START_DATETIME=YYYY-MM-DDTHH:MM:SSZ
End Datetime (end_datetime
)
#
-
Environment variable:
TAP_BIGQUERY_END_DATETIME
Date up to when historical data will be extracted.
How to use #
Manage this setting using
meltano config
or an
environment variable:
meltano config tap-bigquery set end_datetime YYYY-MM-DDTHH:MM:SSZ
export TAP_BIGQUERY_END_DATETIME=YYYY-MM-DDTHH:MM:SSZ
Limit (limit
)
#
-
Environment variable:
TAP_BIGQUERY_LIMIT
Limits the number of records returned in each stream, applied as a limit in the query.
How to use #
Manage this setting using
meltano config
or an
environment variable:
meltano config tap-bigquery set limit 1234
export TAP_BIGQUERY_LIMIT=1234
Start Always Inclusive (start_always_inclusive
)
#
-
Environment variable:
TAP_BIGQUERY_START_ALWAYS_INCLUSIVE
- Default:
true
When replicating incrementally, disable to only select records whose datetime_key
is greater than the maximum value replicated in the last run, by excluding records whose timestamps match exactly. This could cause records to be missed that were created after the last run finished, but during the same second and with the same timestamp.
How to use #
Manage this setting using
meltano config
or an
environment variable:
meltano config tap-bigquery set start_always_inclusive false
export TAP_BIGQUERY_START_ALWAYS_INCLUSIVE=false
Looking for help? #
If you're having trouble getting the
tap-bigquery
extractor to work, look for an
existing issue in its repository, file a new issue,
or
join the Meltano Slack community
and ask for help in the #plugins-general
channel.
Found an issue on this page? #
This page is generated from a YAML file that you can contribute changes to. Edit it on GitHub!