Skip to main content
Version: 1.0

SnowflakeConfig

Summary​

Configuration used to reference a Snowflake table or query.
 
The SnowflakeConfig class is used to create a reference to a Snowflake table. You can also create a reference to a query on one or more tables, which will be registered in Tecton in a similar way as a view is registered in other data systems.
 
This class is used as an input to a BatchSource's parameter batch_config. Declaring this configuration class alone will not register a Data Source. Instead, declare as a part of BatchSource that takes this configuration class instance as a parameter.

Attributes​

The attributes are the same as the __init__ method parameters. See below.

Methods​

NameDescription
__init__(...)Instantiates a new SnowflakeConfig. One of table and query should be specified when creating this file.

__init__(...)​

Instantiates a new SnowflakeConfig. One of table and query should be specified when creating this file.
 
Example of a SnowflakeConfig declaration:

Parameters

  • database (Optional[str]) - The Snowflake database for this Data source. Default: None

  • schema (Optional[str]) - The Snowflake schema for this Data source. Default: None

  • warehouse (Optional[str]) - The Snowflake warehouse for this Data source. This parameter is not supported in Tecton on Snowflake. Default: None

  • url (Optional[str]) - The connection URL to Snowflake, which contains account information (e.g. https://xy12345.eu-west-1.snowflakecomputing.com). This parameter is not supported in Tecton on Snowflake. Default: None

  • role (Optional[str]) - The Snowflake role that should be used for this Data source. This parameter is not supported in Tecton on Snowflake. Default: None

  • table (Optional[str]) - The table for this Data source. Only one of table and query must be specified. If using Rift, this table name cannot include quotation marks. For example, the table name 'foo"bar' is not supported. Default: None

  • query (Optional[str]) - The query for this Data source. Only one of table and query must be specified. Default: None

  • timestamp_field (Optional[str]) - The timestamp column in this data source that should be used by FilteredSource to filter data from this source, before any feature view transformations are applied. Only required if this source is used with FilteredSource. Default: None

  • post_processor (Optional[Callable]) - Python user defined function f(DataFrame) -> DataFrame that takes in raw PySpark data source DataFrame and translates it to the DataFrame to be consumed by the Feature View. This parameter is not supported in Tecton on Snowflake. Default: None

  • data_delay (timedelta) - By default, incremental materialization jobs run immediately at the end of the batch schedule period. This parameter configures how long they wait after the end of the period before starting, typically to ensure that all data has landed. For example, if a feature view has a batch_schedule of 1 day and one of the data source inputs has data_delay=timedelta(hours=1) set, then incremental materialization jobs will run at 01:00 UTC. Default: 0:00:00

  • user (Union[str, Secret, NoneType]) - (Only supported in Rift) The user used to connect to Snowflake. This can be a string or a Secret. If unset SNOWFLAKE_USER from the environment is used. Default: None

  • password (Union[str, Secret, NoneType]) - (Only supported in Rift) The password used to connect to Snowflake. This can be a string or a Secret. If unset SNOWFLAKE_PASSWORD from the environment is used. Default: None

Returns

A SnowflakeConfig class instance.

Example

from tecton import SnowflakeConfig, BatchSource
# Declare SnowflakeConfig instance object that can be used as an argument in BatchSource
snowflake_ds_config = SnowflakeConfig(
url="https://<your-cluster>.eu-west-1.snowflakecomputing.com/",
database="CLICK_STREAM_DB",
schema="CLICK_STREAM_SCHEMA",
warehouse="COMPUTE_WH",
table="CLICK_STREAM_FEATURES",
query="SELECT timestamp as ts, created, user_id, clicks, click_rate"
"FROM CLICK_STREAM_DB.CLICK_STREAM_FEATURES")
# Use in the BatchSource
snowflake_ds = BatchSource(name="click_stream_snowflake_ds",
batch_config=snowflake_ds_config)

Was this page helpful?