Skip to main content
Version: Beta 🚧

SnowflakeConfig

Summary​

Configuration used to reference a Snowflake table or query.
 
The SnowflakeConfig class is used to create a reference to a Snowflake table. You can also create a reference to a query on one or more tables, which will be registered in Tecton in a similar way as a view is registered in other data systems.
 
This class is used as an input to a BatchSource's parameter batch_config. Declaring this configuration class alone will not register a Data Source. Instead, declare as a part of BatchSource that takes this configuration class instance as a parameter.

Attributes​

The attributes are the same as the __init__ method parameters. See below.

Methods​

NameDescription
__init__(...)Instantiates a new SnowflakeConfig. One of table and query should be specified when creating this file.

__init__(...)​

Instantiates a new SnowflakeConfig. One of table and query should be specified when creating this file.
 
Example of a SnowflakeConfig declaration:

Parameters

  • database: Optional[str] = None The Snowflake database for this Data source.
  • schema: Optional[str] = None The Snowflake schema for this Data source.
  • warehouse: Optional[str] = None The Snowflake warehouse for this Data source.
  • url: Optional[str] = None The connection URL to Snowflake, which contains account information (e.g. https://xy12345.eu-west-1.snowflakecomputing.com).
  • role: Optional[str] = None The Snowflake role that should be used for this Data source.
  • table: Optional[str] = None The table for this Data source. Only one of table and query must be specified. If using Rift, this table name cannot include quotation marks. For example, the table name 'foo"bar' is not supported.
  • query: Optional[str] = None The query for this Data source. Only one of table and query must be specified.
  • timestamp_field: Optional[str] = None The timestamp column in this data source that should be used for time-based filtering. Required unless this source is used in Feature Views only with unfiltered().
  • post_processor: Optional[Callable] = None (Only supported in Spark) Python user defined function f(DataFrame) -> DataFrame that takes in raw PySpark data source DataFrame and translates it to the DataFrame to be consumed by the Feature View.
  • data_delay: timedelta = 0:00:00 By default, incremental materialization jobs run immediately at the end of the batch schedule period. This parameter configures how long they wait after the end of the period before starting, typically to ensure that all data has landed. For example, if a feature view has a batch_schedule of 1 day and one of the data source inputs has data_delay=timedelta(hours=1) set, then incremental materialization jobs will run at 01:00 UTC.
  • user: Union[str, Secret, NoneType] = None (Only supported in Rift) The user used to connect to Snowflake. This can be a string or a Secret. If unset SNOWFLAKE_USER from the environment is used.
  • password: Union[str, Secret, NoneType] = None (Only supported in Rift) The password used to connect to Snowflake. This can be a string or a Secret. If unset SNOWFLAKE_PASSWORD from the environment is used. Deprecated: Use private_key authentication instead.
  • private_key: Union[str, Secret, NoneType] = None (Only supported in Rift) The private key used to connect to Snowflake for key-pair authentication. This can be a string or a Secret. If unset SNOWFLAKE_PRIVATE_KEY from the environment is used. Recommended over password authentication.
  • private_key_passphrase: Union[str, Secret, NoneType] = None (Only supported in Rift) The passphrase for the private key used to connect to Snowflake. This can be a string or a Secret. If unset SNOWFLAKE_PRIVATE_KEY_PASSPHRASE from the environment is used. Optional, only needed if the private key is encrypted.

Returns

A SnowflakeConfig class instance.

Example

from tecton import SnowflakeConfig, BatchSource
# Declare SnowflakeConfig instance object that can be used as an argument in BatchSource
snowflake_ds_config = SnowflakeConfig(
url="https://<your-cluster>.eu-west-1.snowflakecomputing.com/",
database="CLICK_STREAM_DB",
schema="CLICK_STREAM_SCHEMA",
warehouse="COMPUTE_WH",
table="CLICK_STREAM_FEATURES",
query="SELECT timestamp as ts, created, user_id, clicks, click_rate"
"FROM CLICK_STREAM_DB.CLICK_STREAM_FEATURES")
# Use in the BatchSource
snowflake_ds = BatchSource(name="click_stream_snowflake_ds",
batch_config=snowflake_ds_config)

Was this page helpful?