SnowflakeConfig
Summary​
Configuration used to reference a Snowflake table or query.Â
The SnowflakeConfig class is used to create a reference to a Snowflake table. You can also create a reference to a query on one or more tables, which will be registered in Tecton in a similar way as a view is registered in other data systems.
Â
This class is used as an input to a
BatchSource's parameter batch_config. Declaring this configuration
class alone will not register a Data Source. Instead, declare as a part of BatchSource that takes this configuration
class instance as a parameter.
Attributes​
The attributes are the same as the __init__ method parameters. See below.
Methods​
| Name | Description |
|---|---|
__init__(...) | Instantiates a new SnowflakeConfig. One of table and query should be specified when creating this file. |
__init__(...)​
Instantiates a new SnowflakeConfig. One of table and query should be specified when creating this file.Â
Example of a SnowflakeConfig declaration:
Parameters
database: Optional[str] = NoneThe Snowflake database for this Data source.schema: Optional[str] = NoneThe Snowflake schema for this Data source.warehouse: Optional[str] = NoneThe Snowflake warehouse for this Data source.url: Optional[str] = NoneThe connection URL to Snowflake, which contains account information (e.g. https://xy12345.eu-west-1.snowflakecomputing.com).role: Optional[str] = NoneThe Snowflake role that should be used for this Data source.table: Optional[str] = NoneThe table for this Data source. Only one oftableandquerymust be specified. If using Rift, this table name cannot include quotation marks. For example, the table name 'foo"bar' is not supported.query: Optional[str] = NoneThe query for this Data source. Only one oftableandquerymust be specified.timestamp_field: Optional[str] = NoneThe timestamp column in this data source that should be used for time-based filtering. Required unless this source is used in Feature Views only withunfiltered().post_processor: Optional[Callable] = None(Only supported in Spark) Python user defined functionf(DataFrame) -> DataFramethat takes in raw PySpark data source DataFrame and translates it to the DataFrame to be consumed by the Feature View.data_delay: timedelta = 0:00:00By default, incremental materialization jobs run immediately at the end of the batch schedule period. This parameter configures how long they wait after the end of the period before starting, typically to ensure that all data has landed. For example, if a feature view has abatch_scheduleof 1 day and one of the data source inputs hasdata_delay=timedelta(hours=1)set, then incremental materialization jobs will run at01:00UTC.user: Union[str, Secret, NoneType] = None(Only supported in Rift) The user used to connect to Snowflake. This can be a string or aSecret. If unset SNOWFLAKE_USER from the environment is used.password: Union[str, Secret, NoneType] = None(Only supported in Rift) The password used to connect to Snowflake. This can be a string or aSecret. If unset SNOWFLAKE_PASSWORD from the environment is used. Deprecated: Use private_key authentication instead.private_key: Union[str, Secret, NoneType] = None(Only supported in Rift) The private key used to connect to Snowflake for key-pair authentication. This can be a string or aSecret. If unset SNOWFLAKE_PRIVATE_KEY from the environment is used. Recommended over password authentication.private_key_passphrase: Union[str, Secret, NoneType] = None(Only supported in Rift) The passphrase for the private key used to connect to Snowflake. This can be a string or aSecret. If unset SNOWFLAKE_PRIVATE_KEY_PASSPHRASE from the environment is used. Optional, only needed if the private key is encrypted.
Returns
A SnowflakeConfig class instance.Example
from tecton import SnowflakeConfig, BatchSource# Declare SnowflakeConfig instance object that can be used as an argument in BatchSourcesnowflake_ds_config = SnowflakeConfig(url="https://<your-cluster>.eu-west-1.snowflakecomputing.com/",database="CLICK_STREAM_DB",schema="CLICK_STREAM_SCHEMA",warehouse="COMPUTE_WH",table="CLICK_STREAM_FEATURES",query="SELECT timestamp as ts, created, user_id, clicks, click_rate""FROM CLICK_STREAM_DB.CLICK_STREAM_FEATURES")# Use in the BatchSourcesnowflake_ds = BatchSource(name="click_stream_snowflake_ds",batch_config=snowflake_ds_config)