Skip to main content
Version: 1.1

1.0 to 1.1 Upgrade Guide

Overview​

New features:​

  • Replication Constraint in Dynamo Config: Use the new replica_region parameter in the DynamoConfig for a Feature View's online store to specify which regions you want to constrain replication to. This allows you to avoid costly replication charges to your satellite regions that you don't need to serve traffic to.

  • Migrate off of Python 3.7 in EMR: You can now run EMR compute on Python 3.9 regardless of which EMR version you use. This allows users running in EMR version 6.X to get off of Python 3.7, which reached end of life in June 2023 and is no longer receiving security upgrades. To set the python_version parameter in the EMRClusterConfig:

    EMRClusterConfig(emr_version="emr-6.7.0", python_version="python_3_9_13")
note

In SDK 1.2, the Python version will be set to "python_3_9_13" by default. We recommend using the flag in your 1.1 upgrade to ensure a smooth transition..

  • Improved Unit Testing Support: Using the MockContext class, you can now set custom mock secrets and resources for your queries, enabling faster development and iteration in unit test and notebook environments. View the Unit Testing docs for more details.

    mock_context = MockContext(secrets={"my_secret": "my_secret_value"})

    actual = transaction_amount_is_high.run_transformation(
    input_data={
    "transaction_request": transaction_request,
    "context": mock_context,
    }
    ).to_pandas()

    expected = pandas.DataFrame({"transaction_amount_is_high": [0, 1, 1]})
    pandas.testing.assert_frame_equal(actual, expected)
  • Configure Data Sources with PyArrow: Using the new pyarrow_batch_config you can define a custom Data Source using PyArrow. pyarrow_batch_config is similar to the pandas_batch_config, but leverages the performance benefits of PyArrow. See our docs for examples of using it to connect to an Iceberg Data Source or a Postgres Data Source

New features in Private Preview:​

  • API Resources: Seamlessly integrate external APIs into your Feature Views using the new @resource_provider decorator. Available for Rift-based Batch and Realtime Feature Views. See API Resources for examples.
  • Calculations: Speed up the execution of RealtimeFeatureViews using Calculations. Calculations define features through simple sql-like expressions, removing the need for python transformation functions in realtime feature views. This delivers a performance boost in offline and online serving.

Upgrade to 1.1​

  1. Ensure the steps in the 0.9 to 1.0 Upgrade Guide are complete and there are no remaining imports from tecton.v09_compat.

  2. Install Tecton 1.0 locally with pip install tecton>=1.1.0

  3. Update Materialization Runtime & Environment Defaults:

    • Spark-based Feature Views: Update the tecton_materialization_runtime field in your repo.yaml file to 1.1.0. Note that this will restart currently-running jobs for StreamFeatureViews. With Tecton's zero-downtime stream restarts, this will cause minimal disruption.

    • Rift-based Feature Views: Update the environment field in your repo.yaml to tecton-rift-core-1.1.0.

      Here's an example repo.yaml:

    defaults:
    batch_feature_view:
    tecton_materialization_runtime: 1.1.0 # For Spark-based Feature Views
    environment: tecton-rift-core-1.1.0 # For Rift-based Feature Views
    stream_feature_view:
    tecton_materialization_runtime: 1.1.0 # For Spark-based Feature Views
    environment: tecton-rift-core-1.1.0 # For Rift-based Feature Views
    feature_table:
    tecton_materialization_runtime: 1.1.0 # For Spark-based Feature Views
    environment: tecton-rift-core-1.1.0 # For Rift-based Feature Views
  4. Apply the upgrade using tecton apply, or tecton apply --integration-test to run integration tests.

Was this page helpful?