Skip to content

Feature Services

Feature Services are sets of features which are exposed as an API. This API can be used for batch lookups of feature values (e.g. generating training datasets or feature dataframes for batch prediction), or low-latency requests for individual feature vectors.

Feature Services reference a set of features from Feature Packages). Feature Packages must be created before they can be served in a Feature Service, since Feature Services are only a consumption layer on top of existing features.

It is generally recommended that each model deployed in production have one associated Feature Service deployed, which serves features to the model.

A Feature Service provides:

  • A REST API to access feature values at the time of prediction
  • A one-line method call to rapidly construct training data for user-specified timestamps and labels
  • The ability to observe the endpoint where the data is served to monitor serving throughput, latency, and prediction success rate

Defining a Feature Service

Define a Feature Service using the FeatureService class.


A Feature Service definition includes the following attributes:

  • name: The unique name of the Feature Service
  • features: The features served by the Feature Service
  • online_serving_enabled: Whether online serving is enabled for this Feature Service (defaults to True.)
  • Metadata used to organize the FeatureService. Metadata parameters include description, owner, family, and tags.

Example: Defining a Feature Service

The following example defines a Feature Service.

from tecton import FeatureService, FeaturesConfig
from feature_repo.shared.features.ad_ground_truth_ctr_performance_7_days import ad_ground_truth_ctr_performance_7_days
from feature_repo.shared.features.user_total_ad_frequency_counts import user_total_ad_frequency_counts
from feature_repo.shared.features.user_ad_impression_counts import user_ad_impression_counts

ctr_prediction_service = FeatureService(
    description='A Feature Service used for supporting a CTR prediction model.',
        # add all of the features in a Feature Package
        # add a single feature from a Feature Package using double-bracket notation
    tags={'release': 'production'},
  • The Feature Service uses the user_total_ad_frequency_counts, and user_ad_impression_counts Feature Packages.
  • The list of features in the Feature Service are defined in the features argument. When you pass a FeaturePackage in this argument, the Feature Service will contain all the features in the Feature Package. To select a subset of features in a Feature Package, use double-bracket notation (e.g. FeaturePackage[['my_feature', 'other_feature']].)

Using Feature Services

Using the low-latency REST API Interface

To use the FeatureService's REST API end, send a request containing the join keys of the feature vector to be retrieved. Tecton's response contains the full feature vector, including the join keys.

To request a single feature vector from the REST API, use the /get-features endpoint. Pass the Feature Service name and the join keys as parameters. The response is a JSON object.


The request is authenticated with an API key that you create using the tecton create-api-key CLI command. See Access Controls & Secrets for more information.

Example Request

curl -X POST \\
-H "Authorization: Tecton-key YOUR_API_KEY" \\
https://[your-cluster] -d \\
  "params": {
    "feature_service_name": "ctr_prediction_service",
    "join_key_map": {
      "ad_id": "12345678",
      "user_uuid": "$USER_UUID_VALUE",
      "partner_id": "12345678",
      "ad_group_id": "12345678"


  "result": {
    "features": [
          "17", "electronics", "headphones", "outdoor", ...

Using the low-latency SDK Interface

To fetch real-time data from a Feature Service using the Python SDK as a client, use the FeatureService.get_feature_vector() method.

import tecton

feature_service = tecton.get_feature_service("price_prediction_feature_service")

# Request features for a single (user, item) tuple
join_keys = { "user_id": "demo_user_123", "item_id": "demo_item_987" }

# Sample code matching example in above section
# Assumes model is built in sklearn, and that the order and type of feature values are consistent with the model object
scoring_features = feature_service.get_feature_vector(join_keys=join_keys)
predicted_price = cls.predict_proba(scoring_features)

Using the Offline Interface

Use the offline or batch interface for batch prediction jobs or to generate training datasets. To fetch a dataframe from a Feature Service with the Python SDK as a client, use the FeatureService.get_feature_dataframe() method.

To make a batch request, first create a context consisting of the join keys for prediction and the desired feature timestamps. Then, pass these events to the Feature Service method get_feature_dataframe().

events ='dbfs:/sample_events.pq')

Events Table

Tecton then generates the feature values.

import tecton
feature_service = tecton.get_feature_service("price_prediction_feature_service")
result_spark_df = feature_service.get_feature_dataframe(events).to_spark()

Results Table

Using Feature Logging

Feature Services have the ability to continuously log online requests and feature vector responses as Tecton Datasets. These logged feature datasets can be used for auditing, analysis, training dataset generation, and spine creation.

Feature Logging Diagram

To enable feature logging on a FeatureService, simply add a LoggingConfig like in the example below and optionally specify a sample rate. You can also optionally set log_effective_times=True to log the feature timestamps from the Feature Store. As a reminder, Tecton will always serve the latest stored feature values as of the time of the request.

Run tecton apply to apply your changes.

from tecton import LoggingConfig

ctr_prediction_service = FeatureService(

This will create a new Tecton Dataset under the Datasets tab in the Web UI. This dataset will continue having new feature logs appended to it every 30 mins. If the features in the Feature Service change, a new dataset version will be created.

Logged Features

This dataset can be fetched in a notebook using the code snippet below.

import tecton
dataset = tecton.get_dataset('ctr_prediction_service.logged_requests.4')

Logged Features Dataset