Skip to main content
Version: 1.1

Billable Usage Logs

Billable Usage Logs contain detailed information about your use of the Tecton platform, and how that use translates into Tecton Credits. By analyzing these logs, you can understand your usage trends, attribute usage within your organization, and optimize costs.

Logs are written as a CSV file into the Cloud Provider Object Storage you have configured during your Tecton deployment.

Accessing Billable Usage Logs​

Billable Usage Logs are available in your Cloud Provider's Object Storage. How you access the Billable Usage Logs depends on how your Tecton account is configured.

  • For AWS: s3://tecton-{DEPLOYMENT_NAME}/tecton-billable-usage/{VERSION}/YYYY-MM-DD/yyyy-MM-dd-HH.csv
  • For Google Cloud: gs://tecton-{DEPLOYMENT_NAME}/tecton-billable-usage/{VERSION}/YYYY-MM-DD/yyyy-MM-dd-HH.csv

Billable Usage Logs v1 will be available from July 1st, 2023 and v2 will be available from May 1st, 2024.

Logs will arrive between 24 - 48 hours after usage of the credits. No files will be written for periods where there is no billable usage. The current VERSION of the metrics schema is v2.

Usage Metrics CSV Schema​

NameDescription
accountNameTecton Account Name
timestampUTC Timestamp (At hour granularity) (For Example: "2023-08-11 00:00:00 UTC") The hour of the file is inclusive with the start time and exclusive with the end time of the event. For example: File 2023-08-11-02.csv will contain the usage from "2023-08-11 02:00:00 UTC" to "2023-08-11 02:59:59 UTC".
quantityNumber of units used by this Tecton object during the hour
unitUnit of measurement for billable usage (consumption)
metadataMetadata associated with billable usage (Ex: workspace, tecton_object_name, etc) based on the unit

The metadata field is a serialized JSON string that has a different structure depending on the unit field. Here is a list of the JSON fields for each unit type.

Consumption Unit (unit field)Metadata Values (metadata field)
ONLINE_WRITE_ROWStecton_object_name,
workspace
OFFLINE_WRITE_ROWStecton_object_name,
workspace
OFFLINE_WRITE_VALUEStecton_object_name,
workspace
FEATURE_SERVICE_ONLINE_REQUESTStecton_object_name,
workspace
FEATURE_VIEW_ONLINE_READStecton_object_name,
workspace
FEATURE_SERVICE_ONLINE_VECTORS_SERVEDtecton_object_name,
workspace
TECTON_JOB_COMPUTE_HOURStecton_job_id,
instance_type,
compute_type (one of EMR, DATABRICKS, RIFT),
job_type (one of STREAM_MATERIALIZATION, BATCH_MATERIALIZATION, ENTITY_DELETION, FEATURE_PUBLISH, FEATURE_TABLE_INGEST, DATASET_GENERATION),
workspace,
tecton_object_name,
tags
FEATURE_SERVER_NODE_HOURSworkspace,
tecton_object_name,
tecton_object_id
SERVER_GROUP_NODE_HOURSregion,
instance_type,
server_group_name,
server_group_type (one of TRANSFORM_SERVER_GROUP, FEATURE_SERVER_GROUP, INGEST_SERVER_GROUP)

Example File Contents:

"timestamp","accountName","quantity","unit","metadata"
"2023-08-11 02:00:00 UTC","customer-finance","15782","ONLINE_WRITE_ROWS","{""tecton_object_name"":""user_transaction_metrics"",""workspace"":""prod-fraud-detection""}"
"2023-08-11 02:00:00 UTC","customer-finance","4267","FEATURE_SERVICE_ONLINE_REQUESTS","{""tecton_object_name"":""fraud_detection_service"",""workspace"":""prod-fraud-detection""}"
"2023-08-11 02:00:00 UTC","customer-finance","12801","FEATURE_VIEW_ONLINE_READS","{""tecton_object_name"":""user_transaction_metrics"",""workspace"":""prod-fraud-detection""}"
"2023-08-11 02:00:00 UTC","customer-finance","1.5","TECTON_JOB_COMPUTE_HOURS","{""tecton_job_id"":""job-6b27e318-9fc5-4d35-89a3-f12b5ea46c92"",""instance_type"":""c5.2xlarge"",""compute_type"":""RIFT"",""job_type"":""BATCH_MATERIALIZATION"",""workspace"":""prod-fraud-detection"",""tecton_object_name"":""user_transaction_metrics"",""tags"":{""department"":""risk"",""environment"":""production""}}"
"2023-08-11 02:00:00 UTC","customer-finance","2.0","FEATURE_SERVER_NODE_HOURS","{""workspace"":""prod-fraud-detection"",""tecton_object_name"":""fraud_detection_service"",""tecton_object_id"":""fs-9e31f287-d4a5-47c2-a83b-7fe592a0b45d""}"
"2023-08-11 02:00:00 UTC","customer-finance","3.0","SERVER_GROUP_NODE_HOURS","{""region"":""us-west-2"",""instance_type"":""c5.xlarge"",""server_group_name"":""fraud-realtime-serving"",""server_group_type"":""FEATURE_SERVER_GROUP""}"
"2023-08-11 03:00:00 UTC","customer-finance","14593","ONLINE_WRITE_ROWS","{""tecton_object_name"":""user_transaction_metrics"",""workspace"":""prod-fraud-detection""}"
"2023-08-11 03:00:00 UTC","customer-finance","5319","FEATURE_SERVICE_ONLINE_REQUESTS","{""tecton_object_name"":""fraud_detection_service"",""workspace"":""prod-fraud-detection""}"
"2023-08-11 03:00:00 UTC","customer-finance","785","OFFLINE_WRITE_ROWS","{""tecton_object_name"":""user_location_features"",""workspace"":""prod-fraud-detection""}"
"2023-08-11 03:00:00 UTC","customer-finance","1.75","TECTON_JOB_COMPUTE_HOURS","{""tecton_job_id"":""job-7c38f429-0af6-48d2-ba94-e27d5fc31088"",""instance_type"":""c5.4xlarge"",""compute_type"":""EMR"",""job_type"":""STREAM_MATERIALIZATION"",""workspace"":""prod-fraud-detection"",""tecton_object_name"":""realtime_user_metrics"",""tags"":{""department"":""risk"",""environment"":""production""}}"

Analyzing Billable Usage Logs​

Once Billable Usage Logs are available in your environment, you can use many types of tools to query and analyze the data.

The following example illustrates how to read and analyze the logs from a standard Juypter Notebook environment. Note that your environment will need to be able to access the Cloud Storage path containing the logs.

import boto3
import os
import pandas as pd
import json

BUCKET_NAME = "tecton-<DEPLOYENT_NAME>"
VERSION = "v2"
PREFIX = f"tecton-billable-usage/{VERSION}"
s3 = boto3.resource("s3") # assumes credentials & configuration are handled outside python (e.g. in the .aws directory)
local_dir = "<SOME_LOCAL_DIRECTORY>"

bucket = s3.Bucket(BUCKET_NAME)
for obj in bucket.objects.filter(Prefix=PREFIX):
filename = obj.key.split("/")[-1]
bucket.download_file(obj.key, os.path.join(local_dir, filename))

df_list = []
for filename in os.listdir(local_dir):
full_path = os.path.join(local_dir, filename)
df_list.append(pd.read_csv(full_path))

metrics_df = pd.concat(df_list, ignore_index=True)


def extract_workspace(metadata) -> str:
metadata_json = json.loads(metadata)
if "workspace" not in metadata_json:
return ""
return metadata_json["workspace"]


# Group metrics by Tecton Object Name + Workspace
usage_by_workspace_object = metrics_df.groupby(["unit", metrics_df["metadata"].apply(extract_workspace)])[
"quantity"
].sum()

Was this page helpful?