Skip to main content
Version: Beta 🚧

Billable Usage Logs

Billable Usage Logs contain detailed information about your use of the Tecton platform, and how that use translates into Tecton Credits. By analyzing these logs, you can understand your usage trends, attribute usage within your organization, and optimize costs.

Logs are written as a CSV file into the Cloud Provider Object Storage you have configured during your Tecton deployment.

Accessing Billable Usage Logs​

Billable Usage Logs are available in your Cloud Provider's Object Storage. How you access the Billable Usage Logs depends on how your Tecton account is configured.

  • For AWS: s3://tecton-{DEPLOYMENT_NAME}/tecton-billable-usage/{VERSION}/YYYY-MM-DD/yyyy-MM-dd-HH.csv
  • For Google Cloud: gs://tecton-{DEPLOYMENT_NAME}/tecton-billable-usage/{VERSION}/YYYY-MM-DD/yyyy-MM-dd-HH.csv
  • For Tecton on Snowflake: Open a ticket with Tecton Support to configure Object Storage.

Billable Usage Logs v1 will be available from July 1st, 2023 and v2 will be available from May 1st, 2024.

Logs will arrive between 24 - 48 hours after usage of the credits. No files will be written for periods where there is no billable usage. The current VERSION of the metrics schema is v2.

Usage Metrics CSV Schema​

NameDescription
accountNameTecton Account Name
timestampUTC Timestamp (At hour granularity) (For Example: "2023-08-11 00:00:00 UTC") The hour of the file is inclusive with the start time and exclusive with the end time of the event. For example: File 2023-08-11-02.csv will contain the usage from "2023-08-11 02:00:00 UTC" to "2023-08-11 02:59:59 UTC".
quantityNumber of units used by this Tecton object during the hour
unitUnit of measurement for billable usage
metadataMetadata associated with billable usage. (Ex: workspace, tectonObjectName, etc)

Example File Contents:

"timestamp","accountName","quantity","unit","metadata"
"2024-05-10T19:00:00Z","sampleAccount",1.0,"FEATURE_SERVER_NODE_HOURS","{""region"":""us-west-2""}"
"2024-05-10T19:32:21.938482416Z","sampleAccount",0.025555555555555557,"TECTON_JOB_COMPUTE_HOURS","{""tectonJobId"":""340e97d623b644e9ad631dd7cec6b515"",""instanceType"":""m6a.2xlarge"",""workspace"":""workspace_1"",""tectonObjectName"":""user_transaction_metrics"",""tags"":{""release"":""production""},""computeType"":""RIFT"",""jobType"":""BATCH_MATERIALIZATION""}"

Analyzing Billable Usage Logs​

Once Billable Usage Logs are available in your environment, you can use many types of tools to query and analyze the data.

The following example illustrates how to read and analyze the logs from a standard Juypter Notebook environment. Note that your environment will need to be able to access the Cloud Storage path containing the logs.

import boto3
import os
import pandas as pd
import json

BUCKET_NAME = "tecton-<DEPLOYENT_NAME>"
VERSION = "v2"
PREFIX = f"tecton-billable-usage/{VERSION}"
s3 = boto3.resource("s3") # assumes credentials & configuration are handled outside python (e.g. in the .aws directory)
local_dir = "<SOME_LOCAL_DIRECTORY>"

bucket = s3.Bucket(BUCKET_NAME)
for obj in bucket.objects.filter(Prefix=PREFIX):
filename = obj.key.split("/")[-1]
bucket.download_file(obj.key, os.path.join(local_dir, filename))

df_list = []
for filename in os.listdir(local_dir):
full_path = os.path.join(local_dir, filename)
df_list.append(pd.read_csv(full_path))

metrics_df = pd.concat(df_list, ignore_index=True)


def extract_workspace(metadata) -> str:
metadata_json = json.loads(metadata)
if "workspace" not in metadata_json:
return ""
return metadata_json["workspace"]


# Group metrics by Tecton Object Name + Workspace
usage_by_workspace_object = metrics_df.groupby(["unit", metrics_df["metadata"].apply(extract_workspace)])[
"quantity"
].sum()

Was this page helpful?