ModelConfig
Summary​
A Custom Model in TectonExample
from tecton import ModelConfigmodel_config = ModelConfig(name=”mymodel_text_embedder”,model_type="pytorch",model_file="model_file.py",environments=["custom-ml-env-01"],artifact_files=["model.safetensors", "tokenizer_config.json" …],input_schema=[Field(”my_text_col”, String), …],output_schema=Field(”text_embedding”))
Attributes​
Name | Data Type | Description |
---|---|---|
name | str | Unique name of model |
model_type | str | Type of Model (pytorch or text embedding) |
description | Optional[str] | Description of Model |
tags | Optional[Dict[str, str]] | Tags associated with this Tecton Object (key-value pairs of arbitrary metadata). |
model_file | str | Path of File containing model relative to where the Tecton object is defined. |
artifact_files | Optional[List[str]] | Path of other files needed to load and run model relative to where the Tecton object is defined. |
environments | List[str] | All the environments allowed for this custom model to run in. |
input_schema | List[Field] | Input Schema for model. Each field in this schema corresponds to the input_columns specified in the Inference and is mapped by order. |
output_schema | Field | Output Schema of model. |
Methods​
Name | Description |
---|---|
__init__(...) | Initialize ModelConfig |
register() | Register a Model Locally so that it can be used in Local Feature Views. |
run(...) | Test Run a Model. |