Skip to main content
Version: 1.0

ModelConfig

Summary​

A Custom Model in Tecton

Example

from tecton import ModelConfig
model_config = ModelConfig(
name=”mymodel_text_embedder”,
model_type="pytorch",
model_file="model_file.py",
environments=["custom-ml-env-01"],
artifact_files=["model.safetensors", "tokenizer_config.json" …],
input_schema=[Field(”my_text_col”, String), …],
output_schema=Field(”text_embedding”)
)

Attributes​

NameData TypeDescription
namestrUnique name of model
model_typestrType of Model (pytorch or text embedding)
descriptionOptional[str]Description of Model
tagsOptional[Dict[str, str]]Tags associated with this Tecton Object (key-value pairs of arbitrary metadata).
model_filestrPath of File containing model relative to where the Tecton object is defined.
artifact_filesOptional[List[str]]Path of other files needed to load and run model relative to where the Tecton object is defined.
environmentsList[str]All the environments allowed for this custom model to run in.
input_schemaList[Field]Input Schema for model. Each field in this schema corresponds to the input_columns specified in the Inference and is mapped by order.
output_schemaFieldOutput Schema of model.

Methods​

NameDescription
__init__(...)Initialize ModelConfig
register()Register a Model Locally so that it can be used in Local Feature Views.
run(...)Test Run a Model.

run(...)​

Test Run a Model.

Parameters

  • mock_inputs (Dict[str,List[Any]]) - Mock Input for each columns defined in input schema

Returns

DataFrame: pandas.Dataframe with inputs and output columns.

Raises

  • TectonValidationError: if the input non-parameters are invalid.

Was this page helpful?