Skip to main content
Version: 1.0

AgentClient

The client for Tecton agent service. The client should always be created using the static methods from_remote or from_local.

Attributes​

NameData TypeDescription
loggerLoggerGet the current logger of the client. The logger can be controlled using the context manager set_logger.
metastoreDict[str,Any]Get the metastore of the client. The metastore contains the metadata of the tools, prompts, knowledge and other resources. This function should not be used directly.

Methods​

NameDescription
__init__()Initialize AgentClient
from_local(...)Create a client for a local agent service. This is for testing
from_remote(...)Create a client connecting to a deployed agent service.
invoke_agent(...)Invoke an agent for a specific LLM framework. Compared to make_agent, this
invoke_feature_view(...)Invoke a feature view in the service.
invoke_prompt(...)Invoke a prompt in the service.
invoke_tool(...)Invoke a tool in the service.
make_agent(...)Make an agent for a specific LLM framework.
search(...)Search a tool in the service.
set_context(...)Set the context for the client. This is a context manager. The context
set_logger(...)Set the logger for the client. This is a context manager.

init(...)​

from_local(...)​

Create a client for a local agent service. This is for testing and development purposes.

Parameters

  • service (AgentService) - The agent service

Returns

AgentClient: The agent client

from_remote(...)​

Create a client connecting to a deployed agent service.

Parameters

  • url (str) - The Tecton URL

  • workspace (str) - The workspace name

  • service (str) - The service name

  • api_key (str) - The API key

Returns

AgentClient: The agent client

invoke_agent(...)​

Invoke an agent for a specific LLM framework. Compared to make_agent, this function is simpler with unified input (str) and output (str), but it is less flexible than getting the agent object of the specific framework and invoking.

Parameters

  • llm (Any) - The language model object in a specific framework (e.g. LangChain or LLamaIndex)

  • message (str) - The message (question)

  • system_prompt (Optional[str]) - The name of the system prompt in the service Default: None

  • chat_history (Any) - The chat history Default: None

  • context (Optional[Dict[str,Any]]) - The context to run the agent, this will override the context set by set_context Default: None

  • kwargs (Any) -

Returns

str: The response

Example

from langchain_openai import ChatOpenAI
openai = ChatOpenAI(model = "gpt-4o", temperature=0)
from tecton_gen_ai.fco import prompt, AgentService, AgentClient
from tecton_gen_ai.utils.tecton_utils import make_request_source
req = make_request_source(age=int)
@prompt(sources=[req])
def sys_prompt(req):
return "You are talking to a "+str(req["age"])+" years old person."
service = AgentService(
"app",
prompts=[sys_prompt],
)
client = AgentClient.from_local(service)
with client.set_context({"age": 3}):
print(client.invoke_agent(llm, "why sky is blue", "sys_prompt"))
with client.set_context({"age": 30}):
print(client.invoke_agent(llm, "why sky is blue", "sys_prompt"))

invoke_feature_view(...)​

Invoke a feature view in the service.

Parameters

  • name (str) - The name of the feature view

  • kwargs (Optional[Dict[str,Any]]) - The arguments for the feature view, the keys should match the entity schema of the feature view. Default: None

Returns

Any: The result of the feature view

Example

from tecton_gen_ai.testing import make_local_feature_view, set_dev_mode
set_dev_mode()
bfv = make_local_feature_view(
"user_age",
{"user_id": 1, "age": 30},
["user_id"],
description="The age of a user",
)
from tecton_gen_ai.fco import AgentService, AgentClient
service = AgentService(
"app",
tools=[bfv],
)
client = AgentClient.from_local(service)
print(client.invoke_feature_view("user_age", {"user_id":1}))

invoke_prompt(...)​

Invoke a prompt in the service.

Parameters

  • name (str) - The name of the prompt

  • kwargs (Optional[Dict[str,Any]]) - The arguments for the prompt, it overrides the context set by set_context Default: None

Returns

str: The result of the prompt

Example

from tecton_gen_ai.fco import AgentService, AgentClient, prompt
from tecton_gen_ai.utils.tecton_utils import make_request_source
req = make_request_source(age=int)
@prompt(sources=[req])
def sys_prompt(req):
return "You are talking to a "+str(req["age"])+" years old person."
service = AgentService(
"app",
prompts=[sys_prompt],
)
client = AgentClient.from_local(service)
print(client.invoke_prompt("sys_prompt", {"age": 3}))

invoke_tool(...)​

Invoke a tool in the service.

Parameters

  • name (str) - The name of the tool

  • kwargs (Optional[Dict[str,Any]]) - The arguments for the tool Default: None

Returns

Any: The result of the tool

Example

from tecton_gen_ai.fco import AgentService, AgentClient, tool
@tool
def get_price_of_fruit(fruit:str) -> int:
'''
Get the price of a fruit
Args:
fruit: The name of the fruit
Returns:
int: The price of the fruit
'''
return 10 if fruit == "apple" else 5
service = AgentService(
"app",
tools=[get_price_of_fruit],
)
client = AgentClient.from_local(service)
print(client.invoke_tool("get_price_of_fruit", {"fruit":"apple"}))

make_agent(...)​

Make an agent for a specific LLM framework.

Parameters

  • llm (Any) - The language model object in a specific framework (e.g. LangChain or LLamaIndex)

  • system_prompt (Optional[str]) - The name of the system prompt in the service Default: None

  • kwargs (Any) -

Returns

Any: The agent object

Example

from langchain_openai import ChatOpenAI
openai = ChatOpenAI(model = "gpt-4o", temperature=0)
from tecton_gen_ai.fco import prompt, AgentService, AgentClient
from tecton_gen_ai.utils.tecton_utils import make_request_source
req = make_request_source(age=int)
@prompt(sources=[req])
def sys_prompt(req):
return "You are talking to a "+str(req["age"])+" years old person."
service = AgentService(
"app",
prompts=[sys_prompt],
)
client = AgentClient.from_local(service)
agent = client.make_agent(openai, system_prompt="sys_prompt")
with client.set_context({"age": 3}):
print(agent.invoke({"input":"why sky is blue"}))
with client.set_context({"age": 30}):
print(agent.invoke({"input":"why sky is blue"}))
Search a tool in the service.

Parameters

  • name (str) - The name of the search tool

  • query (str) - The query string

  • top_k (int) - The number of results to return, defaults to 5 Default: 5

  • filter (Optional[Dict[str,Any]]) - The filter for the search, default to None (no filter) Default: None

Returns

List[Dict[str,Any]]: The search results

Example

from tecton_gen_ai.testing import make_local_source
from tecton_gen_ai.testing.utils import make_local_vector_db_config
df = [
{"zip":"98005", "item_id":1, "description":"pencil"},
{"zip":"98005", "item_id":2, "description":"car"},
{"zip":"98005", "item_id":3, "description":"paper"},
{"zip":"10065", "item_id":4, "description":"boat"},
{"zip":"10065", "item_id":5, "description":"cheese"},
{"zip":"10065", "item_id":6, "description":"apple"},
]
src = make_local_source(
"for_sale",
df,
description="Items information", # required for source_as_knowledge
)
vdb_conf = make_local_vector_db_config()
# Create a knowledge base from the source
from tecton_gen_ai.fco import source_as_knowledge
knowledge = source_as_knowledge(
src,
vector_db_config=vdb_conf,
vectorize_column="description",
filter = [("zip", str, "the zip code of the item for sale")]
)
# Serve the knowledge base
from tecton_gen_ai.fco import AgentService
service = AgentService(
"app",
knowledge=[knowledge],
)
# Test locally
from tecton_gen_ai.fco import AgentClient
client = AgentClient.from_local(service)
# search without filter
print(client.search("for_sale", query="fruit"))
# search with filter
print(client.search("for_sale", query="fruit", top_k=3, filter={"zip": "27001"}))
print(client.search("for_sale", query="fruit", top_k=3, filter={"zip": "10065"}))

set_context(...)​

Set the context for the client. This is a context manager. The context will be used as the arguments for the prompts, tools and knowledge.

Parameters

  • context (Optional[Dict[str,Any]]) - The new context, or None to clear the context


Example

conext = {"a":1, "b":2}
new_args = {"b":3, "c":4}
with client.set_context(context):
# the context will be used as the arguments of my_tool
# new_args will override the context
# the final arguments for my_tool will be {"a":1, "b":3, "c":4}
client.invoke_tool("my_tool", new_args)

set_logger(...)​

Set the logger for the client. This is a context manager.

Parameters

  • logger (Optional[logging.Logger]) - The new logger, or None to use the no-op logger


Example

with client.set_logger(logger):
# do something

Was this page helpful?

🧠 Hi! Ask me anything about Tecton!

Floating button icon