AgentClient
from_remote
or from_local
.
Attributes​
Name | Data Type | Description |
---|---|---|
logger | Logger | Get the current logger of the client. The logger can be controlled
using the context manager set_logger . |
metastore | Dict[str,Any] | Get the metastore of the client. The metastore contains the metadata of the tools, prompts, knowledge and other resources. This function should not be used directly. |
Methods​
Name | Description |
---|---|
__init__() | Initialize AgentClient |
from_local(...) | Create a client for a local agent service. This is for testing |
from_remote(...) | Create a client connecting to a deployed agent service. |
invoke_agent(...) | Invoke an agent for a specific LLM framework. Compared to make_agent , this |
invoke_feature_view(...) | Invoke a feature view in the service. |
invoke_prompt(...) | Invoke a prompt in the service. |
invoke_tool(...) | Invoke a tool in the service. |
make_agent(...) | Make an agent for a specific LLM framework. |
search(...) | Search a tool in the service. |
set_context(...) | Set the context for the client. This is a context manager. The context |
set_logger(...) | Set the logger for the client. This is a context manager. |
init(...)​
from_local(...)​
Create a client for a local agent service. This is for testing and development purposes.Parameters
service
(AgentService
) - The agent service
Returns
AgentClient
: The agent client
from_remote(...)​
Create a client connecting to a deployed agent service.Parameters
url
(str
) - The Tecton URLworkspace
(str
) - The workspace nameservice
(str
) - The service nameapi_key
(str
) - The API key
Returns
AgentClient
: The agent client
invoke_agent(...)​
Invoke an agent for a specific LLM framework. Compared tomake_agent
, this
function is simpler with unified input (str) and output (str), but it is less
flexible than getting the agent object of the specific framework and invoking.Parameters
llm
(Any
) - The language model object in a specific framework (e.g. LangChain or LLamaIndex)message
(str
) - The message (question)system_prompt
(Optional
[str
]) - The name of the system prompt in the service Default:None
chat_history
(Any
) - The chat history Default:None
context
(Optional
[Dict
[str
,Any
]]) - The context to run the agent, this will override the context set byset_context
Default:None
kwargs
(Any
) -
Returns
str
: The responseExample
from langchain_openai import ChatOpenAIopenai = ChatOpenAI(model = "gpt-4o", temperature=0)from tecton_gen_ai.fco import prompt, AgentService, AgentClientfrom tecton_gen_ai.utils.tecton_utils import make_request_sourcereq = make_request_source(age=int)@prompt(sources=[req])def sys_prompt(req):return "You are talking to a "+str(req["age"])+" years old person."service = AgentService("app",prompts=[sys_prompt],)client = AgentClient.from_local(service)with client.set_context({"age": 3}):print(client.invoke_agent(llm, "why sky is blue", "sys_prompt"))with client.set_context({"age": 30}):print(client.invoke_agent(llm, "why sky is blue", "sys_prompt"))
invoke_feature_view(...)​
Invoke a feature view in the service.Parameters
name
(str
) - The name of the feature viewkwargs
(Optional
[Dict
[str
,Any
]]) - The arguments for the feature view, the keys should match the entity schema of the feature view. Default:None
Returns
Any
: The result of the feature viewExample
from tecton_gen_ai.testing import make_local_feature_view, set_dev_modeset_dev_mode()bfv = make_local_feature_view("user_age",{"user_id": 1, "age": 30},["user_id"],description="The age of a user",)from tecton_gen_ai.fco import AgentService, AgentClientservice = AgentService("app",tools=[bfv],)client = AgentClient.from_local(service)print(client.invoke_feature_view("user_age", {"user_id":1}))
invoke_prompt(...)​
Invoke a prompt in the service.Parameters
name
(str
) - The name of the promptkwargs
(Optional
[Dict
[str
,Any
]]) - The arguments for the prompt, it overrides the context set byset_context
Default:None
Returns
str
: The result of the promptExample
from tecton_gen_ai.fco import AgentService, AgentClient, promptfrom tecton_gen_ai.utils.tecton_utils import make_request_sourcereq = make_request_source(age=int)@prompt(sources=[req])def sys_prompt(req):return "You are talking to a "+str(req["age"])+" years old person."service = AgentService("app",prompts=[sys_prompt],)client = AgentClient.from_local(service)print(client.invoke_prompt("sys_prompt", {"age": 3}))
invoke_tool(...)​
Invoke a tool in the service.Parameters
name
(str
) - The name of the toolkwargs
(Optional
[Dict
[str
,Any
]]) - The arguments for the tool Default:None
Returns
Any
: The result of the toolExample
from tecton_gen_ai.fco import AgentService, AgentClient, tool@tooldef get_price_of_fruit(fruit:str) -> int:'''Get the price of a fruitArgs:fruit: The name of the fruitReturns:int: The price of the fruit'''return 10 if fruit == "apple" else 5service = AgentService("app",tools=[get_price_of_fruit],)client = AgentClient.from_local(service)print(client.invoke_tool("get_price_of_fruit", {"fruit":"apple"}))
make_agent(...)​
Make an agent for a specific LLM framework.Parameters
llm
(Any
) - The language model object in a specific framework (e.g. LangChain or LLamaIndex)system_prompt
(Optional
[str
]) - The name of the system prompt in the service Default:None
kwargs
(Any
) -
Returns
Any
: The agent objectExample
from langchain_openai import ChatOpenAIopenai = ChatOpenAI(model = "gpt-4o", temperature=0)from tecton_gen_ai.fco import prompt, AgentService, AgentClientfrom tecton_gen_ai.utils.tecton_utils import make_request_sourcereq = make_request_source(age=int)@prompt(sources=[req])def sys_prompt(req):return "You are talking to a "+str(req["age"])+" years old person."service = AgentService("app",prompts=[sys_prompt],)client = AgentClient.from_local(service)agent = client.make_agent(openai, system_prompt="sys_prompt")with client.set_context({"age": 3}):print(agent.invoke({"input":"why sky is blue"}))with client.set_context({"age": 30}):print(agent.invoke({"input":"why sky is blue"}))
search(...)​
Search a tool in the service.Parameters
name
(str
) - The name of the search toolquery
(str
) - The query stringtop_k
(int
) - The number of results to return, defaults to 5 Default:5
filter
(Optional
[Dict
[str
,Any
]]) - The filter for the search, default to None (no filter) Default:None
Returns
List
[Dict
[str
,Any
]]: The search resultsExample
from tecton_gen_ai.testing import make_local_sourcefrom tecton_gen_ai.testing.utils import make_local_vector_db_configdf = [{"zip":"98005", "item_id":1, "description":"pencil"},{"zip":"98005", "item_id":2, "description":"car"},{"zip":"98005", "item_id":3, "description":"paper"},{"zip":"10065", "item_id":4, "description":"boat"},{"zip":"10065", "item_id":5, "description":"cheese"},{"zip":"10065", "item_id":6, "description":"apple"},]src = make_local_source("for_sale",df,description="Items information", # required for source_as_knowledge)vdb_conf = make_local_vector_db_config()# Create a knowledge base from the sourcefrom tecton_gen_ai.fco import source_as_knowledgeknowledge = source_as_knowledge(src,vector_db_config=vdb_conf,vectorize_column="description",filter = [("zip", str, "the zip code of the item for sale")])# Serve the knowledge basefrom tecton_gen_ai.fco import AgentServiceservice = AgentService("app",knowledge=[knowledge],)# Test locallyfrom tecton_gen_ai.fco import AgentClientclient = AgentClient.from_local(service)# search without filterprint(client.search("for_sale", query="fruit"))# search with filterprint(client.search("for_sale", query="fruit", top_k=3, filter={"zip": "27001"}))print(client.search("for_sale", query="fruit", top_k=3, filter={"zip": "10065"}))
set_context(...)​
Set the context for the client. This is a context manager. The context will be used as the arguments for the prompts, tools and knowledge.Parameters
Example
conext = {"a":1, "b":2}new_args = {"b":3, "c":4}with client.set_context(context):# the context will be used as the arguments of my_tool# new_args will override the context# the final arguments for my_tool will be {"a":1, "b":3, "c":4}client.invoke_tool("my_tool", new_args)
set_logger(...)​
Set the logger for the client. This is a context manager.Parameters
logger
(Optional
[logging.Logger
]) - The new logger, or None to use the no-op logger
Example
with client.set_logger(logger):# do something