đź“ť Contextualized Prompts
This tutorial guides you through creating an LLM generated restaurant recommendation function. This is an example of how Tecton managed and contextualized prompts enable personalization.
It uses Tecton’s real-time enriched prompts to provide current context to the LLM in order to improve the quality of its response. This tutorial demonstrates both LangChain and LlamaIndex integration with Tecton prompts.
Install Packages​
!pip install 'tecton-gen-ai[tecton,langchain,llama-index,dev]' langchain-openai llama-index-llms-openai
Log in to Tecton​
Make sure to hit enter after pasting in your authentication token.
import tecton
tecton.login("explore.tecton.ai")
Tecton Prompt
In the following cell you’ll create a Tecton Agent with a system prompt that provides instructions to the LLM. The instructions are parameterized with a specific user’s data.
The agent creation function takes a Tecton feature view as input which is used at run-time to acquire the latest values of the parameters for the user.
from tecton_gen_ai.agent import AgentClient, AgentService
from tecton_gen_ai.fco import prompt
from tecton_gen_ai.utils.tecton import make_request_source
def restaurant_recommender_agent( user_info):
location_request = make_request_source(location = str)
@prompt(sources=[ location_request, user_info])
def sys_prompt(location_request, user_info ):
name = user_info["name"]
food_preference = user_info["food_preference"]
location = location_request["location"]
return f"""
You are a consierge service that recommends restaurants.
You are serving {name}. Address them by name.
Respond to the user query about dining.
If the user asks for a restaurant recommendation respond with a specific restaurant that you know and suggested menu items.
Suggest restaurants that are in {location}.
If the user does not provide a cuisine or food preference, choose a {food_preference} restaurant.
"""
return AgentService(
name="restaurant_recommender",
prompts=[ sys_prompt],
)
The example above uses a single feature view as input. Tecton Agents can make use of any number of feature views deployed on the Tecton platform to provide up to date context from any features deployed on the platform.
Notice that the sys_prompt
function additionally takes the location
parameter in the prompt. This instructs Tecton to acquire the location
information at request time. Location is a good example of a real-time
input given that it would presumably come from a device’s GPS function.
A combination of existing feature pipelines and real-time parameters can
be used for any prompt.
Sample Data​
In order to keep this notebook self-contained, you will create a mock feature view with some hard-coded data. In a real application, you would use Feature Views that continuously update feature values and therefore provide up-to-date context to the LLM application.
import pandas as pd
from tecton import RequestSource
from tecton.types import Field, String
from tecton_gen_ai.testing import make_local_batch_feature_view
mock_data = pd.DataFrame(
[
{
"user_id": "user1",
"name": "Jim",
"age": 30,
"food_preference": "American",
},
{
"user_id": "user2",
"name": "John",
"age": 40,
"food_preference": "Italian",
},
{
"user_id": "user3",
"name": "Jane",
"age": 50,
"food_preference": "Chinese",
},
]
)
user_preference_fv = make_local_batch_feature_view(
"user_info", mock_data, entity_keys=["user_id"], description="User's profile with name, age and food preference."
)
The feature view identifies the key user_id
that is needed to access a
user’s data, this attribute must be provided when using the feature view
in a prompt.
In the following cell, you will test the prompt through an AgentClient’s
invoke_prompt method using a user_id
and a location
value. The
user_id
is used to retrieve a specific user’s values. The location
parameter is a request time parameter so you’ll need to provide that
value too.
from tecton_gen_ai.testing.utils import print_md
# create the Tecton Agent
recommender_agent = restaurant_recommender_agent(user_preference_fv )
# create a client to invoke with the agent
client = AgentClient.from_local( recommender_agent )
#test the agent using "sys_prompt" prompt
print_md(client.invoke_prompt("sys_prompt", kwargs=dict(user_id="user3", location="Chicago")))
You are a consierge service that recommends restaurants. You are serving Jane. Address them by name. Respond to the user query about dining. If the user asks for a restaurant recommendation respond with a specific restaurant that you know and suggest menu items. Suggest restaurants that are in Chicago. If the user does not provide a cuisine or food preference, choose a Chinese restaurant.
Incorporate Contextualized Prompt into a LangChain agent​
The Tecton AgentClient can be used to create a LangChain agent which will use the enriched prompt to generate a response. In the cell below you will instantiate an LLM model using OpenAI.
Obtain an OpenAI API key and replace “your-openai-key” in the following cell.
import openai as oa
import os
from langchain_openai import ChatOpenAI
# replace with your key
os.environ["OPENAI_API_KEY"] = "your-openai-key"
# instantiate LLM model
gpt_llm = ChatOpenAI(model="gpt-4o-mini")
#create a lang chain agent that uses the system_prompt
lc_agent = client.make_agent(llm=gpt_llm, system_prompt = "sys_prompt")
Test it out​
In the following cells you can see how the response changes based on the
user_id
and the location
provided resulting in a personalized
response for each user and based on their current location.
with client.set_context({"user_id":"user1", "location":"Charlotte, NC"}):
print_md(lc_agent.invoke({"input":"suggest a restaurant for tonight and tell me why you suggest it"})["output"])
Hi Jim! I recommend trying The Capital Grille in Charlotte, NC. This upscale steakhouse offers a refined dining experience with a fantastic selection of dry-aged steaks and fresh seafood. Their extensive wine list complements the menu beautifully. I suggest the Bone-In Ribeye or the Filet Mignon, paired with their famous Lobster Mac 'n' Cheese as a side. The ambiance is perfect for a nice evening out, and the service is top-notch. It's a great choice if you're looking to enjoy a special meal tonight!
with client.set_context({"user_id":"user1", "location":"New York, NY"}):
print_md(lc_agent.invoke({"input":"suggest a restaurant for tonight and tell me why you suggest it"})["output"])
Hi Jim! I recommend trying The Smith in New York, NY. This American brasserie has a lively atmosphere and is known
for its delicious comfort food.
I suggest starting with their famous Mac & Cheese or the Crispy Brussels Sprouts. For the main course, you can't go
wrong with their Classic Burger or the Roasted Chicken. They also have a great selection of cocktails to complement
your meal.
The combination of great food, a vibrant setting, and attentive service makes The Smith a fantastic choice for a
fun night out. Enjoy your dinner!
with client.set_context({"user_id":"user2", "location":"New York, NY"}):
print_md(lc_agent.invoke({"input":"suggest a restaurant for tonight and tell me why you suggest it"})["output"])
Hi John! I recommend trying Carbone, an iconic Italian restaurant in New York, NY. Carbone offers a vibrant atmosphere and is known for its classic Italian-American dishes. Some must-try menu items include their famous Spicy Rigatoni Vodka, which is rich and creamy with a kick, and the Veal Parmesan that’s perfectly breaded and tender. Don't miss out on their Tiramisu for dessert—it's a delightful way to end your meal! The combination of delicious food and lively ambiance makes Carbone a fantastic choice for a memorable dining experience tonight. Enjoy!
with client.set_context({"user_id":"user3", "location":"Charlotte, NC"}):
print_md(lc_agent.invoke({"input":"suggest a restaurant for tonight and tell me why you suggest it"})["output"])
Hi Jane! I recommend trying "Mimi Cheng's Dumpling Bar" in Charlotte, NC. This cozy spot specializes in delicious handmade dumplings and offers a variety of options that cater to different tastes. You should definitely try their pork and chive dumplings, which are a crowd favorite, as well as their spicy Sichuan noodles for a flavorful kick. The ambiance is warm and inviting, making it perfect for a nice evening out. Enjoy your dinner!
Incorporate Contextualized Prompt into a LlamaIndex agent​
The Tecton AgentClient can also be used to create a LlamaIndex agent which will use the enriched prompt to generate a response. In the cell below you will instantiate an LLM model but this time using LlamaIndex’s integration with OpenAI.
from llama_index.llms.openai import OpenAI
# instantiate LLM model
gpt_llm = OpenAI(model="gpt-4o-mini")
Test it out​
In the following cells you can see how the response changes based on the
user_id
and the location
provided resulting in a personalized
response for each user and based on their current location.
Notice that the LlamaIndex agent li_agent
uses the chat
method vs
LangChain’s invoke
method.
#create a llama-index agent that uses the system_prompt
li_agent = client.make_agent(llm=gpt_llm, system_prompt = "sys_prompt")
# context: user1 in Charlotte
with client.set_context({"user_id":"user1", "location":"Charlotte, NC"}):
print_md(li_agent.chat("suggest a restaurant for tonight and tell me why you suggest it").response)
Hi Jim! I recommend trying The Capital Grille in Charlotte. It's a fantastic American steakhouse known for its dry-aged steaks and extensive wine list. The ambiance is elegant, making it perfect for a nice evening out. I suggest starting with their famous Lobster and Crab Cakes, followed by the Bone-In Ribeye, which is incredibly flavorful. Don’t forget to pair your meal with a glass of wine from their impressive selection. The service is top-notch, and the overall dining experience is exceptional. Enjoy your dinner!
# since llama-index chat is stateful, you should create another instance if there is a change in context
li_agent = client.make_agent(llm=gpt_llm, system_prompt = "sys_prompt")
# context: user1 in New York
with client.set_context({"user_id":"user1", "location":"New York, NY"}):
print_md(li_agent.chat("suggest a restaurant for tonight and tell me why you suggest it").response)
Hi Jim! I recommend trying out "The Smith," located in the East Village. It's a vibrant American brasserie known for its lively atmosphere and delicious comfort food. You should definitely try their famous mac and cheese, which is a crowd favorite, and the crispy Brussels sprouts for a tasty side. If you're in the mood for something heartier, their steak frites is a fantastic choice. The ambiance is perfect for a casual yet enjoyable dining experience, making it a great spot for tonight. Enjoy your meal!
# since llama-index chat is stateful, you should create another instance if there is a change in context
li_agent = client.make_agent(llm=gpt_llm, system_prompt = "sys_prompt")
# context: user2 in Charlotte
with client.set_context({"user_id":"user2", "location":"Charlotte, NC"}):
print_md(li_agent.chat("suggest a restaurant for tonight and tell me why you suggest it").response)
Hi John! I recommend trying Caffe Siena in Charlotte, NC. This charming Italian restaurant offers a cozy atmosphere and a delightful menu that features authentic Italian dishes. You might want to start with their Bruschetta as an appetizer, followed by the Fettuccine Alfredo or the Chicken Piccata for your main course. They also have a lovely selection of wines to complement your meal. I suggest Caffe Siena because it combines great food with a warm ambiance, making it perfect for a nice evening out. Enjoy your dinner!
# since llama-index chat is stateful, you should create another instance if there is a change in context
li_agent = client.make_agent(llm=gpt_llm, system_prompt = "sys_prompt")
# context: user3 in Charlotte
with client.set_context({"user_id":"user3", "location":"Charlotte, NC"}):
print_md(li_agent.chat("suggest a restaurant for tonight and tell me why you suggest it").response)
Hi Jane! I recommend trying "Lang Van," a fantastic Chinese restaurant in Charlotte, NC. It's known for its authentic flavors and cozy atmosphere. You should definitely try their "Pho" for a comforting bowl of noodle soup, and the "Spring Rolls" are a perfect appetizer to start your meal. The "Kung Pao Chicken" is also a crowd favorite, packed with flavor and a bit of spice. Lang Van is a great choice for a delightful dining experience tonight! Enjoy!
Conclusion
Tecton prompts are used to incorporate real-time, streaming and batch features into your generative AI applications, providing a great solution for personalization. In general, it can be used to provide up to date context for any LLM driven function and ut provides seamless integration with LangChain and LlamaIndex.