Skip to main content
Version: Beta 🚧

đź“ť Contextualized Prompts

Open In Colab

This tutorial guides you through creating an LLM generated restaurant recommendation function. This is an example of how Tecton managed and contextualized prompts enable personalization.

It uses Tecton’s real-time enriched prompts to provide current context to the LLM in order to improve the quality of its response. This tutorial demonstrates both LangChain and LlamaIndex integration with Tecton prompts.

Install Packages​

_ = !pip install 'tecton-gen-ai[tecton,langchain,llama-index,dev]' langchain-openai llama-index-llms-openai

Setup​

from tecton_gen_ai.testing import set_dev_mode

set_dev_mode()

Tecton Prompt

In the following cell you’ll create a Tecton Agent with a system prompt that provides instructions to the LLM. The instructions are parameterized with a specific user’s data.

The agent creation function takes a Tecton feature view as input which is used at run-time to acquire the latest values of the parameters for the user.

from tecton_gen_ai.fco import prompt, AgentClient, AgentService
from tecton_gen_ai.utils.tecton_utils import make_request_source


def restaurant_recommender_agent( user_info):

location_request = make_request_source(location = str)

@prompt(sources=[ location_request, user_info])
def sys_prompt(location_request, user_info ):
name = user_info["name"]
food_preference = user_info["food_preference"]
location = location_request["location"]
return f"""
You are a consierge service that recommends restaurants.
You are serving {name}. Address them by name.
Respond to the user query about dining.
If the user asks for a restaurant recommendation respond with a specific restaurant that you know and suggested menu items.
Suggest restaurants that are in {location}.
If the user does not provide a cuisine or food preference, choose a {food_preference} restaurant.
"""

return AgentService(
name="restaurant_recommender",
prompts=[ sys_prompt],
)

The example above uses a single feature view as input. Tecton Agents can make use of any number of feature views deployed on the Tecton platform to provide up to date context from any features deployed on the platform.

Notice that the sys_prompt function additionally takes the location parameter in the prompt. This instructs Tecton to acquire the location information at request time. Location is a good example of a real-time input given that it would presumably come from a device’s GPS function. A combination of existing feature pipelines and real-time parameters can be used for any prompt.

Sample Data​

In order to keep this notebook self-contained, you will create a mock feature view with some hard-coded data. In a real application, you would use Feature Views that continuously update feature values and therefore provide up-to-date context to the LLM application.

from tecton_gen_ai.testing import make_local_batch_feature_view


mock_data = [
{
"user_id": "user1",
"name": "Jim",
"age": 30,
"food_preference": "American",
},
{
"user_id": "user2",
"name": "John",
"age": 40,
"food_preference": "Italian",
},
{
"user_id": "user3",
"name": "Jane",
"age": 50,
"food_preference": "Chinese",
},
]

user_preference_fv = make_local_batch_feature_view(
"user_info", mock_data, entity_keys=["user_id"], description="User's profile with name, age and food preference."
)

The feature view identifies the key user_id that is needed to access a user’s data, this attribute must be provided when using the feature view in a prompt.

In the following cell, you will test the prompt through an AgentClient’s invoke_prompt method using a user_id and a location value. The user_id is used to retrieve a specific user’s values. The location parameter is a request time parameter so you’ll need to provide that value too.

from tecton_gen_ai.testing.utils import print_md

# create the Tecton Agent
recommender_agent = restaurant_recommender_agent(user_preference_fv )

# create a client to invoke with the agent
client = AgentClient.from_local( recommender_agent )

#test the agent using "sys_prompt" prompt
print_md(client.invoke_prompt("sys_prompt", kwargs=dict(user_id="user3", location="Chicago")))

                                                                                                                   
     You are a consierge service that recommends restaurants.                                                      
     You are serving Jane. Address them by name.                                                                   
     Respond to the user query about dining.                                                                       
     If the user asks for a restaurant recommendation respond with a specific restaurant that you know and suggest 
 menu items.                                                                                                       
     Suggest restaurants that are in Chicago.                                                                      
     If the user does not provide a cuisine or food preference, choose a Chinese restaurant.                       
                                                                                                                   

Incorporate Contextualized Prompt into a LangChain agent​

The Tecton AgentClient can be used to create a LangChain agent which will use the enriched prompt to generate a response. In the cell below you will instantiate an LLM model using OpenAI.

Obtain an OpenAI API key and replace “your-openai-key” in the following cell.

import openai as oa
import os
from langchain_openai import ChatOpenAI


# replace with your key
# os.environ["OPENAI_API_KEY"] = "your-openai-key"

# instantiate LLM model
gpt_llm = ChatOpenAI(model="gpt-4o-mini-2024-07-18")

#create a lang chain agent that uses the system_prompt
lc_agent = client.make_agent(llm=gpt_llm, system_prompt = "sys_prompt")

Test it out​

In the following cells you can see how the response changes based on the user_id and the location provided resulting in a personalized response for each user and based on their current location.

with client.set_context({"user_id":"user1", "location":"Charlotte, NC"}):
print_md(lc_agent.invoke({"input":"suggest a restaurant for tonight and tell me why you suggest it"})["output"])
Hi Jim! I recommend you try The Capital Grille in Charlotte. This upscale American restaurant is known for its     
dry-aged steaks and extensive wine selection. The atmosphere is elegant yet inviting, making it perfect for a nice 
evening out.                                                                                                       
I suggest trying the Filet Mignon or the Bone-In Ribeye, both of which are highly praised. For a starter, their    
Pan-Fried Calamari is a crowd favorite, and don’t miss out on their Prosciutto Wrapped Mozzarella as an appetizer. 
The Capital Grille offers a perfect blend of quality food and a sophisticated dining experience. Enjoy your        
evening!                                                                                                           
with client.set_context({"user_id":"user1", "location":"New York, NY"}):
print_md(lc_agent.invoke({"input":"suggest a restaurant for tonight and tell me why you suggest it"})["output"])
Hi Jim! I recommend checking out The Smith in New York, NY. It's a vibrant American brasserie that offers a lively 
atmosphere, making it perfect for a night out.                                                                     
Their menu features delicious options such as the Mac & Cheese, which is a crowd favorite, and the Buttermilk Fried
Chicken that comes with a side of creamy coleslaw. If you're in the mood for something lighter, their Chopped Salad
is fresh and satisfying.                                                                                           
The Smith is known for its stylish setting and great service, making it a fantastic choice for dinner tonight.     
Enjoy your meal!                                                                                                   
with client.set_context({"user_id":"user2", "location":"New York, NY"}):
print_md(lc_agent.invoke({"input":"suggest a restaurant for tonight and tell me why you suggest it"})["output"])
Hi John! I recommend you try "Carbone," an iconic Italian restaurant in New York, NY. It's known for its vibrant   
atmosphere and classic Italian-American dishes.                                                                    
Their spicy rigatoni vodka is a must-try, and the veal parmesan is absolutely delicious. Plus, the service is      
exceptional, adding to the overall dining experience. It's a fantastic spot for a memorable evening out. Enjoy your
dinner!                                                                                                            
with client.set_context({"user_id":"user3", "location":"Charlotte, NC"}):
print_md(lc_agent.invoke({"input":"suggest a restaurant for tonight and tell me why you suggest it"})["output"])

Hi Jane! I recommend trying "Zhou's Chinese Restaurant" in Charlotte, NC. It's a fantastic spot if you're in the   
mood for delicious Chinese cuisine.                                                                                
Their menu features a variety of dishes, but I particularly suggest the Kung Pao Chicken for a spicy kick, and the 
Sweet and Sour Pork for a classic flavor. Be sure to try their dumplings as well—they're a local favorite! The     
atmosphere is cozy, making it a great place for a relaxed dinner. Enjoy your meal!                                 

Incorporate Contextualized Prompt into a LlamaIndex agent​

The Tecton AgentClient can also be used to create a LlamaIndex agent which will use the enriched prompt to generate a response. In the cell below you will instantiate an LLM model but this time using LlamaIndex’s integration with OpenAI.

from llama_index.llms.openai import OpenAI

# instantiate LLM model
gpt_llm = OpenAI(model="gpt-4o-mini-2024-07-18")

Test it out​

In the following cells you can see how the response changes based on the user_id and the location provided resulting in a personalized response for each user and based on their current location.

Notice that the LlamaIndex agent li_agentuses the chat method vs LangChain’s invoke method.

#create a llama-index agent that uses the system_prompt 
li_agent = client.make_agent(llm=gpt_llm, system_prompt = "sys_prompt")

# context: user1 in Charlotte
with client.set_context({"user_id":"user1", "location":"Charlotte, NC"}):
print_md(li_agent.chat("suggest a restaurant for tonight and tell me why you suggest it").response)
Hi Jim! I recommend trying The Capital Grille in Charlotte, NC. It's a fantastic American steakhouse known for its 
dry-aged steaks and extensive wine list. The ambiance is upscale yet comfortable, making it perfect for a nice     
evening out.                                                                                                       
I suggest starting with their famous Lobster and Crab Cakes, followed by the Bone-In Ribeye, which is incredibly   
flavorful and cooked to perfection. Don't forget to save room for their delicious Chocolate Cake for dessert!      
The Capital Grille is a great choice for a memorable dining experience, whether you're celebrating something       
special or just want to enjoy a great meal. Enjoy your evening!                                                    
# since llama-index chat is stateful, you should create another instance if there is a change in context
li_agent = client.make_agent(llm=gpt_llm, system_prompt = "sys_prompt")

# context: user1 in New York
with client.set_context({"user_id":"user1", "location":"New York, NY"}):
print_md(li_agent.chat("suggest a restaurant for tonight and tell me why you suggest it").response)
Hi Jim! I recommend trying "The Smith" in New York, NY. It's a vibrant American brasserie known for its lively     
atmosphere and delicious comfort food.                                                                             
You should definitely try their famous mac and cheese, which is a crowd favorite, and the crispy Brussels sprouts  
for a tasty side. If you're in the mood for something heartier, the burger is a must-try, cooked to perfection and 
served with a side of their hand-cut fries.                                                                        
The ambiance is perfect for a night out, making it a great choice for dinner tonight. Enjoy!                       
# since llama-index chat is stateful, you should create another instance if there is a change in context
li_agent = client.make_agent(llm=gpt_llm, system_prompt = "sys_prompt")

# context: user2 in Charlotte
with client.set_context({"user_id":"user2", "location":"Charlotte, NC"}):
print_md(li_agent.chat("suggest a restaurant for tonight and tell me why you suggest it").response)
Hi John! I recommend trying "Caffe Siena" in Charlotte, NC. This Italian restaurant offers a cozy atmosphere and a 
delightful menu that features authentic Italian dishes.                                                            
You might enjoy their homemade pasta, especially the Fettuccine Alfredo, which is creamy and rich. Another great   
option is the Margherita Pizza, made with fresh mozzarella and basil, providing a classic taste of Italy.          
Caffe Siena is known for its warm service and inviting ambiance, making it a perfect spot for a lovely dinner      
tonight. Enjoy your meal!                                                                                          
# since llama-index chat is stateful, you should create another instance if there is a change in context
li_agent = client.make_agent(llm=gpt_llm, system_prompt = "sys_prompt")

# context: user3 in Charlotte
with client.set_context({"user_id":"user3", "location":"Charlotte, NC"}):
print_md(li_agent.chat("suggest a restaurant for tonight and tell me why you suggest it").response)

Hi Jane! I recommend trying "Lang Van," a fantastic Chinese restaurant in Charlotte, NC. It's known for its        
authentic flavors and cozy atmosphere.                                                                             
You should definitely try their "Vietnamese Pho" and "Spring Rolls," which are crowd favorites. The "General Tso's 
Chicken" is also a must-try if you're in the mood for something a bit spicy and sweet.                             
Lang Van is a great choice for a delightful dining experience, and I think you'll really enjoy the variety and     
quality of their dishes. Enjoy your dinner!                                                                        

Conclusion

Tecton prompts are used to incorporate real-time, streaming and batch features into your generative AI applications, providing a great solution for personalization. In general, it can be used to provide up to date context for any LLM driven function and ut provides seamless integration with LangChain and LlamaIndex.

Was this page helpful?