đź“š Using Prompts, Features as Tools and Knowledge
By combining the Tecton GenAI functions, you can provide much better context for your LLM applications.
- The prompt provides instructions for the LLM and is enriched with context from feature pipelines or directly from the real-time context.
- Features as tools give the LLM the ability to retrieve additional data only as needed to respond to the user question.
- Knowledge function provides RAG functionality. It gives the LLM access to domain specific information that it can select from to provide responses that are relevant to the business problem that you aim to solve with generative AI. By taking unstructured text and calculating embedding vectors for them and storing them in a vector database that provides similarity search capabilities to better respond to user queries.
If you want to run it yourself, click on the Colab button below to open the notebook in an interactive environment.
In this notebook, you will combine all three tools to create an example restaurant recommendation chatbot.
You will iteratively improve an AI-driven chatbot for a Yelp-like mobile app that provides personalized recommendations based on the user’s preferences and dining activity.
Install Packages​
_ = !pip install 'tecton-gen-ai[tecton,langchain,llama-index,dev]' langchain-openai llama-index-llms-openai lancedb -q
Setup​
from tecton_gen_ai.testing import set_dev_mode
set_dev_mode()
You will use an OpenAI gpt-4o LLM model to run this notebook.
Obtain an OpenAI API key and replace “your-openai-key” with it in the following cell.
import os
# replace with your key
os.environ["OPENAI_API_KEY"] = "your-openai-key"
Example Support Chatbot
The following cell creates the support_chatbot
function which you will
use to test drive Tecton’s GenAI functionality.
- The first parameter
service
is a Tecton AgentService that serves context as prompts and features as tools and knowledge. - The second parameter is the
user_query
, a string with any arbitrary question for the LLM. - The third parameter provides the
context
as a dictionary of session parameters provided by a mobile app. - It creates a LangChain agent from the
service
integrating Tecton’s context into the LLM interactions.
The service
definition shapes the personalization capabilities of the
LLM-driven chatbot by providing:
- feature enriched prompts with specific instructions on how the LLM should respond.
- features as tools that the LLM can use to retrieve data it needs to respond to the user.
- knowledge it has access to when responding to the user.
The context
provides the personalization values needed by the
service
to provide data services to the LLM. These are any entity key
columns needed to access the features views for enriching prompts and
for features as tools. If request sources are used to provide real-time
values these are also listed in the context, like location or any fields
that are used for real-time transformations.
from langchain_openai import ChatOpenAI
def support_chatbot(service, user_query, context=None):
from rich import print
from rich.markdown import Markdown
from tecton_gen_ai.testing.utils import make_debug_logger
logger = make_debug_logger(lambda record: record.getMessage().lower().startswith("invoking tool"))
#create an agent client that provides context for LLM workflows
client = AgentClient.from_local(service)
# instantiate LLM model for LangChain
langchain_llm = ChatOpenAI(model="gpt-4o-mini")
# create invocable agent for LangChain
langchain_agent = client.make_agent(langchain_llm, system_prompt="sys_prompt")
with client.set_logger(logger):
with client.set_context(context):
print(Markdown(langchain_agent.invoke({"input":user_query})["output"]))
Simple Prompt
This first example creates a simple prompt without any enrichment.
An AgentService is how Tecton integrates with the LLM workflow. This
example creates an AgentService called service
to deliver the prompt
to the support_chatbot
in the following cell.
Throughout this notebook, you will iterate by adding more context to the
service
and therefore show how each GenAI context enhancement improves
the behavior of the LLM application.
from tecton_gen_ai.fco import prompt
from tecton_gen_ai.agent import AgentClient, AgentService
# prompt without any user context
@prompt()
def sys_prompt( ):
return f"""
You are a consierge service that recommends restaurants.
Always provide an address.
Always suggested menu items.
"""
#define the Tecton Agent Service specifying the prompts, sources of knowledge and feature tools
service = AgentService(
name="restaurant_recommender",
prompts=[sys_prompt],
)
Chatbot with a Simple Prompt​
Try running the following cell multiple times to see how the LLM output changes.
support_chatbot(service, "recommend a restaurant for tonight")
I’d love to help! Can you tell me your location or what type of cuisine you’re in the mood for?
At the time of writing this notebook, multiple runs of the cell above had varied responses. It responded with hallucinations by making up restaurants and addresses, or it provided famous restaurants in New York, which are not close to me (the user). In some cases it provided slightly better outputs where the LLM asks for more information like this one:
Sure! Can you please provide me with your location or the type of cuisine you're interested in?
The results are quite underwhelming but they can provide tips on what is missing like location and cuisine. It is clear that in order to create a personalized experience for the user, better context is needed.
A Feature Enriched Prompt
The following example incorporates user specific data into the prompt. In order to provide better personalization, the next example uses:
- a sample dataset with basic user information like name, age, and food_preference
- a request source to capture the user’s current location from the mobile app
- a prompt that incorporates the user’s
location
andname
into the prompt instructions
from tecton_gen_ai.testing import make_local_batch_feature_view
from tecton_gen_ai.utils.tecton_utils import make_request_source
# what if we had more up to date information about the user, like their name, age and cuisines preference
user_info = [
{"user_id": "user1", "name": "Jim", "age": 30, "food_preference": "American"},
{"user_id": "user2", "name": "John", "age": 40, "food_preference": "Italian"},
{"user_id": "user3", "name": "Jane", "age": 50, "food_preference": "Chinese"},
]
user_info_fv = make_local_batch_feature_view(
"user_info_fv",
user_info,
["user_id"],
description="User's basic information."
)
# and what if we knew their current location in order to recommend restaurants that are close to them
location_request = make_request_source(location = str)
# we could create a prompt like this that refers to the user by name and knows the user's location
@prompt(sources=[ location_request, user_info_fv]) # specifies sources of data for the prompt
def sys_prompt(location_request, user_info_fv ):
return f"""
Address the user by their name. Their name is {user_info_fv['name']}.
You are a consierge service that recommends restaurants.
Only suggest restaurants that are in or near {location_request['location']}.
Always provide an address.
Always suggested menu items.
"""
#define the Tecton Agent Service specifying the prompts, sources of knowledge and feature tools
service = AgentService(
name="restaurant_recommender",
prompts=[sys_prompt],
)
Chatbot with some user context​
Try running the following cell a few times by changing the user between “user1”,“user2”,“user3” and by changing the location.
mobile_app_context={"user_id":"user1", "location":"Charlotte, NC"}
support_chatbot(service, "recommend a restaurant for tonight", mobile_app_context)
Hi Jim! I recommend trying The Capital Grille for a fantastic dining experience. Address: 201 N Tryon St, Charlotte, NC 28202 Suggested Menu Items: • Dry Aged Porterhouse - A 40 oz. steak, perfect for steak lovers. • Lobster Mac 'N' Cheese - A deliciously creamy side that pairs well with any main dish. • Chocolate Cake - Indulge in this rich dessert to finish your meal on a sweet note. Enjoy your dinner!
The results have gotten better, the LLM is referring to the user by name
and it provides some decent restaurant options that are actually in or
near the location that the mobile app provides in the
mobile_app_context
variable.
But what happens when trying a more personalized question? The next cell asks the chatbot take into account the user’s preference.
# but now what if the question requires more data
mobile_app_context={"user_id":"user1", "location":"Charlotte, NC"}
support_chatbot(service, "recommend a restaurant for tonight based on my preference", mobile_app_context)
Sure, Jim! Could you please share your preferences? Are you looking for a specific type of cuisine or any dietary restrictions?
Again, the response varies if you run it multiple times. Most of the time it responds with something like:
Sure, Jim! Could you please share your preferences, such as the type of cuisine you enjoy or any specific dietary
restrictions? That way, I can recommend a restaurant that you'll love.
This makes a lot of sense because the user’s food preference is not part of the prompt.
Adding food preference to the prompt would be a good option and you can test that out on your own. But there is an alternative that we explore in the next section, Features as Tools.
Features as Tools
With Features as Tools, the LLM has the option of querying features
on-demand. The user_info_fv
used in the previous example contains the
food_preference
field but it isn’t available to the LLM. When adding
the feature view user_info_fv
in the list of tools in the
AgentService, the LLM can query all the features in the feature view
and make use of that information to drive a better response.
Try it out running multiple times with different users and locations:
#define the Tecton Agent Service specifying the prompts, sources of knowledge and feature tools
service = AgentService(
name="restaurant_recommender",
prompts=[sys_prompt],
tools=[user_info_fv] # user_info_fv provides food preference and recent_eats_fv provides recent dining
)
# we should get a much better answer
mobile_app_context={"user_id":"user1", "location":"Charlotte, NC"}
support_chatbot(service, "recommend a restaurant for tonight based on my preference", mobile_app_context)
Here’s a restaurant recommendation for you, Jim: The Capital Grille Address: 201 N Tryon St, Charlotte, NC 28202 Recommended Menu Items: • Dry Aged Porterhouse • Pan-Seared Tenderloin • Lobster Mac 'N' Cheese • Grilled Asparagus Enjoy your meal!
The results get a lot better, user1 gets American food recommendations,
user2 gets Italian, and user3 gets Chinese. The DEBUG message output
shows that the LLM retrieved data from user_info_fv
before answering
the question.
More features to answer more complex questions​
In the following example we add yet another feature as tools component to the mix.
By adding a list of the restaurants that the user has been to recently, the LLM can now add variety to its recommendations.
# user's recent visits
recent_eats = [
{"user_id": "user1", "last_3_visits":str(["Mama Ricotta's", "The Capital Grille", "Firebirds Wood Fired Grill"])},
{"user_id": "user2", "last_3_visits":str(["Mama Ricotta's", "Villa Antonio", "Viva Chicken"])},
{"user_id": "user3", "last_3_visits":str(["Wan Fu", "Wan Fu Quality Chinese Cuisine", "Ru San's"])},
]
recent_eats_fv = make_local_batch_feature_view( "recent_eats_fv", recent_eats, entity_keys=["user_id"], description="User's recent restaurant visits.")
#define the Tecton Agent Service specifying the prompt, and two feature tools
service = AgentService(
name="restaurant_recommender",
prompts=[sys_prompt],
tools=[user_info_fv, recent_eats_fv] # recent_eats_fv provides recent dining info
)
Chatbot with more tools​
Again, try running the following cell multiple times while varying the user id.
# now the app also knows where the user has already been
mobile_app_context = {"user_id":"user1", "location":"Charlotte, NC"}
support_chatbot(service, "I need a new restaurant for tonight", mobile_app_context)
Here are a few restaurant recommendations for you, Jim: 1 The Fig Tree Restaurant • Address: 9601 Johnston Rd, Charlotte, NC 28210 • Suggested Menu Items: Try the Filet Mignon or the Lobster Ravioli. 2 Sullivan's Steakhouse • Address: 1928 South Blvd, Charlotte, NC 28203 • Suggested Menu Items: The Bone-In Ribeye and the Lobster Mac & Cheese are highly recommended. 3 Café Monte • Address: 1616 E 5th St, Charlotte, NC 28204 • Suggested Menu Items: Don't miss their Croque Madame or the Quiche Lorraine. Enjoy your meal tonight!
The results are getting even better: the user is not only getting recommendations for places that fit their cuisine preference, but also places that they have not been to before.
But why these restaurants and not others​
Savvy readers may have noticed at this point that the recommendations come from the general training of the LLM.
If you are the developer of this app, you probably want to promote the restaurants that have signed up for your service and not those that haven’t. This is where more prompt instructions and knowledge comes in which are explored in the next section…
Knowledge
Adding knowledge to an LLM provides domain specific information that extends beyond its base training. With it, the LLM is able to select relevant text that helps respond to user queries and avoid hallucination.
When creating a set of knowledge in Tecton, the platform takes care of calculating embedding vectors and storing the data in a vector search database which the LLM workflow agent can then use to do semantic search.
With Tecton knowledge, other fields can be specified as filters for the dataset narrowing the results prior to semantic search and therefore providing information that is even more relevant given the user’s question and their current context.
In the following cell, the prompt is adjusted instructing the LLM to only use the knowledge provided when selecting restaurants, this narrows the results to just restaurants that we want the app to recommend. With the restaurant descriptions as its knowledge, the LLM can also select a restaurant based on more qualitative characteristics like atmosphere or whatever else the descriptions provide.
#change the prompt instructions adding "Only select restaurants that have signed up on the service. "
@prompt(sources=[ location_request, user_info_fv]) # specifies sources of data for the prompt
def sys_prompt(location_request, user_info_fv ):
location = location_request["location"]
name = user_info_fv["name"]
return f"""
Address the user by their name. Their name is {name}.
You are a consierge service that recommends restaurants.
Only suggest restaurants that are in or near {location}.
Always provide an address.
Always suggested menu items.
Only select restaurants that have signed up on the service.
""" # notice that we've added an instruction
#example knowledge about restaurants that have signed up
restaurants_signed_up = [
{
"restaurant_name": "Mama Ricotta's",
"zipcode": "28203",
"cuisine": "Italian",
"description": "A Charlotte staple since 1992, Mama Ricotta's offers authentic Italian cuisine in a warm, family-friendly atmosphere. Known for their hand-tossed pizzas, homemade pasta, and signature chicken parmesan."
},
{
"restaurant_name": "The Capital Grille",
"zipcode": "28202",
"cuisine": "American Steakhouse",
"description": "An upscale steakhouse chain featuring dry-aged steaks, fresh seafood, and an extensive wine list. The Charlotte location offers a refined dining experience with impeccable service and a sophisticated ambiance."
},
{
"restaurant_name": "Firebirds Wood Fired Grill",
"zipcode": "28277",
"cuisine": "American",
"description": "A polished casual restaurant known for its classic American cuisine cooked over an open wood fire. Specialties include hand-cut steaks, fresh seafood, and signature cocktails in a warm, contemporary setting."
},
{
"restaurant_name": "Villa Antonio",
"zipcode": "28210",
"cuisine": "Italian",
"description": "An elegant Italian restaurant offering a romantic atmosphere and authentic cuisine. Known for its homemade pasta, extensive wine selection, and attentive service, perfect for special occasions."
},
{
"restaurant_name": "Viva Chicken",
"zipcode": "28203",
"cuisine": "Peruvian",
"description": "A fast-casual eatery specializing in Peruvian-style rotisserie chicken. Offers fresh, flavorful dishes with a modern twist on traditional Peruvian cuisine, including quinoa stuffed avocados and yuca fries."
},
{
"restaurant_name": "Wan Fu",
"zipcode": "28226",
"cuisine": "Chinese",
"description": "A local favorite for Chinese cuisine, Wan Fu offers a wide array of traditional and Americanized Chinese dishes. Known for its generous portions, friendly service, and comfortable dining atmosphere."
},
{
"restaurant_name": "Wan Fu Quality Chinese Cuisine",
"zipcode": "28277",
"cuisine": "Chinese",
"description": "An upscale Chinese restaurant focusing on authentic flavors and high-quality ingredients. Offers a more refined dining experience with a menu featuring both classic and innovative Chinese dishes."
},
{
"restaurant_name": "Ru San's",
"zipcode": "28203",
"cuisine": "Japanese",
"description": "A popular sushi restaurant known for its extensive menu and all-you-can-eat option. Offers a wide variety of sushi rolls, sashimi, and other Japanese dishes in a casual, vibrant atmosphere."
},
{
"restaurant_name": "Le Bernardin",
"zipcode": "10019",
"cuisine": "French Seafood",
"description": "A world-renowned, Michelin three-star restaurant specializing in exquisite seafood. Chef Eric Ripert's menu features innovative preparations of the finest global seafood in an elegant setting."
},
{
"restaurant_name": "Katz's Delicatessen",
"zipcode": "10002",
"cuisine": "Jewish Deli",
"description": "An iconic New York institution since 1888, famous for its hand-carved pastrami and corned beef sandwiches. The bustling, no-frills atmosphere is part of its enduring charm."
},
{
"restaurant_name": "Eleven Madison Park",
"zipcode": "10010",
"cuisine": "Contemporary American",
"description": "A three-Michelin-starred restaurant offering an innovative tasting menu focusing on locally sourced, plant-based ingredients. Known for its impeccable service and artistic presentation."
},
{
"restaurant_name": "Peter Luger Steak House",
"zipcode": "11211",
"cuisine": "Steakhouse",
"description": "A Brooklyn institution since 1887, Peter Luger is famous for its dry-aged steaks and old-school, cash-only policy. The no-frills atmosphere focuses attention on the exceptional quality of the meat."
},
{
"restaurant_name": "Di Fara Pizza",
"zipcode": "11230",
"cuisine": "Pizza",
"description": "A legendary Brooklyn pizzeria, Di Fara is known for its handcrafted pies made by founder Dom DeMarco. Each pizza is a work of art, featuring high-quality ingredients and meticulous preparation."
},
{
"restaurant_name": "Balthazar",
"zipcode": "10012",
"cuisine": "French Brasserie",
"description": "A SoHo classic, Balthazar offers authentic French brasserie fare in a vibrant, bustling atmosphere. Known for its fresh seafood, classic French dishes, and popular weekend brunch."
},
{
"restaurant_name": "Momofuku Ko",
"zipcode": "10003",
"cuisine": "Contemporary Asian",
"description": "Chef David Chang's two-Michelin-starred restaurant offers an ever-changing tasting menu blending Asian and Western influences. The intimate counter seating provides a unique, interactive dining experience."
},
{
"restaurant_name": "The Halal Guys",
"zipcode": "10019",
"cuisine": "Middle Eastern",
"description": "Starting as a food cart, The Halal Guys has become a New York institution. Famous for its chicken and gyro over rice, topped with their legendary white sauce. Now with brick-and-mortar locations."
},
{
"restaurant_name": "Russ & Daughters",
"zipcode": "10002",
"cuisine": "Jewish Appetizing",
"description": "A New York classic since 1914, specializing in traditional Jewish appetizing foods. Famous for its hand-sliced smoked salmon, bagels, and other Jewish delicacies. Now includes a sit-down cafe."
},
{
"restaurant_name": "Lombardi's",
"zipcode": "10012",
"cuisine": "Pizza",
"description": "America's first pizzeria, established in 1905. Lombardi's continues to serve classic New York-style pizza from its coal-fired oven. Known for its simple, high-quality ingredients and historic charm."
},
{
"restaurant_name": "Joe's Shanghai",
"zipcode": "10013",
"cuisine": "Chinese",
"description": "Famous for introducing soup dumplings to New York, Joe's Shanghai offers authentic Shanghai-style cuisine. The bustling, no-frills atmosphere adds to the authentic experience."
}
]
Transform and load the knowledge​
The following cell creates a sample knowledge base for the LLM using
make_local_source
, notice that the description
"Restaurants that signed up for the service"
provides the LLM the
information it needs to follow the new instruction in the prompt created
above.
The source_as_knowledge
function that creates restaurant_knowledge
calculates embedding vectors and loads the data into a vector search
database.
The complete AgentService will now contains the prompt, two features
as tools and the newly created restaurant_knowledge
with the intent
that it only recommend restaurants that have signed up for the service.
from tecton_gen_ai.testing import make_local_source
from tecton_gen_ai.testing.utils import make_local_vector_db_config
from tecton_gen_ai.fco import source_as_knowledge
#provide a vector db config
conf = make_local_vector_db_config("/tmp/test.db", remove_if_exists=True)
#create embeddings of the restaurant descriptions in the vector DB
src = make_local_source("restaurants_signed_up", restaurants_signed_up,
auto_timestamp=True, description="Restaurants that signed up for the service")
restaurant_knowledge = source_as_knowledge(
src,
vector_db_config=conf,
vectorize_column="description",
filter = [("zipcode", str, "the zip code for the restaurant")]
)
#rebuild the service with restaurant knowledge included
service = AgentService(
name="restaurant_recommender",
prompts=[sys_prompt],
knowledge=[restaurant_knowledge], # added a source of knowledge
tools=[user_info_fv, recent_eats_fv]
)
Chatbot with prompt, feature tools & knowledge​
The knowledge restaurant_knowledge
created above specifies a zipcode
based filter, so now we use zip codes for location to provide this
additional filtering capability for knowledge.
Run the next cell multiple times changing the user.
# test the app
mobile_app_context = {"user_id":"user3", "location":"28277"}
support_chatbot(service, "recommend a restaurant with dry-aged steaks", mobile_app_context)
metadata.zipcode = '28277'
DEBUG Invoking tool search_restaurants_signed_up with {'query': 'dry-aged steaks', 'filter': client.py:360 '{"zipcode":"28277"}'}
I recommend Firebirds Wood Fired Grill, located at 1080 Market St, Fort Mill, SC 29708. This polished casual
restaurant is known for its classic American cuisine, including hand-cut dry-aged steaks. You might want to try
their signature Firebirds® Filet or the New York Strip, both cooked over an open wood fire.
Enjoy your meal, Jane!
It only recommends restaurants in the database and since we are asking for specific items, their food preference is not considered. Adding context in prompts, tools and knowledge can really control the behavior of the LLM and avoid hallucinations.
Try experimenting a bit more, below is another type of question where the user wants to visit one of their recent visits but would also like dry-aged steak. Change the user between runs and see how the response adapts to the personalization:
# test for recent visits with item they do not serve
mobile_app_context = {"user_id":"user3", "location":"28277"}
support_chatbot(service, "which of my recent visits that have dry aged steak", mobile_app_context)
metadata.zipcode = '10200'
metadata.zipcode = '10200'
metadata.zipcode = '28277'
metadata.zipcode = '28277'
metadata.zipcode = '28277'
metadata.zipcode = '28277'
DEBUG Invoking tool search_restaurants_signed_up with {'query': 'dry aged steak', 'top_k': 5, client.py:360 'filter': '{"zipcode":"28277"}'}
DEBUG Invoking tool search_restaurants_signed_up with {'query': 'Wan Fu', 'top_k': 5, 'filter': client.py:360 '{"zipcode":"28277"}'}
DEBUG Invoking tool search_restaurants_signed_up with {'query': 'Wan Fu Quality Chinese Cuisine', client.py:360 'top_k': 5, 'filter': '{"zipcode":"28277"}'}
DEBUG Invoking tool search_restaurants_signed_up with {'query': "Ru San's", 'top_k': 5, 'filter': client.py:360 '{"zipcode":"28277"}'}
Hi Jane! Among your recent visits, Wan Fu and Wan Fu Quality Chinese Cuisine do not serve dry aged steak. However, Firebirds Wood Fired Grill, which specializes in classic American cuisine, does offer dry aged steak. Here are the details for Firebirds Wood Fired Grill: Firebirds Wood Fired Grill • Address: 7045 S. Tryon St., Charlotte, NC 28277 • Menu Suggestions: • Dry Aged Ribeye • Hand-Cut Filet Mignon • Fresh Seafood options • Signature Cocktails If you're looking for a great steak experience, Firebirds would be an excellent choice!
Conclusions
Adding context to LLM based applications is a powerful technique to avoid hallucinations and produce highly relevant responses. Prompt engineering with the addition of up-to-date features that describe and make use of the user’s specific situation enhances personalization. Features as tools give the LLM access to additional data as needed, making it more responsive to the user’s individual situation and immediate requests. Knowledge provides domain relevant information.
Tecton helps deliver a new level of generative AI capabilities while providing the robust production-grade data pipelines and serving infrastructure that AI driven applications need in order to serve broad customer facing functionality.