How to pass custom prompt variables in a chainlit app?

2 min read 04-10-2024
How to pass custom prompt variables in a chainlit app?


Unlocking Dynamic Conversations with Custom Prompt Variables in Chainlit

Chainlit is a powerful tool for building interactive, conversational applications powered by LLMs. But sometimes, you need to inject specific information into your prompts to create truly dynamic and tailored responses. This is where custom prompt variables come in.

Imagine building a chatbot that helps users find local restaurants. You'd want the chatbot to be able to ask for the user's location and then tailor its recommendations based on that input. Custom prompt variables make this possible!

Let's dive into how to implement this in a Chainlit app:

Scenario: Building a restaurant recommender chatbot

Original Code (without custom prompt variables):

from langchain import OpenAI
from chainlit import Chainlit

llm = OpenAI(temperature=0.7)
chainlit = Chainlit()

@chainlit.on_message
def handle_message(message):
    response = llm(message)
    chainlit.send(response)

chainlit.run()

The Problem: The chatbot simply repeats whatever the user types, as there's no mechanism to capture and use user-specific information.

Solution: Custom Prompt Variables

  1. Capture User Input:

    • Instead of directly sending the user's message to the LLM, we need to capture it as a variable.
  2. Define a Template:

    • Create a prompt template that includes placeholders for your variables.
  3. Pass the Variables:

    • When calling the LLM, pass the captured user input as the value for the relevant placeholder in the template.

Refined Code:

from langchain import OpenAI, PromptTemplate
from chainlit import Chainlit

llm = OpenAI(temperature=0.7)
chainlit = Chainlit()

template = PromptTemplate(
    input_variables=["location"],
    template="What are the best restaurants in {location}?",
)

@chainlit.on_message
def handle_message(message):
    user_location = message 
    prompt = template.format(location=user_location)
    response = llm(prompt)
    chainlit.send(response)

chainlit.run()

Explanation:

  • PromptTemplate allows us to define a structure for our prompts.
  • input_variables defines the variables that can be passed to the template.
  • template specifies the actual prompt text with placeholders for variables.
  • We capture the user's message as user_location and pass it to the format method of the PromptTemplate to create a complete prompt.
  • The completed prompt is then passed to the LLM, resulting in responses tailored to the user's location.

Additional Insights:

  • You can have multiple custom variables within a single prompt.
  • Use descriptive variable names for clarity.
  • Complex scenarios might require a chain of prompts, where each response generates input for the subsequent prompt.

Benefits of Custom Prompt Variables:

  • Dynamic and Personalized Responses: Tailor your chatbot's responses based on user-specific information.
  • Improved User Experience: Create more engaging conversations by providing contextually relevant information.
  • Increased Functionality: Enable complex chatbots that can handle tasks requiring user input.

Further Exploration:

By mastering custom prompt variables, you unlock a new level of power and flexibility in your Chainlit applications, making them more engaging, dynamic, and user-friendly.