Introduction
In this blog post, you will be guided with the Prompt Routing mechanism of AiConfig with the help of Zero shot technique. The Zero shot is one of the most well-known technique utilized for the understanding of the user context and then labeling it based on the specific known categories. It has a wide range of possibilities and the use-cases are limitless or endless.
Background
AiConfig is an easy-to-use configuration driven application development framework. The beauty of AiConfig is, it's supports both the JSON/YAML based config.
Zero shot technique as explained by the ChatGPT.
Zero-shot learning is a machine learning technique that allows a model to make predictions on classes or tasks it has never seen during training. In traditional supervised learning, models are trained on a specific set of labeled data, and their performance is evaluated on similar data during testing. However, zero-shot learning expands this capability by enabling a model to generalize and make predictions on classes or tasks that were not part of its training data.
Hands-on
The Prompt Routing concept with the Zero shot technique is being executed with the help of AiConfig, an open source framework for development.
For the demonstration purposes, the official code sample for Prompt Routing will be utilized. Here's the reference - Basic-Prompt-Routing
You need to do the following steps for getting the sample running visually.
- Download assistant_aiconfig.json
- Login to LastMileAI
- Navigate to the Workbooks and then do the following to upload the aiconfig file.
Here's what the teaching assistant does. The brief description of what the assistant is all about
The user asks a question. The LLM decides the topic as math, physics, or general. Based on the topic, the LLM selects a different "assistant" to respond. These assistants have different system prompts and respond with varying introductions and style of response.
Here's the main code snippet which is responsible for getting the user intent and then running the appropriate prompt based on the topic, aka zero shot resolved way of contextual understanding. Notice below, first the topic resolution be being happened, and then the appropriate destination prompt is being executed based on the set parameter for the student_question
# Get assistant response based on user prompt (prompt routing)
async def assistant_response(prompt):
config = AIConfigRuntime.load("assistant_aiconfig.json")
params = {"student_question": prompt}
await config.run("router", params)
topic = config.get_output_text("router")
dest_prompt = topic.lower()
await config.run(dest_prompt, params)
response = config.get_output_text(dest_prompt)
return response
If you are wondering how the topic selection happens, below is the prompt or instruction responsible for fulfilling the basic routing aspects. You could basically use any LLM for that matter.
{
"name": "router",
"input": "{{student_question}}",
"metadata": {
"model": {
"name": "gpt-4",
"settings": {
"system_prompt": "\n You will be given a question. Classify the question as one of the following topics: \n 1. Math\n 2. Physics\n 3. General\n Output the topic name.\n "
}
},
"parameters": {}
}
}
Reference
AiConfig Web
AiConfig Open Source
Conclusion
Hope you had a lot of fun with the AiConfig, Prompt routing and using the LastMileAI workbook. Creativity or imagination is the only limit.
Top comments (0)