Navigation
- 00.OVERVIEW
- Action Planner
- Actions
- AI System
- Application class
- Augmentations
- Data Sources
- Function Calls
- Moderator
- Planner
- Powered by AI
- Prompts
- Streaming
- Turns
- User Authentication
Prompt management plays a crucial role in communicating and directing the behavior of Large Language Models (LLMs) AI. They serve as inputs or queries that users can provide to elicit specific responses from a model.
Here's a prompt that asks the LLM for name suggestions:
Input:
Give me 3 name suggestions for my pet golden retriever.
Response:
Some possible name suggestions for a pet golden retriever are:
- Bailey
- Sunny
- Cooper
Prompt templates are a simple and powerful way to define and compose AI functions using plain text. You can use it to create natural language prompts, generate responses, extract information, invoke other prompts, or perform any other task that can be expressed with text.
The language supports two basic features that allow you to include variables and call functions.
Simple Example:
Here's an example of a prompt template:
Give me 3 name suggestions for my pet {{ $petName }}.
$petName
is a variable that is populated on runtime when the template is rendered.
You don't need to write any code or import any external libraries, just use the double curly braces {{...}} to embed expressions in your prompts. Teams AI will parse your template and execute the logic behind it. This way, you can easily integrate AI into your apps with minimal effort and maximum flexibility.
To include a variable value in your text, use the {{$variableName}}
syntax. For example, if you have a variable called name that holds the user's name, you can write:
Hello {{$name}}, nice to meet you!
This will produce a greeting with the user's name.
Spaces are ignored, so if you find it more readable, you can also write:
Hello {{ $name }}, nice to meet you!
Here's how to define variables in code:
C#
In an action or route handler where the turn state object is available:
state.Temp.Post = "Lorem Ipsum..."
The usage in the prompt:
This is the user's post: {{ $post }}
Note: The
turnState.Temp.Post = ...
updates a dictionary with thepost
key under the hood from the AI Message Extension sample.
Javascript
// TurnEvents values include "beforeTurn" and "afterTurn"
app.turn("beforeTurn", (context, state) => {
state.temp.post = "Lorem Ipsum...";
});
Python
@app.before_turn
async def before_turn(context: TurnContext, state: AppTurnState):
state.temp.post = "Lorem Ipsum..."
return True
The usage in the prompt:
This is the user's post: {{ $post }}
You can simply add to the state.temp
object, and it will be accessible from the prompt template on runtime. Note that the safest place to do that would be in the beforeTurn
activity because it will execute before any activity handler or action.
Default Variables
The following are variables accessible in the prompt template without having to manually configure them. These are pre-defined in the turn state and populated by the library. Users can override them by changing it in the turn state.
Variable name | Description |
---|---|
input |
Input passed from the user to the AI Library. |
lastOutput |
Output returned from the last executed action. |
To call an external function and embed the result in your text, use the {{ functionName }}
syntax. For example, if you have a function called diceRoll
that returns a random number between 1 and 6, you can write:
The dice roll has landed on: {{ diceRoll }}
C#
In the Application
class,
prompts.AddFunction("diceRoll", async (context, memory, functions, tokenizer, args) =>
{
int diceRoll = // random number between 1 and 6
return diceRoll;
});
Javascript
prompts.addFunction('diceRoll', async (context, state, functions, tokenizer, args) => {
let diceRoll = // random number between 1 and 6
return diceRoll;
});
Python
@prompts.function("diceRoll")
async def dice_roll(
context: TurnContext,
state: MemoryBase,
functions: PromptFunctions,
tokenizer: Tokenizer,
args: List[str]
):
dice_roll = # random number between 1 and 6
return dice_roll
Each prompt template is a folder with two files, skprompt.txt
and config.json
. The folder name is the prompt template's name which can be referred to in your code. The skprompt.txt
file contains the prompt's text, which can contain natural language or prompt template syntax as defined in the previous section. The config.json
file specifies the prompt completion configuration.
Here's an example of a prompt template from the Twenty Questions sample.
skprompt.txt
You are the AI in a game of 20 questions.
The goal of the game is for the Human to guess a secret within 20 questions.
The AI should answer questions about the secret.
The AI should assume that every message from the Human is a question about the secret.
GuessCount: {{$conversation.guessCount}}
RemainingGuesses: {{$conversation.remainingGuesses}}
Secret: {{$conversation.secretWord}}
Answer the human's question but do not mention the secret word.
config.json
{
"schema": 1.1,
"description": "A bot that plays a game of 20 questions",
"type": "completion",
"completion": {
"completion_type": "chat",
"include_history": false,
"include_input": true,
"max_input_tokens": 2000,
"max_tokens": 256,
"temperature": 0.7,
"top_p": 0.0,
"presence_penalty": 0.6,
"frequency_penalty": 0.0
}
}
Note that the configuration properties in the file do not include all the possible configurations. To learn more about the description of each configuration and all the supported configurations see the
PromptTemplatConfig
Typescript interface.
These files can be found under the src/prompts/chat/
folder. So, this prompt template's name is chat
. Then, to plug these files in the Action Planner, the prompt manager has to be created with the folder path specified and then passed into the Action Planner constructor:
C#
PromptManager prompts = new PromptManager(new PromptManagerOptions(){
PromptFolder = "./prompts"
});
The file path is relative to the source of file in which the PromptManager
is created. In this case the Program.cs
was in the same folder as the prompts
folder.
Javascript
const prompts = new PromptManager({
promptsFolder: path.join(__dirname, "../src/prompts")
});
Python
prompts = PromptManager(PromptManagerOptions(
prompts_folder=f"{os.getcwd()}/src/prompts"
))