Teams AI library is a Teams-centric intece for integrating GPT-based language models and user intent engines. It simplifies the development process by reducing the need to write and maintain complex conversational bot logic.
You can leverage prebuilt, reusable code snippets that allow you to quickly build intelligent apps. This capabilities-driven approach allows you to focus on business logic rather than learning the intricacies of Microsoft Teams conversational frameworks.
Teams AI library enables your apps to engage users in natural and conversational interactions. These interactions can be guided toward specific app functionalities or tasks, allowing your app to better understand and process user intent.
You can rely on the built-in conversational bot capabilities in Teams (such as Power Virtual Agents or the Bot Framework) to handle the complexities of natural language processing.
You can leverage Teams AI library to
Use prebuilt templates to add Teams app capabilities.
Use techniques like prompt engineering to add ChatGPT like conversational experiences to your bot and built-in safety features, like moderation, help ensure your bot always responds in an appropriate manner.
Use the library’s planning engine that allows the model to identify the user’s intent and then maps that intent to actions that you implement.
Add support for any LLM of your choice without changing the bot logic.
Teams AI library supports both JavaScript and C#. It allows you to harness AI capabilities to build intelligent, user-friendly applications for Microsoft Teams. The library provides the flexibility to create AI-powered experiences using the tools and languages that best suits your project needs and ensures the best possible outcomes for your Teams users.
Teams AI library offers a variety of features that can simplify the development of your custom engine agent.
As a developer, I want to build an intelligent lightbot that controls the light in response to the user’s command. I’m considering using Teams AI library because of its features that can make building my custom engine agent a breeze. I want my AI-powered lightbot to make the user experience better and keep them more involved.
How can I use Teams AI library to make sure my custom engine agent runs oothly and gives users a fun and interactive experience?
The following table lists the updates to Teams AI library
Type
Description
.NET
JavaScript
Python
OpenAIModel
The OpenAIModel class allows you to call both OpenAI and Azure OpenAI with one single component. New models can be defined for other model types like Llama2.
✔️
✔️
✔️
Embeddings
The OpenAIEmbeddings class allows you to generate embeddings using either OpenAI or Azure OpenAI. New embeddings can be defined for things like OSS Embeddings.
❌
✔️
✔️
Prompts
A new object-based prompt system enables better token management and reduces the likelihood of overflowing the model’s context window.
✔️
✔️
✔️
Augmentation
Augmentations simplify prompt engineering tasks by letting the developer add named augmentations to their prompt. Only , , and style augmentations are supported.
✔️
✔️
✔️
Data Sources
A new DataSource plugin makes it easy to add RAG to any prompt. You can register a named data source with the planner and then specify the names of the data sources they wish to augment the prompt.
❌
✔️
✔️
Function calls, implemented within the AI SDK, unlock numerous capabilities, enabling the AI model to generate accurate responses seamlessly. It enables direct connection with external tools, thereby AI even more powerful. These capabilities include performing complex calculations, retrieving important data, creating oother workflows, and enabling dynamic interactions with users.
To use function calling with the Chat Completions API
Set up the planner where the default prompt uses the Tools Augmentation. Update one of the following files of your bot app
For a JavaScript app Update .
For a C# bot app Update .
For a Python app Update .
The following code snippet shows how to set up the class
Specify tools augmentation in the file.
Specify all your in the file, which is in the folder. Ensure that you follow the schema to avoid errors when the action is called by the LLM.
Register your in your class.
Each handler is a callback function that runs when a specific event happens. The function call handler executes code in response to the event.
The function call must return a string as the output of the function call.
When the model requests to invoke any functions, these are mapped to commands within a and are invoked in the AI class function. The outputs are then returned to the model with tool call IDs to show that the tools were used.
The following code snippet shows how to register
You can enable the following tool options
Enable Tool Choice Allow the model to select the function it must call by enabling tool selection. In the file
Set as to mandate the model to always call at least one function.
Set to a specific function using its definition for using that function.
Set as to disable the tool.
The default value of is . It enables the model to select the functions that it must call.
Toggle Parallel Tool Calls Executing tools in parallel is faster and reduces the number of back-and-forth calls to the API. In the file, you can set to or . By default, the parameter is set to .
The following code snippet shows how to enable tool choice and to toggle parallel tool calls
Sample name
Description
.NET
Node.js
Python
Echo bot
This sample shows how to incorporate a conversational flow into a Microsoft Teams application using Bot Framework and Teams AI library.
View
View
View
Search command message extension
This sample shows how to incorporate a basic Message Extension app into a Microsoft Teams application using Bot Framework and Teams AI library.
View
View
View
Typeahead bot
This sample shows how to incorporate the typeahead search functionality in Adaptive Cards into a Microsoft Teams application using Bot Framework and Teams AI library.
View
View
View
Conversational bot with AI Teams chef
This sample shows how to incorporate conversational bot behavior into Microsoft Teams. The bot is built to allow GPT to facilitate the conversation on its behalf, using only a natural language prompt file to guide it.
View
View
Message extensions GPT-ME
This sample is a message extension for Microsoft Teams that uses the text-davinci-003 model to help users generate and update posts.
View
View
View
Light bot
This sample illustrates more complex conversational bot behavior into Microsoft Teams. The bot is built to allow GPT to facilitate the conversation on its behalf and manually defined responses, and maps user intents to user defined actions.
View
View
View
List bot
This sample shows how to incorporate conversational bot behavior into Microsoft Teams. The bot harnesses the power of AI to simplify your workflow and bring order to your daily tasks and showcases the action chaining capabilities.
View
View
View
DevOps bot
This sample shows how to incorporate conversational bot behavior in Microsoft Teams. The bot uses the gpt-3.5-turbo model to chat with Teams users and perform DevOps actions such as create, update, triage and summarize work items.
View
View
View
Twenty questions
This sample shows showcases the incredible capabilities of language models and the concept of user intent. Challenge your skills as a human player and try to guess a secret within 20 questions, while the AI-powered bot answers your queries about the secret.
View
View
View
Math tutor assistant
This example shows how to create a conversational experience using OpenAI’s Assistants APIs. It uses OpenAI’s Code Interpreter tool to create an assistant that’s an expert on math.
View
View
View
Food ordering assistant
This example shows how to create a conversational assistant that uses tools to call actions in your bot’s code. It’s a food ordering assistant for a fictional restaurant called The Pub and is capable of complex interactions with the user as it takes their order.
View
View
View
Copilot handoff
Teams AI library FAQs