How to Build AI-First Apps with Microsoft Semantic Kernel Plugins — A Practical To-Do Manager Example
Introduction
Large Language Models (LLMs) like GPT-4 are amazing at answering questions and drafting text — but what if you want them to run real code, query your data, or interact with your systems?
This is where Microsoft Semantic Kernel shines. It lets you build AI-first applications by combining:
In this post, I’ll show you how to use Semantic Kernel’s plugin concept with a simple, practical example: an AI To-Do Manager that understands your natural language instructions and automatically runs Python functions to add, list, and remove tasks.
By the end, you’ll see exactly how to:
Let’s get started!
What Are Plugins in Semantic Kernel?
In Semantic Kernel, a plugin is a collection of functions that the AI can use to do things it can’t do on its own — like call your code, fetch data, or update something in the real world.
When a user sends a prompt, the LLM decides:
This is called function calling — the LLM plans what needs to happen, the kernel runs your code, and then the LLM generates a final response with the results.
How This Works
Here’s the basic flow:
This pattern is simple, but powerful — because it means your AI agent can do real work, not just chat.
A Practical Example: The AI To-Do Manager
Let’s look at a practical example: a To-Do Manager plugin.
Here’s what it does:
Below is the plugin code:
from semantic_kernel.functions import kernel_function
class ToDoPlugin:
def __init__(self):
self.tasks = []
@kernel_function(
name="AddTask",
description="Add a new task to the to-do list"
)
def add_task(self, task: str) -> str:
self.tasks.append(task)
return f'Task added: "{task}"'
@kernel_function(
name="ListTasks",
description="List all tasks in the to-do list"
)
def list_tasks(self) -> str:
if not self.tasks:
return "No tasks yet!"
return "\n".join(f"{idx+1}. {t}" for idx, t in enumerate(self.tasks))
@kernel_function(
name="RemoveTask",
description="Remove a task by its index (starting at 1)"
)
def remove_task(self, index: int) -> str:
if 0 < index <= len(self.tasks):
removed = self.tasks.pop(index - 1)
return f'Removed task: "{removed}"'
return "Invalid task index!"
Each function is decorated with @kernel_function so Semantic Kernel knows how to expose it to the LLM.
Recommended by LinkedIn
Putting It All Together
In your main app (agent.py), you:
The AI automatically calls your plugin functions as needed:
from semantic_kernel import Kernel
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion
from semantic_kernel.connectors.ai.function_choice_behavior import FunctionChoiceBehavior
from semantic_kernel.contents.chat_history import ChatHistory
from semantic_kernel.connectors.ai.open_ai.prompt_execution_settings.azure_chat_prompt_execution_settings import AzureChatPromptExecutionSettings
from todo_plugin import ToDoPlugin
import asyncio, os
from dotenv import load_dotenv
async def main():
load_dotenv()
kernel = Kernel()
# Add Azure OpenAI chat completion
chat_completion = AzureChatCompletion(
deployment_name=os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME"),
api_key=os.getenv("AZURE_OPENAI_API_KEY"),
base_url=os.getenv("AZURE_OPENAI_BASE_URL"),
)
kernel.add_service(chat_completion)
# Add your plugin
kernel.add_plugin(ToDoPlugin(), plugin_name="ToDo")
# Enable planning (automatic function calling)
execution_settings = AzureChatPromptExecutionSettings()
execution_settings.function_choice_behavior = FunctionChoiceBehavior.Auto()
history = ChatHistory()
print("💡 AI To-Do Manager is ready! Type your tasks or 'exit' to quit.\n")
while True:
user_input = input("You > ")
if user_input.lower() == "exit":
break
history.add_user_message(user_input)
result = await chat_completion.get_chat_message_content(
chat_history=history,
settings=execution_settings,
kernel=kernel,
)
print("Assistant >", str(result))
history.add_message(result)
if __name__ == "__main__":
asyncio.run(main())
When you run this, you can try:
You > Add 'Finish project proposal'
You > Add 'Call mom'
You > Show my tasks
You > Remove the second task
You > What tasks are left?
The AI calls your plugin functions automatically.
How to Run This Yourself
I’ve packaged this as a starter GitHub repo so you can try it right away.
How to run:
How This Pattern Helps in the Real World
This simple To-Do Manager shows a real pattern you can use in many scenarios:
Plugins make your AI agent truly useful — connecting the LLM’s language skills to your real business logic and data.
Key Takeaways
🔗 Try the Repo
👉 Get the Starter Repo Feel free to fork it, extend it, and make it your own.
📚 Learn More
✨ Happy Building!
I hope this shows you how easy it is to combine AI with your own systems using Semantic Kernel plugins.
A powerful multi-agent tool by Azure. Recently developed a multi-agent chatbot for a client—more robust than LangGraph and LangChain, though some features are still experimental.
Blake Ellis