Explore Developer Center's New Chatbot! MongoDB AI Chatbot can be accessed at the top of your navigation to answer all your MongoDB questions.

Join us at AWS re:Invent 2024! Learn how to use MongoDB for AI use cases.
MongoDB Developer
C#
plus
Sign in to follow topics
MongoDB Developer Centerchevron-right
Developer Topicschevron-right
Languageschevron-right
C#chevron-right

Building an AI Agent with Semantic Kernel, C#, OpenAI and MongoDB Atlas

Luce Carter13 min read • Published Nov 28, 2024 • Updated Nov 28, 2024
.NETAIC#
FULL APPLICATION
Facebook Icontwitter iconlinkedin icon
Rate this tutorial
star-empty
star-empty
star-empty
star-empty
star-empty
GenAI continues to take off and AI Agents are no different. More and more developers are being asked to develop solutions that integrate AI Agents into the user experience. This helps businesses reduce costs by attempting to solve many user questions or requests without the need for human interaction.
With Winter on its way there is no better time to think about comforting food on those cold nights. For this reason, in this tutorial, we are going to create an AI Agent using Microsoft’s Semantic Kernel for C#, OpenAI, and MongoDB Atlas to help you decide whether you have the ingredients to cook, or whether you should just head to a cozy restaurant that is recommended and let someone else do the cooking!
Our agent is going to be made up of a few pieces:
  1. A plugin for retrieving a list of available ingredients from a text file.
  2. A prompt that uses AI to return what the cuisine of the given meal is.
  3. Another plugin for searching a collection in our MongoDB Atlas cluster.
To explain briefly, a plugin is a piece of custom code that is marked as available to the AI and can be called to achieve some functionality from within the code such as calling an API or interacting with other services.
A prompt is also custom but it builds up text, often with dynamic inputs, that is then used as an input to AI to carry out the task based on the input.
Microsoft Learn has a fantastic course on Semantic Kernel if you want to learn more about both.
If you would like to learn more about integrating Retrieval Augmented Generation (RAG) with Semantic Kernel, you can learn to build a movie recommendation bot that uses MongoDB Atlas to store the data, the vector embeddings and uses Atlas Vector Search under the hood via a Semantic Kernel connector to search the data.
But before we get distracted by hunger, let’s get started so we have our agent ready in time for dinner!

Prerequisites

You will need a few things in place in order to follow along with this tutorial:
  • Atlas M0 cluster deployed with the full sample dataset loaded.
  • OpenAI account and API key.
  • .NET 9 SDK.
  • The starter code that can be found on the “start” branch on GitHub.

Exploring the starter repo

In order to save time with some of the code and files, the starter repo comes with some things already available out of the box.
List of files available in the project on the start branch of the repo
  • Inside of Data/cupboardinventory.txt is a list of ingredients that you might find in a cupboard or fridge. You can always make changes to this if you wish to add or remove ingredients and these can be case insensitive. We will use this to simulate what ingredients are available.
  • Restaurants.cs within Models has properties we care about for a restaurant. As part of the MongoDB connector available for Semantic Kernel (which is already added to the project), we have access to the MongoDB C# Driver under the hood. Which means we can take advantage of being able to represent our documents as simple classes.
  • Program.cs has a method already implemented inside it called GenerateEmbeddingsForCuisine(). This is because we want to generate embeddings for the cuisine field for documents available from the sample dataset so they are available to the application later on. We don’t need to create embeddings for every document though, we just need a good sample size so it is set to fetch 1000 documents. If you want to understand more about how this method works, the section on adding documents to the memory store in the Semantic Kernel and RAG article goes into it in more detail.

First steps

The starter code we are using is a traditional .NET Console application using .NET 9. Although we could go ahead and add “appsettings” files and configure that in our Program.cs class, this is excessive for a simple demo. So we are going to take advantage of environment variables.
Before we start using our environment variables in the code, let's save them. Run each of the following commands (depending on your Operating System(OS)) one by one in your command-line of choice:
1export OPENAI_API_KEY=”<REPLACE WITH YOUR OPEN AI API KEY>” # MacOS/Linux
2set OPENAI_API_KEY=”<REPLACE WITH YOUR OPEN AI API KEY>” # Windows”
3
4export OPENAI_MODEL_NAME=”<REPLACE WITH YOUR MODEL OF CHOICE>”
5set OPENAI_MODEL_NAME_”<REPLACE WITH YOUR MODEL OF CHOICE>”
6
7export MONGODB_ATLAS_CONNECTION_STRING=”<YOUR MONGODB ATLAS CONNECTION STRING>”
8set MONGODB_ATLAS_CONNECTION_STRING=”<YOUR MONGODB ATLAS CONNECTION STRING>”
Now we can add the following code within Program.cs, below the using statements, but before the method definition, to fetch these environment variables:
1string apiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new ArgumentNullException("Environment.GetEnvironmentVariable(\"OPENAI_API_KEY\")");
2string modelName = Environment.GetEnvironmentVariable("OPENAI_MODEL_NAME") ?? "gpt-4o-mini";
This checks for the presence of those values and either throws an exception or sets a default. Gpt-4o-mini is a perfectly acceptable model for our use case.
We also want to call the existing method to generate the embeddings for our sample data.
1await GenerateEmbeddingsForCuisine();
It can take a few minutes to generate the embeddings so now would be a good time to run the application for the first time:
1dotnet run
You can always continue on with the tutorial while you wait, or make a coffee. Once the new embedded_cuisines collection has between 900-1000 documents (or around the number you selected if you chose to change the limit in the method), you can stop the application and delete or comment out the call to the method.

Setting up Semantic Kernel Builder

Now we have our sample data with embedded cuisines available, it is time to start setting up Semantic Kernel so we can then start to make the tools available to achieve our food related goal to the AI agent in the later sections.
1var builder = Kernel.CreateBuilder();
2
3builder.Services.AddOpenAIChatCompletion(
4 modelName,
5 apiKey);
6
7var kernel = builder.Build();
Now the kernel instance is configured and ready to go. We can start to build the first tool for our AI to opt to use; the ingredients plugin.

Creating the Ingredients Plugin

As mentioned earlier in this tutorial, we have a list of ingredients available in a .txt file inside the Data folder that we can use to simulate fetching the ingredients from an API (if you have a techy smart fridge for example).
So the first plugin we are going to write is one that fetches all the ingredients from that file. The agent can then use that to get all the available ingredients before deciding if they are missing any ingredients required to cook the chosen meal.
  1. In the root of the project, add a new folder called Plugins.
  2. Create a new class inside that folder called IngredientsPlugin.
  3. Paste the following code inside the class:
1[KernelFunction, Description("Get a list of available ingredients")]
2 public static string GetIngredientsFromCupboard()
3 {
4 // Ensures that the file path functions across all operating systems.
5 string baseDir = AppContext.BaseDirectory;
6 string projectRoot = Path.Combine(baseDir, "..", "..", "..");
7 string filePath = Path.Combine(projectRoot, "Data", "cupboardinventory.txt");
8 return File.ReadAllText(filePath).ToLower();
9 }
Note: If your text editor doesn’t automatically add using statements, add the following at the top of the file:
1using System.ComponentModel;
2using Microsoft.SemanticKernel;
Here we have a simple method defined called GetIngredientsFromCupboard. It is annotated with this KernelFunction definition with a Description property. This tells the AI that this method is available and also what it is for. This is used to help it decide when and if to call this method to achieve a task.
The code inside the method is pretty common C# code for reading a file. The result of Directory.GetCurrentDirectory() is different depending on the operating system and where the application is being run from. So most of this code is just to get the file path in an OS agnostic way.
What I find clever is that the method returns the contents of the file (in lowercase for consistency) and this is how the AI has access to it, by combining basic C# code with the knowledge that the function exists!
We now need to make the plugin available for us to use later on when we build a prompt up for what we want to achieve.
So after the last call to var kernel = builder.Build();, add the following to import the plugin:
1kernel.ImportPluginFromType<IngredientsPlugin>();

Creating the GetCuisine Prompt

It’s now time to make the GetCuisine prompt. We need a way to get AI to tell us what the cuisine of the meal is so this is where creating a prompt comes in.
There are two ways of creating a prompt; via two files (a json config file and a prompt txt file) or with a YAML file. I find YAML easy to get wrong with its indenting approach. So we are going to use the former approach.
  1. Create a folder called Prompts in the root of the project.
  2. Create a folder inside that folder called GetCuisine.
  3. Create a new file called config.json and input the following JSON:
1{
2 "schema": 1,
3 "type": "completion",
4 "description": "Identify the cuisine of the user's requested meal",
5 "execution_settings": {
6 "default": {
7 "max_tokens": 800,
8 "temperature": 0
9 }
10 },
11 "input_variables": [
12 {
13 "name": "cuisine",
14 "description": "Text from the user that contains their requested meal",
15 "required": true
16 }
17 ]
18}
This specifies the configuration for our prompt and specifies what it does, that is for chat completion, it should have 0 creativity (temperature: 0) aka be specific, and there will be an input variable available called cuisine which will contain the requested meal. Because input_variables is an array, you can specify multiple input variables if needed here as well.
Create another file in the folder called skprompt.txt which is what will dynamically build up the text for the AI to understand what is being asked of it. Then add the following:
1Return a single word that represents the cuisine of the requested meal that is sent: {{$cuisine}}.
2
3For example, if the meal is mac and cheese, return American. Or for pizza it would be Italian.
4Do not return any extra words, just return the single name of the cuisine.
This is an example of generating a prompt statement and making use of good prompt engineering to shape how well the AI understands and responds. The {{$cuisine}} is how you dynamically populate the prompt with values. This has to start with a $ sign, be inside the double curly brackets and match the name of an input variable declared in the array inside the config.json file.
The way to make prompts available to the AI as a plugin is slightly different compared to plugins.
After the call to import the IngredientsPlugin, add the following:
1string baseDir = AppContext.BaseDirectory;
2string projectRoot = Path.Combine(baseDir, "..", "..", "..");
3var plugins = kernel.CreatePluginFromPromptDirectory(projectRoot + "/Prompts");
Semantic Kernel is clever and can find all prompts available within the Prompts directory. It will then be available later in an array of plugin names (named after the folder, in our case GetCuisine).

Creating the Restaurants Plugin

Lastly, we want to create another plugin, this time for restaurants and interacting with our MongoDB Atlas cluster.
  1. Create a class inside the Plugins folder called RestaurantsPlugin.
  2. Add the following using statements and namespace declaration:
1using FoodAgentDotNet.Models;
2using Microsoft.SemanticKernel;
3using Microsoft.SemanticKernel.Connectors.MongoDB;
4using Microsoft.SemanticKernel.Connectors.OpenAI;
5using Microsoft.SemanticKernel.Memory;
6using MongoDB.Driver;
7using System;
8using System.Collections.Generic;
9using System.ComponentModel;
10using System.Linq;
11using System.Text;
12using System.Threading.Tasks;
13
14namespace FoodAgentDotNet.Plugins;
Then replace the rest of the class with the following:
1#pragma warning disable
2public class RestaurantsPlugin
3{
4 static readonly string mongoDBConnectionString = Environment.GetEnvironmentVariable("MONGODB_ATLAS_CONNECTION_STRING");
5 static readonly string openAIApiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY");
6
7 [KernelFunction, Description("Find a restaurant to eat at")]
8 public static async Task<List<Restaurant>> GetRecommendedRestaurant(
9 [Description("The cuisine to find a restaurant for")] string cuisine)
10 {
11 var mongoDBMemoryStore = new MongoDBMemoryStore(mongoDBConnectionString, "sample_restaurants", "restaurants_index");
12 var memoryBuilder = new MemoryBuilder();
13 memoryBuilder.WithOpenAITextEmbeddingGeneration(
14 "text-embedding-3-small",
15 openAIApiKey );
16 memoryBuilder.WithMemoryStore(mongoDBMemoryStore);
17 var memory = memoryBuilder.Build();
18
19 var restaurants = memory.SearchAsync(
20 "embedded_cuisines",
21 cuisine,
22 limit: 5,
23 minRelevanceScore: 0.5
24 );
25
26 List<Restaurant> recommendedRestaurants = new();
27
28 await foreach(var restaurant in restaurants)
29 {
30 recommendedRestaurants.Add(new Restaurant
31 {
32 Name = restaurant.Metadata.Description,
33 // We include the cuisine so that the AI has this information available to it
34 Cuisine = restaurant.Metadata.AdditionalMetadata,
35 });
36 }
37 return recommendedRestaurants;
38 }
39}
There is quite a lot here so let’s take a look at it.
Some of these features are still considered experimental so warnings are disabled. Just like with the ingredients plugin from earlier, we add the attribute to the method to mark it as a KernelFunction. However, this time we also pass in an argument for the cuisine so we add an additional description attribute to describe to the AI what the argument is there for.
Next we build up the memory store and configure MongoDB as our memory store. We also set up the OpenAI text embedding again as it will need to generate embeddings for the cuisine passed in to use in the vector search.
We then bring those pieces together to explicitly search our embedded_cuisines collection for up to 5 restaurants that might suit the requested cuisine. We then build up a list of recommended restaurants, assigning the values we care about before returning that list so the AI has it available.
We now need to return to Program.cs briefly to add our new plugin. After the previous call to add the IngredientsPlugin, add the following to also add the RestaurantsPlugin:
1kernel.ImportPluginFromType<RestaurantsPlugin>();

Adding the Vector Search index

When creating the MongoDBMemoryStore object, we passed it an index called restaurants_index but that doesn’t exist yet. So let’s change that!
It’s coming very soon (version 3.1 of the C# driver) but for now, there is no neat and readable way to programmatically create a vector search index in C#. The easiest way to create one is from within the Atlas UI in a browser or via MongoDB Compass.
I won’t go into detail here as we already have lots of content on how to do it. But we have documentation that shows how to do it if you need help.
You can use the following JSON to define the vector search index:
1{
2 "fields": [
3 {
4 "type": "vector",
5 "path": "embedding",
6 "numDimensions": 1536,
7 "similarity": "cosine"
8 }
9 ]
10}
I recommend calling the index restaurants_index to match the code. If you choose to use something else, be sure to update the code you pasted earlier inside RestaurantsPlugin.

Bringing it all together

Now we have all the pieces defined, it is time to bring it all together. We are going to request user input then use that response to build up what we want the AI to do.
First, let’s tell the AI that we want it to carry out calls automatically of its own accord. Add the following after the call to create the plugins variable:
1OpenAIPromptExecutionSettings settings = new()
2{
3 ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions
4};
Next let’s add the user interaction:
1Console.WriteLine("What would you like to make for dinner?");
2var input = Console.ReadLine();
We now want to build up a prompt that specifies what we want to achieve and starts to add our first plugin:
1string ingredientsPrompt = @"This is a list of ingredients available to the user:
2{{IngredientsPlugin.GetIngredientsFromCupboard}}
3
4Based on their requested dish " + input + ", list what ingredients they are missing from their cupboard to make that meal and return just the list of missing ingredients. If they have similar items such as pasta instead of a specific type of pasta, don't consider it missing";
You can see here how the prompt is built up. We let it know that we have ingredients available that it can get by calling the method we pass inside those double curly brackets. It’s in the format PluginName.Method so it may look familiar.
We then give it access to the reply from the user of what they want to eat and use that to ask it to find out what ingredients they are missing to make that meal. Again, there is a little bit of prompt engineering happening at the end, to avoid it being too fussy and ignoring perfectly valid ingredients.
We can then actually invoke that prompt:
1var ingredientsResult = await kernel.InvokePromptAsync(ingredientsPrompt, new(settings));
2
3var missing = ingredientsResult.ToString().ToLower()
4 .Split(new[] { '\r', '\n' }, StringSplitOptions.RemoveEmptyEntries)
5 .Where(line => line.StartsWith("- "))
6 .ToList();
The AI has a tendency to return quite a lot of extra explainer text alongside the list of ingredients so the missing variable just does some basic formatting to grab only the list of missing ingredients as we need to be very specific. We now want to have some fun with the user and decide whether they have enough ingredients to make their meal or something similar, or should just not bother and go to a restaurant instead! But as well as suggesting they go to a restaurant, we will use our custom plugin to recommend some restaurants too!
1var cuisineResult = await kernel.InvokeAsync(
2 plugins["GetCuisine"],
3 new() { { "cuisine", input } }
4);
5
6if (missing.Count >= 5)
7{
8 string restaurantPrompt = @"This is the cuisine that the user requested: " + cuisineResult + @". Based on this cuisine, recommend a restaurant for the user to eat at. Include the name and address
9 {{RestaurantsPlugin.GetRecommendedRestaurant}}";
10
11 var kernelArguments = new KernelArguments(settings)
12 {
13 { "cuisine", cuisineResult }
14 };
15
16 var restaurantResult = await kernel.InvokePromptAsync(restaurantPrompt, kernelArguments);
17
18 Console.WriteLine($"You have so many missing ingredients ({missing.Count}!), why bother? {restaurantResult}");
19}
20else if(missing.Count < 5 && missing.Count > 0)
21{
22 Console.WriteLine($"You have most of the ingredients to make {input}. You are missing: ");
23 foreach (var ingredient in missing)
24 {
25 Console.WriteLine(ingredient);
26 }
27 string similarPrompt = @"The user requested to make " + input + @" but is missing some ingredients. Based on what they want to eat, suggest another meal that is similar from the " + cuisineResult + " cuisine they can make and tell them the name of it but do not return a full recipe";
28 var similarResult = await kernel.InvokePromptAsync(similarPrompt, new(settings));
29
30 Console.WriteLine(similarResult);
31}
32else {
33 Console.WriteLine("You have all the ingredients to make " + input + "!");
34 string recipePrompt = @"Find a recipe for making " + input;
35 var recipeResult = await kernel.InvokePromptAsync(recipePrompt, new(settings));
36 Console.WriteLine(recipeResult);
37}
Because we fetched all prompts available from the prompts directory earlier, we now have it available in the plugins array I mentioned. So we start by finding out what the cuisine is.
The call takes a KernelArguments object which contains values we want to make available so we create a new one inline to the call, passing it the name of the input variable, matching our cuisine we defined earlier, and the value we want to assign to that.
We then do some basic if/else statements to handle the various conditions, ranging from a lot of missing ingredients, to missing just a few, to missing none at all!
Inside each one, a slightly different prompt to the AI is built up and used, with the response then outputted to the user.

Testing

Now we have all the code in place, let’s try it out!
Debug from within your IDE or using the .NET CLI
dotnet run
You will then see the prompt asking what you want for dinner. Try entering your favorite meal and see how it works!
Depending on the model, it can take a few seconds to run through so if you don’t get an instant response to your meal name, don’t worry!
AI agent interaction recommending Mexican restaurants

Summary

Woo! In this tutorial we have put together the power of Microsoft’s Semantic Kernel, OpenAI, and MongoDB Atlas to build a powerful AI Agent that helps users find out whether to visit the grocery store or go out for dinner!
Why don’t you try playing around with different meals, or ingredients in the included text file and seeing what recommendations you get?
If you have a smart fridge that has an API to tell you what ingredients you have, you could even try combining them together so that some of the data is genuinely your own!
Enjoy your meal!
Top Comments in Forums
There are no comments on this article yet.
Start the Conversation

Facebook Icontwitter iconlinkedin icon
Rate this tutorial
star-empty
star-empty
star-empty
star-empty
star-empty
Related
Tutorial

Designing and Developing 2D Game Levels with Unity and C#


Feb 03, 2023 | 7 min read
Tutorial

Getting Started with the Realm SDK for Unity


Feb 03, 2023 | 8 min read
Tutorial

Using LINQ to Query MongoDB in a .NET Core Application


Jun 10, 2024 | 6 min read
Tutorial

Sending and Requesting Data from MongoDB in a Unity Game


Sep 09, 2024 | 8 min read
Table of Contents