Using Microsoft Semantic Kernel with Amazon Bedrock (and Google Gemma)

Using Microsoft Semantic Kernel with Amazon Bedrock (and Google Gemma)

If you want to use Microsoft's Semantic Kernel SDK with Amazon Bedrock - including third-party models like Google Gemma - there's a neat trick that saves you a lot of pain. This article walks through the full setup, from creating the Bedrock API key to running your first prompt.

Why this combination?

Semantic Kernel is Microsoft's open-source SDK for building AI-powered applications in .NET (and Python/Java). It gives you prompt templating, function orchestration, memory, and a clean abstraction over multiple AI providers.

Amazon Bedrock is AWS's managed service for foundation models. It hosts models from Anthropic, Meta, Mistral, Google, and others - all behind a single AWS API.

The catch: Semantic Kernel ships a dedicated Microsoft.SemanticKernel.Connectors.Amazon package that uses the AWS SDK directly. But at the time of writing it only supports a subset of Bedrock's native model providers - Google models like Gemma will throw:

Unsupported model provider: google        

The solution is Amazon Bedrock's **OpenAI-compatible endpoint** (the "Bedrock Mantle" endpoint). It exposes an OpenAI-shaped API in front of any Bedrock model, which means you can use Semantic Kernel's standard AddOpenAIChatCompletion and point it at Bedrock instead.

Step 1 - Create an AWS Bedrock API key

Bedrock's OpenAI-compatible endpoint uses a long-lived bearer token rather than SigV4 signing.

  1. Open the [AWS Console](https://console.aws.amazon.com/) and navigate to Amazon Bedrock.
  2. In the left menu, go to Settings, then API keys.
  3. Click Create API key, give it a name, and set an expiry.
  4. Copy the generated key - it will look like a base64 string. You will not be able to see it again.
  5. Make sure the IAM user or role associated with the key has the bedrock:InvokeModel permission for the models you want to use.

Also note the endpoint URL shown on that page - it will be regional, for example:

https://bedrock-mantle.eu-west-1.api.aws/v1        

Step 2 - Enable the model in Bedrock

I didn't have to do this, but just in case you do. In the Bedrock console, go to **Model access** and request access to the model you want to use. For Google Gemma 3 27B:

  • Provider: Google
  • Model: Gemma 3 27B Instruct

Access is usually granted immediately for most models.

Step 3 - Set environment variables

Store credentials as user environment variables rather than hardcoding them:

On Windows you can set these via **System Properties > Environment Variables**, or in PowerShell:

[System.Environment]::SetEnvironmentVariable("AWS_BEARER_TOKEN_BEDROCK", "<key>", "User")[System.Environment]::SetEnvironmentVariable("OPENAI_BASE_URL", "https://bedrock-mantle.eu-west-1.api.aws/v1", "User")        

Step 4 - Create the .NET project

dotnet new console -n SemanticKernelBedrockDemo
cd SemanticKernelBedrockDemo
dotnet add package Microsoft.SemanticKernel --version 1.74.0        

You do not need Microsoft.SemanticKernel.Connectors.Amazon for this approach.

Step 5 - Write the code

using Microsoft.SemanticKernel;

class Program
{
    static async Task Main()
    {
        try
        {
            // Load your OpenAI API key from environment variables for security
            var apiKey = Environment.GetEnvironmentVariable("AWS_BEARER_TOKEN_BEDROCK", EnvironmentVariableTarget.User);
            if (string.IsNullOrWhiteSpace(apiKey))
            {
                Console.WriteLine("Please set the AWS_BEARER_TOKEN_BEDROCK environment variable.");
                return;
            }

            var baseUrl = Environment.GetEnvironmentVariable("OPENAI_BASE_URL", EnvironmentVariableTarget.User);
            if (string.IsNullOrWhiteSpace(baseUrl))
            {
                Console.WriteLine("Please set the OPENAI_BASE_URL environment variable.");
                return;
            }

            var builder = Kernel.CreateBuilder();

            builder.AddOpenAIChatCompletion(
                modelId: "google.gemma-3-27b-it",
                apiKey: apiKey,
                endpoint: new Uri(baseUrl)
            );

            var kernel = builder.Build();

            // Define a semantic function (prompt template)
            string promptTemplate = @"
You are a helpful assistant.
Summarize the following text in 3 bullet points:
{{$input}}
";

            var summarizeFunction = kernel.CreateFunctionFromPrompt(
                promptTemplate,
                functionName: "SummarizeText",
                description: "Summarizes text into 3 bullet points"
            );

            // Example input
            string textToSummarize = @"
Semantic Kernel is an SDK that lets you easily integrate AI services into your apps.
It supports prompt templates, memory, and function orchestration.
You can use it with OpenAI, Azure OpenAI, and other connectors.
";

            // Run the semantic function
            var result = await kernel.InvokeAsync(summarizeFunction, new() { ["input"] = textToSummarize });

            Console.WriteLine("Summary:\n" + result);
        }
        catch (Exception ex)
        {
            Console.WriteLine("Error: " + ex.Message);
        }
    }
}        

The key line is AddOpenAIChatCompletion with the endpoint parameter overriding where requests go. The AWS_BEARER_TOKEN_BEDROCK value is sent as the Authorization: Bearer header, which is what the Bedrock Mantle endpoint expects. The modelId is the full Bedrock model ID (`google.gemma-3-27b-it`), passed through as-is.

Running it

dotnet run        

Output:

Summary:
Here's a 3-bullet point summary of the text:
* **AI Integration:** Semantic Kernel is a software development kit (SDK) designed to simplify adding Artificial Intelligence capabilities to your applications.
* **Key Features:** It provides tools for managing prompts, storing information (memory), and coordinating AI functions.
* **Broad Compatibility:** Semantic Kernel works with various AI services like OpenAI and Azure OpenAI, and is expandable with connectors for others.        

Key pitfalls

  • Do not use AddBedrockChatCompletionService for Google models - the Semantic Kernel Amazon connector does not support them and throws at startup.
  • Use AWS_BEARER_TOKEN_BEDROCK, not your OpenAI key - they look superficially similar but are different credentials for different services.
  • The endpoint is regional - make sure the region in your URL matches where you created the API key and enabled model access.
  • Model IDs use dot notation - on the Bedrock OpenAI-compatible endpoint you pass the full Bedrock model ID (`google.gemma-3-27b-it`), not an OpenAI-style name.

The Bedrock OpenAI-compatible endpoint is an underappreciated feature - it means any SDK or tool that speaks the OpenAI API format can instantly talk to Bedrock, with no AWS SDK dependency required. For Semantic Kernel in particular, it opens up the full Bedrock model catalogue without waiting for the Amazon connector to catch up.

To view or add a comment, sign in

More articles by Simon Hughes

Others also viewed

Explore content categories