AWS Bedrock anthropic claude tool call integration with microsoft semantic kernel

As of April 2025, the official Microsoft Semantic Kernel connector for Amazon Microsoft.SemanticKernel.Connectors.Amazon does not natively support tool/function calls. Apparently, Semantic Kernel is shifting its approach towards an LLM abstraction layer based on Microsoft.Extensions.AI, aiming for a more unified and extensible architecture. Currently, only OpenAI and Ollama implementations are available within this new abstraction. It is anticipated that an implementation for AWS Bedrock Anthropic Claude based on Microsoft.Extensions.AI will become available in the future. Therefore, in the interim, I implemented a custom solution. The approach leverages the existing IChatClient interface, making the implementation relatively straightforward. Since function calls are supported by this interface, the solution involves implementing it on top of the AWS Bedrock Runtime SDK. Implement IChatClient with AWS Bedrock Runtime The IChatClient interface essentially contains two methods: one for standard chat responses and another for streamed responses. The implementation involves mapping these two methods to the IAmazonBedrockRuntime.ConverseAsync and ConverseStreamAsync methods, as demonstrated in the full implementation of the AnthropicChatClient here. Setting up Function Calls with Semantic Kernel Here's how to set up function calls with Semantic Kernel using our custom AnthropicChatClient: Set up kernel and functions This step configures the chat completion service with function invocation capabilities and registers it with the Semantic Kernel. // Set up chat completion service IChatClient chatClient = ...; IChatCompletionService chatService = chatClient .AsBuilder() .UseFunctionInvocation() // Enables function call functionality .Build() .AsChatCompletionService(); // Register the Bedrock chat completion service var builder = Kernel.CreateBuilder(); builder.Services.AddKeyedSingleton("bedrock", chatService); // Add plugins/functions builder.Plugins.AddFromType(); // ... var kernel = builder.Build(); Use automatically tool calls This code demonstrates how to use the configured chat completion service to automatically invoke functions based on the user's input. // Set up bedrock var runtimeClient = new AmazonBedrockRuntimeClient(RegionEndpoint.APSoutheast2); IChatClient client = new AnthropicChatClient(runtimeClient, "anthropic.claude-3-5-sonnet-20241022-v2:0"); // Configure the chat client as shown in step 1. IChatCompletionService chatCompletionService = client .AsBuilder() .UseFunctionInvocation() .Build() .AsChatCompletionService(); var chatHistory = new ChatHistory(); chatHistory.AddUserMessage("What is the special soup and its price?"); var promptExecutionSettings = new PromptExecutionSettings { FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(options: new() { RetainArgumentTypes = true }), ExtensionData = new Dictionary { { "temperature", 0 }, { "max_tokens_to_sample", 1024 } // Required parameter for Anthropic models } }; var messageContent = await chatCompletionService .GetChatMessageContentAsync(chatHistory, promptExecutionSettings, kernel); Console.WriteLine(messageContent.Content); // Expected output : Today's special soup is Clam Chowder and it costs $9.99. Complete sample code Please feel free to reach out on twitter @roamingcode

Apr 15, 2025 - 00:59
 0
AWS Bedrock anthropic claude tool call integration with microsoft semantic kernel

As of April 2025, the official Microsoft Semantic Kernel connector for Amazon Microsoft.SemanticKernel.Connectors.Amazon does not natively support tool/function calls. Apparently, Semantic Kernel is shifting its approach towards an LLM abstraction layer based on Microsoft.Extensions.AI, aiming for a more unified and extensible architecture. Currently, only OpenAI and Ollama implementations are available within this new abstraction. It is anticipated that an implementation for AWS Bedrock Anthropic Claude based on Microsoft.Extensions.AI will become available in the future. Therefore, in the interim, I implemented a custom solution. The approach leverages the existing IChatClient interface, making the implementation relatively straightforward. Since function calls are supported by this interface, the solution involves implementing it on top of the AWS Bedrock Runtime SDK.

Implement IChatClient with AWS Bedrock Runtime

The IChatClient interface essentially contains two methods: one for standard chat responses and another for streamed responses. The implementation involves mapping these two methods to the IAmazonBedrockRuntime.ConverseAsync and ConverseStreamAsync methods, as demonstrated in the full implementation of the AnthropicChatClient here.

Setting up Function Calls with Semantic Kernel

Here's how to set up function calls with Semantic Kernel using our custom AnthropicChatClient:

  1. Set up kernel and functions
    This step configures the chat completion service with function invocation capabilities and registers it with the Semantic Kernel.

    // Set up chat completion service
    IChatClient chatClient = ...;
    IChatCompletionService chatService =
        chatClient
            .AsBuilder()
            .UseFunctionInvocation() // Enables function call functionality
            .Build()
            .AsChatCompletionService();
    
    // Register the Bedrock chat completion service
    var builder = Kernel.CreateBuilder();
    builder.Services.AddKeyedSingleton("bedrock", chatService);
    // Add plugins/functions
    builder.Plugins.AddFromType<MenuPlugin>();
    // ...
    var kernel = builder.Build();
    
  2. Use automatically tool calls
    This code demonstrates how to use the configured chat completion service to automatically invoke functions based on the user's input.

    // Set up bedrock
    var runtimeClient = new AmazonBedrockRuntimeClient(RegionEndpoint.APSoutheast2);
    IChatClient client = new AnthropicChatClient(runtimeClient, "anthropic.claude-3-5-sonnet-20241022-v2:0");
    
    // Configure the chat client as shown in step 1.
    IChatCompletionService chatCompletionService = client
        .AsBuilder()
        .UseFunctionInvocation()
        .Build()
        .AsChatCompletionService();
    
    var chatHistory = new ChatHistory();
    chatHistory.AddUserMessage("What is the special soup and its price?");
    
    var promptExecutionSettings = new PromptExecutionSettings
    {
        FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(options: new()
        {
            RetainArgumentTypes = true
        }),
        ExtensionData = new Dictionary<string, object>
        {
            { "temperature", 0 }, 
            { "max_tokens_to_sample", 1024 } // Required parameter for Anthropic models
        }
    };
    
    var messageContent = await chatCompletionService
        .GetChatMessageContentAsync(chatHistory,  promptExecutionSettings, kernel);
    Console.WriteLine(messageContent.Content);
    
    // Expected output : Today's special soup is Clam Chowder and it costs $9.99.
    

Complete sample code

Please feel free to reach out on twitter @roamingcode