Semantic Kernel for Java
Microsoft's open-source SDK for building AI agents and integrating LLMs into your applications. Deep Azure OpenAI integration with enterprise-grade reliability.
Semantic Kernel is Microsoft's answer to LangChain—but designed with enterprise developers in mind. It provides a modular architecture for combining AI prompts, native code functions, and memory into intelligent applications. Originally for C#, it now has official Java support, making it a compelling choice for teams already invested in the Microsoft/Azure ecosystem.
Key differentiators include Planners that automatically orchestrate complex tasks,Plugins for extending capabilities, and seamless Azure OpenAI integration with built-in retries, rate limiting, and enterprise security.
Core Concepts
Kernel
The central orchestrator that manages AI services, plugins, and memory
Plugins
Collections of functions (native code or prompts) that extend capabilities
Planners
AI-powered orchestrators that create execution plans from goals
Memory
Semantic memory for storing and retrieving context and facts
Getting Started
Maven Setup
Add Semantic Kernel dependencies to your project
<dependency><groupId>com.microsoft.semantic-kernel</groupId><artifactId>semantickernel-api</artifactId><version>1.0.0</version></dependency><dependency><groupId>com.microsoft.semantic-kernel</groupId><artifactId>semantickernel-aiservices-openai</artifactId><version>1.0.0</version></dependency>Configuring the Kernel
Build a Kernel with Azure OpenAI
importcom.microsoft.semantickernel.Kernel;importcom.microsoft.semantickernel.aiservices.openai.chatcompletion.OpenAIChatCompletion;@ConfigurationpublicclassSemanticKernelConfig{@Value("${azure.openai.endpoint}")privateString endpoint;@Value("${azure.openai.key}")privateString apiKey;@Value("${azure.openai.deployment}")privateString deploymentName;@BeanpublicKernelsemanticKernel(){// Create the chat completion serviceOpenAIChatCompletion chatService =OpenAIChatCompletion.builder().withModelId(deploymentName).withEndpoint(endpoint).withApiKey(apiKey).build();// Build the kernel with the servicereturnKernel.builder().withAIService(OpenAIChatCompletion.class, chatService).build();}}Creating Plugins
Native Function Plugin
Define functions that the AI can invoke
importcom.microsoft.semantickernel.semanticfunctions.annotations.DefineKernelFunction;importcom.microsoft.semantickernel.semanticfunctions.annotations.KernelFunctionParameter;publicclassWeatherPlugin{@DefineKernelFunction(
name ="getWeather",
description ="Gets the current weather for a specified city")publicStringgetWeather(@KernelFunctionParameter(
name ="city",
description ="The city to get weather for")String city
){// Call actual weather APIreturn weatherService.getCurrentWeather(city);}@DefineKernelFunction(
name ="getForecast",
description ="Gets the 5-day weather forecast")publicList<DailyForecast>getForecast(@KernelFunctionParameter(name ="city")String city,@KernelFunctionParameter(name ="days", defaultValue ="5")int days
){return weatherService.getForecast(city, days);}}// Register the plugin with the kernelKernel kernel =Kernel.builder().withAIService(OpenAIChatCompletion.class, chatService).withPlugin(KernelPluginFactory.createFromObject(newWeatherPlugin(),"WeatherPlugin")).build();Semantic Functions (Prompts)
Prompt-Based Functions
Define AI behaviors using natural language prompts
importcom.microsoft.semantickernel.semanticfunctions.KernelFunction;importcom.microsoft.semantickernel.semanticfunctions.KernelFunctionFromPrompt;// Create a semantic function from a prompt templateKernelFunction<String> summarize =KernelFunctionFromPrompt.<String>builder().withTemplate("""
Summarize the following text in {{$style}} style:
{{$input}}
Summary:
""").withDefaultExecutionSettings(PromptExecutionSettings.builder().withMaxTokens(200).withTemperature(0.7).build()).build();// Invoke the functionKernelFunctionArguments args =KernelFunctionArguments.builder().withVariable("input", longDocument).withVariable("style","professional").build();String summary = kernel.invokeAsync(summarize).withArguments(args).block().getResult();Planners: Automatic Orchestration
Planners are the "magic" of Semantic Kernel. Given a goal, they analyze available plugins and create an execution plan automatically. Think of it as an AI that writes its own code.
Handlebars Planner
Creates a Handlebars template that orchestrates function calls. Good for complex, multi-step workflows.
Function Calling Planner
Uses the model's native function calling. Simpler but relies on model quality for orchestration.
importcom.microsoft.semantickernel.planner.handlebars.HandlebarsPlan;importcom.microsoft.semantickernel.planner.handlebars.HandlebarsPlanner;// Create a plannerHandlebarsPlanner planner =newHandlebarsPlanner();// Generate a plan from a goalString goal ="Find the weather in Seattle and send a summary email to the team";HandlebarsPlan plan = planner.createPlanAsync(kernel, goal).block();// Review the plan (optional)System.out.println("Generated Plan:\n"+ plan.getPlan());// Execute the planString result = plan.invokeAsync(kernel).block().getResult();Semantic Kernel vs Spring AI
| Feature | Semantic Kernel | Spring AI |
|---|---|---|
| Primary Focus | Agent orchestration & planning | LLM integration for Spring apps |
| Azure Integration | Excellent | Good |
| Spring Integration | Manual | Native |
| Planners | Built-in | Custom |
| Best For | Complex AI agents, Azure shops | Spring ecosystem, multi-provider |
When to Choose Semantic Kernel
✓ Good Fit
- • Heavy investment in Azure/Microsoft stack
- • Building complex, multi-step AI agents
- • Need automatic task planning
- • Cross-platform (using SK in C# and Java)
- • Enterprise security requirements with Azure
⚠️ Consider Alternatives
- • Deep Spring Boot integration needed
- • Multi-cloud / multi-provider strategy
- • Simple chat or RAG without agents
- • Team more familiar with Spring ecosystem
- • Using Ollama or other local models