DevExpress AI-powered Extensions for Blazor
- 8 minutes to read
Use the following links for details on how to add AI-powered functionality to DevExpress Blazor components:
#How it Works
DevExpress AI APIs leverage the Microsoft.Extensions.AI libraries for integration and interoperability with a wide range of AI services. These libraries establish a unified C# abstraction layer for standardized interaction with language models.
This architecture decouples your application code from specific AI SDKs. You can seamlessly switch the underlying AI model or provider with minimal code modifications. For example, you can build a prototype with a locally deployed AI model and then quickly transition to an enterprise-grade online LLM provider. These changes only involve adjustments to the app’s startup logic and the installation of necessary NuGet packages.
The IChatClient interface serves as the central mechanism for language models interaction. Currently, supported AI providers include:
- OpenAI (through Microsoft’s reference implementation)
- Azure OpenAI (through Microsoft’s reference implementation)
- Self-hosted Ollama (through the OllamaSharp library)
- Google Gemini, DeepSeek, Claude, and other major AI services through Semantic Kernel AI Connectors
- Custom
IChatClient
implementation for unsupported providers or private language models.
The Microsoft.Extensions.AI framework allows developers to integrate support for AI language models and services without modifying the core library. This means you can leverage third-party libraries for new AI providers or create your own custom implementation for in-house language models.
Note
DevExpress AI-powered extensions operate on a “bring your own key” (BYOK) model. We do not provide a proprietary REST API or bundled language models (LLMs/SLMs).
You can either deploy a self-hosted model or connect to a cloud AI provider and obtain necessary connection parameters (endpoint, API key, language model identifier, and so on). These parameters must be configured at application startup to register an AI client and enable extension functionality.
#Prerequisites
- .NET 8+ SDK / .NET Framework v4.7.2+
AI language model (choose one of the following):
OpenAI Service
- Create an OpenAI account
- Create a secret key to access the OpenAI API
- Subscribe for OpenAI API
Azure OpenAI Service
- Create an Azure account
- Create and deploy an Azure OpenAI resource
- Get Azure OpenAI key and endpoint
Ollama (self-hosted models)
- Download and install Ollama
- Pull a model from the Ollama library
- Run the downloaded model with
ollama run <model_name>
command
Semantic Kernel
- Search for the available AI connector or implement your custom connector
- Subscribe to the desired AI service if needed
#AI Services Integration
Follow the instructions below to register an AI model and enable DevExpress AI-powered Extensions in your application.
#OpenAI
Install the following NuGet packages to your project:
- DevExpress.Blazor
- DevExpress.AIIntegration.Blazor
- Microsoft.Extensions.AI (version 9.5.0 or later)
- Microsoft.Extensions.AI.OpenAI (version 9.5.0-preview.1.25265.7 or later)
Register the OpenAI model in the project’s entry point class:
using Azure.AI.OpenAI; using Microsoft.Extensions.AI; using OpenAI; ... string openAiApiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new InvalidOperationException("OPENAI_API_KEY environment variable is not set."); string openAiModel = "OPENAI_MODEL"; OpenAIClient openAIClient = new OpenAIClient(openAiApiKey); IChatClient openAiChatClient = openAIClient.GetChatClient(openAiModel).AsIChatClient(); builder.Services.AddChatClient(openAiChatClient);
Create an environment variable named
OPENAI_API_KEY
and set its value to your OpenAI API key. If your application throws an exception that the variable is not found, restart your IDE or terminal to ensure they load the new variable.Important
Never hardcode Open
AI API key in your source code. This is a critical security risk that can lead to unauthorized access and misuse of your account. Follow Microsoft’s guidance on safe secret management in ASP.NET Core apps. Set the
openAiModel
variable to the OpenAI model ID.
Register DevExpress Blazor services and DevExpress AI-powered extensions in the project’s entry point class:
builder.Services.AddDevExpressBlazor(); builder.Services.AddDevExpressAI();
#Azure OpenAI
Install the following NuGet packages to your project:
- DevExpress.Blazor
- DevExpress.AIIntegration.Blazor
- Microsoft.Extensions.AI (version 9.5.0 or later)
- Microsoft.Extensions.AI.OpenAI (version 9.5.0-preview.1.25265.7 or later)
- Azure.AI.OpenAI (version 2.2.0-beta.4 or later)
Register the Azure OpenAI model in the project’s entry point class:
using Azure.AI.OpenAI; using Microsoft.Extensions.AI; using System.ClientModel; ... string azureOpenAiKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_KEY") ?? throw new InvalidOperationException("AZURE_OPENAI_KEY environment variable is not set."); string azureOpenAiEndpoint = "AZURE_OPENAI_ENDPOINT"; string azureOpenAiModel = "AZURE_OPENAI_MODEL"; AzureOpenAIClient azureOpenAIClient = new AzureOpenAIClient( new Uri(azureOpenAiEndpoint), new ApiKeyCredential(azureOpenAiKey) ); IChatClient azureOpenAiChatClient = azureOpenAIClient.GetChatClient(azureOpenAiModel).AsIChatClient(); builder.Services.AddChatClient(azureOpenAiChatClient);
Create an environment variable named
AZURE_OPENAI_KEY
and set its value to your Azure OpenAI key. If your application throws an exception that the variable is not found, restart your IDE or terminal to ensure they load the new variable.Important
Never hardcode Azure Open
AI key in your source code. This is a critical security risk that can lead to unauthorized access and misuse of your account. Follow Microsoft’s guidance on safe secret management in ASP.NET Core apps. Set the
azureOpenAiEndpoint
variable to your Azure OpenAI endpoint.- Set the
azureOpenAiModel
variable to the Azure OpenAI model ID.
Register DevExpress Blazor services and DevExpress AI-powered extensions in the project’s entry point class:
builder.Services.AddDevExpressBlazor(); builder.Services.AddDevExpressAI();
#Ollama
Install the following NuGet packages to your project:
- DevExpress.Blazor
- DevExpress.AIIntegration.Blazor
- Microsoft.Extensions.AI (version 9.5.0 or later)
- OllamaSharp
Register the self-hosted AI model in the project’s entry point class:
using Microsoft.Extensions.AI; using OllamaSharp; ... string aiModel = "MODEL_NAME"; IChatClient chatClient = new OllamaApiClient("http://localhost:11434", aiModel); builder.Services.AddChatClient(chatClient);
Set the
aiModel
variable to the name of your Ollama model.Register DevExpress Blazor services and DevExpress AI-powered extensions in the project’s entry point class:
builder.Services.AddDevExpressBlazor(); builder.Services.AddDevExpressAI();
#Semantic Kernel
The Semantic Kernel SDK provides a common interface to interact with different AI services. The Kernel communicates with AI services through AI Connectors, which expose multiple AI service types from different providers.
Semantic Kernel works with an ecosystem of ready-to-use connectors which support leading AI models from OpenAI, Google, Anthropic, DeepSeek, Mistral AI, Hugging Face, and more. You can also build custom connectors for any other service, such as your in-house language models.
The following example connects DevExpress AI-powered Extensions for Blazor to Google Gemini through the Semantic Kernel SDK:
Note
The Google chat completion connector is currently experimental. To acknowledge this and use the feature, you must explicitly suppress the compiler warnings with the #pragma warning disable
directive.
- Sign in to Google AI Studio.
- Create an API key.
Install the following NuGet packages to your project:
Register the Gemini model in the project’s entry point class:
using Microsoft.Extensions.AI; using Microsoft.SemanticKernel; using Microsoft.SemanticKernel.ChatCompletion; ... string geminiApiKey = Environment.GetEnvironmentVariable("GEMINI_API_KEY") ?? throw new InvalidOperationException("GEMINI_API_KEY environment variable is not set."); string geminiAiModel = "GEMINI_MODEL"; #pragma warning disable SKEXP0070 var kernelBuilder = Kernel .CreateBuilder() .AddGoogleAIGeminiChatCompletion(geminiAiModel, geminiApiKey); Kernel kernel = kernelBuilder.Build(); #pragma warning disable SKEXP0001 IChatClient geminiChatClient = kernel.GetRequiredService<IChatCompletionService>().AsChatClient(); builder.Services.AddChatClient(geminiChatClient);
Create an environment variable named
GEMINI_API_KEY
and set its value to your Gemini API key. If your application throws an exception that the variable is not found, restart your IDE or terminal to ensure they load the new variable.Important
Never hardcode API keys in your source code. This is a critical security risk that can lead to unauthorized access and misuse of your subscriptions. Follow Microsoft’s guidance on safe secret management in ASP.
NET Core apps. Set the
geminiAiModel
variable to the Gemini model ID.
Register DevExpress Blazor services and DevExpress AI-powered extensions in the project’s entry point class:
builder.Services.AddDevExpressBlazor(); builder.Services.AddDevExpressAI();
#Configure Inference Parameters
To control the AI model’s behavior and creativity, set inference parameters using IChatClient
options. These parameters are configured once when you register the IChatClient
service in the project’s entry point class. The settings then apply to all DevExpress AI-powered features, ensuring a consistent tone and style across your app.
The following code snippet configures an Azure OpenAI client that is moderately creative, avoids repeating itself, and produces reasonably detailed but not excessively long responses:
AzureOpenAIClient azureOpenAIClient = new AzureOpenAIClient(
new Uri(azureOpenAiEndpoint),
new ApiKeyCredential(azureOpenAiKey)
);
IChatClient azureOpenAIChatClient = azureOpenAIClient.GetChatClient(azureOpenAiModel).AsIChatClient();
IChatClient chatClient = new ChatClientBuilder(azureOpenAIChatClient)
.ConfigureOptions(options => {
options.Temperature = 0.7f;
options.MaxOutputTokens = 1200;
options.PresencePenalty = 0.5f;
})
.Build();
builder.Services.AddChatClient(chatClient);
Note
A specific IChat
implementation might have its own internal representation of options. It may use a subset of options or ignore the provided options entirely.
#Verify AI Service Connectivity
To verify connectivity with the configured AI service, add the DxAIChat component into your application.
@using DevExpress.AIIntegration.Blazor.Chat
@page "/"
@rendermode InteractiveServer
<PageTitle>DevExpress Blazor AI Chat</PageTitle>
<DxAIChat />
Send a test prompt and confirm a response is received.
#Troubleshooting
This section describes common AI integration issues and steps you can follow to diagnose and resolve these issues. If the solutions listed here do not help, create a ticket in our Support Center and attach a reproducible sample project.
The AI chat responds with an “Internal Server Error“ message.
- Verify that the model name, API key, endpoint, and other AI service registration parameters are correct.
- For cloud AI providers, make sure you are online and that your firewall allows access to the provider’s endpoint.
- Confirm that the self-hosted language model service (for example, Ollama) is active and responsive.
“Environment variable is not set“ exception in Visual Studio.
- If you store AI service registration parameters in environment variables, confirm that all necessary environment variables are set.
- Restart Visual Studio to detect the newly created environment variable.