Skip to main content
All docs
V25.2
  • Secret Management for Blazor AI Components

    • 3 minutes to read

    Never hardcode AI provider access keys, credentials, or API endpoints directly in your source code. Hardcoded secrets introduce a major security vulnerability as code can be exposed, leading to data breach events and/or unauthorized service usage.

    To mitigate these risks, adopt a secret management strategy based on your application hosting model.

    Blazor Server

    For Blazor Server applications, all application logic is executed on the server and secrets are never exposed on the client’s browser. You can manage secrets as follows:

    • For local development, use the User Secrets Manager. This .NET tool stores sensitive data in a secrets.json file outside of your project folder and ensures secrets are not committed to source control.
    • Use environment variables to supply secrets to your application. You can set these variables on the host server or through a CI/CD pipeline.
    • For the highest level of security, use Azure Key Vault or an equivalent from another cloud provider.

    Blazor WebAssembly & Hybrid

    Blazor WebAssembly (WASM) apps run on the client side. Once the browser downloads the code to a device, users can decompile it and access sensitive information. To mitigate security risks, route all requests to the AI provider through a backend API proxy:

    1. Create a separate ASP.NET Core Web API or serverless function to handle all communications with your third-party AI provider. Visual Studio automatically creates this backend if you select the ASP.NET Core hosted option for a new Blazor WebAssembly project.
    2. When implementing backend API, follow secret management best practices for Blazor Server applications to ensure access keys/credentials are stored and used securely.
    3. In your Blazor WASM application, call backend API instead of an external AI provider. This API should send authenticated requests to an external AI model. This pattern ensures that no sensitive information is exposed on the client-side.
    using System.Text;
    
    /* ... */
    
    // Fetch Azure OpenAI secrets from the appsettings.json file
    var aiUri = builder.Configuration.GetSection("AzureOpenAISettings")["Endpoint"];
    var aiKey = builder.Configuration.GetSection("AzureOpenAISettings")["Key"];
    var aiModel = builder.Configuration.GetSection("AzureOpenAISettings")["DeploymentName"];
    if (string.IsNullOrEmpty(aiUri) || string.IsNullOrEmpty(aiKey) || string.IsNullOrEmpty(aiModel))
        throw new InvalidOperationException("Cannot fetch secrets from 'appsettings.json'");
    // Registers the app's chat service
    builder.Services.AddChatClient(aiUri, aiKey, aiModel);
    // Enables API controllers and MVC endpoints alongside Razor components
    builder.Services.AddMvc();
    
    /* ... */
    
    // Declare a server-side proxy so Blazor WASM can call Azure OpenAI without exposing the key in the browser
    app.MapPost("/api/chat/{*path}", async (string path, HttpContext context, CancellationToken ct) => {
        var httpClientFactory = context.RequestServices.GetRequiredService<IHttpClientFactory>();
        var client = httpClientFactory.CreateClient();
        client.BaseAddress = new(aiUri);
        client.DefaultRequestHeaders.Authorization = new("Bearer", aiKey);
        var newPath = path.Replace("proxychat", aiModel);
        var endpointUri = new Uri(aiUri);
        var uriBuilder = new UriBuilder(endpointUri) {
            Path = $"{endpointUri.AbsolutePath}/{newPath}",
            Query = context.Request.QueryString.Value
        };
        var body = await new StreamReader(context.Request.Body).ReadToEndAsync(ct);
        var response = await client.PostAsync(uriBuilder.Uri,
            new StringContent(body, Encoding.UTF8, "application/json"), ct);
        context.Response.StatusCode = (int)response.StatusCode;
        await response.Content.CopyToAsync(context.Response.Body, ct);
    });
    

    Tip

    The DevExpress Template Kit automatically generates the AI proxy when you specify the following ASP.NET Core Blazor application parameters:

    • Add views to the application: select the AI Chat tile
    • Render Mode: Select WebAssembly or Auto

    You can also create this setup with the .NET CLI:

    dotnet new dx.blazor --name MyBlazorServerApp --interactivity Auto --add-views AIChat