AIIntegration.ExplainAsync(IAIExtensionsContainer, ExplainRequest, CancellationToken) Method
Transforms text into more understandable terms that make complex content more accessible and understandable.
Namespace: DevExpress.AIIntegration
Assembly: DevExpress.AIIntegration.v24.2.dll
NuGet Package: DevExpress.AIIntegration
Declaration
public static Task<TextResponse> ExplainAsync(
this IAIExtensionsContainer container,
ExplainRequest request,
CancellationToken cancellationToken = default(CancellationToken)
)
Parameters
Name | Type | Description |
---|---|---|
container | IAIExtensionsContainer | The AI extensions container. |
request | ExplainRequest | The request to transform text into more understandable terms that make complex content more accessible and understandable. |
Optional Parameters
Name | Type | Default | Description |
---|---|---|---|
cancellationToken | CancellationToken | null | The token that cancels the task. |
Returns
Type | Description |
---|---|
Task<TextResponse> | The response that contains AI-generated text. |
Remarks
The following example registers an Azure OpenAI client and uses the AI-powered extension to explain originalText
:
using Azure;
using Azure.AI.OpenAI;
using Microsoft.Extensions.AI;
using DevExpress.AIIntegration;
using DevExpress.AIIntegration.Extensions;
SetEnvironmentVariables();
// Register an Azure OpenAI client
AIExtensionsContainerDefault defaultAIExtensionsContainer = RegisterAzureOpenAIClient(
Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT"),
Environment.GetEnvironmentVariable("AZURE_OPENAI_APIKEY")
);
string originalText = "Stochastic gradient descent, a variant of gradient descent, updates parameters based on individual training examples.";
var response = await defaultAIExtensionsContainer.ExplainAsync(
new ExplainRequest(originalText)
);
Console.WriteLine(response);
/* Output:
* Stochastic gradient descent is a method used in machine learning to adjust parameters of a model.
* It's similar to another method called gradient descent. The main difference is that stochastic gradient
* descent updates the parameters of the model after looking at each individual training example, one at a time,
* instead of using the entire dataset at once.
*
* For example, imagine you are trying to teach a robot to recognize apples. In stochastic gradient descent,
* you would show the robot one apple picture at a time, update its understanding based on that single picture,
* then move to the next one. In contrast, traditional gradient descent would look at all of the pictures of apples
* together before making any updates.
*/
AIExtensionsContainerDefault RegisterAzureOpenAIClient(string azureOpenAIEndpoint, string azureOpenAIKey) {
IChatClient client = new AzureOpenAIClient(
new Uri(azureOpenAIEndpoint),
new AzureKeyCredential(azureOpenAIKey))
.AsChatClient("gpt-4o-mini");
return AIExtensionsContainerConsole.CreateDefaultAIExtensionContainer(client);
}
void SetEnvironmentVariables() {
Environment.SetEnvironmentVariable("AZURE_OPENAI_ENDPOINT", {SPECIFY_YOUR_AZURE_ENDPOINT});
Environment.SetEnvironmentVariable("AZURE_OPENAI_APIKEY", {SPECIFY_YOU_AZURE_KEY});
}
See Also