ContentSafetyModelFactory.AnalyzeTextResult Method
Definition
Important
Some information relates to prerelease product that may be substantially modified before it’s released. Microsoft makes no warranties, express or implied, with respect to the information provided here.
Overloads
AnalyzeTextResult(IEnumerable<TextBlocklistMatch>, IEnumerable<TextCategoriesAnalysis>) |
Initializes a new instance of AnalyzeTextResult. |
AnalyzeTextResult(IEnumerable<TextBlocklistMatchResult>, TextAnalyzeSeverityResult, TextAnalyzeSeverityResult, TextAnalyzeSeverityResult, TextAnalyzeSeverityResult) |
Initializes a new instance of AnalyzeTextResult. |
AnalyzeTextResult(IEnumerable<TextBlocklistMatch>, IEnumerable<TextCategoriesAnalysis>)
- Source:
- ContentSafetyModelFactory.cs
Initializes a new instance of AnalyzeTextResult.
public static Azure.AI.ContentSafety.AnalyzeTextResult AnalyzeTextResult (System.Collections.Generic.IEnumerable<Azure.AI.ContentSafety.TextBlocklistMatch> blocklistsMatch = default, System.Collections.Generic.IEnumerable<Azure.AI.ContentSafety.TextCategoriesAnalysis> categoriesAnalysis = default);
static member AnalyzeTextResult : seq<Azure.AI.ContentSafety.TextBlocklistMatch> * seq<Azure.AI.ContentSafety.TextCategoriesAnalysis> -> Azure.AI.ContentSafety.AnalyzeTextResult
Public Shared Function AnalyzeTextResult (Optional blocklistsMatch As IEnumerable(Of TextBlocklistMatch) = Nothing, Optional categoriesAnalysis As IEnumerable(Of TextCategoriesAnalysis) = Nothing) As AnalyzeTextResult
Parameters
- blocklistsMatch
- IEnumerable<TextBlocklistMatch>
The blocklist match details.
- categoriesAnalysis
- IEnumerable<TextCategoriesAnalysis>
Analysis result for categories.
Returns
A new AnalyzeTextResult instance for mocking.
Applies to
AnalyzeTextResult(IEnumerable<TextBlocklistMatchResult>, TextAnalyzeSeverityResult, TextAnalyzeSeverityResult, TextAnalyzeSeverityResult, TextAnalyzeSeverityResult)
- Source:
- ContentSafetyModelFactory.cs
Initializes a new instance of AnalyzeTextResult.
public static Azure.AI.ContentSafety.AnalyzeTextResult AnalyzeTextResult (System.Collections.Generic.IEnumerable<Azure.AI.ContentSafety.TextBlocklistMatchResult> blocklistsMatchResults = default, Azure.AI.ContentSafety.TextAnalyzeSeverityResult hateResult = default, Azure.AI.ContentSafety.TextAnalyzeSeverityResult selfHarmResult = default, Azure.AI.ContentSafety.TextAnalyzeSeverityResult sexualResult = default, Azure.AI.ContentSafety.TextAnalyzeSeverityResult violenceResult = default);
static member AnalyzeTextResult : seq<Azure.AI.ContentSafety.TextBlocklistMatchResult> * Azure.AI.ContentSafety.TextAnalyzeSeverityResult * Azure.AI.ContentSafety.TextAnalyzeSeverityResult * Azure.AI.ContentSafety.TextAnalyzeSeverityResult * Azure.AI.ContentSafety.TextAnalyzeSeverityResult -> Azure.AI.ContentSafety.AnalyzeTextResult
Public Shared Function AnalyzeTextResult (Optional blocklistsMatchResults As IEnumerable(Of TextBlocklistMatchResult) = Nothing, Optional hateResult As TextAnalyzeSeverityResult = Nothing, Optional selfHarmResult As TextAnalyzeSeverityResult = Nothing, Optional sexualResult As TextAnalyzeSeverityResult = Nothing, Optional violenceResult As TextAnalyzeSeverityResult = Nothing) As AnalyzeTextResult
Parameters
- blocklistsMatchResults
- IEnumerable<TextBlocklistMatchResult>
The details of blocklist match.
- hateResult
- TextAnalyzeSeverityResult
Analysis result for Hate category.
- selfHarmResult
- TextAnalyzeSeverityResult
Analysis result for SelfHarm category.
- sexualResult
- TextAnalyzeSeverityResult
Analysis result for Sexual category.
- violenceResult
- TextAnalyzeSeverityResult
Analysis result for Violence category.
Returns
A new AnalyzeTextResult instance for mocking.
Applies to
Azure SDK for .NET