Compartilhar via


Note

Please see Azure Cognitive Services for Speech documentation for the latest supported speech solutions.

SpeechRecognitionEngine.LoadGrammarCompleted Event

Raised when the SpeechRecognitionEngine finishes the asynchronous loading of a Grammar object.

Namespace:  Microsoft.Speech.Recognition
Assembly:  Microsoft.Speech (in Microsoft.Speech.dll)

Syntax

'Declaration
Public Event LoadGrammarCompleted As EventHandler(Of LoadGrammarCompletedEventArgs)
'Usage
Dim instance As SpeechRecognitionEngine
Dim handler As EventHandler(Of LoadGrammarCompletedEventArgs)

AddHandler instance.LoadGrammarCompleted, handler
public event EventHandler<LoadGrammarCompletedEventArgs> LoadGrammarCompleted

Remarks

The recognizer's LoadGrammarAsync method initiates an asynchronous operation. The SpeechRecognitionEngine raises this event when it completes the operation. To get the Grammar object that the recognizer loaded, use the Grammar property of the associated LoadGrammarCompletedEventArgs. To get the current Grammar objects the recognizer has loaded, use the recognizer's Grammars property.

If the recognizer is running, applications must use RequestRecognizerUpdate() to pause the speech recognition engine before loading, unloading, enabling, or disabling a grammar.

When you create a LoadGrammarCompleted delegate, you identify the method that will handle the event. To associate the event with your event handler, add an instance of the delegate to the event. The event handler is called whenever the event occurs, unless you remove the delegate. For more information about event-handler delegates, see Events and Delegates.

Examples

The following example creates two speech recognition grammars and constructs a Grammar object from each of the completed grammars. It then asynchronously loads the Grammar objects to the SpeechRecognitionEngine instance. The handler for the recognizer's LoadGrammarCompleted event reports the status of Grammar objects when loaded. The handler for the SpeechRecognized event reports the name of the Grammar object that was used to perform a recognition and the text of the recognition result.

using System;
using Microsoft.Speech.Recognition;

namespace SampleRecognition
{
  class Program
  {
    private static SpeechRecognitionEngine recognizer;
    public static void Main(string[] args)
    {

      // Initialize a SpeechRecognitionEngine object and set its input.
      recognizer = new SpeechRecognitionEngine(new System.Globalization.CultureInfo("en-US"));
      recognizer.SetInputToDefaultAudioDevice();

      // Add a handler for the LoadGrammarCompleted event.
      recognizer.LoadGrammarCompleted +=
        new EventHandler<LoadGrammarCompletedEventArgs>(recognizer_LoadGrammarCompleted);

      // Add a handler for the SpeechRecognized event.
      recognizer.SpeechRecognized +=
        new EventHandler<SpeechRecognizedEventArgs>(recognizer_SpeechRecognized);

      // Create the "yesno" grammar and build it into a Grammar object.
      Choices yesChoices = new Choices(new string[] { "yes", "yup", "yeah" });
      SemanticResultValue yesValue =
          new SemanticResultValue(yesChoices, (bool)true);
      Choices noChoices = new Choices(new string[] { "no", "nope", "neah" });
      SemanticResultValue noValue =
          new SemanticResultValue(noChoices, (bool)false);
      SemanticResultKey yesNoKey =
          new SemanticResultKey("yesno", new Choices(new GrammarBuilder[] { yesValue, noValue }));
      Grammar yesnoGrammar = new Grammar(yesNoKey);
      yesnoGrammar.Name = "yesNo";

      // Create the "done" grammar within the constructor of a Grammar object.
      Grammar doneGrammar =
        new Grammar(new GrammarBuilder(new Choices(new string[] { "done", "exit", "quit", "stop" })));
      doneGrammar.Name = "Done";

      // Load the Grammar objects to the recognizer.
      recognizer.LoadGrammarAsync(yesnoGrammar);
      recognizer.LoadGrammarAsync(doneGrammar);

      // Start asynchronous, continuous recognition.
      recognizer.RecognizeAsync(RecognizeMode.Multiple);

      // Keep the console window open.
      Console.ReadLine();
    }

    // Handle the LoadGrammarCompleted event. 
    static void recognizer_LoadGrammarCompleted(object sender, LoadGrammarCompletedEventArgs e)
    {
      string grammarName = e.Grammar.Name;
      bool grammarLoaded = e.Grammar.Loaded;
      bool grammarEnabled = e.Grammar.Enabled;

      if (e.Error != null)
      {
        Console.WriteLine("LoadGrammar for {0} failed with a {1}.",
        grammarName, e.Error.GetType().Name);

        // Add exception handling code here.
      }

      Console.WriteLine("Grammar {0} {1} loaded and {2} enabled.", grammarName, (grammarLoaded) ? "is" : "is not", (grammarEnabled) ? "is" : "is not");
    }

    // Handle the SpeechRecognized event.
    static void recognizer_SpeechRecognized(object sender, SpeechRecognizedEventArgs e)
    {
      Console.WriteLine("Grammar({0}): {1}", e.Result.Grammar.Name, e.Result.Text);

      // Add event handler code here.
    }
  }
}

See Also

Reference

SpeechRecognitionEngine Class

SpeechRecognitionEngine Members

Microsoft.Speech.Recognition Namespace

RecognizerUpdateReached

UnloadAllGrammars

UnloadGrammar

LoadGrammar

LoadGrammarAsync