Jaa


Live streaming with Azure Media Services v3

Media Services logo v3


Warning

Azure Media Services will be retired June 30th, 2024. For more information, see the AMS Retirement Guide.

Azure Media Services enables you to deliver live events to your customers on the Azure cloud. To stream your live events with Media Services, you'll need to set up a live video encoder that converts signals from a camera (or another device, like a laptop) into a contribution feed that is sent to Media Services. The contribution feed can include signals related to advertising, such as SCTE-35 markers. For a list of recommended live streaming encoders, see live streaming encoders.

If you haven't used an on-premises encoder before, try the Create an Azure Media Services live stream with OBS quickstart.

Dynamic packaging and delivery

With Media Services, you can take advantage of dynamic packaging, which allows you to preview and broadcast your live streams in MPEG DASH, HLS, and Smooth Streaming formats from the contribution feed. Your viewers can play back the live stream with any HLS, DASH, or Smooth Streaming compatible players. See the list of tested players and try the Media Services 3rd-party player samples.

Live event types

Live events are ingest and process live video feeds. A live event can be set to either:

  • pass-through when an on-premises live encoder sends a multiple bitrate stream, or
  • live encoding when an on-premises live encoder sends a single bitrate stream. For details about live outputs, see Live events and live outputs.

Pass-through

When using the pass-through Live Event (basic or standard), you rely on your on-premises live encoder to generate a multiple bitrate video stream and send that as the contribution feed to the Live Event (using RTMP or fragmented-MP4 input protocol). The Live Event then passes the incoming video stream to the dynamic packager (Streaming Endpoint) without any further processing. A pass-through Live Event is optimized for long-running live events or 24x365 linear live streaming.

pass through streaming

Live encoding

To use live encoding, configure your on-premises live encoder to send a single bitrate video (up to 32Mbps aggregate) to the Live Event (using RTMP or fragmented-MP4 input protocol). The Live Event transcodes the incoming single bitrate stream into multiple bitrate video streams at varying resolutions. This improves delivery for playback devices with industry standard protocols like MPEG-DASH, Apple HTTP Live Streaming (HLS), and Microsoft Smooth Streaming.

live encoding streaming

Live event options

Dynamic encryption

Dynamic encryption enables you to dynamically encrypt your live or on-demand content with AES-128 or any of the three major digital rights management (DRM) systems: Microsoft PlayReady, Google Widevine, and Apple FairPlay. Media Services also provides a service for delivering AES keys and DRM (PlayReady, Widevine, and FairPlay) licenses to authorized clients. For more information, see dynamic encryption.

Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.

Dynamic filtering

Dynamic filtering is used to control the number of tracks, formats, bitrates, and presentation time windows that are sent out to the players. For more information, see filters and dynamic manifests.

Live transcription

Live transcription is a feature you can use with live events that are either pass-through or live encoding. For more information, see live transcription. When this feature is enabled, the service uses the Speech-To-Text feature of Cognitive Services to transcribe the spoken words in the incoming audio into text. This text is then made available for delivery along with video and audio in MPEG-DASH and HLS protocols.

Important

You should use GOP sizes of 2 seconds for live events. You must use GOP sizes of 4 seconds or below for passthrough live events with live transcriptions in order to get correct transcription data. If you choose to use higher GOP size, the transcription data might have defects, e.g. missing content.

Security considerations for closed captions, subtitles, and timed-metadata delivery

The dynamic encryption and DRM features of Azure Media Services has limits to consider when attempting to secure content delivery that includes live transcriptions, captions, subtitles, or timed-metadata. The DRM subsystems, including PlayReady, FairPlay, and Widevine do not support the encryption and licensing of text tracks. The lack of DRM encryption for text tracks limits your ability to secure the contents of live transcriptions, manual inserted captions, uploaded subtitles, or timed-metadata signals that may be inserted as separate tracks.

To secure your captions, subtitles, or timed-metadata tracks, follow these guidelines:

  1. Use AES-128 Clear Key encryption. When enabling AES-128 clear key encryption, the text tracks can be configured to be encrypted using a full "envelope" encryption technique that follows the same encryption pattern as the audio and video segments. These segments can then be decrypted by a client application after requesting the decryption key from the Media Services Key Delivery service using an authenticated JWT token. This method is supported by the Azure Media Player, but may not be supported on all devices and can require some client-side development work to make sure it succeeds on all platforms.
  2. Use CDN token authentication to protect the text (subtitle, captions, metadata) tracks being delivered with short form tokenized URLs that are restricted to geo, IP, or other configurable settings in the CDN portal. Enable the CDN security features using Verizon Premium CDN or other 3rd-party CDN configured to connect to your Media Services streaming endpoints.

Warning

If you do not follow one of the guidelines above, your subtitles, captions, or timed-metadata text will be accessible as un-encrypted content that could be intercepted or shared outside of your intended client delivery path. This can result in leaked information. If you are concerned about the contents of the captions or subtitles being leaked in a secure delivery scenario, reach out to the Media Services support team for more information on the above guidelines for securing your content delivery.

Live streaming workflow

To understand the live streaming workflow in Media Services v3, you have to first review and understand the following concepts:

General steps

  1. In your Media Services account, make sure the streaming endpoint (origin) is running.

  2. Create a live event.
    When creating the event, you can specify to autostart it. Alternatively, you can start the event when you are ready to start streaming.
    When autostart is set to true, the Live Event will be started right after creation. The billing starts as soon as the Live Event starts running. You must explicitly call Stop on the live event resource to halt further billing. For more information, see live event states and billing.

  3. Get the ingest URL(s) and configure your on-premises encoder to use the URL to send the contribution feed.
    See recommended live encoders.

  4. Get the preview URL and use it to verify that the input from the encoder is actually being received.

  5. Create a new asset object.

    Each live output is associated with an asset, which it uses to record the video into the associated Azure blob storage container.

  6. Create a live output and use the asset name that you created so that the stream can be archived into the asset.

    Live Outputs start on creation and stop when deleted. When you delete the Live Output, you are not deleting the underlying asset and content in the asset.

  7. Create a streaming locator with the built-in streaming policy types.

    To publish the live output, you must create a streaming locator for the associated asset.

  8. List the paths on the streaming locator to get back the URLs to use (these are deterministic).

  9. Get the hostname for the streaming endpoint (Origin) you wish to stream from.

  10. Combine the URL from step 8 with the hostname in step 9 to get the full URL.

  11. If you wish to stop making your live event viewable, you need to stop streaming the event and delete the streaming locator.

  12. If you are done streaming events and want to clean up the resources provisioned earlier, follow the following procedure.

    • Stop pushing the stream from the encoder.
    • Stop the live event. Once the live event is stopped, it will not incur any charges. When you need to start it again, it will have the same ingest URL so you won't need to reconfigure your encoder.
    • You can stop your streaming endpoint, unless you want to continue to provide the archive of your live event as an on-demand stream. If the live event is in stopped state, it will not incur any charges. However, if the streaming endpoint is still running, it will incur charges.

The asset that the live output is archiving to, automatically becomes an on-demand asset when the live output is deleted. You must delete all live outputs before a live event can be stopped. You can use an optional flag removeOutputsOnStop to automatically remove live outputs on stop.

Tip

See Live streaming tutorial, the article examines the code that implements the steps described above.

Other important articles

Live streaming FAQ

See the live streaming questions in the FAQ.

Get help and support

You can contact Media Services with questions or follow our updates by one of the following methods: