Compartilhar via


IMediaObject::GetInputMaxLatency (Windows CE 5.0)

Send Feedback

This method retrieves the maximum latency on a specified input stream.

HRESULT GetInputMaxLatency(DWORDdwInputStreamIndex,REFERENCE_TIME* prtMaxLatency);

Parameters

  • dwInputStreamIndex
    Zero-based index of an input stream on the DMO.
  • prtMaxLatency
    [out] Pointer to a variable that receives the maximum latency.

Return Values

Returns an HRESULT value. Possible values include the following.

Value Description
S_OK Success
DMO_E_INVALIDSTREAMINDEX Invalid stream index
E_FAIL Failure
E_NOTIMPL Not implemented. Assume zero latency.

Remarks

The latency is the difference between a time stamp on the input stream and the corresponding time stamp on the output stream. The maximum latency is the largest possible difference in the time stamps. For a DMO, determine the maximum latency as follows:

  • Process input buffers until the DMO can produce output.
  • Process as many output buffers as possible.
  • The maximum latency is the largest delta between input time stamps and output time stamps, taken as an absolute value.

Under this definition, latency does not include the time that it takes to process samples. Nor does it include any latency introduced by the size of the input buffer.

For the special case where a DMO processes exactly one sample at a time, the maximum latency is simply the difference in time stamps.

Latency is defined only when samples have time stamps and the time stamps increase or decrease monotonically. Maximum latency might depend on the media types for the input and output streams.

Requirements

OS Versions: Windows CE .NET 4.1 and later.
Header: Dmo.h.
Link Library: Dmoguid.lib.

See Also

IMediaObject | IMediaObject::SetInputMaxLatency | DMO_E_INVALIDSTREAMINDEX

Send Feedback on this topic to the authors

Feedback FAQs

© 2006 Microsoft Corporation. All rights reserved.