The Video Renderer Connection Process

The video renderer is the last filter in the video pipe, and it is responsible for displaying the output of upstream filters. The video renderer is just a controller for the underlying display driver, and does not do any processing on the image samples themselves.

The video renderer operates in two distinct modes:

  • GDI
  • DirectDraw

When the graph is first connected, the video renderer always tries to connect using GDI, and for that, it will need a connection with an RGB media type that matches the display format of the primary monitor. Just when it goes into Paused mode, the video renderer will try to allocate surfaces using DirectDraw. This dual mode of operation was envisioned so as to always have a fall back plan in case DirectDraw surfaces were not available in some circumstances.

Choosing an Accelerated Media Type

When the video renderer goes into Paused mode, it is time to allocate the DirectDraw surfaces. The video renderer will do so by enumerating all media types of the upstream filter, and then trying to allocate a surface matching that media type. For instance, let's assume the upstream filter is the WMV DMO. It currently supports the following output media types (the preferred media type is the first one):

  • YV12
  • NV12
  • YUY2
  • I420
  • IYUV
  • UYVY
  • YVYU
  • RGB565
  • RGB555
  • RGB32
  • RGB24
  • RGB8

The video renderer will try to allocate flipping overlay surfaces first, then non-flipping surfaces:

    • For each media type of the upstream filter, in the order dictated by the upstream filter
      • try to allocate a flipping surface of that media type
      • If it succeeds, call QueryAccept on the upstream filter's output pin
      • If it succeeds, use it
    • If the previous didn't succeed, try to allocate a primary flipping surface (if enabled)
      • If it succeeds, call QueryAccept on the upstream filter's output pin
      • If it succeeds, use it
    • If the previous didn't succeed, for each media type of the upstream filter, in the order dictated by the upstream filter
      • try to allocate a surface (not flipping) of that media type
      • If it succeeds, call QueryAccept on the upstream filter's output pin
      • If it succeeds, use it

In this way, if the upstream filter has optimized for certain YUV formats, it can control the choice of media type. In case the display driver can also provide a surface of that type, the accelerated media type is chosen. The whole process is driven by the upstream filter, with the display driver in a passive role.

Dynamic Format Changes from the Video Renderer

Of course, for an optimal pipe, we would like to always have overlay flipping surfaces available. Nevertheless, that may not be the case in some situations. For instance, depending on the display driver capabilities, the flipping overlay may just be available when the user is watching the video at its original size. When the user is stretching or shrinking, overlays might not be available. This is controlled by the DirectDraw hardware capabilities dwMinOverlayStretch and dwMaxOverlayStretch (see https://msdn2.microsoft.com/en-us/library/aa915204.aspx). So, if the display driver doesn't support overlay stretching, and the video renderer is currently using overlays, it will need to swap to GDI (and thus to RGB format), so that GDI will do the necessary scaling.

Note that every time the upstream filter requests a new buffer from the video renderer, the video renderer will try to return a DirectDraw buffer. If all the conditions to use the DirectDraw buffer are OK (clipping, stretching, video memory, etc.), then it will use it. Just in case one of the conditions fail it will resort to using GDI.

Debugging Video Renderer Connection Problems

We have seen some common connection problems when initially bringing up new decoder filters and/or capture drivers:

  • Color space converter is inserted in the graph
  • Video renderer doesn't connect
  • YUV surfaces are not being used, just GDI

Analyzing the DirectShow logs 

The first step in this case is to turn on the debug zones for the DirectShow DLL, quartz.dll, and observe the connection and video renderer messages. 

Run your test scenario, and save the debug output. Look for the section that says "Filter Graph Dump", and verify which filters got inserted in the graph. Here's an example of a filter graph dump:

Filter graph dump
Filter 1a199a30 'Video Renderer' Iunknown 1a199a20
    Pin 1a199f10 Input (Input) connected to 1a0e1880
Filter 1a0e1200 'WMVideo & MPEG4 Decoder DMO' Iunknown 1a0e11f0
    Pin 1a0e16e0 in0 (Input) connected to 1a0e0600
    Pin 1a0e1880 out0 (PINDIR_OUTPUT) connected to 1a199f10
    Pin 1a0e1a00 ~out1 (PINDIR_OUTPUT) connected to 0
Filter 1a0ecc60 'ASF ICM Handler' Iunknown 1a0ecc50
    Pin 1a0ecd70 In (Input) connected to 1a0aa3a0
    Pin 1a0e0600 Out (PINDIR_OUTPUT) connected to 1a0e16e0
Filter 1a0ec240 'Audio Renderer' Iunknown 1a0ec230
wo: GetPin, 0
    Pin 1a0ec4e0 Audio Input pin (rendered) (Input) connected to 1a0eb880
Filter 1a0eb220 'WMAudio Decoder DMO' Iunknown 1a0eb210
    Pin 1a0eb660 in0 (Input) connected to 1a0ea800
    Pin 1a0eb880 out0 (PINDIR_OUTPUT) connected to 1a0ec4e0
Filter 1a0e9380 'ASF ACM Handler' Iunknown 1a0e9370
    Pin 1a0e9490 In (Input) connected to 1a0aa000
    Pin 1a0ea800 Out (PINDIR_OUTPUT) connected to 1a0eb660
Filter 1a0a2ae0 '\Hard Disk2\clips\wmv\0-1.asf' Iunknown 1a0a2ad0
    Pin 1a0aa000 Stream 1 (PINDIR_OUTPUT) connected to 1a0e9490
    Pin 1a0aa3a0 Stream 2 (PINDIR_OUTPUT) connected to 1a0ecd70
End of filter graph dump

After that, verify which media type the video renderer is using when trying accelerated mode (and if it succeeded). Search for "Allocating video resources":

Allocating video resources
Initialising DCI/DirectDraw
Searching for direct format
Entering ReleaseSurfaces
Entering HideOverlaySurface
Enumerated 32315659
Entering FindSurface
Entering GetMediaType
Not a RGB format
Entering CreateYUVFlipping
Entering CheckCreateOverlay
GWES Hook fails surface creation. IDirectDraw::CreateSurface fails.
No surface
Entering ReleaseSurfaces
Entering HideOverlaySurface
Enumerated 3231564e
Entering FindSurface
Entering GetMediaType
Not a RGB format
Entering CreateYUVFlipping
Entering CheckCreateOverlay
GWES Hook fails surface creation. IDirectDraw::CreateSurface fails.
No surface
Entering ReleaseSurfaces
Entering HideOverlaySurface
Enumerated 32595559
Entering FindSurface
Entering GetMediaType
Not a RGB format
Entering CreateYUVFlipping
Entering CheckCreateOverlay
Entering InitOverlaySurface
Entering InitDrawFormat
Entering InitDrawFormat
Entering GetDefaultColorKey
Returning default colour key
Entering InitDefaultColourKey
Entering SetSurfaceSize
Preparing source and destination rectangles
Entering ClipPrepare
Entering InitialiseClipper
Entering InitialiseColourKey
overlay color key on
Colour key
No palette
Found AMDDS_YUVFLP surface
Proposing output type  M type MEDIATYPE_Video  S type MEDIASUBTYPE_YUY2

Note in the above log that the video renderer tried to create surfaces in the order specified by the WMV DMO. For the display driver in use for the above log, it managed to create a YUY2 surface, the third option for the WMV decoder. The last section in this blog entry has more information about FourCC codes.

Here are some solutions for common connection problems we have faced in the past.

Color Space Converter is inserted in the graph 

The number one problem is that the upstream filter doesn't report any RGB format, just YUV formats. If that's the case, the video renderer can't connect directly to the filter since it requires a matching RGB format. Usually, the color space converter will be inserted in the graph in these cases. We don't want this to happen, as it will imply a memory copy of each frame buffer, so we want to make sure the upstream filter does provide RGB formats.

Sometimes the color converted gets inserted in the graph even though the upstream filter does support the needed RGB format. This can happen because the upstream filter is requiring an alignment different than 1 when the allocator is being decided. Currently, the video renderer will just accept 1-byte alignments.

Another common reason for the color converter to be inserted in the graph is when the BITMAPINFOHEADER supplied by the upstream filter doesn't contain the bitmasks correctly at the end of BITMAPINFOHEADER that is passed when getting the output media types. Please make sure that the bitmasks are inserted correctly. For instance, for RGB565, we should have:

        *pdwBitfield++ = 0xF800; // Red – 5

 *pdwBitfield++ = 0x07E0; // Green - 6

 *pdwBitfield = 0x001F; // Blue - 5

Graph doesn't connect at all

If the upstream filter just supports a subset of YUV formats, and none of these are recognized by the color space converter, then it won't be possible at all to connect the video renderer. Again, in this case the solution is for the upstream filter to provide RGB formats.

YUV Surfaces are not used, just GDI

Another common occurrence is for the upstream filters to provide allocators. If this is the case, the video renderer will be tied to not using DirectDraw (as it can't pass upstream memory buffers to DirectDraw). If we want the optimal overlay flipping path, the video renderer *needs* to be the allocator, so that it is possible for it to provide DirectDraw surfaces upstream.

Surface Types: Controlling Which Surfaces the Video Renderer Creates

There are ways to control which accelerated surfaces the video renderer is allowed or not to create that are useful when debugging the connection process, specially to reduce the number of options and the number of tries in the display driver. This is controlled via a registry key (see https://msdn2.microsoft.com/en-us/library/aa930626.aspx):

HKEY_LOCAL_MACHINE\Software\Microsoft\DirectX\DirectShow\Video Renderer\SurfaceTypes

The following table shows the AMDDS values for use with the SurfaceTypes named value.

Flag Hexadecimal value Description

AMDDS_NONE

0x00

No support for Device Control Interface (DCI) or DirectDraw.

AMDDS_DCIPS

0x01

Use DCI primary surface.

AMDDS_PS

0x02

Use DirectDraw primary surface.

AMDDS_RGBOVR

0x04

RGB overlay surfaces.

AMDDS_YUVOVR

0x08

YUV overlay surfaces.

AMDDS_RGBOFF

0x10

RGB off-screen surfaces.

AMDDS_YUVOFF

0x20

YUV off-screen surfaces.

AMDDS_RGBFLP

0x40

RGB flipping surfaces.

AMDDS_YUVFLP

0x80

YUV flipping surfaces.

AMDDS_ALL

0xFF

Use all available surfaces.

AMDDS_DEFAULT

0xFF

Use all available surfaces.

AMDDS_YUV

0xA8

(AMDDS_YUVOFF | AMDDS_YUVOVR | AMDDS_YUVFLP)

AMDDS_RGB

0x58

(AMDDS_RGBOFF | AMDDS_RGBOVR | AMDDS_RGBFLP)

AMDDS_PRIMARY

0x03

(AMDDS_DCIPS | AMDDS_PS)

If you just want to enable YUV overlay flipping surfaces for debugging purposes, you should set the SurfaceTypes registry key to AMDDS_YUVFLP. Remember to turn on all surfaces back on after you finished debugging your problem...

About FourCC codes:

Note that in the example log we list the FOURCC codes that are being used in the line "Enumerated 32315659" . Here's how to map this hex number into a character sequence that will help identify the code:

 Enumerated 32315659

0x32 = '2', 0x31 = '1', 0x56 = 'V', 0x59 = 'Y' ===> 0x32315659 = YV12

 Enumerated 3231564e

0x32 = '2', 0x31 = '1', 0x56 = 'V', 0x4e = 'N' ===> 0x3231564e = NV12

 Enumerated 32595559

0x32 = '2', 0x59 = 'Y', 0x55 = 'U', 0x59 = 'Y' ===> 0x32595559 = YUY2

 Enumerated 56555949

0x56 = 'V', 0x55 = 'U', 0x59 = 'Y', 0x49 = 'I' ===> 0x56555949 = IYUV

 Enumerated 59565955

0x59 = 'Y', 0x56 = 'V', 0x59 = 'Y', 0x55 = 'U' ===> 0x59565955 = UYVY

 Enumerated 55595659

0x55 = 'U', 0x59 = 'Y', 0x56 = 'V', 0x59 = 'Y' ===> 0x55595659 = YVYU

Also, the file in public\directx\sdk\inc\uuids.h contains several FOURCC media subtypes definitions:

  • // 32595559-0000-0010-8000-00AA00389B71 'YUY2' == MEDIASUBTYPE_YUY2
  • OUR_GUID_ENTRY(MEDIASUBTYPE_YUY2,
  • 0x32595559, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xaa, 0x00, 0x38, 0x9b, 0x71)
  • // 55595659-0000-0010-8000-00AA00389B71 'YVYU' == MEDIASUBTYPE_YVYU
  • OUR_GUID_ENTRY(MEDIASUBTYPE_YVYU,
  • 0x55595659, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xaa, 0x00, 0x38, 0x9b, 0x71)
  • // 59565955-0000-0010-8000-00AA00389B71 'UYVY' == MEDIASUBTYPE_UYVY
  • OUR_GUID_ENTRY(MEDIASUBTYPE_UYVY,
  • 0x59565955, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xaa, 0x00, 0x38, 0x9b, 0x71)
  • // 31313259-0000-0010-8000-00AA00389B71 'Y211' == MEDIASUBTYPE_Y211
  • OUR_GUID_ENTRY(MEDIASUBTYPE_Y211,
  • 0x31313259, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xaa, 0x00, 0x38, 0x9b, 0x71)
  • // 32315659-0000-0010-8000-00AA00389B71 'YV12' == MEDIASUBTYPE_YV12
  • OUR_GUID_ENTRY(MEDIASUBTYPE_YV12,
  • 0x32315659, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xaa, 0x00, 0x38, 0x9b, 0x71)
  • // 36313259-0000-0010-8000-00AA00389B71 'YV16' == MEDIASUBTYPE_YV16
  • OUR_GUID_ENTRY(MEDIASUBTYPE_YV16,
  • 0x36315659, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xaa, 0x00, 0x38, 0x9b, 0x71)
  • // 56595549-0000-0010-8000-00AA00389B71 'IUYV' == MEDIASUBTYPE_IUYV
  • OUR_GUID_ENTRY(MEDIASUBTYPE_IUYV,
  • 0x56595549, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xaa, 0x00, 0x38, 0x9b, 0x71)
  • // 3231564E-0000-0010-8000-00AA00389B71 'NV12' == MEDIASUBTYPE_NV12
  • OUR_GUID_ENTRY(MEDIASUBTYPE_NV12,
  • 0x3231564E, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xaa, 0x00, 0x38, 0x9b, 0x71)
  • // 30323449-0000-0010-8000-00AA00389B71 'I420' == MEDIASUBTYPE_I420
  • OUR_GUID_ENTRY(MEDIASUBTYPE_I420,
  • 0x30323449, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xaa, 0x00, 0x38, 0x9b, 0x71)
  • // 56555949-0000-0010-8000-00AA00389B71 'IYUV' == MEDIASUBTYPE_IYUV
  • OUR_GUID_ENTRY(MEDIASUBTYPE_IYUV,
  • 0x56555949, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xaa, 0x00, 0x38, 0x9b, 0x71)

Please do leave feedback, and let us know if this has been useful. Thanks,

Lucia

Comments

  • Anonymous
    June 26, 2007
    Thanks for the good artical But I found a very strange bug about the video renderer filter it will fail to work properly in full-screen mode by  using Windows MediaPlayer 10 Mobile I wrote a transform filter to decode H264 bistream,let's take the following scenario into account: The dimension of H264 bistream is 320x240, they display very fine if I never swtich on full-screen mode on Windows Media Player mobile In full-screen mode, it is scaled and displayed in 240x180 which is layouted on the middle zone of the screen. however, it should be rotated 90 degree to fit the full screen. Is it the potential bug of video renderer? If it is not the bug of video renderer, I need get the message to know whether the full-screen mode is enabled, so I have a chance to rotate the picture by 90 degree to fit the full screen my mail is : dev [at] fastreaming.com Thanks

  • Anonymous
    July 11, 2007
    I'm looking for a point of clarification.  When you wrote: The video renderer will try to allocate flipping overlay surfaces first, then non-flipping surfaces: For each media type of the upstream filter, in the order dictated by the upstream filter try to allocate a flipping surface of that media type Did you mean, "try allocate an overlay flipping surface"?  

  • Anonymous
    July 24, 2007
    Hi, I am deseperatly trying to know if it's possible on windows mobile platform to use Directx in windowed mode ? I suppose this blog is the right  place to do it. Please help a poor developper...

  • Anonymous
    July 25, 2008
    The comment has been removed

  • Anonymous
    January 14, 2009
    Hi! I am trying to enable debug zone for quartz.dll. Here is my process:

  1. add new REG_DWORD item "quartz" and set the value as 0xf in     HKEY_CURRENT_USERPegasusZones
  2. Add flag COMPILE_DEBUG=1 in   C:WINCE600PUBLICDIRECTXsources.cmn
  3. rebuild solution After booting my platform. I cannot find item quartz.dll in VS2005's debug zone window. Did I missing something? Thanks~
  • Anonymous
    February 10, 2009
    Hi, Is dynamic format change supported in wince 6.0 video renderer ? I am trying to change vide size after the grpah is built and running, but all the efforts failed.(windows XP, I tried same steps it worked !!) So was wondering if it is supported in Wince 6.0? Regards krt