Share via


DirectShow Performance Tests (Compact 2013)

3/26/2014

The DirectShow Performance Tests measure the capabilities of a multimedia device. They can be used in developing a new multimedia device or in gauging the performance of a component, such as a codec, in an end-to-end playback environment. The DirectShow Performance Tests are geared primarily toward testing video performance, since this is typically a primary performance bottleneck for such a device. They also test for audio discontinuities.

To use the DirectShow Performance Tests, you must run the same media clip encoded in varying bitrates and resolutions. By using a single standardized media clip in various bitrates and resolutions, you will gain a more accurate assessment of your Windows Embedded Compact-based device's performance. Your content should include both low-motion and high-motion video clips; the high-motion clips have more key frames and thus taxes the system more heavily. Running a range of such clips of low, medium, and high density will give you the best measurement of your device's performance. If you are not targeting a high performance or HD media device, you do not need to run the higher-resolution clips. The content range you choose to encode and test should reflect the expected media range of the Windows Embedded Compact-based device being tested.

Test Prerequisites

Your device must meet the following requirements before you run this test.

The following table shows the hardware requirements for the DirectShow Performance Tests.

Requirement

Description

Windows Embedded Compact based device, with audio and video card, network card and drivers, and storage device

The device to be tested. A storage device is necessary if you want to measure the performance of running content from the storage device.

Server with IIS and WM Server installed (or other HTTP server you wish to use in testing)

Server platform from which to stream the media content.

Hub or switch and cables as needed

Required for private network.

The following table shows the software requirements for the DirectShow Performance Tests.

Requirement

Description

Audio and video drivers for the Windows Embedded Compact based device

Hardware-dependent drivers on the Windows Embedded Compact based device being tested. These drivers will affect your performance results, so it is recommended that you use production-quality drivers.

Network drivers

Required for testing streaming scenarios. These drivers will affect your performance results, so it is recommended that you use production-quality drivers.

SYSGEN_DDRAW

Needed if you want the video renderer to run in DirectDraw. Your video drivers and cards must also support DirectDraw, or there will be a serious performance hit.

quartz.dll

Library for testing.

tux.exe

Test harness, required for test execution.

kato.dll

Logging engine, required for test execution.

dshow_glitchfree.dll

Library containing the tests.

playlist.xml

xml file specifies from where the tests get the tested media. This file must be modified with information specific to the server.

Subtests

The table below lists the subtests included in this test.

SubTest ID

Description

3001

This gathers performance data on WMS_HTTP playback and Local playback of Uncompressed A/V in an ASF Container. Uses the dshow_glitchfree test dll, and records CPU & MEM data.

Setting Up the Test

1) Modify the xml 'playlist.xml' and copy it to release directory:

The playlist.xml file specifies from where the tests get the tested media. This file must be modified with information specific to the server. By default, the test types get the media from the locations as follows:

HTTP tests look for media under http://<servername>:18080/dshowperf (HTTP over the WM server)

MMS UDP tests look for media under mmsu://<servername>/dshowperf

MMS TCP tests look for media under mmst://<servername>/dshowperf

The default media server present in the 'playlist.xml' is 'acedxmedia', but we need to change the media server to the server on which we are working, e.g. wcemedia03 & wcemedia02.

2) If you are testing with local storage, be sure the media folder is also present in the test device's storage. Copy the media content to the storage device. The default address for content in playlist.xml is \<disk storage>\dshowperf, and must be updated if you copy these files to another location.

3) Ensure your device has enough memory to run, especially for the high bit rate content. If using the standard shell, you can go to Control Panel | System | Memory and move the memory slider as far as possible to the left, leaving about 5% of the memory for object store only.

Collecting Output Logs

Performance logs will only be generated if you specify the /perflog command line option. In this case, the default log generated is dshow_pb_stats.log. For each test run, this log contains a summary consisting of clip id, pass/fail status, total frames dropped, frames rendered, and average CPU usage. Future test runs append log data onto the existing log file.

Once the testing is completed, use PerfReporter.exe to parse the log file. First, open dshow_pb_stats.log and save it as an ASCII file. Run this on the desktop side by executing the following command line:

PerfReporter.exe dshow_pb_stats.log outfile.xls

Note:

Failure to open dshow_pb_stats.log and resave it as an ASCII file will result in the parser throwing an exception.

After this completes, you can open your results file (called outfile.xls in the command line above) in Microsoft Excel. PerfReporter generates statistics by bit rate, clip name, and protocol. It also reports pass/fail based on the criteria provided in the XML file for each clip. A pass is indicated by '100' and a fail by '0'. The overall summaries fail if more than 1% of the runs in a category fail.

If the /status parameter was used, a log will be generated for each repetition. These logs are comma-separated lists with filenames beginning with dshow_pb_cpu_*. Each contains a running time log and an instantaneous CPU or memory load measurement. If dropped frames data is available, each time point will also contain statistics on frames rendered, frames dropped by the decoder, and frames dropped by the renderer. If the IAMNetworkStats interface is present, each time point will contain statistics on network packets lost, network packets recovered, and network packets received.

Running the Test

The DirectShow Performance Tests execute the tux -o -d dshow_glitchfree.dll command line on default execution. This runs all the clips listed in playlist.xml over all protocols, each clip running under each protocol once. The test as run will give dropped frame statistics upon completion but will not generate logs.

You can generate a performance log by including it in the command line. The general form is tux -o -d dshow_glitchfree.dll -c "command_specifications". The flags that can be used for the command specifications are listed in the following table. These options are case-sensitive. Default is to execute all protocols.

Command line parameter

Description

/protocol type

Specifies the protocol used to run the test. The protocol URLs must be specified in playlist.xml; see detailed information below on creating and modifying playlist.xml. If you specify multiple protocols, the test runs through each specified protocol. If multiple protocols are listed, they should be separated by commas; for example, HTTP, MMST. Valid options are: HTTP: Use HTTP, MMST: Use MMS over TCP, MMSU: Use MMS over UDP, and Local: Run content from the local storage device.

/clipid list

Specifies the clips listed in playlist.xml that should be used. If multiple clips are specified, each clip will be used in turn. If multiple clips are listed, they should be separated by commas. Default is to use all the clips listed in playlist.xml. Note: The clips specified in this parameter must be listed in playlist.xml. The clip names are case-sensitive.

/repeat n

Specifies the number of times to repeat the test for each clip across each protocol. Increasing this number can give you more stable results. Default is 1.

/perflog .

Specifies that the performance log be generated and saved. Inclusion of this switch causes generation of the log file \Release\dshow_pb_stats.log. Note: This parameter must be used exactly as specified; that is, with the exact syntax "/perflog .". There is a period (".") after the "/perflog" command, and a space between the command and the period.

/status type

Specifies the health status type to output. A log is generated for each clip every time it is run. Use of the parameter spawns a background thread to allocate memory and poll the dropped frames interface. This has the potential to adversely affect performance. See the /interval parameter below. Valid options are:

/interval n

If the /status flag is used, this specifies the interval between sampling in milliseconds. Decreasing this interval can negatively impact performance; increasing it can improve performance. If the /status flag is not used, this parameter is ignored. Default is 120.

/playlist path\xml_media_file

Specifies the name and location of the media content xml file. Default is to look for the playlist.xml file in the current running directory.

/drm true|false

Specifies whether to choose the DRM URL from the media xml file. You may specify a URL for the DRM version of a clip in the media file and then choose to run either DRM content or open content. Default is false (that is, use the regular URL for the specified protocol). Note: The true and false options of this parameter are case sensitive, and must be given in lower case.

CPU:

output CPU measurements.

MEM:

output memory load measurements.

/vrmode mode

Specifies the mode in which the video rendered should run. Valid options are:

GDI:

renderer runs in GDI mode.

DDRAW:

renderer runs in DDRAW mode. Note: The DDRAW SYSGEN must be present in the OS for the renderer to run in DDRAW mode.

/maxbackbuffers n

Specifies the number of back buffers used by the video renderer. The default is 1.

Note: This flag updates the registry value, which has a default value of 1. However, the test will not revert this registry value to its original value after test completion.

The following is an example of a recommended command line which plays all clips listed in playlist.xml:

tux -o -d dshow_glitchfree.dll -c "/repeat 10 /perflog . /status CPU,MEM /protocol HTTP,Local,MMST,MMSU"

Verifying the Test

Without any performance logging options set, the test will report the dropped frame statistics. If performance logging is enabled, it will produce a log with the dropped frame statistics and other CPU/MEM statistics. Ensure that the dropped frame number is within the acceptable threshold.

Troubleshooting the Test

You can often get better performance from your Windows Embedded Compact based device by increasing the number of back buffers used and by switching the video renderer mode used. In addition, be sure your device's network card supports DMA and that your audio, video, and network drivers and cards are production-quality.

This test may not pass 100% of the test runs, nor may it be expected to do so. However, you should anticipate a very low failure rate, no more than 1% for all the media that you expect the device to play successfully. If the test fails inordinately or will not run correctly, check the following:

1) Check that the URL works and can be correctly accessed on the device. Microsoft Internet Explorer for Windows Embedded Compact might be useful in determining this. Specifically:

-The test will fail if the media information in the XML file is not updated to point to valid URLs.

-The tests will fail if they cannot access the media server.

-The tests will fail if they cannot access the exact URL on the media server.

2) Check that the correct SYSGENs are all present in your OS build.

3) Check that that you can initialize COM.

4) Check that the URL can be rendered and you can achieve an end-to-end, completely connected graph.

5) Check that the media files can be played successfully from beginning to end.

6) Check that there is sufficient memory, especially for the higher bitrates and larger encodings.

7) Make sure the test can find the XML file, and that it parses correctly.

8) The test will also pass or fail based on the criteria specified in the <QualityControl> tag of the XML file. So if the test fails, check that the criteria listed in your XML file are valid.

For example, the following Quality Control block:

<QualityControl>

will cause the test to check each clip to see that:

- the number of frames dropped in the renderer equals or is less than the total number of frames specified in the tag (0), within the specific margin of error (0);

- the number of frames dropped in the decoder equals or is less than the total number of frames specified in the tag (0), within the specific margin of error (0);

- the number of frames drawn equals the total number of frames specified in the tag (3180), within the specific margin of error (5).

9) The test will fail if the media information in the XML file is not updated to point to valid URLs.

10) The tests will fail if they cannot access the media server.

11) The tests will fail if they cannot access the exact URL on the media server.

See Also

Other Resources

Multimedia - Video Tests