Compartilhar via


DirectShow Playback Tests (Compact 2013)

3/26/2014

The DirectShow Playback and Latency Tests validate the ability to run the DirectShow Performance Tests and provide latency measurements. These tests check for correct playback of the file and duration, complete playback of the file, and End-to-End Playback tests. These tests fall into two categories:

*Build and playback testing: Ensure that you can create a graph for a particular media clip. Before you run the DirectShow Performance Tests, you must run and pass these tests. If you cannot build a graph for your clip and play the file from beginning to end in a reasonable time period then, then these tests as well as the performance tests do not apply to you.

*Latency testing: Provide latency measurements from playing a clip from when a frame leaves the decoder until it is displayed, as well as during state changes (play/pause/stop). These are performance metrics that help you build your device and software. Again, these tests will not work if a clip cannot pass the build and playback tests above.

These tests do not provide media. The XML files use a placeholder filename which must be modified to play the files you provide. You should provide media appropriate to your device and sufficient to thoroughly test it.

Test Prerequisites

Your device must meet the following requirements before you run this test.

The following table shows the hardware requirements for the DirectShow Playback and Latency Tests.

Requirement

Description

Windows Embedded Compact powered device with network card and drivers

The test device must have network access

Audio and video card, as needed

Depends on the device capabilities

Network equipment

Hubs/routers/switches, cables, and so forth

Server with content

Provides content for testing

Hard disk, if needed for local media playback

May have some other storage device, as well

The following table shows the software requirements for the DirectShow Playback and Latency Tests.

Requirement

Description

Tux.exe

Test harness, required for testing

Kato.dll

Logging engine, required for logging data

Graphtest.dll

Test library

CETK_playback_tests.xml

Sample xml file is provided; modify as needed

Media content

Files you wish to play back

Audio, video, and network drivers

As needed for the test device

Quartz.dll

Library required for DirectShow operation

The following table shows the SYSGENs that you may need to build into your operating system.

SYSGEN

Condition

BSP_NOSHAREETH

Required

SYSGEN_DSHOW

SYSGEN_DSHOW_WMT

SYSGEN_DSHOW_HTTPSTREAMER

Required for DirectShow, along with Quartz.dll, as mentioned above

SYSGEN_DDRAW

Needed if you want the video renderer to run in DirectDraw; video drivers and cards must also support DirectDraw

SYSGEN_ATAPI (or other driver, as appropriate)

Hard disk driver, needed for streaming from a hard disk. If you want to stream from another kind of storage device, you must include the SYSGEN with the appropriate drivers.

SYSGEN_DSHOW_MMS

Needed to run MMS tests

SYSGEN_DSHOW_WMV

Needed to test wmv content

SYSGEN_DSHOW_WMA

Needed to test wma content

SYSGEN_DSHOW_MP3

SYSGEN_DSHOW_MPEGA

SYSGEN_DSHOW_MPEGV

SYSGEN_DSHOW_MPEGSPLITTER

Needed

In addition, you must have any SYSGENs needed to include the codecs or other components for running other content of interest.

Subtests

The table below lists the subtests included in this test.

SubTest ID

Description

0

Graph building - adding a source filter-Verify that source filter gets added, for this particular media

1

Graph building - preferential filter loading-Verify that the correct graph gets built, forcing certain filters to be present

2

Graph building - Render pin to a complete graph-Verify that we can render a graph from the source filter onwards

3

Graph building - intelligent connect source to sink-Verify that we can connect the Source Filter to the renderer

4

Graph building - build graph for supported media-Verify that RenderFile will build the correct graph

5

Graph building - build graph for supported media & query interfaces-Verify that the correct interfaces exist in the graph

100

Graph playback - manual playback test-playback a clip, manually verify and pass⁄fail the test

101

Graph playback - manual playback test-copy a clip to hard disk first and then playback clip, manually verify and pass⁄fail test

102

Graph playback - end-end playback test-verify duration, verify it played till end

200

Graph playback - end-end playback test-verify startup latency

201

Graph playback - end-end playback test-verify startup latency, higher bitrate content

300

Graph playback - state change test-verify state change (Play to Pause) and measure latency

301

Graph playback - state change test-verify state change (Random state changes) and measure latency

302

Graph playback - state change test-verify state (Play to Stop) change and measure latency

303

Graph playback - run, pause, run-verify latency of first sample to come out of the decoder, after calling Run

Setting Up the Test

The following procedure describes how to set up the test environment.

Setting up the test environment for the DirectShow Playback and Latency Tests

1.Set up a network (preferably private, to eliminate noise)

2.Set up a server with your media content and a Windows Media server for MMS and intelligent HTTP streaming, as required for your particular testing needs.

3.Ensure that your device has access to the server and its content.

4.Modify the IP address and server name in the XML file to point to the servers.

5.Modify the <Media></Media> tags n the XML file to point to valid media. By default, this points to a dummy location.

Note: The above steps are optional. The media to be tested can be streamed over HTTP, MMSU or from the release directory, hard disk drive. The location of the media to be tested is specified in the xml file using the tags mentioned below. When media clip is located in the release or hard disk drive, the setup steps mentioned above are not required.

"MediaFile Location Tags in the XML:"

<IIS_HTTP></IIS_HTTP>

<WMS_HTTP></WMS_HTTP>

<MMSU></MMSU>

<release></release>

<Disk></Disk>

Running the Test

"Creating and Modifying the XML File for the DirectShow Playback and Latency Tests"

A media XML file can be found under private\test\multimedia\directx\dshow\playback\tests\XMLSuites\CETK_playbacktests.xml. You can update the file provided with your correct server paths, or create your own and update the command line to point to your XML file; see Command Line Parameters for the DirectShow Playback and Latency Tests.

Add or remove tests from the XML suite to change the test cases that run. You can also update parameters for the tests using the XML file, as described below. It is good practice to change the XML file as part of the source enlistment; then when you rebuild your source directory, the XML file in the source will overwrite the XML file on the device or release directory.

The XML schema is as follows.

copy code:

<?xml version="1.0" encoding="utf-8" ?>

<TestConfig Name="" Desc="" GenerateId=>

<MediaList>

<Media>

<Name></Name>

<FileName></FileName>

<Description></Description>

<BaseUrl>

<IIS_HTTP></IIS_HTTP>

<WMS_HTTP></WMS_HTTP>

<MMSU></MMSU>

<release></release>

<Disk></Disk>

</BaseUrl>

</Media>

</MediaList>

<TestList>

<TestGroup Name="" TestIdBase="">

<Test Name="" Desc="">

<TestID></TestID>

<Media></Media>

<DownloadTo></DownloadTo>

<FilterList></FilterList>

<PositionList></PositionList>

<StateChangeSequence Desc="Type,HowMany,TimeBetween"></StateChangeSequence>

<Verify>

<CorrectGraph></CorrectGraph>

<PlaybackDuration></PlaybackDuration>

<StartupLatency></StartupLatency>

<VerifyStateChangeLatency></VerifyStateChangeLatency>

<DecodedVideoLatencyRunToFirstSample></DecodedVideoLatencyRunToFirstSample>

<DecodedVideoLatencyPauseToFirstSample></DecodedVideoLatencyPauseToFirstSample>

</Verify>

</Test>

</TestGroup>

</TestList>

</TestConfig>

The root node of the XML config file contains the TestConfig node. The TestConfig node contains a media list node MediaList and a test list node TestList. The media list can be wholly contained within the config file or can refer to a different file which contains the actual list of media. The test list can be organized into Test and TestGroup nodes. Test nodes contain the name of the test to be run, the media to run on and the parameters for the test. TestGroup nodes serve to logically group together Test nodes. They also serve to specify the test id base for the group of test nodes which then get numbered sequentially following the test id base.

The following table lists the XML nodes, starting with the root node, and describes what each node can contain.

XML Node

Schema

TestConfig - the root node of the config file which contains a optional media list and a test list

<TestConfig optional:GenerateId="true">Optional:<MediaList>… </MediaList>Mandatory:<TestList>… </TestList></TestConfig> Attribute: GenerateId="true" - enables automatic generation of test ids. All other attributes are ignored

MediaList - lists the media to be used or contains a pointer to a separate file containing the media list.

<MediaList Container="filename"> The Container attribute is optional. If it is not present then at least one media element must be present<Media>… </Media> </MediaList>Attribute:Container="filepath", the path and filename

Media - gives detail about the media clip

<Media><Name>friendly name to be referred in the xml file</Name> <FileName>name of the actual file</FileName><Description>description of clip(informative only)</Description><BaseUrl><IIS_HTTP>http path of file on IIS server</IIS_HTTP> <WMS_HTTP>http path of file on WMS server</WMS_HTTP> <MMSU>mms path of file</MMSU> <release>release path</release> <Disk>path of file on storage media </Disk></BaseUrl> </Media>

TestList - lists the tests to be run.

<TestList>Contains 0 or more TestGroup or Test nodes</TestList>

TestGroup - allows tests to be logically grouped and also allows specifying test base id

<TestGroup Name="groupname" TestIdBase="base">Contains 0 or more Test nodes </TestGroup> Attribute:TestIdBase contains the base id for the test group. This is used if GenerateId is set to "true" in TestConfig and all the tests in the TestGroup node will be numbered sequentially starting from the base id.

Verify - lists the verifiers to be used

<Verify>Contains 0 or more verifier nodes</Verify> These verifier nodes are specified in the table following this one.

TestId - lists test ids to be used if automatic generation is not enabled.

<TestID>DWORD</TestID>

Media - lists the (1 or more) media+protocol to be used

<Media>medianame:protocol,medianame:protocol,…<Media>Medianame is the name given in the <Name> tag of the <Media> object.

DownloadTo - specifies the location where the media will be staged before being used

<DownloadTo>directory path</DownloadTo>

PositionList - lists the positions (as % of duration) to be used

<PositionList>pos1,pos2,…,posN</PositionsList> .Each posN is a number between 0:100 that represents a position in the media as a percentage of the duration. For instance, if the duration is 10 seconds, then pos=30 represents 3 seconds.

FilterList -lists the filter names

<FilterList>NSSOURCE:…:VIDREND</FilterList>The filter names are actually the "wellknownname" of the filter to the test. The only filters accepted are the ones specified in the test binary. Currently accepted values: WMADEC, WMVDEC, MP43DEC, MPGDEC, MP3DEC, MP13DECDMO, MP13DEC, MP2ADEC, MP2VDEC, MP2VDEC, MP2VENC, AVISPLIT, WAVPAR, AVISPLIT, MPGSPLIT, MPEG2DMUX, MPEG2Demux, AUDREND, VIDREND, ICMDISP, ACMDISP, NSSOURCE, ASYNCRDR, URLRDR, CLRCONV, AVIMUX, MPEGMUX, OVMIXER, FILEWR. For a full list of these filters and details, look at the map structure at line 122 of Playback\Framework\FilterDesc.cpp in the test source code.

StateChangeSequence - specifies the state change pattern and number of state changes

<StateChangeSequence>InitialState, Sequence, NumStateChanges, DelayBetweenChangesInMilliseconds</StateChangeSequence>InitialState can be Stopped, Paused, Running. Sequence can be PlayPause, PlayStop, PauseStop, RandomSequence.

Test - specifies the actual test to be run and the parameters.

<Test Name="testname" Desc="desc">params</Test>.Attributes:Name specifies the test function to be run. This string is associated with the test function in an internal function table. Desc is concatenated to the generic description of the test when listing out tests. The <Test> tag contains parameters specific to the type of test description expected for this test. These parameters are specified in a table below.

The following table shows the parameters expected for the <Test> tag.

Parameter expected

Description

Playback test parameters

Contains nodes in the following order. Optional nodes can be omitted, but the order must be maintained. Optional: <TestId>DWORD</TestId>Mandatory: <Media>one or more entries of the form "medianame:protocol" separated by ‘,'</Media>Optional: <DownloadTo>download path</DownloadTo>Optional: <PositionList>start position, stop position</PositionList>Optional: <Verify>…</Verify>

Build test parameters

Contains nodes in the following order. Optional nodes can be omitted, but the order must be maintained. Optional: <TestId>DWORD</TestId>Mandatory: <Media>one or more entries of the form "medianame:protocol" separated by ‘,'</Media>Optional: <DownloadTo>download path</DownloadTo>Optional: <FilterList> list of filters, separated by ‘:' </FilterList>Optional: <Verify>…</Verify>

State change test parameters

Contains nodes in the following order. Optional nodes can be omitted, but the order must be maintained. Optional: <TestId>DWORD</TestId>Mandatory: <Media>one or more entries of the form "medianame:protocol" separated by ‘,'</Media>Optional: <DownloadTo>download path</DownloadTo>Mandatory: <StateChangeSequence>…</StateChangeSequence>Optional: <Verify>…</Verify>

"Command Line Parameters for the DirectShow Playback and Latency Tests :"

The following table shows the command line parameter for the DirectShow Playback and Latency Tests.

Command line parameter

Description

/Config filename

Specifies the filename, and optionally the path, of the xml file to be used in running the test. If no path is provided with the filename, the test will first try to open the file from the \Windows folder and then from the current folder the test is running from.

"Tests in the Test Binary"

The following table shows the individual tests found in the test binary.

Test

Description

EmptyGraphQueryInterfaceTest

Graph build. Queries an empty filter graph for common interfaces. Fails the test if some of the common interfaces aren't returned. The essential interfaces are considered to be "IMediaControl, IMediaEvent, IMediaSeeking".

AddSourceFilterTest

Graph build. Adds a source filter for media that is supported. Fails the test if unsuccessful.

AddUnsupportedSourceFilterTest

Graph build. Adds a source filter for media that is unsupported. Fails the test if successful.

BuildGraphTest

Graph build. *Starts verification. *Builds a filter graph using RenderFile for specified media. *Gets verification results. *Fails the test if unsuccessful or if verification fails. Verifiers: *Handles BuildGraphLatency within the test,*Any appropriate verifier.

BuildGraphMultipleTest

Graph build. Repeatedly builds a filter graph using RenderFile for specified media. The number of repetitions is 10 by default. Fails the test if any of the RenderFile calls is unsuccessful.

BuildGraphQueryInterfaceTest

Graph build. Builds a filter graph using RenderFile for specified media. Fails the test if unsuccessful. Queries the graph for essential interfaces. Fails the test if any of the essential interfaces aren't returned - the essential interfaces are considered to be IMediaControl, IMediaEvent, IMediaSeeking.

BuildGraphUnsupportedMediaTest

Graph build. Builds a filter graph using RenderFile for media that is unsupported. Fails the test if successful.

BuildMultipleGraphTest

Graph build. Builds multiple filter graphs for the list of media specified in the config file for this test. Fails the test if building the graph fails for any of the media specified.

RenderPinTest

Graph build. Adds a source filter for the media that is supported and renders all the unconnected pins of the source filter. The test fails if adding the source filter does not succeed or rendering the unconnected pins does not succeed

ConnectSourceFilterToRendererTest

Graph build. Retrieves the source and renderer filters specified in config file. Adds the source and renderer filters. Loads the source filter with the specified media url. Connects the zeroth pins of the source and renderer filter using intelligent connect. Test fails if adding the source and renderer filter fails or if setting the media fails or if connecting the source and renderer fails.

BuildGraphPreLoadFilterTest

Graph build. Retrieve the list of filters to be preloaded specified in the config file. Load the filters into an empty graph. RenderFile the specified media. Verify that the pre-loaded filters have at-least one input pin (if any) and output pin (if any) connected.

ManualPlaybackTest

Graph playback. Builds graph for the media specified. Sets start and stop positions (if any), Starts playback and waits for completion with a timeout of twice the expected playback duration. Asks the tester if playback was successful with no or few glitches? Returns pass/fail depending on response.

PlaybackTest

Graph playback. Builds graph for the media specified. Enables specified verifiers. VerifyPlaybackDuration is handled intrinsically. Sets start and stop positions (if any), Starts verification. Changes graph state to Running. Waits for completion with a timeout of twice the duration. Gets verification results. Fails the test if wait for completion fails or any of the verifiers fail. Verifiers: Do not mix VerifyPlaybackDuration and StartupLatency. The test internally uses StartupLatency to determine playback duration.

PlaybackDurationTest

Graph playback. Builds graph for the media specified. Enables VerifyPlaybackDuration verifier, Sets start and stop positions (if any), Starts verification, Changes graph state to Running, Waits for completion with a timeout of twice the duration, Query the verifier for the first sample time and the EOS time. Calculate the EC completion latency time and the actual playback duration from first sample to EOS and compare against a default threshold.

MultiplePlaybackTest

Graph playback. For each of the media specified, instantiate a test graph object and set the media. Build graphs for each of the media, Get event handles from each of the test graphs, Change each of the test graphs to state Running, Wait for completion on all the handles with infinite timeout, Test succeeds the wait succeeds.

StateChangeTest

State change. Builds graph for the media specified. VerifyStateChangeLatency is handled intrinsically. Change states according to the sequence and count specified. Fails the test the state change latency was over the threshold or if any of the state-changes fail. Verifiers: Only VerifyStateChangeLatency

Run_Pause_Run_Test

State change. Builds graph for the media specified. Enables specified verifiers. Starts verification. Sets the graph in Running state. Waits, Sets the graph in Paused state. Waits, Start verification. Set the graph in running state. Get verification results. Fails the test if any of the state changes fail or any of the verifiers fail. Verifiers: Any appropriate verifier.

Verifying the Test

"Verifiers in the Test Binary"

Most of the verifiers work by inserting a tap filter in the existing DirectShow filter graph. When the verifier is created, it decides where it wants a tap filter inserted and the test graph will add it for the verifier (for example, between the video decoder and the video renderer). The verifier then registers callbacks into the tap filter.

The tap filter does not do anything to the samples and messages passing through it. It acts as a passthrough and just passes it as is to its output pin. However, every time it gets a sample or a message, it signals the registered callback into the verifiers. Thus the verifiers know of what is happening in the graph, including what sample is currently coming out of the filter and can verify as needed.

The following table shows the verifiers included in the test binary.

Verifier

Supported verifications

DecoderOutputLatencyVerifier

"DecodedVideoLatencyPauseToFirstSample" . Measures the latency from the next Pause to the first sample received after the pause at the output of the video decoder. "DecodedVideoLatencyRunToFirstSample" . Measures the latency from the next Run to the first sample received after the Run at the output of the video decoder.

GraphBuildVerifier

"CorrectGraph".This verifier checks if the specified filters are in the filter-graph.

PlaybackDurationVerifier

"VerifyPlaybackDuration". This measures the time from the start of the verifier to the first sample time and then to the time that EOS is received. The caller is responsible for taking the measurements and verifying.

SampleDeliveryVerifier

"VerifySampleDelivered". This verifies that a sample is delivered at a specified location in the graph after a specified event.

StartupLatencyVerifier

"StartupLatency". This measures the latency of the first sample to be received after the verifier is started.

Troubleshooting the Test

Check that the XML file is configured as desired. The default values in the XML file will usually fail. This is especially true of the latency and duration thresholds, which are set to arbitrary values. Update the XML file to reflect the desired latency. The test will fail if the latency experienced is larger than the threshold specified in the XML file.

In general, the tests may fail if:

*The URL is inaccessible, or if the xml file points to an invalid URL.

*The server containing the media is not accessible.

*The exact URL on the server is inaccessible.

*The network is inaccessible or the proxy or firewall is configured incorrectly.

*The correct SYSGENs needed to pull in the necessary DirectShow components are missing.

*You do not have or cannot initialize COM.

*The test device lacks a storage device, or the XML file is not pointing to the storage device or the media clip.

*The test device has run out of memory.

*The command line points to the wrong XML file.

The following table shows common reasons for failure for each test case.

Test case

Typical reasons for failure

0

DirectShow cannot find an appropriate source filter for the URL.

1

A filter in the <FilterList> cannot be loaded. Cannot create an end-to-end filter graph of the URL.

2

DirectShow cannot find an appropriate source filter for the URL. Cannot render the unconnected pins all the way to the renderer.

3

Xml file does not specify a source filter, or the source filter is unknown. Xml file does not specify a renderer filter, or the renderer filter is unknown. The source filter or the renderer specified cannot be loaded. Cannot connect the pins of the source filter, through various transform filters if needed, to the renderer specified.

4

The URL cannot be rendered and the test cannot create an end-to-end, completely connected graph. All the filters in the <CorrectGraph> are not present in the connected graph.

5

The URL cannot be rendered and the test cannot create an end-to-end, completely connected graph. Any of the essential interfaces are not returned.

100

The URL cannot be rendered and the test cannot create an end-to-end, completely connected graph. Cannot set the playback position to play to and from. Cannot run the graph successfully. It takes longer to playback than expect (calculated based of off clip duration). Tester clicks No on the message box asking to verify playback.

101

Cannot copy the clip locally (URLMON is used to do this); failure could be due to:Lack of space on the device in the location specified. The location specified in the <DownloadTo> tag does not exist. The clip is too big to fit in the URLMON cache. The URL cannot be rendered and the test cannot create an end-to-end, completely connected graph. Cannot set the playback position to play to and from. Cannot run the graph successfully. It takes longer to playback than expected (calculated based of off clip duration). Tester clicks No on the message box asking to verify playback.

102

The URL cannot be rendered and the test cannot create an end-to-end, completely connected graph. Cannot set the playback position to play to and from. Cannot run the graph successfully. It takes longer to playback than expected (calculated based of off clip duration). The playback duration is not within the threshold specified. The threshold can be specified in the <PlaybackDuration> tag as a comma separated list. First number is the percent deviation allowed on total playback time. Second number is the deviation allowed in ms. If the difference between expected playback and actual playback is not covered by either of these two thresholds, then the test fails.

200

The URL cannot be rendered and the test cannot create an end-to-end, completely connected graph.

201

Cannot set the playback position to play to and from. Cannot run the graph successfully. It takes longer to playback than expected (calculated based of off clip duration). The startup latency is not within the specified threshold. Can configure this threshold via the <StartupLatency></StartupLatency> tag. Specify a latency in ms. Currently this is set at 300 ms.

300

301

302

The URL cannot be rendered and the test cannot create an end-to-end, completely connected graph. Cannot run the graph successfully. Cannot change states. State change latency is not within expected threshold (in ms). This value can be modified via the <VerifyStateChangeLatency> tag. The value is specified in ms. Currently set at 2000 ms. Test measures the latency for every state change it does, plus calculates and prints out an average state change latency over all the runs.

303

The URL cannot be rendered and the test cannot create an end-to-end, completely connected graph. Cannot run the graph successfully. Cannot change the states successfully. Decoder run to first sample latency is not within expected threshold (in ms). This value can be modified via the <DecodedVideoLatencyRunToFirstSample> tag. The value is specified in ms. Currently set at 300 ms.

See Also

Other Resources

Multimedia - Video Tests