Freigeben über


All About Load Test Results and the Results DB (Part 1 - How Is The Data Calculated and Stored)

Test results in Visual Studio can be confusing, so I am going to try to do a very quick breakdown to help explain some of the results values you can see in Visual Studio for load tests. The below table applies to TEST TIME, TRANSACTION TIME, PAGE TIME, etc.

The column for "Performance Counter Data" refers to any values captured that are used for building graphs. Detailed Data refers to the data that is used to display values in the Tables.

Topic

"Performance Counter Data"

"Detailed Data"

More Info

https://msdn.microsoft.com/en-us/library/ms184782.aspx

https://msdn.microsoft.com/en-us/library/ms404656.aspx

How is it collected?

Agents report to the controller a value (I think it is the average time) for all iterations of a test during the previous sampled interval.

Agents store the data on the agent machine for every iteration during the test.

When is it collected?

Every Sampling Interval

Only after the test is completed

Where is it stored?

In the LoadTestPerformanceSample table

In the LoadTest*Detail table(s)

Where do the min/max/avg values come from?

These are computed against the sampled values in the LoadTestPerformanceCounterSample table

These are calculated AFTER the results from ALL agents get stored in the DB. There are stored procs that populate the LoadTest*SummaryData tables with the values.

Where is the data displayed in Visual Studio?

Only in the GRAPH tab

Only in the TABLE tab (this is why there is a slim chance you will see a MAX and Average value differ between a table and a graph. One is statistical info and the other is actual info)

What is the difference between "statistics only" and "All individual details" in the run settings?

Nothing

All details get loaded to the database REGARDLESS of the setting. The difference is whether the results get deleted from the details tables after the statistics are calculated.

The Prc_UpdateSummaryData3 S.P. is executed regardless, but the second argument passed in is a BIT that is used for determining whether to clear the tables after computing the results.

How is the percentile Data calculated?

 

Percentile data is calculated as part of the data population in the Prc_UpdateSummaryData3 execution. The total list of timing details is sorted, then the bottom value of the TOP XX% is set for the XX percent. Here's an example for 90% percent:

I have 100 values stored for the response time for transactionX, the list is sorted slowest to fastest. Then I take the Top 10 % (SQL Query), which gives me the slowest 10% of the responses (so in this case I get 10 values back). Then I get the smallest value of these (meaning the fastest of the ten percent) and make that value my 90% value.

In other words, for 100 sorted values, my 90% is the timing for the 91st slowest transaction. If I have 1000 entries, then it would be the 901st value.

Comments

  • Anonymous
    March 12, 2014
    How is the percentile Data calculated? Does this use each transaction, or is it based on your statistics collection interval? Is it generating the 90th percentile of the transaction or the 90th percentile of a 5 second average of the transaction?

  • Anonymous
    March 12, 2014
    The comment has been removed

  • Anonymous
    October 14, 2014
    Can you please share the load test SQL queries by which we can get the transaction response times and status of the transaction like pass/fail

  • Anonymous
    October 14, 2014
    Bipin, you can get the transaction summary information from the built in view "LoadTestTransactionResults2." It retrieves the data from the "LoadTestTransactionSummaryData" table (which gets populated at the end of the run) and adds the name and test/scenario info to the data. There is no pass/fail info stored for transactions. If a transaction completes, then it gets recorded. If the transaction does not complete, then there is no record stored for it. (In other words, the concept of pass/fail does not exist for transactions). I will be writing a couple more posts in the near future with more information of the DB and some customized queries.