Freigeben über


Answering Questions

Matt asked a lot of good questions in the comments about yesterdays post on the automation tools I wrote last week. Instead of answering in comments, I thought I'd just write a whole new post.

First:

Disregarding implementation details, how would you design a way to automate the results of your testing program?
I read your Test Automation Article and the best way to automate the verification would be to use a 'visual comparison tool'.
Say you had such a tool. Forgetting implementation, what would this tool do here? Your Automation article says visual comparison tools just compare against a master image, but is that a plausible verification?

Yeah, since I'm working with graphical drawing output, you'd have to use some kind of tool to determine if the drawing is right or not. I do have such a tool, but decided not to use it because the return on investment isn't there in this case. This hits on one of the key problems with automated testing and verification. The actual automated part of the testing isn't very good at finding new bugs. In order to verify the results of the automation you need an expected result. And mostly the only way to get an expected result is to do it by hand (or at the very least do it as you develop the automation.)

In this case I would need to run through the automation once. Verify that all the output is correct the first time, use those images as masters, then on future automation runs compare the output against the masters and fail if they don't match. The tricky part about this is you need a working stable system before you can bring the automation online. The system I'm working on now is not that, improvements are made every day that change the output.

So here's that key problem about automated testing. Automation is great at finding regressions, but is not very good at extending your test coverage to find new bugs. Or more accurately, it's hard to design automation so that it saves you time in extending that coverage. It will always cost more to build automation to extend coverage then it does to just test those new things by hand. Where you save time is if you need to test that extended coverage over and over again, automation is great at that.

In this case automation allows me to extend coverage easily and quickly, as long as I do the verification by hand. If I tried to use automated verification the system would cost more in maintenance time then it would save in testing time. Remember, when and how to build automation is all about return on investment. It's about making you more efficient in your testing, and everything should be measured with that bar.

But wait, there's more:

My guess as to the only proper tool that verifies correctly is one that uses a mix of computer vision and artificial intelligence. The computer vision code will find objects within the image, and the AI code will construct a neural net and back prop from the expected positioning of the objects and calculate the offset of the perceived objects.
FINAL QUESTION:
Do you think computer vision and AI are the proper computer science concepts to use for automated verification?

I think you're on the right track here. Some of the more advanced tools I've seen do things similar to this, but none have been able to remove the need for a master image/expected result. You can take an image, and pull elements out of that image, then compare those elements against elements in your master to reduce some of the problems inherent in testing - but trying to do it without something to compare to is a whole different world.

If I were going off to school to get a PhD in Computer Science this is the kind of thing I'd be thinking about for my doctoral thesis. And if at the end of that long, grueling process I had something that sort of, kind of worked in controlled circumstances I'd feel really good about myself.

Needless to say, I view this as an incredibly hard problem. That said, if someone can solve it I'd be wonderfully grateful - they'd be making a huge advance in software testing.

Chris

Comments