DirectX demos still get my juices flowing
On Tuesday I attended Mike Pelton's session about how to choose between WPF and DirectX for rendering 3D graphics. He began by illustrating WPF's strengths, and specifically called out data binding, the similarities between the 2D and 3D APIs, and the ease with which WPF developers can perform common activities like hit testing.
But then he uncorked some DirectX goodness, and thereby underscored, as his slide put it, "what DirectX can do that Avalon can't." (Mike rocks enough that he still calls the Windows Presentation Foundation by its cooler code name, Avalon.)
What DirectX can do that Avalon (WPF) Can't:
- Shaders. (and therefore:)
- Advanced lighting models.
Why you should consider WPF:
- The entry barrier is much higher in DX and getting higher
- Two-monthly code drops of the DX SDK, with breaking changes not at all uncommon (he's dead right on this one)
- The DX documentation is 'distributed'
If you're interested in 3D graphics, you're likely to reach a point where you're aware of what visual functionality WPF, in version 1.0, doesn't expose to the developer. For instance, WPF v1.0 doesn't allow you to use normal maps (or bump maps) on 3D objects. This is a very common technique for adding detail to a model without increasing the polygon count.
If you have feedback on what features you'd like to see in WPF v.Next, send it to someone on the WPF team, like Rob Relyea. They're doing v.Next analysis right now. And I'm not coming down hard on WPF - it's not meant to be a games engine - and this is a subject I've written about previously myself - the WPF vs. DirectX choice.
What DirectX can do:
DirectX can access your graphic card's Programmable Pipeline to do things like:
- Anisotropic lighting (avoids cost of textures for look-up tables)
- Membrane shaders (shaders for balloons, skin, etc.)
- Kubelka-Munk shaders (which take into account light that penetrates the surface)
- Procedural Geometry (compositing meshes with procedural ones (spheres) to simulate muscles moving under the skin.
- and of course, whatever eles you can dream of. It's a Programmable Pipeline after all! :)
...but, as Mike put it, "it's a brave person who gets into shader programming."
Some of the really cool demos Mike showed us:
The Pre-computed Radiance Transfer Engine, which compares Pre-computed Radiance Transfer lighting results (computing multiple bounces of light through the scene from all directions), against the standard lighting equations. You can find this in the DirectX SDK. Look how the light from behind yer man's head is glowing through his ears, in real time. To put this in perspective, remember them telling us a few years ago how awesome it was when gollum, in not-real time, exhibited effects like this?
The High Dynamic Range lighting demos, using the High Dynamic Range Pipeline demo, also from the DirectX SDK. Check it out, here it is, explained in a forum, by the MVP who built the demo.
The High Dynamic Range Cubemap demos, also from the SDK.
Skinned animations with deformable meshes, also from the DirectX SDK. I mean, deformable meshes are pretty basic stuff to graphics engine designers, but they're not WPF v1.0 territory.
The Nvidia "Luna" demo, which he ran it real time on his PC. Here is part of it, in a video. But it's only really impressive if you can get it to run on your PC in real-time.
There are days when i'd still give all of this up to go back to somewhere closer to the front lines of graphics and visualization.
Right now I feel as displaced from the front lines as General Sir Anthony Cecil Hogmanay Melchett.
Comments
- Anonymous
November 09, 2006
The comment has been removed - Anonymous
November 09, 2006
AQ, that's the spirit :)