Parallel Extensions CTP and the Parallel Computing Developer Center [Judd Hall]
The CLR Team has been working with the Parallel Computing Platform Team for the past year on some innovative ideas in parallel computing. Yesterday, the Parallel Computing Platform Team announced the Parallel Computing Developer Center along with their first Community Technology Preview (CTP) of Parallel Extensions to the .NET Framework. We encourage you to download this early release CTP and provide feedback so that we can grow this technology together.
Parallel Extensions is a managed programming model for data parallelism, task parallelism, and coordination on parallel hardware unified by a common work scheduler. As such, it makes it easier for you to write programs that scale to take advantage of parallel hardware—providing improved performance as the numbers of cores and processors increase—without having to deal with many of the complexities of today’s concurrent programming models. It does so via library-based support for introducing concurrency into applications written with any .NET language, including but not limited to C# and Visual Basic.
Two major components in Parallel Extensions are the Task Parallel Library (TPL), and Parallel LINQ (PLINQ), a technology extending the Language Integrated Query (LINQ) technology introduced in .NET 3.5. As such, the CTP requires .NET Framework 3.5.
With TPL, you get the concept of Tasks, Futures, and Parallel loops, for starters. So you can take the following:
for (int i = 0; i < 100; i++) {
a[i] = a[i]*a[i];
}
And make it scalable across all the processors available:
Parallel.For(0, 100, delegate(int i) {
a[i] = a[i]*a[i];
});
Similarly, with PLINQ, you get a query execution engine that accepts any LINQ-to-Objects or LINQ-to-XML query and automatically utilizes multiple processors or cores for execution when they are available. As such, you can take a simple LINQ query:
IEnumerable<T> data = ...;
var q = data.Where(x => p(x)).Orderby(x => k(x)).Select(x => f(x));
foreach (var e in q) a(e);
And scale it:
IEnumerable<T> data = ...;
var q = data.AsParallel().Where(x => p(x)).Orderby(x => k(x)).Select(x => f(x));
foreach (var e in q) a(e);
And behind it all is a work-stealing task scheduler to reduce thread starvation — and this scheduler interleaves TPL tasks with PLINQ queries on-the-fly.
There are limitations of course, mostly related to making sure your parallel operations are independent and the such. And there are known correctness bugs. As such, it’s worth checking out the extensive documentation posted on the Parallel Computing Developer Center. There you will find links to articles and videos, and tons of samples.
Links:
- Parallel Computing Developer Center
- Parallel Extensions to the .NET Framework CTP download
- Announcement on Soma’s Weblog
Comments
Anonymous
November 30, 2007
PingBack from http://kientifikoloko.consulting23.info/2007/11/30/parallel-extensions-ctp-and-the-parallel-computing-developer-center-judd-hall/Anonymous
December 01, 2007
<a href="http://0341c2.com">490c61</a> | [url=http://202608.com]6d7505[/url] | [link=http://ecac36.com]1dede0[/link] | http://29964e.com | e055e4 | [http://132e82.com 2b1eae]Anonymous
December 04, 2007
Great job you guys! I'm going to download this right now since I have this new PC with 4 processors here to see what cool things I can build here...Anonymous
December 05, 2007
I watched the channel 9 video on this. Me and the guys I work with have been anticipating something like this for awhile now, and it sounds like MS has been thinking about it to. So I'm excited to hear about this. However, it sounds like tasks will have to be completely independent, without any type of scheduling based on resource usage. I have always hoped for something that allows tasks to define other "dependent" tasks that must complete first. This would allow the tasks to be scheduled dynamically in such a way as to guarantee some order of operation(and allow the developer to make some assumptions about the state of shared resources), but still allow concurrency. Maybe this is it, but it sounded like shared state was out of the question, and the closing attitude was that tasks will have to be completely independent and developers will just have to change the way they do things. I always think about what I've read about the internals of how database engines perform scheduling, and it just seems like they are way ahead of the game on making concurrency with shared state easy.Anonymous
December 06, 2007
Time for another weekly round-up of developer news that focuses on .NET, agile and general development