PDA

View Full Version : Fight Global Warming with GPU Computing and C++!


News
07-02-11, 07:30 AM
Several weeks ago, Microsoft announced C++ AMP (http://blogs.nvidia.com/2011/06/microsoft-going-all-in-on-gpu-computing/) , an extension to C++ for GPU computing.¬* As a C++ enthusiast and GPU architect at NVIDIA, I couldn‚??t be more excited. Visual C++, one of my favorite programming tools, is being updated to work with the parallel processors I help design!¬* I feel this really validates all the hard work we have invested in GPUs over the years.

This week, Microsoft and NVIDIA co-hosted an event where we had the chance to talk to Silicon Valley C++ developers about C++ AMP and CUDA being two sides of the same coin: NVIDIA‚??S CUDA is optimized for high performance while C++ AMP will be optimized for productivity.

We started things off with guest speaker Herb Sutter, the chief native languages architect at Microsoft (pictured above).¬* Herb‚??s a terrific speaker and he made a compelling case for C++ AMP.¬* Earlier in the day, Herb had keenly pointed out that the crushing performance-per-watt advantage of C++ over competing programming languages means that solutions based on C++ are the greenest possible!¬* Hence my headline.

http://blogs.nvidia.com/wp-content/uploads/2011/07/herb-sutter-blog-post-crowd-pic--1024x768.jpg

Now as much as I‚??ll admit that C++ AMP will be sexy, with its elegant syntactical curves, the reality is that the future is here now for CUDA developers.¬* After Herb‚??s talk, a team of NVIDIANs presented the basics of CUDA as well as Thrust (http://code.google.com/p/thrust/) , a CUDA library of parallel algorithms that makes programs more concise and human-readable.¬* Eyebrows were raised with interest. People ‚??got‚?? it. Thrust has this effect on people.

Developers can‚??t get their hands on C++ AMP quite yet, at least not until Microsoft‚??s BUILD conference this fall, but you can download CUDA and Thrust today at www.nvidia.com/getcuda (http://www.nvidia.com/getcuda).

¬*



More... (http://blogs.nvidia.com/2011/07/fight-global-warming-with-gpu-computing-and-c/)