A Resounding Success or an Uncertain Future?
A Resounding Success or an Uncertain Future?
Before I start putting my thoughts down on paper about NVIDIA’s GPU Technology Conference and the direction the industry seems to be heading, allow me share with you a quote:
“…we expect things like CUDA and CTN will end up in the same interesting footnotes in the history of computing annals – they had great promise and there were a few applications that were able to take advantage of them, but generally an evolutionary compatible computing model, such as we’re proposing with Larrabee, we expect will be the right answer long term.”
Intel’s Pat Gelsinger to CustomPC
July 2nd, 2008
Ironically, Pat isn’t working for Intel anymore and Larrabee seems to be stuck at the glorified tech demo stage but CUDA continues to march on and is gaining popularity at an almost alarming pace. We have seen applications which utilize the GPU for parallel computing tasks emerging from every corner of what used to be an industry reserved for exorbitantly-priced and power hungry CPU clusters. With more and more companies jumping on board, GPU computing in general seems to be exactly what many industries were looking for. Meanwhile, the statements of this burgeoning popularity and near-limitless potential weren’t made by NVIDIA but rather representatives we spoke to from like likes of Industrial Light and Magic / Lucasfilm, AutoDesk and other industry luminaries. That in itself says a lot about how CUDA has been received. If anything, the GPU Technology Conference proved that no matter what the naysayers have said, it is more than obvious that CUDA is real, it’s here and it isn’t going anywhere but up in popularity.
<div style="float:right;margin:6px;">
</div>One of the most interesting aspects about the GTC is that NVIDIA took a back seat and other than holding a few information sessions for DirectCompute, DX11, etc., they let the attendees take the wheel and decide where to guide the sessions. This worked out extremely well for them since it portrayed GPU computing as an inclusive medium for programmers that is easily accessible for professionals and regular consumers alike. Honestly, before going I had convinced myself that the GTC was doomed to failure and I don’t even think NVIDIA was prepared for the response they received. They expected around 800 people to attend not including about 100 attendees from the press but what they got was 1620 professionals representing a massive array of companies and government organizations.
While it is amazing to see how far the GPGPU market has come in a mere three years, there are plenty of other technologies NVIDIA has been pushing. With Intel’s Atom making some serious waves in the mobile computing and small form factor markets, NVIDIA’s ION and forthcoming ION2 are poised to ride the Atom’s coattails to serious success. People are looking for efficiency, performance and miniaturization within a PC that they can use on a daily basis and with the Atom / ION combination, they can get exactly that. The Tegra mobile chip is also making some inroads in nearly every area you can think of. Not only will it inspire innovative automotive solutions but we will soon see it in media players (the Zune HD is the first of many), GPS units and a slew of other devices.
As the focus of the industry moves towards Fermi, there are still many questions to be answered particularly about any GeForce iterations thereof. What NVIDIA needs is a price-competitive solution that is able to outperform ATI DX11 cards while offering the most in terms of value. If the benchmarks prove that Fermi is superior to the HD 5800-series, then NVIDIA will once again be on firm footing from a gamers point of view.
The majority of people reading Hardware Canucks right now are gamers and enthusiasts who use their graphics cards for playing games. Unfortunately, we sometimes have a narrow view of the GPU world that centers around review graphs and sky-high framerates but nowadays, there is more to the industry than that. We have to remember that even though NVIDIA is thinking outside of its typical gaming markets, the technologies it is developing within CUDA can and will have positive implications for the gaming community as well. In NVIDIA’s world, the role of the GPU hasn’t changed. Rather, it has evolved into a tool that can be used to not only show worlds of fantasy but also help unravel medical mysteries or be used in a studio to visualize a movie's special effects.
The fact of the matter remains that last year NVIDIA lost a small fortune, their stock value plummeted and their core markets are under constant attack from competitors. They need to turn things around in record time and while the absolute success of the GTC provides a ray of hope for the future, convincing developers to use NVIDIA’s platforms is just the first skirmish in what looks to be a battle for survival. To be honest though, if the technologies and programs associated with CUDA help to save even a single life – be it through quicker, more effective mammograms or a 3D image of the heart or predicting a levee break- then in my opinion, all the naysayers can eat their words. Why? Because NVIDIA will have proven that their formula for a GPGPU ecosystem really does work and can be used to help humanity as a whole instead of having GPUs relegated to just rendering pretty games.