Graphics chip maker
today took the wraps off of Cg. Described as “C for graphics,” Cg is a new high-level graphics programming language and is being promoted not just as a solution for Nvidia products but also comparable hardware from other companies. It’s fully cross-platform, too. Recently MacCentral spoke with Nvidia about the new technology.
Cg is intended to help ease development of applications that will leverage underutilized capabilities of Nvidia’s hardware, and while games stand to obviously benefit from Cg, that’s hardly the only application. The news immediately drew positive reactions from prominent game programmers, cinema technologists and many others in various fields that stand to benefit from the new technology.
“Cg makes it much easier for developers to create cinematic style graphics and special effects,” Jim Black of Nvidia developer marketing told MacCentral.
Nvidia’s current and future graphics technology supports vertex and pixel shaders. When utilized, such technology yields remarkably realistic lighting and shading effects in 3D games and other 3D applications, but presently, only a very small minority of programmers are using them. That’s because the barrier to entry is the way in which such technology is programmed: Using assembly code, an extremely low-level programming language that, while bringing the programmer close to the hardware, isn’t something the majority of game developers feel comfortable using on a daily basis.
“There are a finite number of people on the planet that can operate at that level of programming experience. Not only does Cg expand the base of people who can program shaders, but it offers much easier debugging for developers who already program in assembly,” said Black.
Black also explained that while Nvidia’s own Cg compiler is heavily optimized for their own hardware, the company is offering an open source version that can be optimized to work with other manufacturer’s graphics hardware as well. ATI, for example, produces Radeon graphics chips that sport programmable vertex and shader capabilities.
The Cg Toolkit itself is expected to be released for download from Nvidia’s Web site beginning today. It’ll also be available to European developers attending “The Gathering 2” conference in London today and tomorrow.
The initial release of the toolkit, a beta version, includes the compiler, a prototyping and visualization environment called Cg Browser, the Cg Standard Library, and a set of pre-written shader examples.
The toolkit is forward and backward compatible, thanks to Nvidia’s own Unified Driver Architecture (UDA). Likewise, Cg employs a Unified Compiler Architecture that makes it possible for programs to run on future or past generations of Cg-compatible graphics processing units (GPUs), optimized at run-time for the specific GPU in the system.
This means that developers who use Cg won’t have to manually optimize their code for each generation of GPU that’s released. Nvidia vice president of marketing Dan Vivoli said, “Because we do the heavy lifting with our Cg Compiler, developers can spend more time on the creative side of game development.”
So what’s the significance of this for Mac users and Mac developers?
“Cg is inherently a lingua franca,” explained Black. “The same shader you write will run on a variety of operating systems. It will also compile to different APIs, like DirectX and OpenGL.”
And while Nvidia makes it clear that it has worked closely with Microsoft in developing the new language to support features in DirectX 9.0, Black added that Nvidia and Apple are working closely together to make sure that Mac OS X will experience benefits from Cg.
“When Nvidia first shipped the GeForce2 MX, we committed to supporting the Macintosh with all future GPUs,” said Black. “This announcement extends our commitment to software as well.”