announced today the GeForce FX, a new graphics architecture which the company said delivers “cinematic-quality” graphics and special effects to computers in real time.
|<?php virtual(“/includes/boxad.inc”); ?>
The 500MHz graphics processing unit can calculate 375 million programmable vertices per second, 4 billion pixels per second, and 16 billion anti-aliased samples per second.
As a point of comparison, the GeForce FX is significantly faster than Nvidia’s current top-of-the-line consumer processor, the GeForce4 Ti 4600, which calculates about 136 million vertices per second and has a fill rate of about 4.8 billion AA samples/per second.
Nvidia president and CEO Jen-Hsun Huang said that the new processor has the ability to “bring character emotion to life” through a new type of expression that Nvidia calls “cinematic computing.”
The core component in this technology is the CineFX engine, which Nvidia said provides more powerful cinematic visual effects through the use of pixel and vertex shaders. Pixel and vertex shading isn’t new — current and past implementations of Nvidia’s and competitor ATI’s top-end graphics hardware have long boasted support for such technology, but using it has been complicated and relegated to only a select few developers of games and 3D graphics products.
Nvidia said that CineFX exposes additional capabilities by eliminating “many programming barriers” associated with the use of these detailed technologies. Nvidia has posted and linked various examples of the CineFX engine at work, displaying advanced effects, more detailed surface properties and lighting, and other capabilities associated with the technology.
Other technical aspects of the GeForce FX include 1GHz DDR2 memory support — the fastest frame buffer ever designed, according to Nvidia — and complete support for the AGP 8x specification. The GeForce FX unit sports nearly twice the number of transistors as the GeForce4 series; it uses a 0.13 micron and copper manufacturing process.
Although much emphasis today is being made on the GeForce FX’s support of Microsoft’s own DirectX technology (Nvidia is also a manufacturing partner in Microsoft Xbox video game console operations), Nvidia noted that the CineFX engine that powers the GeForce FX implements OpenGL specifications as well, and the GeForce FX supports Nvidia’s Unified Driver Architecture (UDA). These two issues are of significant import to Macintosh users, as they pave the way for GeForce FX to be used on other computing platforms like the Macintosh. Apple has not announced support for the new hardware at this time, and it generally does not discuss plans for new products prior to an actual announcement.
The new GeForce FX offers true 128-bit color with 32-bit floating point components for red, green, blue and alpha values, according to Nvidia. It also has “Intellisample,” a new form of anti-aliasing technology that provides gamma-adjusted anti-aliasing and adaptive anisotropic filtering. Anti-aliasing technology removes jagged lines from the edges of polygonal objects rendered in real time 3D environments like games.
Nvidia Corp. said that it’s currently sampling the GeForce FX to its add-in card partners and original equipment manufacturer (OEM) partners. It anticipates that retail graphics cards based on the new chip will hit store shelves in February 2003.