The Unofficial Opie & Anthony Message Board
Home | Search | FAQ


The Unofficial Opie & Anthony Message Board - GeForce 3.......mmmmmmmm!!


Displaying 1-3 of 3 messages in this thread.
Posted ByDiscussion Topic: GeForce 3.......mmmmmmmm!!
Jafa Car Service
G.O.O.F.B.A.H.G.S.
Gyroscope Research Division
posted on 02-28-2001 @ 3:41 PM      
Psychopath
Registered: Oct. 00
I'm drooling over this new card even though its very expensive. If you have the money and need a new card here is some info.

To no one's surprise, nVidia continues on its six-month release cycle with the unsurprisingly named GeForce3. That said, the differences between this card and the previous models in the GeForce line are plenty, although like the last entry, the GeForce2 Ultra, this new card will require a thick wallet. Let's see what $500 (approx.) will buy. There's a lot more here than a speed upgrade.
First, the numbers. The original TNT chip had seven million transistors, the TNT2 had 15 million and the GeForce2 had 25 million. The GeForce3 chip has 57 million transistors capable of pounding through 800 billion operations per second, making it the most complex processor in the world. (The Pentium 4 chip has 42 million transistors.) There's a reason that sounds more like a CPU statistic: The GeForce3 is the first fully programmable GPU.

What does this mean for games? While the traditional ways of transforming, lighting and rendering are all on the chip, game programmers can also use the nfiniteFX engine to literally program their own methods of handling these operations. What's unique about these programs is that they do not use the computer's processor to run -- they are actually processed by the GeForce3, leaving the CPU to take care of housekeeping matters like physics and gameplay. So, for example, a programmer can download a polygonal model onto the GeForce3's memory and then write programs that run on the GPU to transform and light that model in real time.

The movies we¹ve included below all use this method. The main models are not motion-captured or key-framed; rather, they're manipulated by programs running on the GPU, and their lighting and rendering is all handled in real time solely by the video card. For games, this means that the video card can be utilized in ways that were not before possible. Instead of having to rely on the CPU to calculate animations, shadows, lights, etc., all of that information can be handled in software running on the GeForce GPU. The programs that do this can all be user-created, and nVidia also offers hundreds (if not thousands) of routines that are free to developers. This means that developers can literally drag-and-drop a specific procedure into their own code without having to figure it out on their own. Very cool.

Additional new features include HRAA, or High Resolution Anti-Aliasing. Through a new multisampling technique, the GeForce3 is able to generate anti-aliasing at four times the speed of the GeForce2 Ultra. (It's actually a new mode of anti-aliasing called Quincunx,­ a fancy word that describes the pattern of five dots found on a six-sided dice.) What this translates into is a game like Q3A running at 1024x768x32 with 4x anti-aliasing on and getting 70 frames per second.

Another important feature is the Crossbar Memory Controller. With this technology, even the pipelines between the GPU and the video card's RAM are programmable, meaning each can separately handle different data in parallel, or they can all be used at once on the same function. For example, in an area with heavy lighting and shading, three pipelines may be dedicated to handling the proper shadows and coloring, while the other pipeline can handle the static backgrounds. This technology also incorporates Z-buffer compression, meaning that scene depth can go further without having to rely on fog to cover pop-up.

It's all about programmability with the GeForce3. Since the entire GPU is programmable, programmers are free to come up with any way they want to utilize this technology. Where one programmer may use the standard methods provided by nVidia, another may come up with a completely new, and perhaps more efficient method of accomplishing the same thing. Additionally, since the GPU processes all the more traditional methods of handling transformation, lighting and rendering, the Detonator drivers are compatible across the board. All that remains to be seen is how programmers take advantage of this new technology.



Some people ask me,"Why don't your sig pics ever have anything to do with your name!". And to them I say,"Go F yourself you ass-f tard!"
hornygoatweed23
I've Got A Vagina With Teeth.
G.O.O.F.B.A.H.G.S.
Dragoon Battalion
My friends call me Weed
posted on 02-28-2001 @ 5:10 PM      
O&A Board Regular
Registered: Jan. 01
I think the estimated street price on this bad bot is suppossed to start at $6000, then dropping down to around $400 - $450 - damn, you can buy some machines for that price. I don't think I can even look at this card until I up my CPU (PII 400) - but at the moment, I'm lovin' my Herc GF2MX, so far, its handled everything I've thrown at it.

I watched a QT clip on the web in regards to Apple Introducing this card in its next IMAC line (I think), and they previewed Doom 3 using GF3 - in a word....SWEEEET.




"Groupsex - it does every-body good"
Proudly Adopted by Doughboy
CrackSweat
posted on 03-04-2001 @ 12:24 AM      
Psychopath
Registered: Oct. 00
Besides this new Geforce3, which is the next best out there now for about 3 to 4 hundred? I need a new card, the card I have sucks, Got Nascar4, running slow and shitty.




Displaying 1-3 of 3 messages in this thread.