If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
|
Thread Tools | Display Modes |
#11
|
|||
|
|||
The real key is in getting decent drivers. This is why nVidia took
over the graphics world, not by their hardware. nVidia managed to get fast and *stable* drives out for all of their products while 3dfx and ATI were floundering with buggy drivers that were missing features and having either very poor performance or, at best, uneven performance. ATI has since learned from their mistakes and really improved the quality of their drivers, but they are about the only one. 3DFX drivers were excellent - I never had to fiddle with anything to get my Voodoo 3 working with any game. The reason nVidia took over was that the GeForce performance was so far above V3 performance, although there was nothing wrong with the drivers either. 3DFX attempts to get back on par for performance never went anywhere while nVidia raised the bar again by a huge amount with the GF2. |
#12
|
|||
|
|||
|
#13
|
|||
|
|||
On Sun, 28 Sep 2003 19:21:19 +0100, "Andy Cunningham"
wrote: The real key is in getting decent drivers. This is why nVidia took over the graphics world, not by their hardware. nVidia managed to get fast and *stable* drives out for all of their products while 3dfx and ATI were floundering with buggy drivers that were missing features and having either very poor performance or, at best, uneven performance. ATI has since learned from their mistakes and really improved the quality of their drivers, but they are about the only one. 3DFX drivers were excellent - I never had to fiddle with anything to get my Voodoo 3 working with any game. By the time that the Voodoo 3 came out it was rapidly becoming too late for 3DFX. They really blew it with crappy drivers on their Voodoo Rush chipset and then the Voodoo Banshee after that. 3DFX's drivers also always tended to offer quite poor performance unless you happened to be playing a Glide game or one that would work with their "Mini-GL" driver (ie their Quake driver). FWIW the "buggy driver" syndrome was more a problem for ATI. 3DFX had more problems with poor driver performance in OpenGL and DirectX as well as some missing features (though the latter was one part hardware, one part software). The reason nVidia took over was that the GeForce performance was so far above V3 performance, although there was nothing wrong with the drivers either. 3DFX attempts to get back on par for performance never went anywhere while nVidia raised the bar again by a huge amount with the GF2. By that point in time their drivers might not have been buggy, but they were often offering rather poor performance as compared to what the hardware was theoretically capable of. ------------- Tony Hill hilla underscore 20 at yahoo dot ca |
#14
|
|||
|
|||
wrote in message ...
U comp.sys.ibm.pc.hardware.video graphics processing unit prica: Personally, I am most excited about the Volari V8 Duo - first *consumer* graphics card configuration to sport twin Grahpics Processing Units. Voodoo5 5500 in my machine has got 2 VSA100 units... If that isn't GPU, than what is it? Drivers are working properly under any Windows OS (right now using Windows 2000 Pro)... There is one thing that nobody will beat soon... ) Voodoo5 6000... Or, saying another words - 4 CPU's on one board... But, shhhhhh... ) I screwed one CPU, so it isn't working properly... )) And, ATI Rage Fury MAXX had 2 Rage128Pro CPU's (IIRC)... But, problematic drivers... Ok this post is sort of for you, and for Tony, or anyone who doesn't really draw the line between a rasterizer / 3D accellerator like 3Dfx Voodoo 1,2,3, Banshee, VSA-100, PowerVR Series 1,2,3, Riva 128, TNT1/2, Rage128, Rage Fury etc., and a full on 'graphics processor' or GPU or polygon processor or polygon processor chipset (GeForce 1-4, GFFX, all the Radeons, Lockheed Reald3D series, 3DLabs GLINT+Delta, Evans & Sutherland RealIMAGE, 3DLabs Wildcat, etc) What I am posting below is a very good (IMHO) post from 1996 from a guy who explained the differences (and made a distinction) between Voodoo Graphics or similar consumer 3D accelerators/rasterizers of the time, and full 3D polygon processors (equivalent of todays GPUs) with geometry engines/processors-like Lockheed's non-consumer Real3D/100, which was a true 'graphics processor'/ chipset (not the horrible consumer Intel/R3D i740 used in Starfighter cards that had not been revealed in 1996). At that time, there were NO consumer PC 3D chips with geometry processing / T&L. in otherwords, there were no consumer GPUs in 1996. not until 1999's GeForce256. This post really points out the differences quite well. Alright without further rambling on my part, here is the post: http://groups.google.com/groups?selm...&output=gplain [quote] "First, let me start off by saying I am going to be buying a Voodoo card. For low end comsumer grade flight sims and such, the Voodoo looks like about the best thing available. Second, I am not necessarily responding to just you, because there seems to be a hell of a lot of confusion about Lockheed Martin's graphics accelerators. I have been seeing posts all over the place confusing the R3D/100 with the AGP/INTEL project that L.M. is working on. The R3D/100 is *NOT* the chipset that is being developed for the AGP/INTEL partnership. However, since your inference is that the Voodoo is faster than the R3D/100, I have to say that you are totally dead wrong. While the specs say that the Voodoo is *capable* of rendering a higher number of pixels per second, or the same number of polygons per second as the R3D/100, the specs fail to mention that these are not real world performance figures any you probably will not ever see the kind of performance that 3Dfx claims to be able to acheive. This does *not* mean that the Voodoo is not a good (its great actually) card, just that the game based 3D accelerator companies (all of them) don't tell you the whole story. The Voodoo uses a polygon raster processor. This accelerates line and polygon drawing, rendering, and texture mapping, but does not accelerate geometry processing (ie vertex transormation like rotate and scale). Geometry processing on the Voodoo as well as every other consumer (read game) grade 3D accelerator. Because the cpu must handle the geometry transforms and such, you will never see anything near what 3Dfx, Rendition, or any of the other manufacturers claim until cpu's get significantly faster (by at least an order of magnitude). The 3D accelerator actually has to wait for the cpu to finish processing before it can do its thing. I have yet to see any of the manufacturers post what cpu was plugged into their accelerator, and what percentage of cpu bandwidth was being used to produce the numbers that they claim. You can bet that if it was done on a Pentium 200, that the only task the cpu was handling was rendering the 3D model that they were benchmarking. For a game, rendering is only part of the cpu load. The cpu has to handle flight modelling, enemy AI, environmental variables, weapons modelling, damage modelling, sound, etc, etc. The R3D includes both the raster accelerator (see above) and a 100 MFLOP geometry processing engine. Read that last line again. All geometry processing data is offloaded from the system cpu and onto the R3D floating point processor, allowing the cpu to handle more important tasks. The Voodoo does not have this, and if it were to add a geometry processor, you would have to more than double the price of the card. The R3D also allows for up to 8M of texture memory (handled by a seperate texture processor) which allows not only 24 bit texturemaps (RGB), but also 32bit maps (RGBA) the additional 8 bits being used for 256 level transparency (Alpha). An addtional 10M can be used for frame buffer memory, and 5M more for depth buffering. There are pages and pages of specs on the R3D/100 that show that in the end, it is a better card than the Voodoo and other consumer and accelerator cards, but I guess the correct question is, for what? If the models that are in your scene are fairly low detailed (as almost all games are - even the real cpu pigs like Back to Bagdhad), then the R3D would be of little added benefit over something like the Voodoo. However, when you are doing scenes where the polys are 2x+ times more than your typical 3D game, the R3D really shines. The R3D is and always was designed for mid to high end professional type application, where the R3D/1000 (much much faster than the 100) would be too expensive, or just plain overkill. I've seen the 1000 and I have to say that it rocks! I had to wipe the drool from my chin after seeing it at Siggraph (We're talking military grade simulation equipment there boys, both in performance and price!) Now then, as I mentioned before, I'm going be buying the Voodoo for my home system, where I would be mostly playing games. But, I am looking at the R3D for use in professional 3D application. More comparible 3D accelerators would not be Voodoo, Rendition based genre, but more along the lines of high end GLINT based boards containing Delta geometry accelerator chips (and I don't mean the low end game base Glint chips, or even the Permedia for that matter), or possibly the next line from Symmetric (Glyder series), or Intergraph's new professional accelerator series." [unqoute] Ahem, I appologize for making a really huge deal out of this. I am not trying to be anal or trying to flame anyone, just pointing something out that is quite significant IMHO, and significant to most people that work with 3D graphics. (I dont myself). I feel that person's post is right in line with my thinking as far as making a distinction between rasterizers / 3D accelerators, which only tackle part of the rendering pipeline (leaving the rest for the CPU) and full polygon processers with geometry & lighting onboard, aka 'GPUs'. |
#15
|
|||
|
|||
Actually I don't just use the term 'GPU' as Nvidia uses it. To myself and to many who use graphics processors, something that takes the geometry processing load OFF the CPU, putting on the graphics chip, that's a 'graphics processor' or graphics processing unit / GPU as Nvidia coined it. The 3Dfx Voodoo chips, including VSA-100s used in Voodoo5 5500 and 6000 did NOT do that at all. Neither did any of the pre-Radeon ATI chips, inluding the duel Rage Fury chips in the MAXX card. And basicly any consumer 3D PC chip before the GeForce256. Any graphics chip that lacks what used to be called 'geometry processing' or what was commenly called T&L in late 1999 when GeForce came out, and is now called Vertex Shading, if it lacks that, it's usually concidered a 3D accelerator or rasterizer, rather than a complete 'graphics processor' or GPU. At least that is the way I have understood things for a long time. Well. By your explanation, a GPU does all proccessing itself. My old GF2 card didn't support the DX 8 hardware shaders, so I guess it stopped being a GPU. Now I have an actual GPU card (GF3), but I guess since it doesn't support DX 9 hardware shaders, I can't call it a GPU either. |
#16
|
|||
|
|||
By the time that the Voodoo 3 came out it was rapidly becoming too late for 3DFX. They really blew it with crappy drivers on their Voodoo Rush chipset and then the Voodoo Banshee after that. I don't think the Rush, and Banshee helped kill 3DFX. The Banshee was understood to be entry level performance when compared to the Voodoo 2. I bought a Banshee myself, and found it to be a nice, cheap upgrade from my previous 8MB Verite 2200 + 4MB Voodoo 1. Could only get about 28fps in Q2 640x480 using Verite 2200, about 34fps with Voodoo 1, and a whopping 50fps with the Banshee. I was so happy then. Now we complain if we can't get 100fps. |
#17
|
|||
|
|||
"Larry Roberts" wrote in message ... By the time that the Voodoo 3 came out it was rapidly becoming too late for 3DFX. They really blew it with crappy drivers on their Voodoo Rush chipset and then the Voodoo Banshee after that. I don't think the Rush, and Banshee helped kill 3DFX. The Banshee was understood to be entry level performance when compared to the Voodoo 2. I bought a Banshee myself, and found it to be a nice, cheap upgrade from my previous 8MB Verite 2200 + 4MB Voodoo 1. Could only get about 28fps in Q2 640x480 using Verite 2200, about 34fps with Voodoo 1, and a whopping 50fps with the Banshee. I was so happy then. Now we complain if we can't get 100fps. The Banshee did help kill 3dfx because 3dFX was in such a hurry to release the alll-in-one 3d card they took people away from the Rampage project to work on it. |
#18
|
|||
|
|||
Look... GPU states for Graphics Processing Unit, right?
The acronyms says it's a unit that processes graphics... So, looking that way, all 2D GPU's back in the time of Hercules, CGA, EGA, VGA, blahblah, to the newest GPU's are the same thing... Units that have only one thing to do - process graphics... You can now talk about high-perf SGI GPU's, all the stuff you mentioned, and yes, all of these are GPU's, just like all the stuff I mentioned... But, if you say 3D GPU only, then it's other thing to discuss... Looking that way, Voodoo 1 and 2 weren't true GPU's, but 3D only (which they were in fact)... EOD... -- Klintona boja drazesan keksu sviru na Infou prekjucer ? By runf Damir Lukic, a member of hr.comp.hardver FAQ-team |
#19
|
|||
|
|||
Radeon350 wrote:
Ok this post is sort of for you, and for Tony, or anyone who doesn't really draw the line between a rasterizer / 3D accellerator like 3Dfx Voodoo 1,2,3, Banshee, VSA-100, PowerVR Series 1,2,3, Riva 128, TNT1/2, Rage128, Rage Fury etc., and a full on 'graphics processor' or GPU or polygon processor or polygon processor chipset (GeForce 1-4, GFFX, all the Radeons, Lockheed Reald3D series, 3DLabs GLINT+Delta, Evans & Sutherland RealIMAGE, 3DLabs Wildcat, etc) What I am posting below is a very good (IMHO) post from 1996 from a guy who explained the differences (and made a distinction) between Voodoo Graphics or similar consumer 3D accelerators/rasterizers of the time, and full 3D polygon processors (equivalent of todays GPUs) with geometry engines/processors-like Lockheed's non-consumer Real3D/100, which was a true 'graphics processor'/ chipset (not the horrible consumer Intel/R3D i740 used in Starfighter cards that had not been revealed in 1996). At that time, there were NO consumer PC 3D chips with geometry processing / T&L. in otherwords, there were no consumer GPUs in 1996. not until 1999's GeForce256. Yes, sure, the name GPU was invented back then. It was a 'revolution' in 3D cards. The 'GPU' has more capabilities and hardware support than the previous generations of vid cards. *BUT*, there have been many many more revolutions, like for instance the pixel shader. The Directx 8 compliant cards are the first ones capable of doing this. Great. But they didnt come up with a new name, like PSGPU, or whatever. It's just that NVidia chose to change the name of the graphics chip to GPU. For me, it's nonsense to claim that it's a special thing that the Volari chips is the first dual GPU video card, since you're referring to a dual video-chip card. It's the card with the latest version of videochips that has been launched. But, well, since it's te most recent card, there's no special thing in that. Well, this doesnt really lead anywhere ;-) I think we all know what we all mean, so no point in arguing about names ;-) Thomas |
|
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
CPU vs GPU | cowboyz | Ati Videocards | 42 | May 21st 04 03:17 AM |
AIB Companies To Adopt XGI Volari GPUs? | graphics processing unit | Ati Videocards | 18 | October 5th 03 12:46 AM |
HELP MX420 or MX460 GPU's | Dark Avenger | Nvidia Videocards | 1 | August 27th 03 11:17 AM |