A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Nvidia is in a really bad spot



 
 
Thread Tools Display Modes
  #1  
Old September 18th 03, 05:23 PM
Folk
external usenet poster
 
Posts: n/a
Default Nvidia is in a really bad spot


Think back several months. Everyone looked forward to leaked Nvidia
drivers, hoping for some additional improvements. If new drivers gave
an extra 100-200 points in 3DMark2001, everyone jumped for joy and
lauded the Nvidia driver development team as geniuses. Usenet and fan
sites were all gaga over the brilliance of the Nvidia software
developers. It was common knowledge that the big difference in Nvidia
and ATI was driver quality, with the hands-down nod given to Nvidia.

Now, any improvements in Nvidia drivers are seen as cheats? I'll be
damned if I can see how Nvidia is going to get out from under this
recent stigma. What were once seen as a brilliant team of developers
is now perceived as a bunch of cheaters. The truth is somewhere in
the middle, but the online press only sees the extremes. Perception
is everything. Nvidia is in deep ****.

I'm not saying that Nvidia didn't bring this upon themselves. What I
am saying is that it is amazing to see the 180 degree perception
switch as pertains to Nvidia's driver development team. From gods to
scoundrels so quickly......

  #2  
Old September 18th 03, 05:27 PM
John Russell
external usenet poster
 
Posts: n/a
Default


"Folk" wrote in message
...

Think back several months. Everyone looked forward to leaked Nvidia
drivers, hoping for some additional improvements. If new drivers gave
an extra 100-200 points in 3DMark2001, everyone jumped for joy and
lauded the Nvidia driver development team as geniuses. Usenet and fan
sites were all gaga over the brilliance of the Nvidia software
developers. It was common knowledge that the big difference in Nvidia
and ATI was driver quality, with the hands-down nod given to Nvidia.

Now, any improvements in Nvidia drivers are seen as cheats? I'll be
damned if I can see how Nvidia is going to get out from under this
recent stigma. What were once seen as a brilliant team of developers
is now perceived as a bunch of cheaters. The truth is somewhere in
the middle, but the online press only sees the extremes. Perception
is everything. Nvidia is in deep ****.

I'm not saying that Nvidia didn't bring this upon themselves. What I
am saying is that it is amazing to see the 180 degree perception
switch as pertains to Nvidia's driver development team. From gods to
scoundrels so quickly......


70% of americans believe "incorrectly" that Saddam worked with Al Kyeda and
therefore responsible for 9/11. I guess a similar percentage of pc users
believe Nvidia cheated.


  #3  
Old September 18th 03, 06:05 PM
Lenny
external usenet poster
 
Posts: n/a
Default


70% of americans believe "incorrectly" that Saddam worked with Al Kyeda

and
therefore responsible for 9/11. I guess a similar percentage of pc users
believe Nvidia cheated.


Who's this Al Kyeda you say, and what's he done to make you so upset at him?

Anyway, if you're trying to say Nvidia's "optimized" cheating drivers is
just internet mass hysteria, why don't you disassemble the drivers yourself
and check them out, huh? People have actually done that you know, and using
this technique and others were the reason Futuremark determined how the
tricks used in 3dmark 2003 worked for example (as detailed in that PDF they
released some moons back before Nvidia threatened to sue them out of
business).


  #4  
Old September 18th 03, 06:30 PM
Romano Cule
external usenet poster
 
Posts: n/a
Default


"Folk" wrote in message:
cut


Why you all are so angry on nVidia ??Ati cards are in this moment faster and
thats is.Nohing to talk about.NVidia FX cards are total disaster in dx 9.o
because their combine register with legacy arhitecture,but Ati have made
total new arhitecture and new drivers for their product,in this case dx9
arhitecture.


  #5  
Old September 18th 03, 08:31 PM
John Russell
external usenet poster
 
Posts: n/a
Default


"Lenny" wrote in message
...

70% of americans believe "incorrectly" that Saddam worked with Al Kyeda

and
therefore responsible for 9/11. I guess a similar percentage of pc users
believe Nvidia cheated.


Who's this Al Kyeda you say, and what's he done to make you so upset at

him?

Anyway, if you're trying to say Nvidia's "optimized" cheating drivers is
just internet mass hysteria, why don't you disassemble the drivers

yourself
and check them out, huh? People have actually done that you know, and

using
this technique and others were the reason Futuremark determined how the
tricks used in 3dmark 2003 worked for example (as detailed in that PDF

they
released some moons back before Nvidia threatened to sue them out of
business).



So what you are saying is that optimising drivers to circumvent poor coding
and get the best out of any make of card by it's manufacturers is cheating.

Please can more makers produce drivers that cheat.

There was a time where gamers preyed for drivers with better optimisation.


  #6  
Old September 18th 03, 10:21 PM
magnulus
external usenet poster
 
Posts: n/a
Default


"John Russell" wrote in message
...

70% of americans believe "incorrectly" that Saddam worked with Al Kyeda

and
therefore responsible for 9/11. I guess a similar percentage of pc users
believe Nvidia cheated.


I don't believe computer gaming should be a pecker contest between who has
the biggest scores, but the fact is NVidia's DX 9 performance stinks, no
matter how you slice it. The best you can hope for is a game like Halo or
Aquanox 2 where the GeForce FX 5900 won't take a big hit for using full DX9,
but in other games like Half Life 2, Tomb Raider, and probably many more, it
just will have to run with lower image quality.

My theory is that NVidia built the GeForce FX 5900 primarily for Doom 3,
which is, at this time, mostly a DX 8 feature-set game, and Carmack himself
has said the precision differences won't matter much. It also doesn't sound
like the current build of the game makes as big of use of pixel shaders as
Half Life 2 will. It uses big textures, bump mapping, shadows, fully
real-time lighting, and little else, from what I've read. To a lesser
extent, I also think they built the FX to beat ATI at DX 8 in pure score,
and at that they succeded.


  #7  
Old September 18th 03, 11:46 PM
Lenny
external usenet poster
 
Posts: n/a
Default


So what you are saying is that optimising drivers to circumvent poor

coding

....As defined by who? Nvidia's optimizations have largely meant a reduction
of image quality to boost speed, I fail to see how pretty graphics
constitutes "poor coding".

and get the best out of any make of card by it's manufacturers is

cheating.

Again, how do you define "the best"?

I find it extremely ironic Nvidia pointed out rather smugly at launch of
NV30 that it supported 32-bit per color component (for a total of 128 bits
per pixel, floating-point) and ATi only did 24 bits per component (=96 bits
per pixel, floating point), since 96-bits weren't enough precision for
advanced cinematic shaders.

Now Kirk & crew say DX9 shaders can be substituted with DX8.1 shaders (which
are based on just 8 bits per component (32 bits per pixel, integer) without
a loss of quality. Huh? Why this sudden and rather drastic change of
attitude? Clearly a case of trying to eat the cake and still have it
methinks.

On the other hand, do you REALLY think it's fair when you have a benchmark
where one manufacturer runs the software as intended and the other replaces
shaders with code that runs faster but gives worse output, reduces texture
resolution, does poorer lighting calculations etc?

Please can more makers produce drivers that cheat.


Well, okay, if you enjoy being bent over and take it up the backside by your
video card manufacturer, then by all means... Run your cheating drivers man.


There was a time where gamers preyed for drivers with better optimisation.


Optimisations, yeah. Cheats, no thanks.


  #8  
Old September 19th 03, 02:02 AM
ho alexandre
external usenet poster
 
Posts: n/a
Default

Lenny wrote:
On the other hand, do you REALLY think it's fair when you have a benchmark
where one manufacturer runs the software as intended and the other replaces
shaders with code that runs faster but gives worse output, reduces texture
resolution, does poorer lighting calculations etc?


If no one can see the difference without examining each frame to spot
the differences, I think it's fair.
When you watch a movie you don't pause it to say whether the director is
good or not. You just watch it

To what I've seen so far, all benchmarks require static images of
selected frames to compare the renderings. That may be very scientific,
but very far from normal use (that is : see 40+ images each second,
compared to watching 1 image for 40 seconds to see that "AH !! I see a
difference ! Boo !"). This opinion I give is not specific to today's
burning issue, it is about general graphics benchmarking in general.



--
XandreX
/I'm that kind of people your parents warned you about/

  #9  
Old September 19th 03, 06:41 AM
John Lewis
external usenet poster
 
Posts: n/a
Default

On Thu, 18 Sep 2003 19:30:36 +0200, "Romano Cule"
wrote:


"Folk" wrote in message:
cut


Why you all are so angry on nVidia ??Ati cards are in this moment faster and
thats is.Nohing to talk about.NVidia FX cards are total disaster in dx 9.o
because their combine register with legacy arhitecture,but Ati have made
total new arhitecture and new drivers for their product,in this case dx9
arhitecture.



I have very happily gone out and bought a FX5900 (for $250), in spite
of all the weeping and gnashing of teeth over DX9 water-effects......
err...sorry.. Pixel Shaders2.0................

I have bought high-performance video cards since my first Voodoo1
( was also $250, if I recall correctly ) and I am distinctly not an
impulse purchaser. I do not have enough loose change to be that
stupid.

I did need to make a purchase since I was building a new machine.

So why did I buy a FX5900 ...............

Well, I have a large collection of Directx/OpenGL PC games stretching
back quite a few years. It was essential to me to settle on a video
card family and drivers that I knew ( from experience with my Ti4400)
would run not only the new games but all my old favorites without any
functional glitches or really-objectionable graphical artifacts,
besides being trouble-free with all desk-top applications and all my
professional video-editing tools as well.

The FX5900/45.23- driver has satisfied all of my expectations.

Worst case FPS in Morrowind has shot up from 18FPS (Ti4400)
to 30, all effects maxed, same CPU. Morrowind has very oddly-
coded graphics, ---- unlike virtually every other first-person game,
resolution-scaling has little or no effect on FPS................
My other 'shooters" have seen such a quantum leap in FPS,
that I haven't bothered to measure it. Also no need to
tweak driver settings away from default for everything to run
correctly.

Why not Ati.....................

Ummm............tell me all about Ati driver/hardware and performance
with legacy games. Didn't Ati's recent drivers have a little shadow
problem with JK2. ? Been fixed yet ? Ever since the introduction of
the 9700 there has been sporadic but continued newsgroup
comments and cited facts with regard to Ati's lack of focus
on backward compatibility of their new hardware/drivers.
Seems as if Ati's emphasis is on the new.............nVidia's focus
seems to be on new but also very seriously trying not to break old
software. If so, I like the compromise they made in the FX5900.
I also like the on-chip thermal monitoring. My non-ultra FX5900
overclocks to 460/920 with no problem. With its very efficient
(and quiet) cooling the GPU max. temp ~ 65 degrees C, with
ambient air near the video board ~ 35 degrees C.
---------------------------------------------
Anyway, the subject which started all this foaming over nVidia/Dx9
--- Half-Life 2 will not be released before 19 November. By that time

we should be only a month or two away from the next round of
video hardware from both companies.

Meanwhile, while waiting for HL2, some time spent examining the
ummm....benefits... (???) Valve's Steam gives you, especially in
single-player and local-LAN mode, should very nicely distract you
from the currently-flogged-to-death HL2/ DX9 discussion..........
Maybe you can vent your ample spleen on Valve instead and
give nVidia a bit of a break. For PC gamers, the underlying
issues with Valve and Steam are far more fundamentally important
and far-reaching than this transitory DX9 discussion which will
become history with the next round of video chips, or
maybe even after driver tweaks.

If you are at all confused by the last paragraph, I refer you to
several Steam forums, particularly reading the Steam FAQ
and related Q&A very carefully indeed, and to relevant
newsgroup threads in comp.sys.ibm.pc.games.action.

John Lewis



  #10  
Old September 19th 03, 08:13 AM
Lenny
external usenet poster
 
Posts: n/a
Default


On the other hand, do you REALLY think it's fair when you have a

benchmark
where one manufacturer runs the software as intended and the other

replaces
shaders with code that runs faster but gives worse output, reduces

texture
resolution, does poorer lighting calculations etc?


If no one can see the difference without examining each frame to spot
the differences, I think it's fair.


But you're not benchmarking the same thing anymore. Benchmarks aren't about
if things LOOK the same (because in 3D rendering things rarely look exactly
the same between two cards; the specs aren't that tightly defined), it's
about doing the same thing on two cards and then determine the winner.

One card is running an entirely different workload than the other. That's
hardly fair in my opinion.

Besides, it's been pretty easy to spot the differences so far. 3dmark had
highlights that did not point towards the sun, aquamark has blurry textures
and washed-out colors, etc.


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
pc problems after g card upgrade + sp2 ben reed Homebuilt PC's 9 November 30th 04 02:04 AM
Bad news for ATI: Nvidia to 'own' ATI at CeBit - no pixel shader 3.0 support in R420 (long) NV55 Ati Videocards 12 February 24th 04 07:29 AM
Response by Nvidia concerning HL2 *warning, lengthy post, strong opinion content, some bad langwidge* NC-17 rating administered... Dave Ati Videocards 28 September 14th 03 05:51 PM
Kyle Bennett (HardOCP) blasts NVIDIA Radeon350 Nvidia Videocards 19 August 14th 03 09:46 PM
Kyle Bennett (HardOCP) blasts NVIDIA Radeon350 Ati Videocards 12 August 13th 03 09:19 PM


All times are GMT +1. The time now is 05:09 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.