View Single Post
  #3  
Old September 12th 03, 06:03 AM
Strontium
external usenet poster
 
Posts: n/a
Default

Blah, blah, blah.
Drivers, drivers, drivers.....

YADA, YADA, YADA

Dippity doo dah, dippity day...

'Crayz jig, in the middle of the street'


It's already been stated, elsewhere (and NO I'm not going to supply
references as this ****ing fanboy war has produced enough), that the nVidia
hardware just does NOT cut it.

Talk about drivers, all you want. If you want your card to run on drivers,
alone.....have fun fanbois


-
Dave stood up at show-n-tell, in zZb8b.420297$o%2.191281@sccrnsc02, and
said:

I think we'll bother the ATI forum with this as well. Might hate me
now, but you might thank me later...

"Lee Marsh" wrote in message
...
We have a response straight from NVIDIA addressing the recent storm
that


snip.

Let's just schlep the whole statement right into this thread, shall
we? Naturally, some accompanying pithy and more not-so-concise jabs
(wouldn't be myself without them...). This is not a troll, nor a
mission for me to lambaste Nvidia, although I do my share of it. I
would really like to see them succeed, provided they knock off the
BS. This is to point out some inconsistencies I see just jumping out
of this statement. I read it four ****ing times, and each time, the
same things hit me in the same place: night in the ruts, so here's
some noise from those bruised jimmies, as I feel that this type of
nonsense really underestimates people's intelligence...


"Over the last 24 hours, there has been quite a bit of controversy
over comments made by Gabe Newell of Valve at ATIs Shader Day.

(Fun's just beginning. A whole can of worms has been opened to the
masses WRT Nvidia actual DX9 shader performance)

During the entire development of Half Life 2, NVIDIA has had close
technical contact with Valve regarding the game. However, Valve has
not made us aware of the issues Gabe discussed.

(I reiterate: So much for close technical contact. But Brian, being
the nice guy and PR flack he is---I wouldn't expect him to say much
differently than all this.)

We're confused as to why Valve chose to use Release. 45 (Rel. 45) -
because up to two weeks prior to the Shader Day we had been working
closely with Valve to ensure that Release 50 (Rel. 50) provides the
best experience possible on NVIDIA hardware.

(Missing fog, perhaps? Or maybe screenshot artificial augmentation? Or
something else the general public is not privy to, that Nvidia might
be exploiting for PR's sake on the basis of lack of commonly
available info, and Gabe is a little more dignified at the moment to
sling the big mudballs with specifics? Who would put it past Nvidia
after all said and done?)

Regarding the Half Life2 performance numbers that were published on
the web, we believe these performance numbers are invalid because
they do not use our Rel. 50 drivers. Engineering efforts on our Rel.
45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's
optimizations for Half Life 2 and other new games are included in our
Rel.50 drivers - which reviewers currently have a beta version of
today. Rel. 50 is the best driver we've ever built - it includes
significant optimizations for the highly-programmable GeForce FX
architecture and includes feature and performance benefits for over
100 million NVIDIA GPU customers.

(So, essentially we should use whichever optimized driver set
provides the best performance with whichever game it was designed to
speed up, regardless whether they' a: released to the public and
b: WHQL certified? So what if it breaks performance and/or
functionality with other things, or previously implemented
workarounds? And stating that the 50's are the best set yet...sans
fog...is a little ludicrous. Talk is cheap. Release the drivers RFN
and let the people be the judge, if you dare...)

Pending detailed information from Valve, we are only aware one bug
with Rel. 50 and the version of Half Life 2 that we currently have -
this is the fog issue that Gabe referred to in his presentation. It
is not a cheat or an over optimization. Our current drop of Half Life
2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public
before the game is available. Since we know that obtaining the best
pixel shader performance from the GeForce FX GPUs currently requires
some specialized work, our developer technology team works very
closely with game developers. Part of this is understanding that in
many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9)
provides no image quality benefit. Sometimes this involves converting
32-bit floating point precision shader operations into 16-bit
floating point precision shaders in order to obtain the performance
benefit of this mode with no image quality degradation. Our goal is
to provide our consumers the best experience possible, and that means
games must both look and run great.

(How much time is a developer expected to spend special-case
optimizing engines for hardware that does not fully conform to
reference specs, or implements them in a near-unplayable fashion with
what is trying to be accomplished from a creative standpoint?
Regardless if it's the result of any failure in relations between
Nvidia and Microsoft. How much grease is this gonna take? And
downplaying the missing fog bug IMHO is a misstep. If the proper
implementation of that fog would skew results unfavorably in the
slightest---mind you, I can't say one way or another: I don't have
these drivers, and I'm not a developer---how does one think they have
ANY leeway whatsoever in their insistence such a driver should be
used as part of a valid performance assessment, let alone providing
the best possible experience? Maybe these drivers should be leaked
and the general public could see for themselves where the numbers
lie---and I chose that word for a reason---in their own evaluations?
Quite frankly I feel that regardless of how true it may be that
16-bit FP precision and PS 1.4 are more economical, efficient
codepaths in some instances without performance or IQ hit, telling a
developer that after he's coded the damn thing around DX9 PS 2.0
reference calls and now has to push up the release date or burn some
midnight oil just to make a wimpy pipeline look better is either
inevitable devrel suicide or expensive. In any case, it's no excuse
for the failure to measure up to all the spewed hype, let alone
reference standards. The latter part of the above paragraph reads
like "Sometimes, using earlier iterations of advanced features that
happen to be part of the spec our product was ostensibly designed and
hyped to the moon to support makes the game run much faster".)

The optimal code path for ATI and NVIDIA GPUs is different - so
trying to test them with the same code path will always disadvantage
one or the other. The default settings for each game have been chosen
by both the developers and NVIDIA in order to produce the best
results for our consumers.

(Looking at some preliminary results would tend to provide some
contradictions to this latter assertion...a spoonful of truth,
followed by a spoonful of bull****, perhaps? Nothing new under the
sun in PR-speak...)

In addition to the developer efforts, our driver team has developed a
next-generation automatic shader optimizer that vastly improves
GeForce FX pixel shader performance across the board. The fruits of
these efforts will be seen in our Rel.50 driver release. Many other
improvements have also been included in Rel.50, and these were all
created either in response to, or in anticipation of the first wave
of shipping DirectX 9 titles, such as Half Life 2.

(Read this: "We're painfully aware our DX9 shader performance sucks
bricks thru a straw compared to ATI's, although you won't EVER hear
this from us, mind you, so we're adding the overhead of a translation
layer to Cg function calls, thereby circumventing reference
functionality thru our own brand of emulation." Now who doesn't think
this translates to: a: reduced precision b: broken functionality with
later DX requirements? The former might not matter more than a
****hole in a snowbank in many instances, the latter...who wants to
spend $200-400+ on a piece of hardware that is not even immediately
future-proof? ****! Come on, Brian! Perhaps if the hardware supported
the API a little better upon inception, this last-minute, knees-bent
running around looking for leaves big enough to cover your asses
wouldn't be necessary. "I did it my way" worked for Sinatra. Indeed,
we shall see how well this works for Nvidia.)

We are committed to working with Gabe to fully understand his
concerns and with Valve to ensure that 100+ million NVIDIA consumers
get the best possible experience with Half Life 2 on NVIDIA hardware."

(Calling Gabe's evaluation invalid *ESPECIALLY when fog doesn't work*
is hardly a step in the right direction. It's laughable. There are no
doubt good reasons in Gabe's mind why he chose not to use the Det
50's. The real question is, if the public were to see chapter and
verse of these reasons, how do YOU think Nvidia would look in the
eyes of the community, Brian? The million-dollar question is: "Did
Valve optimize for ATI architecture at the expense of Nvidia?" If so,
it's not like this sort of thing wasn't funded by Nvidia in the past,
one's own medicine always tastes foul it seems. But really, if
Valve's dev team was just using reference API calls, and this works
better with ATI than with Nvidia---in fact it does, and this is
supported by several benchmarks---and Nvidia hardware is just not
measuring up, then perhaps Nvidia should throw some more time and
money at Gabe et al to help them obtain more favorable results using
proprietary codepaths, or STFU and go back to driver cheating which
apparently is what they are prioritizing.)

Brian Burke
NVIDIA Corp.

(Pixar on a single chip...so what if it takes as long to render a
scene...;-) Maybe "something hallucinogenic to smoke" should be
bundled with Nvidia cards...that way people could see the Emperor's
New Clothes clear as day, right next to the pink elephants..."if you
can't dazzle 'em with brilliance, baffle 'em with bull****" should be
the corporate mantra of the millennium)


--
Strontium

"It's no surprise, to me. I am my own worst enemy. `Cause every
now, and then, I kick the livin' **** `outta me." - Lit