A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Ati Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Response by Nvidia concerning HL2 *warning, lengthy post, strong opinion content, some bad langwidge* NC-17 rating administered...



 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
  #1  
Old September 12th 03, 05:30 AM
Dave
external usenet poster
 
Posts: n/a
Default Response by Nvidia concerning HL2 *warning, lengthy post, strong opinion content, some bad langwidge* NC-17 rating administered...

I think we'll bother the ATI forum with this as well. Might hate me now, but
you might thank me later...

"Lee Marsh" wrote in message
...
We have a response straight from NVIDIA addressing the recent storm that


snip.

Let's just schlep the whole statement right into this thread, shall we?
Naturally, some accompanying pithy and more not-so-concise jabs (wouldn't be
myself without them...). This is not a troll, nor a mission for me to
lambaste Nvidia, although I do my share of it. I would really like to see
them succeed, provided they knock off the BS. This is to point out some
inconsistencies I see just jumping out of this statement. I read it four
****ing times, and each time, the same things hit me in the same place:
night in the ruts, so here's some noise from those bruised jimmies, as I
feel that this type of nonsense really underestimates people's
intelligence...


"Over the last 24 hours, there has been quite a bit of controversy over
comments made by Gabe Newell of Valve at ATIs Shader Day.

(Fun's just beginning. A whole can of worms has been opened to the masses
WRT Nvidia actual DX9 shader performance)

During the entire development of Half Life 2, NVIDIA has had close technical
contact with Valve regarding the game. However, Valve has not made us aware
of the issues Gabe discussed.

(I reiterate: So much for close technical contact. But Brian, being the nice
guy and PR flack he is---I wouldn't expect him to say much differently than
all this.)

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because
up to two weeks prior to the Shader Day we had been working closely with
Valve to ensure that Release 50 (Rel. 50) provides the best experience
possible on NVIDIA hardware.

(Missing fog, perhaps? Or maybe screenshot artificial augmentation? Or
something else the general public is not privy to, that Nvidia might be
exploiting for PR's sake on the basis of lack of commonly available info,
and Gabe is a little more dignified at the moment to sling the big mudballs
with specifics? Who would put it past Nvidia after all said and done?)

Regarding the Half Life2 performance numbers that were published on the web,
we believe these performance numbers are invalid because they do not use our
Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months
ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and
other new games are included in our Rel.50 drivers - which reviewers
currently have a beta version of today. Rel. 50 is the best driver we've
ever built - it includes significant optimizations for the
highly-programmable GeForce FX architecture and includes feature and
performance benefits for over 100 million NVIDIA GPU customers.

(So, essentially we should use whichever optimized driver set provides the
best performance with whichever game it was designed to speed up, regardless
whether they' a: released to the public and b: WHQL certified? So what if
it breaks performance and/or functionality with other things, or previously
implemented workarounds? And stating that the 50's are the best set
yet...sans fog...is a little ludicrous. Talk is cheap. Release the drivers
RFN and let the people be the judge, if you dare...)

Pending detailed information from Valve, we are only aware one bug with Rel.
50 and the version of Half Life 2 that we currently have - this is the fog
issue that Gabe referred to in his presentation. It is not a cheat or an
over optimization. Our current drop of Half Life 2 is more than 2 weeks old.
NVIDIA's Rel. 50 driver will be public before the game is available. Since
we know that obtaining the best pixel shader performance from the GeForce FX
GPUs currently requires some specialized work, our developer technology team
works very closely with game developers. Part of this is understanding that
in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no
image quality benefit. Sometimes this involves converting 32-bit floating
point precision shader operations into 16-bit floating point precision
shaders in order to obtain the performance benefit of this mode with no
image quality degradation. Our goal is to provide our consumers the best
experience possible, and that means games must both look and run great.

(How much time is a developer expected to spend special-case optimizing
engines for hardware that does not fully conform to reference specs, or
implements them in a near-unplayable fashion with what is trying to be
accomplished from a creative standpoint? Regardless if it's the result of
any failure in relations between Nvidia and Microsoft. How much grease is
this gonna take? And downplaying the missing fog bug IMHO is a misstep. If
the proper implementation of that fog would skew results unfavorably in the
slightest---mind you, I can't say one way or another: I don't have these
drivers, and I'm not a developer---how does one think they have ANY leeway
whatsoever in their insistence such a driver should be used as part of a
valid performance assessment, let alone providing the best possible
experience? Maybe these drivers should be leaked and the general public
could see for themselves where the numbers lie---and I chose that word for a
reason---in their own evaluations? Quite frankly I feel that regardless of
how true it may be that 16-bit FP precision and PS 1.4 are more economical,
efficient codepaths in some instances without performance or IQ hit, telling
a developer that after he's coded the damn thing around DX9 PS 2.0 reference
calls and now has to push up the release date or burn some midnight oil just
to make a wimpy pipeline look better is either inevitable devrel suicide or
expensive. In any case, it's no excuse for the failure to measure up to all
the spewed hype, let alone reference standards. The latter part of the above
paragraph reads like "Sometimes, using earlier iterations of advanced
features that happen to be part of the spec our product was ostensibly
designed and hyped to the moon to support makes the game run much faster".)

The optimal code path for ATI and NVIDIA GPUs is different - so trying to
test them with the same code path will always disadvantage one or the other.
The default settings for each game have been chosen by both the developers
and NVIDIA in order to produce the best results for our consumers.

(Looking at some preliminary results would tend to provide some
contradictions to this latter assertion...a spoonful of truth, followed by a
spoonful of bull****, perhaps? Nothing new under the sun in PR-speak...)

In addition to the developer efforts, our driver team has developed a
next-generation automatic shader optimizer that vastly improves GeForce FX
pixel shader performance across the board. The fruits of these efforts will
be seen in our Rel.50 driver release. Many other improvements have also been
included in Rel.50, and these were all created either in response to, or in
anticipation of the first wave of shipping DirectX 9 titles, such as Half
Life 2.

(Read this: "We're painfully aware our DX9 shader performance sucks bricks
thru a straw compared to ATI's, although you won't EVER hear this from us,
mind you, so we're adding the overhead of a translation layer to Cg function
calls, thereby circumventing reference functionality thru our own brand of
emulation." Now who doesn't think this translates to: a: reduced precision
b: broken functionality with later DX requirements? The former might not
matter more than a ****hole in a snowbank in many instances, the
latter...who wants to spend $200-400+ on a piece of hardware that is not
even immediately future-proof? ****! Come on, Brian! Perhaps if the hardware
supported the API a little better upon inception, this last-minute,
knees-bent running around looking for leaves big enough to cover your asses
wouldn't be necessary. "I did it my way" worked for Sinatra. Indeed, we
shall see how well this works for Nvidia.)

We are committed to working with Gabe to fully understand his concerns and
with Valve to ensure that 100+ million NVIDIA consumers get the best
possible experience with Half Life 2 on NVIDIA hardware."

(Calling Gabe's evaluation invalid *ESPECIALLY when fog doesn't work* is
hardly a step in the right direction. It's laughable. There are no doubt
good reasons in Gabe's mind why he chose not to use the Det 50's. The real
question is, if the public were to see chapter and verse of these reasons,
how do YOU think Nvidia would look in the eyes of the community, Brian? The
million-dollar question is: "Did Valve optimize for ATI architecture at the
expense of Nvidia?" If so, it's not like this sort of thing wasn't funded by
Nvidia in the past, one's own medicine always tastes foul it seems. But
really, if Valve's dev team was just using reference API calls, and this
works better with ATI than with Nvidia---in fact it does, and this is
supported by several benchmarks---and Nvidia hardware is just not measuring
up, then perhaps Nvidia should throw some more time and money at Gabe et al
to help them obtain more favorable results using proprietary codepaths, or
STFU and go back to driver cheating which apparently is what they are
prioritizing.)

Brian Burke
NVIDIA Corp.

(Pixar on a single chip...so what if it takes as long to render a
scene...;-) Maybe "something hallucinogenic to smoke" should be bundled with
Nvidia cards...that way people could see the Emperor's New Clothes clear as
day, right next to the pink elephants..."if you can't dazzle 'em with
brilliance, baffle 'em with bull****" should be the corporate mantra of the
millennium)


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
pc problems after g card upgrade + sp2 ben reed Homebuilt PC's 9 November 30th 04 02:04 AM
Asus A7N8X-X and AMD Athlon XP 3200+ Information Scavenger Asus Motherboards 30 November 9th 04 10:30 PM
I dont see that nvidia is "finished"... Steven C \(Doktersteve\) Ati Videocards 17 September 13th 03 09:00 PM
Tomb Raider AOD benches: Bad news for Nvidia who be dat? Ati Videocards 33 September 4th 03 10:35 AM
Kyle Bennett (HardOCP) blasts NVIDIA Radeon350 Ati Videocards 12 August 13th 03 09:19 PM


All times are GMT +1. The time now is 07:13 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.