A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Ati Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Response by Nvidia concerning HL2 *warning, lengthy post, strong opinion content, some bad langwidge* NC-17 rating administered...



 
 
Thread Tools Display Modes
  #1  
Old September 12th 03, 05:30 AM
Dave
external usenet poster
 
Posts: n/a
Default Response by Nvidia concerning HL2 *warning, lengthy post, strong opinion content, some bad langwidge* NC-17 rating administered...

I think we'll bother the ATI forum with this as well. Might hate me now, but
you might thank me later...

"Lee Marsh" wrote in message
...
We have a response straight from NVIDIA addressing the recent storm that


snip.

Let's just schlep the whole statement right into this thread, shall we?
Naturally, some accompanying pithy and more not-so-concise jabs (wouldn't be
myself without them...). This is not a troll, nor a mission for me to
lambaste Nvidia, although I do my share of it. I would really like to see
them succeed, provided they knock off the BS. This is to point out some
inconsistencies I see just jumping out of this statement. I read it four
****ing times, and each time, the same things hit me in the same place:
night in the ruts, so here's some noise from those bruised jimmies, as I
feel that this type of nonsense really underestimates people's
intelligence...


"Over the last 24 hours, there has been quite a bit of controversy over
comments made by Gabe Newell of Valve at ATIs Shader Day.

(Fun's just beginning. A whole can of worms has been opened to the masses
WRT Nvidia actual DX9 shader performance)

During the entire development of Half Life 2, NVIDIA has had close technical
contact with Valve regarding the game. However, Valve has not made us aware
of the issues Gabe discussed.

(I reiterate: So much for close technical contact. But Brian, being the nice
guy and PR flack he is---I wouldn't expect him to say much differently than
all this.)

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because
up to two weeks prior to the Shader Day we had been working closely with
Valve to ensure that Release 50 (Rel. 50) provides the best experience
possible on NVIDIA hardware.

(Missing fog, perhaps? Or maybe screenshot artificial augmentation? Or
something else the general public is not privy to, that Nvidia might be
exploiting for PR's sake on the basis of lack of commonly available info,
and Gabe is a little more dignified at the moment to sling the big mudballs
with specifics? Who would put it past Nvidia after all said and done?)

Regarding the Half Life2 performance numbers that were published on the web,
we believe these performance numbers are invalid because they do not use our
Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months
ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and
other new games are included in our Rel.50 drivers - which reviewers
currently have a beta version of today. Rel. 50 is the best driver we've
ever built - it includes significant optimizations for the
highly-programmable GeForce FX architecture and includes feature and
performance benefits for over 100 million NVIDIA GPU customers.

(So, essentially we should use whichever optimized driver set provides the
best performance with whichever game it was designed to speed up, regardless
whether they' a: released to the public and b: WHQL certified? So what if
it breaks performance and/or functionality with other things, or previously
implemented workarounds? And stating that the 50's are the best set
yet...sans fog...is a little ludicrous. Talk is cheap. Release the drivers
RFN and let the people be the judge, if you dare...)

Pending detailed information from Valve, we are only aware one bug with Rel.
50 and the version of Half Life 2 that we currently have - this is the fog
issue that Gabe referred to in his presentation. It is not a cheat or an
over optimization. Our current drop of Half Life 2 is more than 2 weeks old.
NVIDIA's Rel. 50 driver will be public before the game is available. Since
we know that obtaining the best pixel shader performance from the GeForce FX
GPUs currently requires some specialized work, our developer technology team
works very closely with game developers. Part of this is understanding that
in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no
image quality benefit. Sometimes this involves converting 32-bit floating
point precision shader operations into 16-bit floating point precision
shaders in order to obtain the performance benefit of this mode with no
image quality degradation. Our goal is to provide our consumers the best
experience possible, and that means games must both look and run great.

(How much time is a developer expected to spend special-case optimizing
engines for hardware that does not fully conform to reference specs, or
implements them in a near-unplayable fashion with what is trying to be
accomplished from a creative standpoint? Regardless if it's the result of
any failure in relations between Nvidia and Microsoft. How much grease is
this gonna take? And downplaying the missing fog bug IMHO is a misstep. If
the proper implementation of that fog would skew results unfavorably in the
slightest---mind you, I can't say one way or another: I don't have these
drivers, and I'm not a developer---how does one think they have ANY leeway
whatsoever in their insistence such a driver should be used as part of a
valid performance assessment, let alone providing the best possible
experience? Maybe these drivers should be leaked and the general public
could see for themselves where the numbers lie---and I chose that word for a
reason---in their own evaluations? Quite frankly I feel that regardless of
how true it may be that 16-bit FP precision and PS 1.4 are more economical,
efficient codepaths in some instances without performance or IQ hit, telling
a developer that after he's coded the damn thing around DX9 PS 2.0 reference
calls and now has to push up the release date or burn some midnight oil just
to make a wimpy pipeline look better is either inevitable devrel suicide or
expensive. In any case, it's no excuse for the failure to measure up to all
the spewed hype, let alone reference standards. The latter part of the above
paragraph reads like "Sometimes, using earlier iterations of advanced
features that happen to be part of the spec our product was ostensibly
designed and hyped to the moon to support makes the game run much faster".)

The optimal code path for ATI and NVIDIA GPUs is different - so trying to
test them with the same code path will always disadvantage one or the other.
The default settings for each game have been chosen by both the developers
and NVIDIA in order to produce the best results for our consumers.

(Looking at some preliminary results would tend to provide some
contradictions to this latter assertion...a spoonful of truth, followed by a
spoonful of bull****, perhaps? Nothing new under the sun in PR-speak...)

In addition to the developer efforts, our driver team has developed a
next-generation automatic shader optimizer that vastly improves GeForce FX
pixel shader performance across the board. The fruits of these efforts will
be seen in our Rel.50 driver release. Many other improvements have also been
included in Rel.50, and these were all created either in response to, or in
anticipation of the first wave of shipping DirectX 9 titles, such as Half
Life 2.

(Read this: "We're painfully aware our DX9 shader performance sucks bricks
thru a straw compared to ATI's, although you won't EVER hear this from us,
mind you, so we're adding the overhead of a translation layer to Cg function
calls, thereby circumventing reference functionality thru our own brand of
emulation." Now who doesn't think this translates to: a: reduced precision
b: broken functionality with later DX requirements? The former might not
matter more than a ****hole in a snowbank in many instances, the
latter...who wants to spend $200-400+ on a piece of hardware that is not
even immediately future-proof? ****! Come on, Brian! Perhaps if the hardware
supported the API a little better upon inception, this last-minute,
knees-bent running around looking for leaves big enough to cover your asses
wouldn't be necessary. "I did it my way" worked for Sinatra. Indeed, we
shall see how well this works for Nvidia.)

We are committed to working with Gabe to fully understand his concerns and
with Valve to ensure that 100+ million NVIDIA consumers get the best
possible experience with Half Life 2 on NVIDIA hardware."

(Calling Gabe's evaluation invalid *ESPECIALLY when fog doesn't work* is
hardly a step in the right direction. It's laughable. There are no doubt
good reasons in Gabe's mind why he chose not to use the Det 50's. The real
question is, if the public were to see chapter and verse of these reasons,
how do YOU think Nvidia would look in the eyes of the community, Brian? The
million-dollar question is: "Did Valve optimize for ATI architecture at the
expense of Nvidia?" If so, it's not like this sort of thing wasn't funded by
Nvidia in the past, one's own medicine always tastes foul it seems. But
really, if Valve's dev team was just using reference API calls, and this
works better with ATI than with Nvidia---in fact it does, and this is
supported by several benchmarks---and Nvidia hardware is just not measuring
up, then perhaps Nvidia should throw some more time and money at Gabe et al
to help them obtain more favorable results using proprietary codepaths, or
STFU and go back to driver cheating which apparently is what they are
prioritizing.)

Brian Burke
NVIDIA Corp.

(Pixar on a single chip...so what if it takes as long to render a
scene...;-) Maybe "something hallucinogenic to smoke" should be bundled with
Nvidia cards...that way people could see the Emperor's New Clothes clear as
day, right next to the pink elephants..."if you can't dazzle 'em with
brilliance, baffle 'em with bull****" should be the corporate mantra of the
millennium)


  #2  
Old September 12th 03, 05:40 AM
JAD
external usenet poster
 
Posts: n/a
Default

I am certainly not offended by your post, But a I have to say that if you put this kind of conviction in everything you do then I
suspect you will be very successful. However, if you find this taking up every minute of your day, worrying about Nvidia and ATI
then, Damn Dude turn off the juice and get out more. Wouldn't be weird if the picture appeared of both CEO's playing golf together
again and Bill Gates was making it a threesome.

"Dave" wrote in message news:zZb8b.420297$o%2.191281@sccrnsc02...
I think we'll bother the ATI forum with this as well. Might hate me now, but
you might thank me later...

"Lee Marsh" wrote in message
...
We have a response straight from NVIDIA addressing the recent storm that


snip.

Let's just schlep the whole statement right into this thread, shall we?
Naturally, some accompanying pithy and more not-so-concise jabs (wouldn't be
myself without them...). This is not a troll, nor a mission for me to
lambaste Nvidia, although I do my share of it. I would really like to see
them succeed, provided they knock off the BS. This is to point out some
inconsistencies I see just jumping out of this statement. I read it four
****ing times, and each time, the same things hit me in the same place:
night in the ruts, so here's some noise from those bruised jimmies, as I
feel that this type of nonsense really underestimates people's
intelligence...





  #3  
Old September 12th 03, 06:03 AM
Strontium
external usenet poster
 
Posts: n/a
Default

Blah, blah, blah.
Drivers, drivers, drivers.....

YADA, YADA, YADA

Dippity doo dah, dippity day...

'Crayz jig, in the middle of the street'


It's already been stated, elsewhere (and NO I'm not going to supply
references as this ****ing fanboy war has produced enough), that the nVidia
hardware just does NOT cut it.

Talk about drivers, all you want. If you want your card to run on drivers,
alone.....have fun fanbois


-
Dave stood up at show-n-tell, in zZb8b.420297$o%2.191281@sccrnsc02, and
said:

I think we'll bother the ATI forum with this as well. Might hate me
now, but you might thank me later...

"Lee Marsh" wrote in message
...
We have a response straight from NVIDIA addressing the recent storm
that


snip.

Let's just schlep the whole statement right into this thread, shall
we? Naturally, some accompanying pithy and more not-so-concise jabs
(wouldn't be myself without them...). This is not a troll, nor a
mission for me to lambaste Nvidia, although I do my share of it. I
would really like to see them succeed, provided they knock off the
BS. This is to point out some inconsistencies I see just jumping out
of this statement. I read it four ****ing times, and each time, the
same things hit me in the same place: night in the ruts, so here's
some noise from those bruised jimmies, as I feel that this type of
nonsense really underestimates people's intelligence...


"Over the last 24 hours, there has been quite a bit of controversy
over comments made by Gabe Newell of Valve at ATIs Shader Day.

(Fun's just beginning. A whole can of worms has been opened to the
masses WRT Nvidia actual DX9 shader performance)

During the entire development of Half Life 2, NVIDIA has had close
technical contact with Valve regarding the game. However, Valve has
not made us aware of the issues Gabe discussed.

(I reiterate: So much for close technical contact. But Brian, being
the nice guy and PR flack he is---I wouldn't expect him to say much
differently than all this.)

We're confused as to why Valve chose to use Release. 45 (Rel. 45) -
because up to two weeks prior to the Shader Day we had been working
closely with Valve to ensure that Release 50 (Rel. 50) provides the
best experience possible on NVIDIA hardware.

(Missing fog, perhaps? Or maybe screenshot artificial augmentation? Or
something else the general public is not privy to, that Nvidia might
be exploiting for PR's sake on the basis of lack of commonly
available info, and Gabe is a little more dignified at the moment to
sling the big mudballs with specifics? Who would put it past Nvidia
after all said and done?)

Regarding the Half Life2 performance numbers that were published on
the web, we believe these performance numbers are invalid because
they do not use our Rel. 50 drivers. Engineering efforts on our Rel.
45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's
optimizations for Half Life 2 and other new games are included in our
Rel.50 drivers - which reviewers currently have a beta version of
today. Rel. 50 is the best driver we've ever built - it includes
significant optimizations for the highly-programmable GeForce FX
architecture and includes feature and performance benefits for over
100 million NVIDIA GPU customers.

(So, essentially we should use whichever optimized driver set
provides the best performance with whichever game it was designed to
speed up, regardless whether they' a: released to the public and
b: WHQL certified? So what if it breaks performance and/or
functionality with other things, or previously implemented
workarounds? And stating that the 50's are the best set yet...sans
fog...is a little ludicrous. Talk is cheap. Release the drivers RFN
and let the people be the judge, if you dare...)

Pending detailed information from Valve, we are only aware one bug
with Rel. 50 and the version of Half Life 2 that we currently have -
this is the fog issue that Gabe referred to in his presentation. It
is not a cheat or an over optimization. Our current drop of Half Life
2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public
before the game is available. Since we know that obtaining the best
pixel shader performance from the GeForce FX GPUs currently requires
some specialized work, our developer technology team works very
closely with game developers. Part of this is understanding that in
many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9)
provides no image quality benefit. Sometimes this involves converting
32-bit floating point precision shader operations into 16-bit
floating point precision shaders in order to obtain the performance
benefit of this mode with no image quality degradation. Our goal is
to provide our consumers the best experience possible, and that means
games must both look and run great.

(How much time is a developer expected to spend special-case
optimizing engines for hardware that does not fully conform to
reference specs, or implements them in a near-unplayable fashion with
what is trying to be accomplished from a creative standpoint?
Regardless if it's the result of any failure in relations between
Nvidia and Microsoft. How much grease is this gonna take? And
downplaying the missing fog bug IMHO is a misstep. If the proper
implementation of that fog would skew results unfavorably in the
slightest---mind you, I can't say one way or another: I don't have
these drivers, and I'm not a developer---how does one think they have
ANY leeway whatsoever in their insistence such a driver should be
used as part of a valid performance assessment, let alone providing
the best possible experience? Maybe these drivers should be leaked
and the general public could see for themselves where the numbers
lie---and I chose that word for a reason---in their own evaluations?
Quite frankly I feel that regardless of how true it may be that
16-bit FP precision and PS 1.4 are more economical, efficient
codepaths in some instances without performance or IQ hit, telling a
developer that after he's coded the damn thing around DX9 PS 2.0
reference calls and now has to push up the release date or burn some
midnight oil just to make a wimpy pipeline look better is either
inevitable devrel suicide or expensive. In any case, it's no excuse
for the failure to measure up to all the spewed hype, let alone
reference standards. The latter part of the above paragraph reads
like "Sometimes, using earlier iterations of advanced features that
happen to be part of the spec our product was ostensibly designed and
hyped to the moon to support makes the game run much faster".)

The optimal code path for ATI and NVIDIA GPUs is different - so
trying to test them with the same code path will always disadvantage
one or the other. The default settings for each game have been chosen
by both the developers and NVIDIA in order to produce the best
results for our consumers.

(Looking at some preliminary results would tend to provide some
contradictions to this latter assertion...a spoonful of truth,
followed by a spoonful of bull****, perhaps? Nothing new under the
sun in PR-speak...)

In addition to the developer efforts, our driver team has developed a
next-generation automatic shader optimizer that vastly improves
GeForce FX pixel shader performance across the board. The fruits of
these efforts will be seen in our Rel.50 driver release. Many other
improvements have also been included in Rel.50, and these were all
created either in response to, or in anticipation of the first wave
of shipping DirectX 9 titles, such as Half Life 2.

(Read this: "We're painfully aware our DX9 shader performance sucks
bricks thru a straw compared to ATI's, although you won't EVER hear
this from us, mind you, so we're adding the overhead of a translation
layer to Cg function calls, thereby circumventing reference
functionality thru our own brand of emulation." Now who doesn't think
this translates to: a: reduced precision b: broken functionality with
later DX requirements? The former might not matter more than a
****hole in a snowbank in many instances, the latter...who wants to
spend $200-400+ on a piece of hardware that is not even immediately
future-proof? ****! Come on, Brian! Perhaps if the hardware supported
the API a little better upon inception, this last-minute, knees-bent
running around looking for leaves big enough to cover your asses
wouldn't be necessary. "I did it my way" worked for Sinatra. Indeed,
we shall see how well this works for Nvidia.)

We are committed to working with Gabe to fully understand his
concerns and with Valve to ensure that 100+ million NVIDIA consumers
get the best possible experience with Half Life 2 on NVIDIA hardware."

(Calling Gabe's evaluation invalid *ESPECIALLY when fog doesn't work*
is hardly a step in the right direction. It's laughable. There are no
doubt good reasons in Gabe's mind why he chose not to use the Det
50's. The real question is, if the public were to see chapter and
verse of these reasons, how do YOU think Nvidia would look in the
eyes of the community, Brian? The million-dollar question is: "Did
Valve optimize for ATI architecture at the expense of Nvidia?" If so,
it's not like this sort of thing wasn't funded by Nvidia in the past,
one's own medicine always tastes foul it seems. But really, if
Valve's dev team was just using reference API calls, and this works
better with ATI than with Nvidia---in fact it does, and this is
supported by several benchmarks---and Nvidia hardware is just not
measuring up, then perhaps Nvidia should throw some more time and
money at Gabe et al to help them obtain more favorable results using
proprietary codepaths, or STFU and go back to driver cheating which
apparently is what they are prioritizing.)

Brian Burke
NVIDIA Corp.

(Pixar on a single chip...so what if it takes as long to render a
scene...;-) Maybe "something hallucinogenic to smoke" should be
bundled with Nvidia cards...that way people could see the Emperor's
New Clothes clear as day, right next to the pink elephants..."if you
can't dazzle 'em with brilliance, baffle 'em with bull****" should be
the corporate mantra of the millennium)


--
Strontium

"It's no surprise, to me. I am my own worst enemy. `Cause every
now, and then, I kick the livin' **** `outta me." - Lit


  #4  
Old September 12th 03, 06:13 AM
Dave
external usenet poster
 
Posts: n/a
Default


"JAD" wrote in message
nk.net...
I am certainly not offended by your post, But a I have to say that if you

put this kind of conviction in everything you do then I
suspect you will be very successful.


Actually, I am, in spite of myself...please don't tell anyone, wouldn't
wanna spoil my carefully contrived bad image. ;-)

However, if you find this taking up every minute of your day, worrying

about Nvidia and ATI
then, Damn Dude turn off the juice and get out more.


I came, I saw, I read the statement, I gave my opinion, took me about five
minutes to react and think, ten minutes to type, that's pretty much "all she
wrote". I don't worry about much more than making a compelling argument
here. Nvidia's future won't affect mine all too much, neither will ATI's. I
dumped my Nvidia stock some time ago. Really, I don't worry very much about
anything, despite appearances to the contrary. There are certainly far
larger priorities to me than a video card company to worry about in any
case. I may say a lot of the same things in the same places, but I'm not
obsessive at all...just somewhat consistent.

Wouldn't be weird if the picture appeared of both CEO's playing golf

together
again and Bill Gates was making it a threesome.


I might be tempted to make it into a dartboard and market it...;-)


snip.


  #5  
Old September 12th 03, 06:20 AM
Mark Leuck
external usenet poster
 
Posts: n/a
Default


"Strontium" wrote in message
...
Blah, blah, blah.
Drivers, drivers, drivers.....

YADA, YADA, YADA

Dippity doo dah, dippity day...

'Crayz jig, in the middle of the street'


It's already been stated, elsewhere (and NO I'm not going to supply
references as this ****ing fanboy war has produced enough), that the

nVidia
hardware just does NOT cut it.

Talk about drivers, all you want. If you want your card to run on

drivers,
alone.....have fun fanbois


I'd think ANY card wouldn't run worth crap without drivers


  #6  
Old September 12th 03, 06:26 AM
Strontium
external usenet poster
 
Posts: n/a
Default

-
Dave stood up at show-n-tell, in TBc8b.421235$Ho3.66910@sccrnsc03, and said:

"JAD" wrote in message
nk.net...
I am certainly not offended by your post, But a I have to say that
if you

put this kind of conviction in everything you do then I
suspect you will be very successful.


Actually, I am, in spite of myself...please don't tell anyone,
wouldn't wanna spoil my carefully contrived bad image. ;-)

However, if you find this taking up every minute of your day,
worrying about Nvidia and ATI then, Damn Dude turn off the juice and
get out more.


I came, I saw, I read the statement, I gave my opinion, took me about
five minutes to react and think, ten minutes to type, that's pretty
much "all she wrote". I don't worry about much more than making a
compelling argument here.


Or, starting a flamewar? Pffffff! If it was so 'blase', to you, why in
the Hell did you decide to x-post propoganda? The only reason possible is
that you get off on starting arguments. As knowledgeable as you put
yourself forth to be about both cards, it would seem you've read both of
these groups. And, in so knowing, that such a post would incite a flamewar.
So, drop the 'innocent' act. The fact that you knew this group, pretty much
nails that.


Nvidia's future won't affect mine all too
much, neither will ATI's. I dumped my Nvidia stock some time ago.
Really, I don't worry very much about anything, despite appearances
to the contrary. There are certainly far larger priorities to me than
a video card company to worry about in any case. I may say a lot of
the same things in the same places, but I'm not obsessive at
all...just somewhat consistent.


Then, why make a point to x-post something that is sure to start a flamewar?



Wouldn't be weird if the picture appeared of both CEO's playing golf
together again and Bill Gates was making it a threesome.


I might be tempted to make it into a dartboard and market it...;-)


You seem crafted, in the art of 'controversy'. Maybe, one day, someone will
be throwing darts at 'your' head



snip.


--
Strontium

"It's no surprise, to me. I am my own worst enemy. `Cause every
now, and then, I kick the livin' **** `outta me." - Lit


  #7  
Old September 12th 03, 06:28 AM
Strontium
external usenet poster
 
Posts: n/a
Default


-
Mark Leuck stood up at show-n-tell, in JIc8b.322583$Oz4.112906@rwcrnsc54,
and said:

"Strontium" wrote in message
...
Blah, blah, blah.
Drivers, drivers, drivers.....

YADA, YADA, YADA

Dippity doo dah, dippity day...

'Crayz jig, in the middle of the street'


It's already been stated, elsewhere (and NO I'm not going to supply
references as this ****ing fanboy war has produced enough), that the
nVidia hardware just does NOT cut it.

Talk about drivers, all you want. If you want your card to run on
drivers, alone.....have fun fanbois


I'd think ANY card wouldn't run worth crap without drivers


Cute joke, but not very good. I said 'alone' (i.e. minus hardware, for the
illiterate).


--
Strontium

"It's no surprise, to me. I am my own worst enemy. `Cause every
now, and then, I kick the livin' **** `outta me." - Lit


  #8  
Old September 12th 03, 06:38 AM
Dave
external usenet poster
 
Posts: n/a
Default


"Strontium" wrote in message
...
Blah, blah, blah.
Drivers, drivers, drivers.....

YADA, YADA, YADA

Dippity doo dah, dippity day...

'Crayz jig, in the middle of the street'


Thank you, drive thru...I suppose this is your copout for not having
anything meaningful to add? And now, predictably enough, you'll become
petulant and insulting, right? Come on. You know you want to. Humor me.
Beyatch. ;-)

It's already been stated, elsewhere (and NO I'm not going to supply
references as this ****ing fanboy war has produced enough), that the

nVidia
hardware just does NOT cut it.


Perhaps not in HL2, but let's take a look at the OpenGL picture, shall we?
My rant was specifically targeted, not representative of my general
overview. And it was not intended to start a fanboi flamefest, contrary to
the abject ignorance of some folks.

Talk about drivers, all you want. If you want your card to run on

drivers,
alone.....have fun fanbois


Seems it went right over your head. Your hair ain't parted in the middle
from all those near misses, perchance, is it? If you think I'm a fanboi, I
think you need to get a life and a clue, not necessarily in that order. You
might find them in Aisle 6 in Wal-Mart. For the record, I'm playing a common
'tater, with agi 'tater tendencies. Oh, and BTW, as far as drivers are
concerned, seems cards won't run very well without them, now imagine that!
What else do they run on? Fossil fuels? A pile of ammonium dichromate with a
magnesium ribbon as a wick? Maybe some of yer C17 H21 NO4? Heh! ;-) If that
were the case, maybe HL2 would run faster...


  #9  
Old September 12th 03, 07:30 AM
Dave
external usenet poster
 
Posts: n/a
Default


"Strontium" wrote in message
...
-
Dave stood up at show-n-tell, in TBc8b.421235$Ho3.66910@sccrnsc03, and

said:


snip

I came, I saw, I read the statement, I gave my opinion, took me about
five minutes to react and think, ten minutes to type, that's pretty
much "all she wrote". I don't worry about much more than making a
compelling argument here.


Or, starting a flamewar? Pffffff!


Nope. Bzzt, wrong. Thank you for playing, Next contestant, please. Who was
the first to hop on the bandwagon and flame me so far? You. Says a lot for
you, eh? ;-) If that was indeed my intent, you played oh so conveniently
right into my scheme in a very timely fashion no less. That would make you a
fish.
I dangled my johnson, and you jumped right into the car and stuck your head
immediately in my lap. Sucker!

I repeat: if that was my intent.

If it was so 'blase', to you, why in
the Hell did you decide to x-post propoganda?


Propaganda? ROTFL. Perhaps the truth as seen from the eyes of a consummate
cynic, but I'd really like to see you prove conclusively that anything I've
stated wasn't just my opinion based on how Nvidia's official response
appeared to me based on track record and established fact. I'm not terribly
concerned about who it influences. There goes your propaganda schtick.


The only reason possible is
that you get off on starting arguments.


So why, pray tell, are you sitting there taking knee-jerk potshots at me,
other than to provide me with amusement? Please, do tell...oh, let me guess,
you reflexively felt the urge to flame me, is that it? Silly wabbit...

"Your five minutes is up"
"No it isn't"
"Yes it is"
"Then why are you still arguing with me?"
"I am not"
"Yes you are..."

Your deductive reasoning could use a little work, my friend...

As knowledgeable as you put
yourself forth to be about both cards,


I don't claim to be. I offer opinion, I offer advice, every now and then
someone gets from me what they're begging for, but not without (my own brand
of perhaps demented) humor. Constructive criticism can take many forms.
Sometimes it comes down to "where there's a whip, there's a way". But if you
see something that's flat-out wrong, perhaps you can correct me in a nice,
constructive fashion that is conducive to rational discussion, that is, if
you're at all capable of something more than trite hypocrisy. Feel free.
Surprise me. Elevate thy stature a little beyond the mundane and
all-too-boringly predictable. If you continue to wallow in banality by
insisting upon reducing a potential discussion into another typical, silly
flamefest, I'll just ignore you: I really don't feel like wasting my time
with such a monochromatic viewpoint.

it would seem you've read both of
these groups. And, in so knowing, that such a post would incite a

flamewar.
So, drop the 'innocent' act. The fact that you knew this group, pretty

much
nails that.


I've been in and out of here for many years and seen the best and the worst.
If my opinion incites a flamewar, there's nothing I can do to compensate for
people's lack of understanding, now is there? If that's the way you choose
to view things, then I challenge you to step outside of the context of your
skewed opinion and take a good, long objective look at what's really going
on behind the scenes. Or don't. Not my problem in either case. I can live
quite comfortably with my opinion. Question is, can you?

Nvidia's future won't affect mine all too
much, neither will ATI's. I dumped my Nvidia stock some time ago.
Really, I don't worry very much about anything, despite appearances
to the contrary. There are certainly far larger priorities to me than
a video card company to worry about in any case. I may say a lot of
the same things in the same places, but I'm not obsessive at
all...just somewhat consistent.


Then, why make a point to x-post something that is sure to start a

flamewar?

It is an issue that is germane to both groups. The foibles of human nature
are not my concern here.

Wouldn't be weird if the picture appeared of both CEO's playing golf
together again and Bill Gates was making it a threesome.


I might be tempted to make it into a dartboard and market it...;-)


You seem crafted, in the art of 'controversy'.


That was not my intent. My intent was to promote discussion. If it takes the
shape of a flamefest, it's just typical Usenet for ya. If you react this way
to strong opinions as a matter of course, perhaps you shouldn't read things
that provoke you thusly...or maybe *you* just like milking your anxiety
gradient for kicks, I dunno...

Maybe, one day, someone will
be throwing darts at 'your' head


The search for fame and recognition is not exactly on my "to do" list this
year. However, if someone would like to throw darts at my picture, any
publicity is good publicity I suppose...;-)



  #10  
Old September 12th 03, 07:44 AM
Dave
external usenet poster
 
Posts: n/a
Default


"Strontium" wrote in message
...



Cute joke, but not very good. I said 'alone' (i.e. minus hardware, for

the
illiterate).


Nice try, but the phrase "If you want your card to run on drivers, alone"
presumes the existence of a card running on something, right?
Not all your fault, really. You were a chem major, not an English major, I'm
assuming...your concept of literacy appears rather vague here.


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
pc problems after g card upgrade + sp2 ben reed Homebuilt PC's 9 November 30th 04 02:04 AM
Asus A7N8X-X and AMD Athlon XP 3200+ Information Scavenger Asus Motherboards 30 November 9th 04 10:30 PM
I dont see that nvidia is "finished"... Steven C \(Doktersteve\) Ati Videocards 17 September 13th 03 09:00 PM
Tomb Raider AOD benches: Bad news for Nvidia who be dat? Ati Videocards 33 September 4th 03 10:35 AM
Kyle Bennett (HardOCP) blasts NVIDIA Radeon350 Ati Videocards 12 August 13th 03 09:19 PM


All times are GMT +1. The time now is 02:41 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.