A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Ati Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Response by Nvidia concerning HL2 *warning, lengthy post, strong opinion content, some bad langwidge* NC-17 rating administered...



 
 
Thread Tools Display Modes
  #11  
Old September 12th 03, 07:52 AM
Strontium
external usenet poster
 
Posts: n/a
Default

-
Dave stood up at show-n-tell, in LZc8b.423934$uu5.76802@sccrnsc04, and said:

"Strontium" wrote in message
...
Blah, blah, blah.
Drivers, drivers, drivers.....

YADA, YADA, YADA

Dippity doo dah, dippity day...

'Crayz jig, in the middle of the street'


Thank you, drive thru...I suppose this is your copout for not having
anything meaningful to add? And now, predictably enough, you'll become
petulant and insulting, right? Come on. You know you want to. Humor
me. Beyatch. ;-)



Umm, and your statement was more meaningful, how? Oh...wait, you are
perpetuating the flamewar......


It's already been stated, elsewhere (and NO I'm not going to supply
references as this ****ing fanboy war has produced enough), that the
nVidia hardware just does NOT cut it.


Perhaps not in HL2, but let's take a look at the OpenGL picture,
shall we? My rant was specifically targeted, not representative of my
general overview. And it was not intended to start a fanboi
flamefest, contrary to the abject ignorance of some folks.


I've seen no difference, OGl wise, between any cards. But, then, OGl seems
to be picking up lately.



Talk about drivers, all you want. If you want your card to run on
drivers, alone.....have fun fanbois


Seems it went right over your head. Your hair ain't parted in the
middle from all those near misses, perchance, is it? If you think I'm
a fanboi, I think you need to get a life and a clue, not necessarily
in that order. You might find them in Aisle 6 in Wal-Mart.


Well, to put it, BLUNTLY, you x-posted this CRAP to two well known flame
beds. So, if anyone has a question, it is me: Are you just a moron? Or, a
****ing flameware-wanting-dickhead? I gotta ask, because you seem to be
'egging' it on. Regardless. You've made my killfile mr. 'I'm just hear to
have intelligent discussion, while I throw controversial **** out there and
hope for flamewars'. **** off. See YEAH MR IM JUST WANTING CONVERSATION
WHILE CROSSPOSTING CRAP TO TWO WELL KNOWN FLAMING GROUPS. Yeah. I believe
that you just wanted meaningful, intelligent discussion, NOT.

See ya, sparky.



For the
record, I'm playing a common 'tater, with agi 'tater tendencies.



No, you are being a troll.. At least, that's what it's called these days.


snip



--
Strontium

"It's no surprise, to me. I am my own worst enemy. `Cause every
now, and then, I kick the livin' **** `outta me." - Lit


  #12  
Old September 12th 03, 07:53 AM
Strontium
external usenet poster
 
Posts: n/a
Default


-
Dave stood up at show-n-tell, in IJd8b.424186$uu5.76736@sccrnsc04, and said:

"Strontium" wrote in message
...
-
Dave stood up at show-n-tell, in TBc8b.421235$Ho3.66910@sccrnsc03,
and said:


snip


Your very reply, proves my point. Go argue with your fist for not jerking
your small snub, today. Today, I've killfiled you. Sorry, no herky-jerky
here CYA **** FACE.




Strontium

"It's no surprise, to me. I am my own worst enemy. `Cause every
now, and then, I kick the livin' **** `outta me." - Lit


  #13  
Old September 12th 03, 01:17 PM
SST
external usenet poster
 
Posts: n/a
Default

LOL!! I love the street jig visual.

You're sort of right, I keep reading about optimized drivers, drivers and
more drivers. Maybe nVidia will release a driver for every game made so they
can optimize them to death. That might work. They can even charge lets say,
$10 or $20 bucks for 'special optimized' driver packs on their web site.
Hehehe...



BTW: We hate Dave too but be careful - he's crazy and strangely obsessive. I
don't respond to him, actually he's in my kill file so I only get dribs and
drabs of his drool.


"Strontium" wrote in message
...
Blah, blah, blah.
Drivers, drivers, drivers.....

YADA, YADA, YADA

Dippity doo dah, dippity day...

'Crayz jig, in the middle of the street'


It's already been stated, elsewhere (and NO I'm not going to supply
references as this ****ing fanboy war has produced enough), that the

nVidia
hardware just does NOT cut it.

Talk about drivers, all you want. If you want your card to run on

drivers,
alone.....have fun fanbois


-
Dave stood up at show-n-tell, in zZb8b.420297$o%2.191281@sccrnsc02, and
said:

I think we'll bother the ATI forum with this as well. Might hate me
now, but you might thank me later...

"Lee Marsh" wrote in message
...
We have a response straight from NVIDIA addressing the recent storm
that


snip.

Let's just schlep the whole statement right into this thread, shall
we? Naturally, some accompanying pithy and more not-so-concise jabs
(wouldn't be myself without them...). This is not a troll, nor a
mission for me to lambaste Nvidia, although I do my share of it. I
would really like to see them succeed, provided they knock off the
BS. This is to point out some inconsistencies I see just jumping out
of this statement. I read it four ****ing times, and each time, the
same things hit me in the same place: night in the ruts, so here's
some noise from those bruised jimmies, as I feel that this type of
nonsense really underestimates people's intelligence...


"Over the last 24 hours, there has been quite a bit of controversy
over comments made by Gabe Newell of Valve at ATIs Shader Day.

(Fun's just beginning. A whole can of worms has been opened to the
masses WRT Nvidia actual DX9 shader performance)

During the entire development of Half Life 2, NVIDIA has had close
technical contact with Valve regarding the game. However, Valve has
not made us aware of the issues Gabe discussed.

(I reiterate: So much for close technical contact. But Brian, being
the nice guy and PR flack he is---I wouldn't expect him to say much
differently than all this.)

We're confused as to why Valve chose to use Release. 45 (Rel. 45) -
because up to two weeks prior to the Shader Day we had been working
closely with Valve to ensure that Release 50 (Rel. 50) provides the
best experience possible on NVIDIA hardware.

(Missing fog, perhaps? Or maybe screenshot artificial augmentation? Or
something else the general public is not privy to, that Nvidia might
be exploiting for PR's sake on the basis of lack of commonly
available info, and Gabe is a little more dignified at the moment to
sling the big mudballs with specifics? Who would put it past Nvidia
after all said and done?)

Regarding the Half Life2 performance numbers that were published on
the web, we believe these performance numbers are invalid because
they do not use our Rel. 50 drivers. Engineering efforts on our Rel.
45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's
optimizations for Half Life 2 and other new games are included in our
Rel.50 drivers - which reviewers currently have a beta version of
today. Rel. 50 is the best driver we've ever built - it includes
significant optimizations for the highly-programmable GeForce FX
architecture and includes feature and performance benefits for over
100 million NVIDIA GPU customers.

(So, essentially we should use whichever optimized driver set
provides the best performance with whichever game it was designed to
speed up, regardless whether they' a: released to the public and
b: WHQL certified? So what if it breaks performance and/or
functionality with other things, or previously implemented
workarounds? And stating that the 50's are the best set yet...sans
fog...is a little ludicrous. Talk is cheap. Release the drivers RFN
and let the people be the judge, if you dare...)

Pending detailed information from Valve, we are only aware one bug
with Rel. 50 and the version of Half Life 2 that we currently have -
this is the fog issue that Gabe referred to in his presentation. It
is not a cheat or an over optimization. Our current drop of Half Life
2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public
before the game is available. Since we know that obtaining the best
pixel shader performance from the GeForce FX GPUs currently requires
some specialized work, our developer technology team works very
closely with game developers. Part of this is understanding that in
many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9)
provides no image quality benefit. Sometimes this involves converting
32-bit floating point precision shader operations into 16-bit
floating point precision shaders in order to obtain the performance
benefit of this mode with no image quality degradation. Our goal is
to provide our consumers the best experience possible, and that means
games must both look and run great.

(How much time is a developer expected to spend special-case
optimizing engines for hardware that does not fully conform to
reference specs, or implements them in a near-unplayable fashion with
what is trying to be accomplished from a creative standpoint?
Regardless if it's the result of any failure in relations between
Nvidia and Microsoft. How much grease is this gonna take? And
downplaying the missing fog bug IMHO is a misstep. If the proper
implementation of that fog would skew results unfavorably in the
slightest---mind you, I can't say one way or another: I don't have
these drivers, and I'm not a developer---how does one think they have
ANY leeway whatsoever in their insistence such a driver should be
used as part of a valid performance assessment, let alone providing
the best possible experience? Maybe these drivers should be leaked
and the general public could see for themselves where the numbers
lie---and I chose that word for a reason---in their own evaluations?
Quite frankly I feel that regardless of how true it may be that
16-bit FP precision and PS 1.4 are more economical, efficient
codepaths in some instances without performance or IQ hit, telling a
developer that after he's coded the damn thing around DX9 PS 2.0
reference calls and now has to push up the release date or burn some
midnight oil just to make a wimpy pipeline look better is either
inevitable devrel suicide or expensive. In any case, it's no excuse
for the failure to measure up to all the spewed hype, let alone
reference standards. The latter part of the above paragraph reads
like "Sometimes, using earlier iterations of advanced features that
happen to be part of the spec our product was ostensibly designed and
hyped to the moon to support makes the game run much faster".)

The optimal code path for ATI and NVIDIA GPUs is different - so
trying to test them with the same code path will always disadvantage
one or the other. The default settings for each game have been chosen
by both the developers and NVIDIA in order to produce the best
results for our consumers.

(Looking at some preliminary results would tend to provide some
contradictions to this latter assertion...a spoonful of truth,
followed by a spoonful of bull****, perhaps? Nothing new under the
sun in PR-speak...)

In addition to the developer efforts, our driver team has developed a
next-generation automatic shader optimizer that vastly improves
GeForce FX pixel shader performance across the board. The fruits of
these efforts will be seen in our Rel.50 driver release. Many other
improvements have also been included in Rel.50, and these were all
created either in response to, or in anticipation of the first wave
of shipping DirectX 9 titles, such as Half Life 2.

(Read this: "We're painfully aware our DX9 shader performance sucks
bricks thru a straw compared to ATI's, although you won't EVER hear
this from us, mind you, so we're adding the overhead of a translation
layer to Cg function calls, thereby circumventing reference
functionality thru our own brand of emulation." Now who doesn't think
this translates to: a: reduced precision b: broken functionality with
later DX requirements? The former might not matter more than a
****hole in a snowbank in many instances, the latter...who wants to
spend $200-400+ on a piece of hardware that is not even immediately
future-proof? ****! Come on, Brian! Perhaps if the hardware supported
the API a little better upon inception, this last-minute, knees-bent
running around looking for leaves big enough to cover your asses
wouldn't be necessary. "I did it my way" worked for Sinatra. Indeed,
we shall see how well this works for Nvidia.)

We are committed to working with Gabe to fully understand his
concerns and with Valve to ensure that 100+ million NVIDIA consumers
get the best possible experience with Half Life 2 on NVIDIA hardware."

(Calling Gabe's evaluation invalid *ESPECIALLY when fog doesn't work*
is hardly a step in the right direction. It's laughable. There are no
doubt good reasons in Gabe's mind why he chose not to use the Det
50's. The real question is, if the public were to see chapter and
verse of these reasons, how do YOU think Nvidia would look in the
eyes of the community, Brian? The million-dollar question is: "Did
Valve optimize for ATI architecture at the expense of Nvidia?" If so,
it's not like this sort of thing wasn't funded by Nvidia in the past,
one's own medicine always tastes foul it seems. But really, if
Valve's dev team was just using reference API calls, and this works
better with ATI than with Nvidia---in fact it does, and this is
supported by several benchmarks---and Nvidia hardware is just not
measuring up, then perhaps Nvidia should throw some more time and
money at Gabe et al to help them obtain more favorable results using
proprietary codepaths, or STFU and go back to driver cheating which
apparently is what they are prioritizing.)

Brian Burke
NVIDIA Corp.

(Pixar on a single chip...so what if it takes as long to render a
scene...;-) Maybe "something hallucinogenic to smoke" should be
bundled with Nvidia cards...that way people could see the Emperor's
New Clothes clear as day, right next to the pink elephants..."if you
can't dazzle 'em with brilliance, baffle 'em with bull****" should be
the corporate mantra of the millennium)


--
Strontium

"It's no surprise, to me. I am my own worst enemy. `Cause every
now, and then, I kick the livin' **** `outta me." - Lit




  #14  
Old September 12th 03, 03:26 PM
JAD
external usenet poster
 
Posts: n/a
Default

They can even charge lets say,
$10 or $20 bucks for 'special optimized' driver packs on their web site.
Hehehe...

careful ATI has already done this......
to some extent...

"SST" wrote in message news
LOL!! I love the street jig visual.

You're sort of right, I keep reading about optimized drivers, drivers and
more drivers. Maybe nVidia will release a driver for every game made so they
can optimize them to death. That might work.


BTW: We hate Dave too but be careful - he's crazy and strangely obsessive. I
don't respond to him, actually he's in my kill file so I only get dribs and
drabs of his drool.


"Strontium" wrote in message
...
Blah, blah, blah.
Drivers, drivers, drivers.....

YADA, YADA, YADA

Dippity doo dah, dippity day...

'Crayz jig, in the middle of the street'


It's already been stated, elsewhere (and NO I'm not going to supply
references as this ****ing fanboy war has produced enough), that the

nVidia
hardware just does NOT cut it.

Talk about drivers, all you want. If you want your card to run on

drivers,
alone.....have fun fanbois


-
Dave stood up at show-n-tell, in zZb8b.420297$o%2.191281@sccrnsc02, and
said:

I think we'll bother the ATI forum with this as well. Might hate me
now, but you might thank me later...

"Lee Marsh" wrote in message
...
We have a response straight from NVIDIA addressing the recent storm
that

snip.

Let's just schlep the whole statement right into this thread, shall
we? Naturally, some accompanying pithy and more not-so-concise jabs
(wouldn't be myself without them...). This is not a troll, nor a
mission for me to lambaste Nvidia, although I do my share of it. I
would really like to see them succeed, provided they knock off the
BS. This is to point out some inconsistencies I see just jumping out
of this statement. I read it four ****ing times, and each time, the
same things hit me in the same place: night in the ruts, so here's
some noise from those bruised jimmies, as I feel that this type of
nonsense really underestimates people's intelligence...


"Over the last 24 hours, there has been quite a bit of controversy
over comments made by Gabe Newell of Valve at ATIs Shader Day.

(Fun's just beginning. A whole can of worms has been opened to the
masses WRT Nvidia actual DX9 shader performance)

During the entire development of Half Life 2, NVIDIA has had close
technical contact with Valve regarding the game. However, Valve has
not made us aware of the issues Gabe discussed.

(I reiterate: So much for close technical contact. But Brian, being
the nice guy and PR flack he is---I wouldn't expect him to say much
differently than all this.)

We're confused as to why Valve chose to use Release. 45 (Rel. 45) -
because up to two weeks prior to the Shader Day we had been working
closely with Valve to ensure that Release 50 (Rel. 50) provides the
best experience possible on NVIDIA hardware.

(Missing fog, perhaps? Or maybe screenshot artificial augmentation? Or
something else the general public is not privy to, that Nvidia might
be exploiting for PR's sake on the basis of lack of commonly
available info, and Gabe is a little more dignified at the moment to
sling the big mudballs with specifics? Who would put it past Nvidia
after all said and done?)

Regarding the Half Life2 performance numbers that were published on
the web, we believe these performance numbers are invalid because
they do not use our Rel. 50 drivers. Engineering efforts on our Rel.
45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's
optimizations for Half Life 2 and other new games are included in our
Rel.50 drivers - which reviewers currently have a beta version of
today. Rel. 50 is the best driver we've ever built - it includes
significant optimizations for the highly-programmable GeForce FX
architecture and includes feature and performance benefits for over
100 million NVIDIA GPU customers.

(So, essentially we should use whichever optimized driver set
provides the best performance with whichever game it was designed to
speed up, regardless whether they' a: released to the public and
b: WHQL certified? So what if it breaks performance and/or
functionality with other things, or previously implemented
workarounds? And stating that the 50's are the best set yet...sans
fog...is a little ludicrous. Talk is cheap. Release the drivers RFN
and let the people be the judge, if you dare...)

Pending detailed information from Valve, we are only aware one bug
with Rel. 50 and the version of Half Life 2 that we currently have -
this is the fog issue that Gabe referred to in his presentation. It
is not a cheat or an over optimization. Our current drop of Half Life
2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public
before the game is available. Since we know that obtaining the best
pixel shader performance from the GeForce FX GPUs currently requires
some specialized work, our developer technology team works very
closely with game developers. Part of this is understanding that in
many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9)
provides no image quality benefit. Sometimes this involves converting
32-bit floating point precision shader operations into 16-bit
floating point precision shaders in order to obtain the performance
benefit of this mode with no image quality degradation. Our goal is
to provide our consumers the best experience possible, and that means
games must both look and run great.

(How much time is a developer expected to spend special-case
optimizing engines for hardware that does not fully conform to
reference specs, or implements them in a near-unplayable fashion with
what is trying to be accomplished from a creative standpoint?
Regardless if it's the result of any failure in relations between
Nvidia and Microsoft. How much grease is this gonna take? And
downplaying the missing fog bug IMHO is a misstep. If the proper
implementation of that fog would skew results unfavorably in the
slightest---mind you, I can't say one way or another: I don't have
these drivers, and I'm not a developer---how does one think they have
ANY leeway whatsoever in their insistence such a driver should be
used as part of a valid performance assessment, let alone providing
the best possible experience? Maybe these drivers should be leaked
and the general public could see for themselves where the numbers
lie---and I chose that word for a reason---in their own evaluations?
Quite frankly I feel that regardless of how true it may be that
16-bit FP precision and PS 1.4 are more economical, efficient
codepaths in some instances without performance or IQ hit, telling a
developer that after he's coded the damn thing around DX9 PS 2.0
reference calls and now has to push up the release date or burn some
midnight oil just to make a wimpy pipeline look better is either
inevitable devrel suicide or expensive. In any case, it's no excuse
for the failure to measure up to all the spewed hype, let alone
reference standards. The latter part of the above paragraph reads
like "Sometimes, using earlier iterations of advanced features that
happen to be part of the spec our product was ostensibly designed and
hyped to the moon to support makes the game run much faster".)

The optimal code path for ATI and NVIDIA GPUs is different - so
trying to test them with the same code path will always disadvantage
one or the other. The default settings for each game have been chosen
by both the developers and NVIDIA in order to produce the best
results for our consumers.

(Looking at some preliminary results would tend to provide some
contradictions to this latter assertion...a spoonful of truth,
followed by a spoonful of bull****, perhaps? Nothing new under the
sun in PR-speak...)

In addition to the developer efforts, our driver team has developed a
next-generation automatic shader optimizer that vastly improves
GeForce FX pixel shader performance across the board. The fruits of
these efforts will be seen in our Rel.50 driver release. Many other
improvements have also been included in Rel.50, and these were all
created either in response to, or in anticipation of the first wave
of shipping DirectX 9 titles, such as Half Life 2.

(Read this: "We're painfully aware our DX9 shader performance sucks
bricks thru a straw compared to ATI's, although you won't EVER hear
this from us, mind you, so we're adding the overhead of a translation
layer to Cg function calls, thereby circumventing reference
functionality thru our own brand of emulation." Now who doesn't think
this translates to: a: reduced precision b: broken functionality with
later DX requirements? The former might not matter more than a
****hole in a snowbank in many instances, the latter...who wants to
spend $200-400+ on a piece of hardware that is not even immediately
future-proof? ****! Come on, Brian! Perhaps if the hardware supported
the API a little better upon inception, this last-minute, knees-bent
running around looking for leaves big enough to cover your asses
wouldn't be necessary. "I did it my way" worked for Sinatra. Indeed,
we shall see how well this works for Nvidia.)

We are committed to working with Gabe to fully understand his
concerns and with Valve to ensure that 100+ million NVIDIA consumers
get the best possible experience with Half Life 2 on NVIDIA hardware."

(Calling Gabe's evaluation invalid *ESPECIALLY when fog doesn't work*
is hardly a step in the right direction. It's laughable. There are no
doubt good reasons in Gabe's mind why he chose not to use the Det
50's. The real question is, if the public were to see chapter and
verse of these reasons, how do YOU think Nvidia would look in the
eyes of the community, Brian? The million-dollar question is: "Did
Valve optimize for ATI architecture at the expense of Nvidia?" If so,
it's not like this sort of thing wasn't funded by Nvidia in the past,
one's own medicine always tastes foul it seems. But really, if
Valve's dev team was just using reference API calls, and this works
better with ATI than with Nvidia---in fact it does, and this is
supported by several benchmarks---and Nvidia hardware is just not
measuring up, then perhaps Nvidia should throw some more time and
money at Gabe et al to help them obtain more favorable results using
proprietary codepaths, or STFU and go back to driver cheating which
apparently is what they are prioritizing.)

Brian Burke
NVIDIA Corp.

(Pixar on a single chip...so what if it takes as long to render a
scene...;-) Maybe "something hallucinogenic to smoke" should be
bundled with Nvidia cards...that way people could see the Emperor's
New Clothes clear as day, right next to the pink elephants..."if you
can't dazzle 'em with brilliance, baffle 'em with bull****" should be
the corporate mantra of the millennium)


--
Strontium

"It's no surprise, to me. I am my own worst enemy. `Cause every
now, and then, I kick the livin' **** `outta me." - Lit






  #15  
Old September 12th 03, 05:11 PM
Jean
external usenet poster
 
Posts: n/a
Default



BTW: We hate Dave too but be careful - he's crazy and strangely obsessive.

I
don't respond to him, actually he's in my kill file so I only get dribs

and
drabs of his drool.



Well in fact he seem to be the only one having an objective
mind here, good post.


  #16  
Old September 12th 03, 05:16 PM
Pluvious
external usenet poster
 
Posts: n/a
Default

On Fri, 12 Sep 2003 00:26:50 -0500, "Strontium" wrote:

-


Or, starting a flamewar? Pffffff! If it was so 'blase', to you, why in
the Hell did you decide to x-post propoganda? The only reason possible is
that you get off on starting arguments. As knowledgeable as you put
yourself forth to be about both cards, it would seem you've read both of
these groups. And, in so knowing, that such a post would incite a flamewar.
So, drop the 'innocent' act. The fact that you knew this group, pretty much
nails that.


OH please. He was discussing both ATI and Nvidia and posted as such. He's not
trying to start a flame war.. he's trying to discuss this issue with the people
that are involved. Seems that you are the only one that is 'flaming' him and I
think your just an instigator. STFU.

Pluvious


  #18  
Old September 12th 03, 08:29 PM
Tim Miser
external usenet poster
 
Posts: n/a
Default

"Strontium" wrote in message
...

Your very reply, proves my point. Go argue with your fist for not jerking
your small snub, today. Today, I've killfiled you. Sorry, no herky-jerky
here CYA **** FACE.


Ah yes, so predictable. Dave gets the best of you so the only thing left
for you to do to "save face" is name calling. Brilliant!



  #20  
Old September 12th 03, 09:24 PM
ST
external usenet poster
 
Posts: n/a
Default

He's a freak!


"Jean" wrote in message
.. .


BTW: We hate Dave too but be careful - he's crazy and strangely

obsessive.
I
don't respond to him, actually he's in my kill file so I only get dribs

and
drabs of his drool.



Well in fact he seem to be the only one having an

objective
mind here, good post.





 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
pc problems after g card upgrade + sp2 ben reed Homebuilt PC's 9 November 30th 04 01:04 AM
Asus A7N8X-X and AMD Athlon XP 3200+ Information Scavenger Asus Motherboards 30 November 9th 04 09:30 PM
I dont see that nvidia is "finished"... Steven C \(Doktersteve\) Ati Videocards 17 September 13th 03 09:00 PM
Tomb Raider AOD benches: Bad news for Nvidia who be dat? Ati Videocards 33 September 4th 03 10:35 AM
Kyle Bennett (HardOCP) blasts NVIDIA Radeon350 Ati Videocards 12 August 13th 03 09:19 PM


All times are GMT +1. The time now is 11:24 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.