A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Ati Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Kyle Bennett (HardOCP) blasts NVIDIA



 
 
Thread Tools Display Modes
  #1  
Old August 12th 03, 07:26 PM
Radeon350
external usenet poster
 
Posts: n/a
Default Kyle Bennett (HardOCP) blasts NVIDIA

quote:
"The last year has been an incredible year when it comes to video
cards. No wait, it has not. It has been an incredible year when it
comes to video card companies that develop the technology. As far as
the actual hardware goes, the Radeon R3XX series still remains the
most impressive VPU introduction in recent memory as it truly
revolutionized the way we play games and continues to do so. On the
other end of the scale we have the GeForceFX 5800 series that without
a doubt turned into the biggest joke in our community since Matrox
brought us the Parhelia. That is not what I want to talk about
however. I want to talk about the way NVIDIA is treating the people
that have bought its products recently and every enthusiast that keeps
up with the goings on in the world of VPU/GPUs.

ATI has been building targets for NVIDIA to aim at for the last year
and NVIDIA has done little more than shoot themselves in the foot
every chance they have gotten. Let?s recap a couple of the highlights
shall we?

On the topic of 3DMark03, everyone but NVIDIA, Futuremark, and their
lawyers will tell you that NVIDIA cheated in that benchmark to get a
better score. No matter what the press releases say about
optimizations, I think many of us know very well that it was cheating.
You can characterize NVIDIA?s actions as ?optimizations? and argue
that point well, but to accept that take on the issue you have to
throw out everything you know about past experience when it comes to
benchmarking. The way that NVIDIA went about optimizing for 3DMark03
recently can only be looked at as an attack on the enthusiast
community and everything we have built and believe in. NVIDIA?s
actions and optimizations for 3DMark03 tear at the very fabric of our
community and violate some unwritten laws. I personally put no real
value in 3DMark03 scores, but many folks did, and certainly it is used
by large OEMs. NVIDIA got caught with their hand in the cookie jar and
when they pulled it out, they gave everyone the bird.

I cannot be sure of their motives behind those actions but NVIDIA has
ensured us that they have taken measures so that it will not happen
again. Sadly, the person they need to be talking to is you. The person
they need to be apologizing to is you. What NVIDIA has done in the
world of benchmarks is inexcusable and they simply owe us all an
apology for dumping on every person that ever supported them.

We took NVIDIA to task earlier this year on the topic of image quality
and they got down to brass tacks and fixed some things that were
wrong. We were impressed, but they seem to be backsliding. We recently
addressed the issue of NVIDIA and their Trilinear Filtering that they
use in UT2K3. I still stand behind our results, but that article
raised a wealth of other questions. Why is NVIDIA ?decreasing?
Trilinear Filtering samples? Why do they force the optimization in
this one game? Why do I not have the control over image quality that I
think I should have?

It seems that to me that NVIDIA has implemented a legitimate
optimization with UT2K3 in order to win what is a very widely used
benchmark. I personally do not have an issue with that. The
optimizations are present in the benchmark are also applied in the
game which is widely played and many persons might find that useful
during gameplay. The issue with this ?quasi-Trilinear Filtering?
optimization is that you cannot turn it off should you wish to. If you
go into the UT2K3 video setup GUI, there is a checkbox there for
?Trilinear Filtering?, so it seems obvious to me that the game
developer seemed to think that Trilinear Filtering would be something
that would enhance the game. Some folks are upset that they do not get
to enable this feature using NVIDIA GFFX cards as it is commonly
understood to work.

Quite frankly, I don?t like not having that ability taken from me
either. I think when we spend $400 or $500 on a video card, we should
be able to turn on true Trilinear Filtering in the game if that is
what we want, and NVIDIA has taken this away from us. We spoke to
NVIDIA about this months ago, and we thought we had negotiated an
answer that would make everyone happy. We have publicly referred to
this multiple times. We expected a control in their new drivers that
would allow the application to set influences that affect image
quality.

As of last night, using NVIDIA?s new driver set, we were not able to
turn on true Trilinear Filtering in UT2K3. We went to NVIDIA and asked
about this. They explained that we had never been told what we thought
we had been told. It seemed to turn into a semantics game; one I did
not feel like playing. Why NVIDIA is refusing to give this option to
the enthusiasts is simply beyond me. The fact of the matter is that
this option is not our birthright or anything close. NVIDIA has a
fiduciary duty to its stockholders to make them money and apparently
they think this decision on forcing their optimizations on the end
user is one that will go their way and help them be profitable.

Now we are hearing rumors of NVIDIA once again joining Futuremark?s
3DMark Beta program. As of this morning NVIDIA PR neither confirmed
nor denied the rumor. IF this is true it simply leaves me numb after
all the efforts that NVIDIA has expended to discredit Futuremark. We
have already seen Futuremark roll over at NVIDIA?s command here
recently. What integrity Futuremark had was spent this year. I am not
sure how anyone can actually use their benchmarks now and think that
there is some semblance of objectivity. Futuremark takes direct
payments from the companies that profit from the hardware it
benchmarks and that is unacceptable as it is a glaring conflict of
interest. We are talking about payments adding up to millions of
dollars. At this point, if NVIDIA does in fact climb back into the
Futuremark Beta program I can only think of the move as being
laughable.

I am sorry, but I have now had enough. NVIDIA needs to fess up to
their actions publicly. NVIDIA needs to apologize for their actions
publicly. NVIDIA needs to spell out the corrections they are going to
make publicly. NVIDIA needs to make the community aware of the
optimizations they will make that affect benchmark scores in games and
synthetics. NVIDIA needs to treat the community with the respect they
deserve. If they do not, the enthusiast community needs to go spend
their money with the competition and urge all the people that ask them
for buying advice to do the same.

The bottom line is this. NVIDIA has broken a sacred trust between
themselves and the community and unless they get their issues together
very quickly and address them, I have a feeling that many more of you
will not be buying their products. NVIDIA?s current line of cards is
very strong and they look to be good products, but as a consumer I
would personally have a hard time giving them my money right now. The
only real power we consumers have is to vote with our wallets.

ATI and NVIDIA need to pull out of the Futuremark Beta Program and
recommit themselves to focusing on their customers gaming experience.
But, this is all just my opinion.

Feel free to steal, rip, copy, fax, email, and post this article in
whole or in part, as your own words, or as mine."
  #2  
Old August 12th 03, 08:58 PM
Mario Kadastik
external usenet poster
 
Posts: n/a
Default

B wrote:

Interesting you should mention Futuremark because as of today NVIDIA has
rejoined Futuremarks testing.

http://news.com.com/2100-1040_3-5062667.html?tag=fd_top

I also noticed that you didn't say much about how ATI also cheated on some
of their drivers some time ago.


If you look again at his posting subject and the first word: "quote:"
then you understand that it's a copy-paste from HardOCP news article. It
was just ment as news.

Mario

PS! not that your point's would have been wrong, just emphasizing on the
words not being his.

  #3  
Old August 13th 03, 01:02 AM
K
external usenet poster
 
Posts: n/a
Default


"Radeon350" wrote in message
om...
quote:


Kyle Bennett is a complete ****wit at the best of times. Given his past
record on his childish rants his opinions count for nothing.

K


  #4  
Old August 13th 03, 01:52 AM
jack
external usenet poster
 
Posts: n/a
Default

Hi There

Kyle is right. Nvidia has scared me away after a long time having owned more
than 6 different cards in a row.
The performance of NV cards is very unbalanced lately, but they might
improve themselves in the future.
Meanwhile i`m stickin with ATI , it`s my pleasure and they seem friendly to
the enthousiast community.

BYE

Jack


  #5  
Old August 13th 03, 02:53 AM
phobos
external usenet poster
 
Posts: n/a
Default

K wrote:

"Radeon350" wrote in message
om...

quote:



Kyle Bennett is a complete ****wit at the best of times. Given his past
record on his childish rants his opinions count for nothing.

K


Henceforth the name "editorial" instead of news. Kyle puts up with a
lot of **** from companies he deals with and sooner or later he's gonna
let it out.

Just take the hipocrisy that reviewers must face for example; they must
use an objective benchmark to compare performance between hardware, but
the benchmark itself is crap - yet the hardware vendors still insist on
beating each other in it (completely ignoring any of the [H]'s often
suggested benchmarking and performance tuning guidelines.

  #6  
Old August 13th 03, 03:42 AM
who be dat?
external usenet poster
 
Posts: n/a
Default


"K" wrote in message
...

"Radeon350" wrote in message
om...
quote:


Kyle Bennett is a complete ****wit at the best of times. Given his past
record on his childish rants his opinions count for nothing.


Translation: he slammed Nvidia so he doesn't know what he's talking about,
once he praises Nvidia again he'll be a clear/logical thinker who deserves
praise.

Fanboys, gotta love 'em!

Chris Smith


  #7  
Old August 13th 03, 10:16 AM
K
external usenet poster
 
Posts: n/a
Default


"who be dat?" wrote in message
...

"K" wrote in message
...

"Radeon350" wrote in message
om...
quote:


Kyle Bennett is a complete ****wit at the best of times. Given his past
record on his childish rants his opinions count for nothing.


Translation: he slammed Nvidia so he doesn't know what he's talking about,
once he praises Nvidia again he'll be a clear/logical thinker who deserves
praise.

Fanboys, gotta love 'em!


No fanboy here, my next card is probably going to be a Radeon unless Nvidia
come out with something better. I totally agree with Kyle about 3Dmark03,
it's a completely worthless benchmark but I realised this long before his
tiny brain thought of it.

My point is that because of Mr Bennetts' past record of spewing forth utter
drivel, his opinions should be taken with a sack of salt.

K


  #8  
Old August 13th 03, 01:53 PM
chainbreaker
external usenet poster
 
Posts: n/a
Default

No fanboy here, my next card is probably going to be a Radeon unless
Nvidia come out with something better. I totally agree with Kyle
about 3Dmark03, it's a completely worthless benchmark but I realised
this long before his tiny brain thought of it.

My point is that because of Mr Bennetts' past record of spewing forth
utter drivel, his opinions should be taken with a sack of salt.

K


Every card I've had since a Diamond Viper V550 has been an nVidia card, up
to the Ti4200 I now have, but it's looking right now like the next one is
going to be a Radeon 9800 or better.

--
chainbreaker


  #10  
Old August 13th 03, 05:48 PM
chainbreaker
external usenet poster
 
Posts: n/a
Default

OverKlocker wrote:
one point is 'yes' ati has done this as well. another would be that
nvidia seems to be doing it alot with their new product (5900 ultra).
i dont have my new issue of maximumPC here, but some of you prob read
the article which basically shows that by removing the 'optimizations'
the nvidia card takes ALOT bigger hit on preformance than the ati
card. that kinda struck me as funny, since maximumPC is now showing
both cards as their 'high end' choices.


From what I can tell, there seems to be little difference between the two
very top end cards. Right below that, though, the various Radeon offerings
seem to offer more for the money.

--
chainbreaker


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
pc problems after g card upgrade + sp2 ben reed Homebuilt PC's 9 November 30th 04 02:04 AM
Asus A7N8X-X and AMD Athlon XP 3200+ Information Scavenger Asus Motherboards 30 November 9th 04 10:30 PM
Kyle Bennet of HardOCP sells emails to his vendors . Overclocking 8 November 4th 04 10:38 AM
Kyle Bennet of HardOCP sells emails to his vendors . Overclocking AMD Processors 8 November 4th 04 10:38 AM
A7N8X-Dlx and win98se - No Audio - has me stumped Kyle Brant Asus Motherboards 13 November 26th 03 12:08 PM


All times are GMT +1. The time now is 10:57 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.