A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Nvidia panics, scraps NV50 ?????



 
 
Thread Tools Display Modes
  #1  
Old December 5th 04, 11:33 AM
Sham B
external usenet poster
 
Posts: n/a
Default Nvidia panics, scraps NV50 ?????


If true it can only mean good things for us, if Nvidia can bring out an ATI
killer, ATI will kick back!, or could it be ATI are doing to Nvidia what they did to 3DFX?, only
time will tell!.



nVidia *is* 3DFX when it comes to its design staff - it absorbed a lot of 3DFX people during take
over.
I'm sure the nVidia problems are due to the same as 3DFX - the large amount of heat their designs
generate, and the increasing cost that this imposes. At the moment, high end ATI and nVidia cards
are well matched, and the only real comparison for non-fanboys is price. My opinion is that nVidia
can compete on performance but not on cost per unit. This is exactly what happened to 3DFX - they
could create cards that were as fast, but they were huge, had high chip counts, and they ran hot.

It looks like nVidia know this and are going back to the drawing board to design out the heat
bottleneck problem. IMO, you wont get faster versions of nVidia chips because of this, just a more
cost effective base design.

S



  #2  
Old December 5th 04, 12:17 PM
Kokoro
external usenet poster
 
Posts: n/a
Default

In alt.comp.periphs.videocards.nvidia, Sham B ordered an army of
hamsters to type:



It looks like nVidia know this and are going back to the drawing board
to design out the heat bottleneck problem. IMO, you wont get faster
versions of nVidia chips because of this, just a more cost effective
base design.

S




If Nvidia have scrapped NV50 then they are more than going back to the
drawing board. It must mean another design already exists with which they
are going to switch to. Who is to say a new design wont be any faster and
cooler at the same time?


  #3  
Old December 5th 04, 02:17 PM
J. Clarke
external usenet poster
 
Posts: n/a
Default

Kokoro wrote:

In alt.comp.periphs.videocards.nvidia, Sham B ordered an army of
hamsters to type:



It looks like nVidia know this and are going back to the drawing board
to design out the heat bottleneck problem. IMO, you wont get faster
versions of nVidia chips because of this, just a more cost effective
base design.

S




If Nvidia have scrapped NV50 then they are more than going back to the
drawing board. It must mean another design already exists with which they
are going to switch to.


Either that or the design is so badly hosed at this point that they've given
up trying to fix it.

Who is to say a new design wont be any faster and
cooler at the same time?


--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
  #4  
Old December 5th 04, 04:30 PM
NightSky 421
external usenet poster
 
Posts: n/a
Default

"J. Clarke" wrote in message
...

I think you're going to find that you not only pay full boat for the video
boards but also pay a premium for a motherboard with the necessary slots.
And the performance doesn't look to be anything like 2X.



I saw the benchmarks of GeForce 6800 Ultras in SLI mode compared to single
video card solutions and I certainly was expecting better results than what
I saw. In my opinion, dual video cards (today) are a waste of time and
money considering how little you gain for the price you're paying. Check
out the following article. They give benchmark results. It's a 13-page
article, but there's an option at the bottom of the page which allows you to
skip forward to whatever page you want.
http://www.techreport.com/reviews/20...t/index.x?pg=1


  #5  
Old December 5th 04, 05:03 PM
Nicholas Buenk
external usenet poster
 
Posts: n/a
Default


"Sham B" wrote in message
t.net...

If true it can only mean good things for us, if Nvidia can bring out an
ATI
killer, ATI will kick back!, or could it be ATI are doing to Nvidia what
they did to 3DFX?, only
time will tell!.



nVidia *is* 3DFX when it comes to its design staff - it absorbed a lot of
3DFX people during take
over.
I'm sure the nVidia problems are due to the same as 3DFX - the large
amount of heat their designs
generate, and the increasing cost that this imposes. At the moment, high
end ATI and nVidia cards
are well matched, and the only real comparison for non-fanboys is price.
My opinion is that nVidia
can compete on performance but not on cost per unit. This is exactly what
happened to 3DFX - they
could create cards that were as fast, but they were huge, had high chip
counts, and they ran hot.

It looks like nVidia know this and are going back to the drawing board to
design out the heat
bottleneck problem. IMO, you wont get faster versions of nVidia chips
because of this, just a more
cost effective base design.


Nvidia 6800 is currently out selling ATI x800 by a fair margin. They are
making a good profit on their work on the 6800 which they can put into a new
core. However, ATI is outselling nvidia in the low end and medium range
cards, as the 9700pro family of cards are still paying of big compared to
the geforce fx.
But I think Nvidia made a big come back with the 6800 and their willingness
to abandon a core shows their determination to beat ATI and give a superior
product.


  #6  
Old December 5th 04, 10:17 PM
Bass
external usenet poster
 
Posts: n/a
Default

On Mon, 6 Dec 2004 04:03:53 +1100, "Nicholas Buenk"
wrote:


But I think Nvidia made a big come back with the 6800 and their willingness
to abandon a core shows their determination to beat ATI and give a superior
product.

Superior? Nothing beats the X800XT Platinum.
  #7  
Old December 6th 04, 02:34 AM
Scotter
external usenet poster
 
Posts: n/a
Default

Depends on what game/benchmark you are using.
In this Tom's Hardware article, I see a few benchmarks where the 6800 Ultra
and sometimes even the 6800 GT beat the X800 XT PE: Call of Duty, Doom3,
FarCry, Sims2, Flight Simulator 2004 and many other games where all three of
these cards are neck & neck or the X800 XT PE is BARELY fastest.

Call of Duty
http://graphics.tomshardware.com/gra...rmance-13.html

Doom 3
http://graphics.tomshardware.com/gra...rmance-15.html
http://graphics.tomshardware.com/gra...rmance-16.html

FarCry
http://graphics.tomshardware.com/gra...rmance-17.html

Sims2
http://graphics.tomshardware.com/gra...rmance-19.html

Flight Simulator 2004
http://graphics.tomshardware.com/gra...rmance-21.html



"Bass" wrote in message
...
On Mon, 6 Dec 2004 04:03:53 +1100, "Nicholas Buenk"
wrote:


But I think Nvidia made a big come back with the 6800 and their
willingness
to abandon a core shows their determination to beat ATI and give a
superior
product.

Superior? Nothing beats the X800XT Platinum.



  #8  
Old December 6th 04, 03:04 AM
Scotter
external usenet poster
 
Posts: n/a
Default

Great review! Thanks for posting.
Hmmm I'm actually noticing very nice performance from SLI nVidia cards and
even the 6800 Ultra on these pages:

Here nearly double framerate on some tests:
http://www.techreport.com/reviews/20...t/index.x?pg=3

Here, on the Doom 3 bench, what I'm perceiving is a CPU limitation:
http://www.techreport.com/reviews/20...t/index.x?pg=4

And we all know Valve optimized Half Life 2 for ATi cards:
http://www.techreport.com/reviews/20...t/index.x?pg=6

I was suprised, though, to see ATi's 850 do so well against nVidia's best in
FarCry benchmark. I ?guess? lack of Shader 3 support just means the game
doesn't look quite as hot for ATi's card but FPS does not suffer:
http://www.techreport.com/reviews/20...t/index.x?pg=9

I see, under load, the 850 XT PE is a big time power hog.
FAST card, though. I'm impressed, especially that it could beat two 6800
GT's and even Ultras sometimes in SLi mode. Of course I'm skeptical when
that happens that there is a CPU-limitation or driver issue going on.
But the price is steep and I still see this card as an ultra-souped-up
9800XT.

When the games I'm playing (right now Half Life 2, FarCry, UT 2004, and Dawn
of War 40K) drop below 40fps with everything turned on, I'll upgrade away
from my 6800 GT OC, but I really don't see that happening any time soon. And
by then, nVidia's next card will be out, which I'm sure will leapfrog ATi's
850; assuming ATi doesn't come out with something like a 900 by then
Ah... gotta love competition spurring on such great technology



"NightSky 421" wrote in message
...
"J. Clarke" wrote in message
...

I think you're going to find that you not only pay full boat for the
video
boards but also pay a premium for a motherboard with the necessary slots.
And the performance doesn't look to be anything like 2X.



I saw the benchmarks of GeForce 6800 Ultras in SLI mode compared to single
video card solutions and I certainly was expecting better results than
what I saw. In my opinion, dual video cards (today) are a waste of time
and money considering how little you gain for the price you're paying.
Check out the following article. They give benchmark results. It's a
13-page article, but there's an option at the bottom of the page which
allows you to skip forward to whatever page you want.
http://www.techreport.com/reviews/20...t/index.x?pg=1



  #9  
Old December 9th 04, 10:11 AM
assaarpa
external usenet poster
 
Posts: n/a
Default

these cards are neck & neck or the X800 XT PE is BARELY fastest.

Call of Duty
http://graphics.tomshardware.com/gra...rmance-13.html
http://graphics.tomshardware.com/gra...rmance-15.html
http://graphics.tomshardware.com/gra...rmance-16.html
http://graphics.tomshardware.com/gra...rmance-17.html
http://graphics.tomshardware.com/gra...rmance-19.html
http://graphics.tomshardware.com/gra...rmance-21.html


Pretty sad if it is barely 20% faster (if that!) at over twice the price.
I'd get a single 6800 GT or X800, huhu, both break 60 frames per second with
easy in these benchmarks. Beyond that it's not going to bring any value for
the investment as far as these games are concerned!

And to add salt to the wounds, in the game that has the lowest performance
(FS 2004) single X800 is faster than 2x the price SLI solution from NV.


.... That said, I got 6800 GT myself because of programming fetish for 3.0
shaders. The card is:

- noisy
- big (the card infact covers two SATA connectors on K8V Deluxe, that's
right only 2 usable SATA connectors (the Promise controller ones).
- large YUV overlays don't seem to work (try those hi-definition dvd demo
videos from microsoft.com at 1080i.. completely **** up, the 720p ones work
smoothly).

Besides these small grievances, guess what, I still wouldn't switch! Simple
reason: 3.0 shaders... argh...




  #10  
Old December 10th 04, 01:57 PM
Nicholas Buenk
external usenet poster
 
Posts: n/a
Default


"assaarpa" wrote in message
...
these cards are neck & neck or the X800 XT PE is BARELY fastest.

Call of Duty
http://graphics.tomshardware.com/gra...rmance-13.html
http://graphics.tomshardware.com/gra...rmance-15.html
http://graphics.tomshardware.com/gra...rmance-16.html
http://graphics.tomshardware.com/gra...rmance-17.html
http://graphics.tomshardware.com/gra...rmance-19.html
http://graphics.tomshardware.com/gra...rmance-21.html


Pretty sad if it is barely 20% faster (if that!) at over twice the price.
I'd get a single 6800 GT or X800, huhu, both break 60 frames per second
with easy in these benchmarks. Beyond that it's not going to bring any
value for the investment as far as these games are concerned!

And to add salt to the wounds, in the game that has the lowest performance
(FS 2004) single X800 is faster than 2x the price SLI solution from NV.



... That said, I got 6800 GT myself because of programming fetish for 3.0
shaders. The card is:


I have one myself.

- noisy


You can buy a heatsink with a quieter fan seperately.

- big (the card infact covers two SATA connectors on K8V Deluxe, that's
right only 2 usable SATA connectors (the Promise controller ones).


Blame your motherboard. My gigabyte K8NS Pro does not have that problem.

- large YUV overlays don't seem to work (try those hi-definition dvd demo
videos from microsoft.com at 1080i.. completely **** up, the 720p ones
work smoothly).


I have noticed.. no problems with them, except that the card doesn't do
decode acceleration as the specs say it does.
Perhaps that is your problem, you don't have the CPU power to make up for
the total lack of decode acceleration, less than what a FX 5900 does. What
CPU do you have? You really need at least a 3400+ to watch 1080p with this
card.

Besides these small grievances, guess what, I still wouldn't switch!
Simple reason: 3.0 shaders... argh...


They are no big deal really, they just give a speed boost. What is a nice
feature that is valuable is HDR which offers far more realistic lighting
effects and perhaps is the biggest change to lighting effects since pixel
shader 2, but it slows down the rendering speed by about 50% so it has
limited usefulness....


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
NV50 in late 2005? NV55 Ati Videocards 0 August 31st 04 08:18 PM
Bad news for ATI: Nvidia to 'own' ATI at CeBit - no pixel shader 3.0 support in R420 (long) NV55 Nvidia Videocards 11 February 24th 04 06:29 AM
nVidia NV40, NV41, NV45 Information NV55 Nvidia Videocards 4 January 29th 04 02:02 PM
Nvidia says two ways to install drivers Dudley Henriques Nvidia Videocards 11 December 8th 03 07:17 PM
tnt2 mvp3 agpgart and nvidia linux drivers no-spam Nvidia Videocards 0 September 18th 03 07:58 AM


All times are GMT +1. The time now is 01:50 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.