A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Ati Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

NV40 ~ GeForce 6800 specs



 
 
Thread Tools Display Modes
  #21  
Old April 14th 04, 08:04 PM
teqguy
external usenet poster
 
Posts: n/a
Default

chrisv wrote:

"teqguy" wrote:

NightSky 421 wrote:

Regardless of if someone wants the new high-end nVidia or ATI

product, I've read that a person better have a monster power
supply and excellent case cooling before even considering such
cards. I also wonder how loud the fans on these new cards are
going to need to be. It'd be interesting to see what they can do
with regards to cooling and power consumption on future video
cards too - I see this as getting to be more and more of a problem
with time.

The power consumption should stay below 15v.

The Geforce FX does NOT use the 12v rail, for anyone wondering.

All 4 pins are connected for potential usage, but the overall
consumption never raises above 5.5v so 17v is not neccessary.


Surely you can't believe that we can take the advice of someone who
thinks that power "consumption" is measured in Volts. What you wrote
is complete drivel, sorry.








Uh, voltage is part of consumption... along with amperage....


The amount of amperes this card will consume is right now between 32
and 35.

The FX series is 28 to 23, ranging from the 5950 to the 5200.






If you have nothing to contribute, shut up. What's drivel is your
obsessant need to critique everything anyone ever says.
  #22  
Old April 14th 04, 08:07 PM
teqguy
external usenet poster
 
Posts: n/a
Default

joe smith wrote:

As far as the reference Nvidia cards go... I'm pretty sure we'll
start out with the dustbuster again... at least until someone can
decide on a more effective method.


That kind of sucks with K8V if anyone taken notice of where the
firewire connector is in the motherboard.. it might fit perfectly
tho, you never know before you try..

I heard that NV40 boards will have two power connectors...? When
RADEON's came with just one, I thought that was already one too many,
LOL, but since it's inside the case who cares in the end of the day.
But two? Huh! 200+ Million transistors sure suck some power.. but
certainly 350 Watt supply with only 5 IDE devices connected should be
enough? ;-)

It would suck if find out suddenly (from the smoke coming from the
PSU) that oh ****, looks like 450-500 watts would be required
anyways... though I find it amazingly unlikely, but since someone
else in this thread was concerned about his PSU being sufficient had
to ask. NV40 would rock for programming, because that's the only way
for quite a while to try out VS 3.0 and PS 3.0 if I am not mistaken?
I read from this NG that ATI wouldn't have these in their new chip,
why the hell not!?

Peace.





The two power connectors will eventually come down to one... right now
testing is only showing that stability is better achieved using 4 rails
instead of two.






  #23  
Old April 14th 04, 08:08 PM
teqguy
external usenet poster
 
Posts: n/a
Default

chrisv wrote:

"teqguy" wrote:

The best possible optimization that could ever be made, would be to
start manufacturing motherboards with sockets for a GPU and either
sockets or slots for video memory.


This would allow for motherboards to potentially reduce in size,
while increasing in performance and upgradability.


The price would increase, but it would be worth it.


No it wouldn't.





You're a moron.
  #24  
Old April 14th 04, 08:13 PM
teqguy
external usenet poster
 
Posts: n/a
Default

G wrote:

K wrote in message
...

I have a gut feeling that PCI Express will do very little for
performance, just like AGP before it. Nothing can substitute lots
of fast RAM on the videocard to prevent shipping textures across to
the much slower system RAM. You could have the fastest interface
imaginable for your vid card; it would do little to make up for the
bottleneck that is your main memory.




But what about for things that don't have textures at all?

PCI Express is not only bi-directional, but full duplex as well. The
NV40 might even use this to great effect, with its built-in hardware
accelerated MPEG encoding/decoding plus "HDTV support" (which I assume
means it natively supports 1920x1080 and 1280x720 without having to
use Powerstrip). The lower cost version should be sweet for Shuttle
sized Media PC's that will finally be able to "tivo" HDTV.

I can also see the 16X slot being used in servers for other things
besides graphics. Maybe in a server you'd want your $20k SCSI RAID
Controller in it. Or in a cluster box a 10 gigabit NIC.

There's more to performance than just gaming. And there's more to PCI
Express than just the 16X slot which will be used for graphics cards
initially. AGP was a hack, and (as others have said) it hit the wall
at "4X". PCI Express is a VERY well thought out bus that should be
alot better than PCI, PCI-X, and AGP... not to mention things bolted
directly to the Northbridge. If it helps games a little in the
process, it's just gravy.






Most MPEG encoding is processor dependent... I wish developers would
start making applications that let the graphics card do video encoding,
instead of dumping the work on the processor.




The bandwidth of AGP 2X can carry a high definition signal... so I
don't understand how you can expect PCI-Express to do it any better.





Last time I checked, an HD signal operates at 8Mb/s.... DVD @
2.5Mb/s... VCR @ 250Kb/s

PCI Express can potentially carry up to 4.3Gb/s... so do the math.





SCSI only operates at 320Mb/s.

In RAID stripe 0, it's roughly 460Mb/s.


So again... a lot more bandwidth than required.



And definitely a lot more expensive than using onboard SCSI.
  #25  
Old April 14th 04, 08:30 PM
joe smith
external usenet poster
 
Posts: n/a
Default

pfft. You don't even know what the ATI offering is as yet, much less
are you able to buy a 6800 until well into next month.


No, I do not. I wrote that the rumor is that ATI wouldn't have 3.0 level
shaders.. I was commenting on a rumor, if that isn't true then the situation
is naturally entirely different. The confidentially / NDA ends 19th this
month so soon after that we should begin to see cards dripping to the
shelves like always (just noticed a trend in past 5-7 years, could be wrong
but I wouldn't die if had to wait even 2 months.. or 7.. or 3 years.. the
stuff will get here sooner or later.. unless the world explodes before that
=

Relax dude, you don't have to pfff, obviously any intelligent person know
what you're saying.. I wasn't commenting on that, or claiming that the cards
will be here TOMORROW!!!! Or that ATI will definitely NOT have 3.0 spec
shaders, now, if you want to argue that fact look up the person who posted
the RUMOR about that, then PFFFF his ass! Pfff... - now that is for a valid
reasons... heh


  #26  
Old April 14th 04, 08:40 PM
Tony DiMarzio
external usenet poster
 
Posts: n/a
Default

I'd have to agree. It looks like this guy is trying to masquerade anti-ATI
sentiment as nonchalance and "no-brainer" NVIDIA superiority. Sorry, but
your weak psychology is definitely not fooling me.

--
Tony DiMarzio



"rms" wrote in message
news
No, really, I can't believe my eyes that after two year trip to the ATI
side, would again consider NV even a candidate, not to mention #1 choise

as
gfx card upgrade. Must suck to base your choises to brandname aka. the
Fanboy's Choise. Looking at the offerings, this is a no-brainer for me.



pfft. You don't even know what the ATI offering is as yet, much less
are you able to buy a 6800 until well into next month.

rms




  #27  
Old April 14th 04, 09:22 PM
chrisv
external usenet poster
 
Posts: n/a
Default

"teqguy" wrote:

chrisv wrote:

"teqguy" wrote:


The power consumption should stay below 15v.

The Geforce FX does NOT use the 12v rail, for anyone wondering.

All 4 pins are connected for potential usage, but the overall
consumption never raises above 5.5v so 17v is not neccessary.


Surely you can't believe that we can take the advice of someone who
thinks that power "consumption" is measured in Volts. What you wrote
is complete drivel, sorry.


Uh, voltage is part of consumption... along with amperage....


That doesn't make what you said above sensible. It was senseless
drivel. Deal with it.

The amount of amperes this card will consume is right now between 32
and 35.

The FX series is 28 to 23, ranging from the 5950 to the 5200.


Better late then never, I guess.

If you have nothing to contribute, shut up.


If you're just going to post drivel, shut up.

What's drivel is your
obsessant need to critique everything anyone ever says.


Wrong again.

  #28  
Old April 14th 04, 09:23 PM
chrisv
external usenet poster
 
Posts: n/a
Default

"teqguy" wrote:

chrisv wrote:

"teqguy" wrote:

The best possible optimization that could ever be made, would be to
start manufacturing motherboards with sockets for a GPU and either
sockets or slots for video memory.


This would allow for motherboards to potentially reduce in size,
while increasing in performance and upgradability.


The price would increase, but it would be worth it.


No it wouldn't.


You're a moron.


My irony meter is off the scale. You're obviously clueless, if you
think what you proposed above is a good idea.

  #29  
Old April 14th 04, 11:47 PM
JLC
external usenet poster
 
Posts: n/a
Default


"joe smith" wrote in message
...

Ah,

http://frankenstein.evilgeniuslabs.c...nv40/news.html

I see from the pictures (assuming not fakes that the card should fit
reasonably to "single" AGP (8x) slot more or less.. that's nice, but the
best part about this debacle is two DVI ports. That is the part I like the
most, currently using DVI + DB25 to two TFT's.


It says right in that same article that the new cards will take two slots.
But it is possible for vendors to come out with single slot cards.
I find it amazing that it says that Nvidia recommended that their testers
use at lest a 480W PS. That's going to be a very expensive upgrade for a lot
of people. And a lot of guys that think they have a 480+ PS will find that
there cheap PS is not up to the task.
So the Ultra is gonna start at $499 + say another $100 for a quality PS, Wow
$599 just to play games that probably don't need a fraction of the power the
new card can deliver. Let's hope that Doom 3 runs great on with this card.
Of course by the time the game finally comes out this card will probably
cost $150. JLC


  #30  
Old April 14th 04, 11:49 PM
DaveL
external usenet poster
 
Posts: n/a
Default

I think Nvidia learned their lesson about that from the 5800U debacle. It
was ATI that stayed with the old standard and took the lead in performance.
Meanwhile, Nvidia was struggling with fab problems.

DaveL


"Ar Q" wrote in message
link.net...

Isn't it time for NVidia to use 0.09um process? How could they put some
many features if still using 0.13 um process?



 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Geforce 6800 drivers Glitch General 3 December 13th 04 09:30 AM
Athlon xp 2600+ (@1.9ghz) with a Geforce 6800? Kiran Kumar Kamineni Overclocking AMD Processors 1 November 14th 04 11:24 AM
Gigabyte NVIDIA 6600 Series: Bringing GeForce 6800 features to the mainstream! Gigabyte USA Marketing Gigabyte Motherboards 0 October 28th 04 11:04 PM
GeForce 6800 Ultra (256 Mb) F.O.R. General 1 August 7th 04 02:20 AM
P4C800-E Deluxe and BFG GeForce 6800 Ultra OC graphics card Mark Cee Asus Motherboards 2 June 28th 04 05:24 AM


All times are GMT +1. The time now is 01:09 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.