If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
|
#1
|
|||
|
|||
NVidia 6800 Ultra power requirements
Nvidia 6800 Ultra: 110 watts maximum
Power Supply:- 480 watts minimum !!! Preferably 4 "disk-drive" cables available from power-supply. Two must be available EXCLUSIVELY for the 6800 Ultra. ( minor exception --- auxiliary fans are allowed to co-use these cable feeds ) Power-supplies with only 3 cables will need some $2 splitters on the third cable for the various disk drives --- not a real problem, since their power consumption is very low. Also, from the Tom's Hardware Guide review :- ---------------------------------------------------------------------------------------- ".................................. We can also extrapolate the power requirements of the remaining cards ( FX5950, Ati 9800XT ) from these numbers. Assuming that NVIDIA's quoted maximum power draw of 110 Watts for the 6800 Ultra is correct. Let's also assume that we reached that worst-case scenario during our tests. That would mean that the Radeon 9800XT has a maximum power requirement of about 91,5 Watts, while the FX 5950 needs 93,5 Watts" ----------------------------------------------------------- ..... ever wondered why your 5950 and 9800 got so hot..... ? About the same as the P4 3.4GHz CPU. However, the heat is spread over the video board, since the above power-consumption includes that of memory. My guess at the 5950/9800XT GPUs is around 70 watts... Since DDR3 memory takes less power than DDR1 and assuming 256meg will be the 6800Ultra default, , the actual power in the NV40 (16-pipe ) is probably around 95 watts, around 25-watts more than the existing GPUs, so the nice big 2-slot cooler on 6800 Ultra should be more than adequate for the new GPU. BTW, the 6800 non-Ultra (12-pipe) will have only 1 power-connector (!) and a single-slot cooler, so those thinking of cheesily upgrading the 6800 non-Ultra 12-pipe to the 16-pipe by a BIOS/hardware hack will probably have to think again. Either the NV40 12-pipe is a totally different mask or more likely the power supply to the core is divided into groups of 4 pipes and that the 12-pipe will be parts that fail (silicon-blemishes) to get all 16-pipes working to spec, with the failures concentrated in one group, and the power then either hardware-enabled external to the NV40 or bonded internally to avoid that group. If externally hardware-enabled, any attempt to modify the power-enabling arrangement for the 12-pipe NV40 to enable all 16 pipes is highly likely to burn out the power-regulators or sag the voltage too much, since regulator power-sharing is obviously not available on the single-connector board.. John Lewis |
#2
|
|||
|
|||
|
#3
|
|||
|
|||
|
#4
|
|||
|
|||
John Lewis wrote:
Nvidia 6800 Ultra: 110 watts maximum Power Supply:- 480 watts minimum !!! Preferably 4 "disk-drive" cables available from power-supply. Two must be available EXCLUSIVELY for the 6800 Ultra. ( minor exception --- auxiliary fans are allowed to co-use these cable feeds ) Power-supplies with only 3 cables will need some $2 splitters on the third cable for the various disk drives --- not a real problem, since their power consumption is very low. Also, from the Tom's Hardware Guide review :- ---------------------------------------------------------------------------------------- ".................................. We can also extrapolate the power requirements of the remaining cards ( FX5950, Ati 9800XT ) from these numbers. Assuming that NVIDIA's quoted maximum power draw of 110 Watts for the 6800 Ultra is correct. Let's also assume that we reached that worst-case scenario during our tests. That would mean that the Radeon 9800XT has a maximum power requirement of about 91,5 Watts, while the FX 5950 needs 93,5 Watts" ----------------------------------------------------------- .... ever wondered why your 5950 and 9800 got so hot..... ? About the same as the P4 3.4GHz CPU. However, the heat is spread over the video board, since the above power-consumption includes that of memory. My guess at the 5950/9800XT GPUs is around 70 watts... Since DDR3 memory takes less power than DDR1 and assuming 256meg will be the 6800Ultra default, , the actual power in the NV40 (16-pipe ) is probably around 95 watts, around 25-watts more than the existing GPUs, so the nice big 2-slot cooler on 6800 Ultra should be more than adequate for the new GPU. BTW, the 6800 non-Ultra (12-pipe) will have only 1 power-connector (!) and a single-slot cooler, so those thinking of cheesily upgrading the 6800 non-Ultra 12-pipe to the 16-pipe by a BIOS/hardware hack will probably have to think again. Either the NV40 12-pipe is a totally different mask or more likely the power supply to the core is divided into groups of 4 pipes and that the 12-pipe will be parts that fail (silicon-blemishes) to get all 16-pipes working to spec, with the failures concentrated in one group, and the power then either hardware-enabled external to the NV40 or bonded internally to avoid that group. If externally hardware-enabled, any attempt to modify the power-enabling arrangement for the 12-pipe NV40 to enable all 16 pipes is highly likely to burn out the power-regulators or sag the voltage too much, since regulator power-sharing is obviously not available on the single-connector board.. John Lewis Keep in mind that the reviewer samples are only an A1 revision engineering reference board, straight out of the bin. It's likely that come production time, each manufacturer will choose things such as a cooler, the mosfets, caps, vregs, etc. So by the time you get your hands on one (a GF6-6800U), most card manufacturers will likely go with a _strict_ reference design (including the HSF and ramsinks), but you might find some that just barely fit in a single slot with a different cooler. Also, the voltage requirements right now are a "clean 12v", so depending on what components are certified by NVidia, some manufacturers /JUST MIGHT/ be able to get away with a single plug from a good PSU (like a high rated 350w). NVidia's Q.C. program they've inacted since the FX series launch is likely going to be the limiting factor in this. What I'm really excited about is the next card revision. In the same way the NV35 completed the promise of the NV30, The NV45 (or whatever core revision gets stuck on the PCB in 6 mo.'s), will likely be a much more efficient and slimmed fighter. Most likely even better memory bandwidth, some tweaked RGMSAA modes, probably a few Doom III tricks , and hopefully a much more efficient power usage. As for now, it looks like brute force is doing pretty well. |
#5
|
|||
|
|||
On Wed, 14 Apr 2004 15:59:38 -0500, duralisis
wrote: Keep in mind that the reviewer samples are only an A1 revision engineering reference board, straight out of the bin. It's likely that come production time, each manufacturer will choose things such as a cooler, the mosfets, caps, vregs, etc. The 110 watts is nVidia's max. spec.....according to the reviews. So by the time you get your hands on one (a GF6-6800U), most card manufacturers will likely go with a _strict_ reference design (including the HSF and ramsinks), but you might find some that just barely fit in a single slot with a different cooler. Also, the voltage requirements right now are a "clean 12v", so depending on what components are certified by NVidia, some manufacturers /JUST MIGHT/ be able to get away with a single plug from a good PSU (like a high rated 350w). NVidia's Q.C. program they've inacted since the FX series launch is likely going to be the limiting factor in this. I agree. But I will go with the 2-plug version if it is available. The lower the PS impedance at the board, the less noisy the power on the board. What I'm really excited about is the next card revision. In the same way the NV35 completed the promise of the NV30, The NV45 (or whatever core revision gets stuck on the PCB in 6 mo.'s), will likely be a much more efficient and slimmed fighter. Not likely for quite a while. Manipulation in the current 0.13u process will have very little power-savings without cutting functionality. And the masking cost is horrendous. Probably at least $1 million for a chip this size, assuming first-pass no errors... I would expect the next real iteration to be either a .09u ..065u shrink. IBM is working with AMD on .065u. And that sure won't happen in six months. The NV30 to NV35 interation was a significant DESIGN improvement on the SAME process. The design of the NV40 seems near-perfect for the current graphics state-of-the-art. Only benefit would be a mask-shrink, which would potentially improve yield (assuming a stable process) and significantly raise the number of die per wafer - thus doubly reducing production costs. John Lewis Most likely even better memory bandwidth, some tweaked RGMSAA modes, probably a few Doom III tricks , and hopefully a much more efficient power usage. As for now, it looks like brute force is doing pretty well. |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
6800 Ultra power requirements? | Dr. Linux | Nvidia Videocards | 11 | April 21st 04 08:22 AM |
NVidia 6800 Card and System Builders | John Lewis | Nvidia Videocards | 6 | April 15th 04 09:16 PM |
Bad news for ATI: Nvidia to 'own' ATI at CeBit - no pixel shader 3.0 support in R420 (long) | NV55 | Nvidia Videocards | 11 | February 24th 04 06:29 AM |
Won't Power Up after Power Outage | Greg Lovern | Homebuilt PC's | 7 | February 8th 04 01:47 PM |
nVidia NV40, NV41, NV45 Information | NV55 | Nvidia Videocards | 4 | January 29th 04 02:02 PM |