A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

DirectX10 GPUs (R600, G80) to Consume up to 300W



 
 
Thread Tools Display Modes
  #1  
Old June 7th 06, 09:20 PM posted to comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action
external usenet poster
 
Posts: n/a
Default DirectX10 GPUs (R600, G80) to Consume up to 300W


http://www.anandtech.com/tradeshows/showdoc.aspx?i=2770

__________________________________________________ _______________________


It's almost like we say this every year, but Computex hasn't even
officially started yet and we already have a lot to talk about.
Everything from the power requirements of next year's GPUs from ATI
and NVIDIA to the excitement surrounding Intel's Conroe launch is
going to be covered in today's pre-show coverage so we'll save the
long winded introduction and get right to business.

We will first address some of the overall trends we've seen while
speaking to many of the Taiwanese manufacturers and then dive into
product specific items. We'll start with the most shocking news
we've run into thus far - the power consumption of the
next-generation GPUs due out early next year.

DirectX 10 GPUs to Consume up to 300W

ATI and NVIDIA have been briefing power supply manufacturers in Taiwan
recently about what to expect for next year's R600 and G80 GPUs.
Both GPUs will be introduced in late 2006 or early 2007, and while we
don't know the specifications of the new cores we do know that they
will be extremely power hungry. The new GPUs will range in power
consumption from 130W up to 300W per card. ATI and NVIDIA won't confirm
or deny our findings and we are receiving conflicting information as to
the exact specifications of these new GPUs, but the one thing is for
sure is that the power requirements are steep.

Power supply makers are being briefed now in order to make sure that
the power supplies they are shipping by the end of this year are up to
par with the high end GPU requirements for late 2006/early 2007. You
will see both higher wattage PSUs (1000 - 1200W) as well as secondary
units specifically for graphics cards. One configuration we've seen
is a standard PSU mounted in your case for your motherboard, CPU and
drives, running alongside a secondary PSU installed in a 5.25" drive
bay. The secondary PSU would then be used to power your graphics
cards.


http://images.anandtech.com/reviews/...l/PICT0239.jpg

OCZ had one such GPU power supply at the show for us to look at. As you
can see above, the 300W power supply can fit into a 5.25" drive bay and
receives power from a cable passed through to it on the inside of your
PC's case.


http://images.anandtech.com/reviews/...l/PICT0242.jpg

OCZ is even working on a model that could interface with its
Powerstream power supplies, so you would simply plug this PSU into your
OCZ PSU without having to run any extra cables through your case.


http://images.anandtech.com/reviews/...l/PICT0240.jpg

In order to deal with the increased power consumption of this
next-generation of DirectX 10 GPUs apparently manufacturers are
considering the possibility of using water cooling to keep noise and
heat to a minimum.

As depressing as this news is, there is a small light at the end of the
tunnel. Our sources tell us that after this next generation of GPUs we
won't see an increase in power consumption, rather a decrease for the
following generation. It seems as if in their intense competition with
one another, ATI and NVIDIA have let power consumption get out of hand
and will begin reeling it back in starting in the second half of next
year.

In the more immediate future, there are some GPUs from ATI that will be
making their debut, including the R580+, RV570 and RV560. The R580+ is
a faster version of the current R580, designed to outperform NVIDIA's
GeForce 7900 GTX. The RV570 is designed to be an upper mid-range
competitor to the 7900GT, possibly carrying the X1700 moniker. The
only information we've received about RV570 is that it may be an 80nm
GPU with 12-pipes. The RV560 may end up being the new successor to the
X1600 series, but we haven't received any indication of
specifications.

http://images.anandtech.com/reviews/...l/PICT0082.jpg


After the HDMI/HDCP fiasco that both ATI and NVIDIA faced earlier this
year, we're finally seeing video cards equipped with HDMI outputs and
full HDCP support. The HDCP solution of choice appears to be a TMDS
transmitter by Silicon Image that has found its way onto almost all of
the HDMI equipped video cards we've seen.

http://images.anandtech.com/reviews/...l/PICT0085.jpg
http://images.anandtech.com/reviews/...l/PICT0084.jpg


While some of the HDMI equipped graphics cards simply use the HDMI
output as a glorified DVI connector, other companies have outfitted
their designs with a SPDIF header to allow for digital audio
passthrough over HDMI as well. Remember that the HDMI connector can
carry both audio and video data, and by outfitting cards with a header
for internal audio passthrough (from your soundcard/motherboard to the
graphics card) you take advantage of that feature of the HDMI
specification.

http://images.anandtech.com/reviews/...l/PICT0182.JPG

Alongside HDMI support, passively cooled GPUs are "in" these days
as we've seen a number of fanless graphics cards since we've been
here. The combination of HDMI output and a passive design is a
definite winner for the HTPC community, who are most likely to be after
a video card equipped with a HDMI output at this point.

  #2  
Old June 8th 06, 03:38 AM posted to comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action
external usenet poster
 
Posts: n/a
Default DirectX10 GPUs (R600, G80) to Consume up to 300W


"AirRaid" wrote in message
oups.com...
As depressing as this news is, there is a small light at the end of the
tunnel. Our sources tell us that after this next generation of GPUs we
won't see an increase in power consumption, rather a decrease for the
following generation. It seems as if in their intense competition with
one another, ATI and NVIDIA have let power consumption get out of hand
and will begin reeling it back in starting in the second half of next
year.


Yea, and I won't be buying any first gen DX10 card either. This is a win-win
situation for me. I sit out Vista and DX10 cards until most of the bugs and
foibles are fixed and I save money too. That's cool with me.


  #3  
Old June 8th 06, 03:57 AM posted to comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action
external usenet poster
 
Posts: n/a
Default DirectX10 GPUs (R600, G80) to Consume up to 300W

On Thu, 08 Jun 2006 02:38:07 GMT, "Gank" wrote:


"AirRaid" wrote in message
roups.com...
As depressing as this news is, there is a small light at the end of the
tunnel. Our sources tell us that after this next generation of GPUs we
won't see an increase in power consumption, rather a decrease for the
following generation. It seems as if in their intense competition with
one another, ATI and NVIDIA have let power consumption get out of hand
and will begin reeling it back in starting in the second half of next
year.


Yea, and I won't be buying any first gen DX10 card either. This is a win-win
situation for me. I sit out Vista and DX10 cards until most of the bugs and
foibles are fixed and I save money too. That's cool with me.


Yeah... much cooler (physically, financially and metaphorically) than
being the suckers that buy the first bleeding-edge DX10 cards.

John Lewis
  #4  
Old June 8th 06, 07:37 AM posted to comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action
external usenet poster
 
Posts: n/a
Default DirectX10 GPUs (R600, G80) to Consume up to 300W

Your links don't work whoever you are LoL



"AirRaid" wrote in message
oups.com...

http://www.anandtech.com/tradeshows/showdoc.aspx?i=2770

__________________________________________________ _______________________


It's almost like we say this every year, but Computex hasn't even
officially started yet and we already have a lot to talk about.
Everything from the power requirements of next year's GPUs from ATI
and NVIDIA to the excitement surrounding Intel's Conroe launch is
going to be covered in today's pre-show coverage so we'll save the
long winded introduction and get right to business.

We will first address some of the overall trends we've seen while
speaking to many of the Taiwanese manufacturers and then dive into
product specific items. We'll start with the most shocking news
we've run into thus far - the power consumption of the
next-generation GPUs due out early next year.

DirectX 10 GPUs to Consume up to 300W

ATI and NVIDIA have been briefing power supply manufacturers in Taiwan
recently about what to expect for next year's R600 and G80 GPUs.
Both GPUs will be introduced in late 2006 or early 2007, and while we
don't know the specifications of the new cores we do know that they
will be extremely power hungry. The new GPUs will range in power
consumption from 130W up to 300W per card. ATI and NVIDIA won't confirm
or deny our findings and we are receiving conflicting information as to
the exact specifications of these new GPUs, but the one thing is for
sure is that the power requirements are steep.

Power supply makers are being briefed now in order to make sure that
the power supplies they are shipping by the end of this year are up to
par with the high end GPU requirements for late 2006/early 2007. You
will see both higher wattage PSUs (1000 - 1200W) as well as secondary
units specifically for graphics cards. One configuration we've seen
is a standard PSU mounted in your case for your motherboard, CPU and
drives, running alongside a secondary PSU installed in a 5.25" drive
bay. The secondary PSU would then be used to power your graphics
cards.


http://images.anandtech.com/reviews/...l/PICT0239.jpg

OCZ had one such GPU power supply at the show for us to look at. As you
can see above, the 300W power supply can fit into a 5.25" drive bay and
receives power from a cable passed through to it on the inside of your
PC's case.


http://images.anandtech.com/reviews/...l/PICT0242.jpg

OCZ is even working on a model that could interface with its
Powerstream power supplies, so you would simply plug this PSU into your
OCZ PSU without having to run any extra cables through your case.


http://images.anandtech.com/reviews/...l/PICT0240.jpg

In order to deal with the increased power consumption of this
next-generation of DirectX 10 GPUs apparently manufacturers are
considering the possibility of using water cooling to keep noise and
heat to a minimum.

As depressing as this news is, there is a small light at the end of the
tunnel. Our sources tell us that after this next generation of GPUs we
won't see an increase in power consumption, rather a decrease for the
following generation. It seems as if in their intense competition with
one another, ATI and NVIDIA have let power consumption get out of hand
and will begin reeling it back in starting in the second half of next
year.

In the more immediate future, there are some GPUs from ATI that will be
making their debut, including the R580+, RV570 and RV560. The R580+ is
a faster version of the current R580, designed to outperform NVIDIA's
GeForce 7900 GTX. The RV570 is designed to be an upper mid-range
competitor to the 7900GT, possibly carrying the X1700 moniker. The
only information we've received about RV570 is that it may be an 80nm
GPU with 12-pipes. The RV560 may end up being the new successor to the
X1600 series, but we haven't received any indication of
specifications.

http://images.anandtech.com/reviews/...l/PICT0082.jpg


After the HDMI/HDCP fiasco that both ATI and NVIDIA faced earlier this
year, we're finally seeing video cards equipped with HDMI outputs and
full HDCP support. The HDCP solution of choice appears to be a TMDS
transmitter by Silicon Image that has found its way onto almost all of
the HDMI equipped video cards we've seen.

http://images.anandtech.com/reviews/...l/PICT0085.jpg
http://images.anandtech.com/reviews/...l/PICT0084.jpg


While some of the HDMI equipped graphics cards simply use the HDMI
output as a glorified DVI connector, other companies have outfitted
their designs with a SPDIF header to allow for digital audio
passthrough over HDMI as well. Remember that the HDMI connector can
carry both audio and video data, and by outfitting cards with a header
for internal audio passthrough (from your soundcard/motherboard to the
graphics card) you take advantage of that feature of the HDMI
specification.

http://images.anandtech.com/reviews/...l/PICT0182.JPG

Alongside HDMI support, passively cooled GPUs are "in" these days
as we've seen a number of fanless graphics cards since we've been
here. The combination of HDMI output and a passive design is a
definite winner for the HTPC community, who are most likely to be after
a video card equipped with a HDMI output at this point.



  #5  
Old June 8th 06, 03:59 PM posted to comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action
external usenet poster
 
Posts: n/a
Default DirectX10 GPUs (R600, G80) to Consume up to 300W

Gank wrote:

"AirRaid" wrote in message
roups.com...
As depressing as this news is, there is a small light at the end of the
tunnel. Our sources tell us that after this next generation of GPUs we
won't see an increase in power consumption, rather a decrease for the
following generation. It seems as if in their intense competition with
one another, ATI and NVIDIA have let power consumption get out of hand
and will begin reeling it back in starting in the second half of next
year.


Yea, and I won't be buying any first gen DX10 card either. This is a win-win
situation for me. I sit out Vista and DX10 cards until most of the bugs and
foibles are fixed and I save money too. That's cool with me.


Indeed. I'm planning on building a new PC in a month or two,
featuring an Intel Core 2 and a 7900GT. OS will be dual-boot Linux
and XP.

  #6  
Old June 8th 06, 05:53 PM posted to comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action
external usenet poster
 
Posts: n/a
Default DirectX10 GPUs (R600, G80) to Consume up to 300W


"Chingy" wrote in message
...
Your links don't work whoever you are LoL

Well the first link worked for me, but the rest no.

"AirRaid" wrote in message
oups.com...

http://www.anandtech.com/tradeshows/showdoc.aspx?i=2770

__________________________________________________ _______________________


It's almost like we say this every year, but Computex hasn't even
officially started yet and we already have a lot to talk about.
Everything from the power requirements of next year's GPUs from ATI
and NVIDIA to the excitement surrounding Intel's Conroe launch is
going to be covered in today's pre-show coverage so we'll save the
long winded introduction and get right to business.

We will first address some of the overall trends we've seen while
speaking to many of the Taiwanese manufacturers and then dive into
product specific items. We'll start with the most shocking news
we've run into thus far - the power consumption of the
next-generation GPUs due out early next year.

DirectX 10 GPUs to Consume up to 300W

ATI and NVIDIA have been briefing power supply manufacturers in Taiwan
recently about what to expect for next year's R600 and G80 GPUs.
Both GPUs will be introduced in late 2006 or early 2007, and while we
don't know the specifications of the new cores we do know that they
will be extremely power hungry. The new GPUs will range in power
consumption from 130W up to 300W per card. ATI and NVIDIA won't confirm
or deny our findings and we are receiving conflicting information as to
the exact specifications of these new GPUs, but the one thing is for
sure is that the power requirements are steep.

Power supply makers are being briefed now in order to make sure that
the power supplies they are shipping by the end of this year are up to
par with the high end GPU requirements for late 2006/early 2007. You
will see both higher wattage PSUs (1000 - 1200W) as well as secondary
units specifically for graphics cards. One configuration we've seen
is a standard PSU mounted in your case for your motherboard, CPU and
drives, running alongside a secondary PSU installed in a 5.25" drive
bay. The secondary PSU would then be used to power your graphics
cards.


http://images.anandtech.com/reviews/...l/PICT0239.jpg

OCZ had one such GPU power supply at the show for us to look at. As you
can see above, the 300W power supply can fit into a 5.25" drive bay and
receives power from a cable passed through to it on the inside of your
PC's case.


http://images.anandtech.com/reviews/...l/PICT0242.jpg

OCZ is even working on a model that could interface with its
Powerstream power supplies, so you would simply plug this PSU into your
OCZ PSU without having to run any extra cables through your case.


http://images.anandtech.com/reviews/...l/PICT0240.jpg

In order to deal with the increased power consumption of this
next-generation of DirectX 10 GPUs apparently manufacturers are
considering the possibility of using water cooling to keep noise and
heat to a minimum.

As depressing as this news is, there is a small light at the end of the
tunnel. Our sources tell us that after this next generation of GPUs we
won't see an increase in power consumption, rather a decrease for the
following generation. It seems as if in their intense competition with
one another, ATI and NVIDIA have let power consumption get out of hand
and will begin reeling it back in starting in the second half of next
year.

In the more immediate future, there are some GPUs from ATI that will be
making their debut, including the R580+, RV570 and RV560. The R580+ is
a faster version of the current R580, designed to outperform NVIDIA's
GeForce 7900 GTX. The RV570 is designed to be an upper mid-range
competitor to the 7900GT, possibly carrying the X1700 moniker. The
only information we've received about RV570 is that it may be an 80nm
GPU with 12-pipes. The RV560 may end up being the new successor to the
X1600 series, but we haven't received any indication of
specifications.

http://images.anandtech.com/reviews/...l/PICT0082.jpg


After the HDMI/HDCP fiasco that both ATI and NVIDIA faced earlier this
year, we're finally seeing video cards equipped with HDMI outputs and
full HDCP support. The HDCP solution of choice appears to be a TMDS
transmitter by Silicon Image that has found its way onto almost all of
the HDMI equipped video cards we've seen.

http://images.anandtech.com/reviews/...l/PICT0085.jpg
http://images.anandtech.com/reviews/...l/PICT0084.jpg


While some of the HDMI equipped graphics cards simply use the HDMI
output as a glorified DVI connector, other companies have outfitted
their designs with a SPDIF header to allow for digital audio
passthrough over HDMI as well. Remember that the HDMI connector can
carry both audio and video data, and by outfitting cards with a header
for internal audio passthrough (from your soundcard/motherboard to the
graphics card) you take advantage of that feature of the HDMI
specification.

http://images.anandtech.com/reviews/...l/PICT0182.JPG

Alongside HDMI support, passively cooled GPUs are "in" these days
as we've seen a number of fanless graphics cards since we've been
here. The combination of HDMI output and a passive design is a
definite winner for the HTPC community, who are most likely to be after
a video card equipped with a HDMI output at this point.





  #7  
Old June 8th 06, 06:41 PM posted to comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action
external usenet poster
 
Posts: n/a
Default DirectX10 GPUs (R600, G80) to Consume up to 300W

Mickey Skuczas wrote:

"Chingy" wrote in message
...
Your links don't work whoever you are LoL

Well the first link worked for me, but the rest no.


Good thing you stupid top-posters kept the 100+ lines hanging
uselessly below...

  #8  
Old June 9th 06, 12:45 AM posted to comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action
external usenet poster
 
Posts: n/a
Default DirectX10 GPUs (R600, G80) to Consume up to 300W


"AirRaid" wrote in message
oups.com...

http://www.anandtech.com/tradeshows/showdoc.aspx?i=2770


Holy ****!

That kinda sucks. I don't get it though. A GPU card is nothing more than
like a CPU with a motherboard, and no stinking motherboard/CPU combo I know
of sucks down 300W alone.


  #9  
Old June 9th 06, 06:22 AM posted to comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action
external usenet poster
 
Posts: n/a
Default DirectX10 GPUs (R600, G80) to Consume up to 300W


"HockeyTownUSA" magma at comcast dot net wrote in message
. ..

"AirRaid" wrote in message
oups.com...

http://www.anandtech.com/tradeshows/showdoc.aspx?i=2770


Holy ****!

That kinda sucks. I don't get it though. A GPU card is nothing more than
like a CPU with a motherboard, and no stinking motherboard/CPU combo I
know of sucks down 300W alone.


Current graphics cards actually have more transistors than current CPUs -
though I can't remember the exact figures off the top of my head. The reason
they are able to have more transistors is that graphics processors are
naturally a lot more parallel, so they work out a circuit and then just
repeat it many times. On the other hand, CPUs are much more flexible, but
this means that it is much more complicated to design them - so that
designing a CPU with half as many transistors actually takes many more
man-hours to design.


  #10  
Old June 9th 06, 05:34 PM posted to comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action
external usenet poster
 
Posts: n/a
Default DirectX10 GPUs (R600, G80) to Consume up to 300W

Gank wrote:
"AirRaid" wrote in message
oups.com...
As depressing as this news is, there is a small light at the end of the
tunnel. Our sources tell us that after this next generation of GPUs we
won't see an increase in power consumption, rather a decrease for the
following generation. It seems as if in their intense competition with
one another, ATI and NVIDIA have let power consumption get out of hand
and will begin reeling it back in starting in the second half of next
year.


Yea, and I won't be buying any first gen DX10 card either. This is a win-win
situation for me. I sit out Vista and DX10 cards until most of the bugs and
foibles are fixed and I save money too. That's cool with me.



I'm still on NT4 SP3. I just don't trust anything beyond that just yet. ;-)
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
ATI Technologies Gathers Developers for R600 VPU Radeon Ati Videocards 5 April 29th 04 03:17 PM
ATI Technologies Gathers Developers for R600 VPU Radeon Nvidia Videocards 5 April 29th 04 03:17 PM


All times are GMT +1. The time now is 11:26 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.