A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Ati Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

NV40 ~ GeForce 6800 specs



 
 
Thread Tools Display Modes
  #1  
Old April 13th 04, 06:51 PM
NV55
external usenet poster
 
Posts: n/a
Default NV40 ~ GeForce 6800 specs

the following is ALL quote:


http://frankenstein.evilgeniuslabs.c...nv40/news.html


Tuesday, April 13, 2004

NVIDIA GeForce 6800 GPU family officially announced — Cormac @ 17:00
It's time to officially introduce the new GPU generation from NVIDIA
and shed the light on its architecture and features.

So, the GeForce 6800 GPU family, codenamed NV40, today officially
entered the distribution stage. Initially it will include two chips,
GeForce 6800 Ultra and GeForce 6800, with the same architecture.


These are the key innovations introduced in NVIDIA's novelties:

*16-pipeline superscalar architecture with 6 vertex modules, DDR3
support and *real 32-bit pipelines
*PCI Express x16, AGP 8x support
*222 million transistors
*400MHz core clock
*Chips made by IBM
*0.13µm process


40x40mm FCBGA (flip-chip ball grid array) package
ForceWare 60+ series
Supports 256-bit GDDR3 with over 550MHz (1.1GHz DDR) clock rates
NVIDIA CineFX 3.0 supporting Pixel Shader 3.0, Vertex Shader 3.0;
real-time Displacement Mapping and Tone Mapping; up to 16
textures/pass, 16-bit and 32-bit FP formats, sRGB textures, DirectX
and S3TC compression; 32bpp, 64bpp and 128bpp rendering; lots of new
visual effects
NVIDIA HPDR (High-Precision Dynamic-Range) on OpenEXR technology
supporting FP filtering, texturing, blending and AA
Intellisample 3.0 for extended 16xAA, improved compression
performance; HCT (High-resolution compression), new lossless
compression algorithms for colors, textures and Z buffer in all modes,
including hi-res high-frequency, fast Z buffer clear
NVIDIA UltraShadow II for 4 times the performance in highly shadowed
games (e.g. Doom III) comparing to older GPUs


Extended temperature monitoring and management features
Extended display and video output features, including int.
videoprocessor, hardware MPEG decoder, WMV9 accelerator, adaptive
deinterlacing, video signal scaling and filtering, int. NTSC/PAL
decoder (up to 1024x768), Macrovision copy protection; DVD/HDTV to
MPEG2 decoding at up to 1920x1080i; dual int. 400MHz RAMDAC for up to
2048x1536 @ 85Hz; 2 x DVO for external TMDS transmitters and TV
decoders; Microsoft Video Mixing Renderer (VMR); VIP 1.1 (video
input); NVIDIA nView
NVIDIA Digital Vibrance Control (DVC) 3.0 for color and image clarity
management
Supports Windows XP/ME/2000/9X; MacOS, Linux
Supports the latest DirectX 9.0, OpenGL 1.5


http://frankenstein.evilgeniuslabs.c..._files/nv3.png



We have almost received a GeForce 6800 sample, so it's still early to
speak of GPU power consumption. Though a giant core with 222 million
transistors imply some high appetite for power. At least NVIDIA
recommends testers to use 480W and over power supplies. By the way,
GeForce 6800 Ultra reference cards will occupy two standard slots.
However, it's not obligatory for all vendors, so we might see
single-slot models as well.

Well, having seen the GPU, we now have to wait a bit for its test
results. Please be patient, we are going to publish the respective
article in the nearest future.

Ending this news I'll mention NVIDIA partners that will support the
new release by solutions on it. They are Albatron, AOpen, ASUSTeK
Computer, Chaintech, Gainward, Leadtek Research, MSI, Palit
Microsystems, PNY Technologies, Prolink Computer, Shuttle and XFX
Technologies.


http://frankenstein.evilgeniuslabs.c...nv40/news.html






quote:

"8 shader units per pipeline and 16 pipelines..."

http://www.beyond3d.com/forum/viewtopic.php?t=11484
  #2  
Old April 13th 04, 08:56 PM
Shep©
external usenet poster
 
Posts: n/a
Default

On 13 Apr 2004 10:51:07 -0700 As truth resonates honesty
(NV55) wrote :

the following is ALL quote:


http://frankenstein.evilgeniuslabs.c...nv40/news.html


Tuesday, April 13, 2004

NVIDIA GeForce 6800 GPU family officially announced — Cormac @ 17:00
It's time to officially introduce the new GPU generation from NVIDIA
and shed the light on its architecture and features.

So, the GeForce 6800 GPU family, codenamed NV40, today officially
entered the distribution stage. Initially it will include two chips,
GeForce 6800 Ultra and GeForce 6800, with the same architecture.


These are the key innovations introduced in NVIDIA's novelties:

*16-pipeline superscalar architecture with 6 vertex modules, DDR3
support and *real 32-bit pipelines



*PCI Express x16, AGP 8x support


Looks like new mother boards required?


*222 million transistors
*400MHz core clock
*Chips made by IBM
*0.13µm process


40x40mm FCBGA (flip-chip ball grid array) package
ForceWare 60+ series
Supports 256-bit GDDR3 with over 550MHz (1.1GHz DDR) clock rates
NVIDIA CineFX 3.0 supporting Pixel Shader 3.0, Vertex Shader 3.0;
real-time Displacement Mapping and Tone Mapping; up to 16
textures/pass, 16-bit and 32-bit FP formats, sRGB textures, DirectX
and S3TC compression; 32bpp, 64bpp and 128bpp rendering; lots of new
visual effects
NVIDIA HPDR (High-Precision Dynamic-Range) on OpenEXR technology
supporting FP filtering, texturing, blending and AA
Intellisample 3.0 for extended 16xAA, improved compression
performance; HCT (High-resolution compression), new lossless
compression algorithms for colors, textures and Z buffer in all modes,
including hi-res high-frequency, fast Z buffer clear
NVIDIA UltraShadow II for 4 times the performance in highly shadowed
games (e.g. Doom III) comparing to older GPUs


Extended temperature monitoring and management features
Extended display and video output features, including int.
videoprocessor, hardware MPEG decoder, WMV9 accelerator, adaptive
deinterlacing, video signal scaling and filtering, int. NTSC/PAL
decoder (up to 1024x768), Macrovision copy protection; DVD/HDTV to
MPEG2 decoding at up to 1920x1080i; dual int. 400MHz RAMDAC for up to
2048x1536 @ 85Hz; 2 x DVO for external TMDS transmitters and TV
decoders; Microsoft Video Mixing Renderer (VMR); VIP 1.1 (video
input); NVIDIA nView
NVIDIA Digital Vibrance Control (DVC) 3.0 for color and image clarity
management
Supports Windows XP/ME/2000/9X; MacOS, Linux
Supports the latest DirectX 9.0, OpenGL 1.5


http://frankenstein.evilgeniuslabs.c..._files/nv3.png



We have almost received a GeForce 6800 sample, so it's still early to
speak of GPU power consumption. Though a giant core with 222 million
transistors imply some high appetite for power. At least NVIDIA
recommends testers to use 480W and over power supplies. By the way,
GeForce 6800 Ultra reference cards will occupy two standard slots.
However, it's not obligatory for all vendors, so we might see
single-slot models as well.

Well, having seen the GPU, we now have to wait a bit for its test
results. Please be patient, we are going to publish the respective
article in the nearest future.

Ending this news I'll mention NVIDIA partners that will support the
new release by solutions on it. They are Albatron, AOpen, ASUSTeK
Computer, Chaintech, Gainward, Leadtek Research, MSI, Palit
Microsystems, PNY Technologies, Prolink Computer, Shuttle and XFX
Technologies.


http://frankenstein.evilgeniuslabs.c...nv40/news.html






quote:

"8 shader units per pipeline and 16 pipelines..."

http://www.beyond3d.com/forum/viewtopic.php?t=11484




--
Free Windows/PC help,
http://www.geocities.com/sheppola/trouble.html
email shepATpartyheld.de
Free songs to download and,"BURN" :O)
http://www.soundclick.com/bands/8/nomessiahsmusic.htm
  #3  
Old April 14th 04, 12:28 AM
teqguy
external usenet poster
 
Posts: n/a
Default

K wrote:

On Tue, 13 Apr 2004 20:56:48 +0100, Shep© wrote:



*PCI Express x16, AGP 8x support


Looks like new mother boards required?



If there is AGP 8x support, why would you need a new motherboard?

K







Because most well known manufacturers will eventually stop carrying AGP
cards all together.



If you're into business at all, you know that it's more cost effective
to produce one version of a product than two... unless you're Microsoft
or Donald Trump (aka God).




The voltages on PCI-E and AGP are entirely different, so different
components (such as resistors) must be used.


In order to avoid confusion between the two, you'd have to hire two
different production lines, have twice as many labs, and pay for two
types of packaging, manuals, etc.




Having one version of a product cuts down on confusion and returns,
which helps both consumers and retail sales.
  #4  
Old April 14th 04, 01:10 AM
K
external usenet poster
 
Posts: n/a
Default

On Tue, 13 Apr 2004 20:56:48 +0100, Shep© wrote:



*PCI Express x16, AGP 8x support


Looks like new mother boards required?



If there is AGP 8x support, why would you need a new motherboard?

K
  #5  
Old April 14th 04, 01:47 AM
Mr. Grinch
external usenet poster
 
Posts: n/a
Default

"teqguy" wrote in news:uI_ec.26606$F9.15486
@nwrddc01.gnilink.net:

K wrote:

On Tue, 13 Apr 2004 20:56:48 +0100, Shep© wrote:

If there is AGP 8x support, why would you need a new motherboard?

K

Because most well known manufacturers will eventually stop carrying AGP
cards all together.


The thing is, we've had AGP slots out for over 5 years now, and yet you
still find vendors making PCI video cards. So I wouldn't be too worried
about any lack of AGP video cards for some time to come. They'll be around
long enough to follow any of the current motherboards into obsolescence, at
which point you wouldn't want to be buying a video card or any other
upgrade for them anyways.



  #6  
Old April 14th 04, 02:14 AM
K
external usenet poster
 
Posts: n/a
Default

On Tue, 13 Apr 2004 23:28:26 +0000, teqguy wrote:



Because most well known manufacturers will eventually stop carrying AGP
cards all together.


Eventually, yes, but AGP will be with us well into next year. DDR2 will
replace DDR1. Socket 939 will replace socket 940, Socket T will replace
Socket 728, BTX will eventually replace ATX, the list goes on in the never
ending upgrade cycle.



Having one version of a product cuts down on confusion and returns,
which helps both consumers and retail sales.


Absolutely, and I'm sure that the likes of ATI and Nvidia as
well as the motherboard makers will push us to PCI Express as soon as they
can. But it would be suicide for one of them to bring out a new card and
only cater for those who are prepared to buy new motherboards. It's just
the poster I replied to implied that there would be an immediate need to
replace your motherboard, which is clearly not the case.

I have a gut feeling that PCI Express will do very little for performance,
just like AGP before it. Nothing can substitute lots of fast RAM on the
videocard to prevent shipping textures across to the much
slower system RAM. You could have the fastest interface imaginable for
your vid card; it would do little to make up for the bottleneck that
is your main memory.


K
  #7  
Old April 14th 04, 04:37 AM
Shep©
external usenet poster
 
Posts: n/a
Default

On Wed, 14 Apr 2004 00:10:26 +0000 As truth resonates honesty K
wrote :

On Tue, 13 Apr 2004 20:56:48 +0100, Shep© wrote:



*PCI Express x16, AGP 8x support


Looks like new mother boards required?



If there is AGP 8x support, why would you need a new motherboard?

K


Because it's my understanding that although the new protocol/cards
support AGP 8X this is merely a data rate comparison and the new cards
will only fit a,"PCI-Express" slot,not an AGP one.
http://www.pcstats.com/articleview.cfm?articleID=1087

HTH




--
Free Windows/PC help,
http://www.geocities.com/sheppola/trouble.html
email shepATpartyheld.de
Free songs to download and,"BURN" :O)
http://www.soundclick.com/bands/8/nomessiahsmusic.htm
  #8  
Old April 14th 04, 05:41 AM
NightSky 421
external usenet poster
 
Posts: n/a
Default

"NV55" wrote in message
m...

the following is ALL quote:



Regardless of if someone wants the new high-end nVidia or ATI product,
I've read that a person better have a monster power supply and excellent
case cooling before even considering such cards. I also wonder how loud
the fans on these new cards are going to need to be. It'd be
interesting to see what they can do with regards to cooling and power
consumption on future video cards too - I see this as getting to be more
and more of a problem with time.


  #9  
Old April 14th 04, 07:16 AM
teqguy
external usenet poster
 
Posts: n/a
Default

NightSky 421 wrote:

"NV55" wrote in message
m...

the following is ALL quote:



Regardless of if someone wants the new high-end nVidia or ATI product,
I've read that a person better have a monster power supply and
excellent case cooling before even considering such cards. I also
wonder how loud the fans on these new cards are going to need to be.
It'd be interesting to see what they can do with regards to cooling
and power consumption on future video cards too - I see this as
getting to be more and more of a problem with time.






The power consumption should stay below 15v.

The Geforce FX does NOT use the 12v rail, for anyone wondering.


All 4 pins are connected for potential usage, but the overall
consumption never raises above 5.5v so 17v is not neccessary.




Most companies are starting to push for water cooling. Gainward is one
of them that announced they are going to start shipping a version of
their cards that have a waterblock in place of a conventional heatsink
and fan.



As far as the reference Nvidia cards go... I'm pretty sure we'll start
out with the dustbuster again... at least until someone can decide on a
more effective method.


Solid silver heatsink anyone? =P




  #10  
Old April 14th 04, 07:22 AM
teqguy
external usenet poster
 
Posts: n/a
Default

K wrote:

On Tue, 13 Apr 2004 23:28:26 +0000, teqguy wrote:



Because most well known manufacturers will eventually stop carrying
AGP cards all together.


Eventually, yes, but AGP will be with us well into next year. DDR2
will replace DDR1. Socket 939 will replace socket 940, Socket T will
replace Socket 728, BTX will eventually replace ATX, the list goes on
in the never ending upgrade cycle.



Having one version of a product cuts down on confusion and returns,
which helps both consumers and retail sales.


Absolutely, and I'm sure that the likes of ATI and Nvidia as
well as the motherboard makers will push us to PCI Express as soon as
they can. But it would be suicide for one of them to bring out a new
card and only cater for those who are prepared to buy new
motherboards. It's just the poster I replied to implied that there
would be an immediate need to replace your motherboard, which is
clearly not the case.

I have a gut feeling that PCI Express will do very little for
performance, just like AGP before it. Nothing can substitute lots of
fast RAM on the videocard to prevent shipping textures across to the
much slower system RAM. You could have the fastest interface
imaginable for your vid card; it would do little to make up for the
bottleneck that is your main memory.


K







Current high end graphics cards do very little with an AGP 4x bus, let
alone an 8x bus.



The best possible optimization that could ever be made, would be to
start manufacturing motherboards with sockets for a GPU and either
sockets or slots for video memory.


This would allow for motherboards to potentially reduce in size, while
increasing in performance and upgradability.


The price would increase, but it would be worth it.


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Geforce 6800 drivers Glitch General 3 December 13th 04 09:30 AM
Athlon xp 2600+ (@1.9ghz) with a Geforce 6800? Kiran Kumar Kamineni Overclocking AMD Processors 1 November 14th 04 11:24 AM
Gigabyte NVIDIA 6600 Series: Bringing GeForce 6800 features to the mainstream! Gigabyte USA Marketing Gigabyte Motherboards 0 October 28th 04 11:04 PM
GeForce 6800 Ultra (256 Mb) F.O.R. General 1 August 7th 04 02:20 AM
P4C800-E Deluxe and BFG GeForce 6800 Ultra OC graphics card Mark Cee Asus Motherboards 2 June 28th 04 05:24 AM


All times are GMT +1. The time now is 04:56 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2022, Jelsoft Enterprises Ltd.
Copyright ©2004-2022 HardwareBanter.
The comments are property of their posters.