A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Ati Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

John Carmack's official comments on NV40 (GeForce 6800 family)



 
 
Thread Tools Display Modes
  #1  
Old April 15th 04, 07:59 PM
John Lewis
external usenet poster
 
Posts: n/a
Default John Carmack's official comments on NV40 (GeForce 6800 family)

From the nVidia news release:-
----------------------------------------------------------------------------------------------------------------
NVIDIA Corporation ( NASDAQ: NVDA), the worldwide leader in visual
processing solutions, introduced today the NVIDIA(R) GeForce(TM) 6800
models of graphics processing units (GPUs) for high-performance
desktop computers. The NVIDIA GeForce 6 Series, which includes the
flagship GeForce 6800 Ultra and GeForce 6800, is designed to deliver:

-- Industry-leading 3D performance -- new superscalar 16-pipe
architecture delivers more than twice that of current industry leading
NVIDIA GPUs

-- New features, including Microsoft DirectX(R) 9.0 Shader Model 3.0
feature set -- for ultra-realistic cinematic effects

-- Unprecedented on-chip video processing engine -- enabling high-
definition video and DVD playback

"This is the biggest generation-to-generation performance leap that we
have ever seen with a new GPU," said Jen-Hsun Huang, president and CEO
of NVIDIA. "In addition to the raw performance increase, we had two
fundamental strategies with the 6800 models. First was to take
programmability to the next level with the industry's only GPU with
Shader Model 3.0. Second was to extend the reach of GPUs to the
consumer electronics market with a powerful and fully programmable
video processor capable of multiple video formats and 'prosumer' level
image processing."

"As DOOM 3 development winds to a close, my work has turned to
development of the next generation rendering technology. The NV40 is
my platform of choice due to its support of very long fragment
programs, generalized floating point blending and filtering, and the
extremely high performance," said John Carmack, president and
technical director of id Software
-------------------------------------------------------------------------------------------------------

Still have to hear from Gabe@Valve. All quiet from him, so
far......... Must be busy tweaking the HL2 code for Shaders 3.0 ?
Shaders 2.0 must now be just a little passe..... Far Cry V1.1
implementation of Shader 3.0 is apparently little rough at the
moment, but Crytek says that they are working on it. No doubt
it will be in a polished patch by the time the NV40 is retail
available.

For me personally the 6800 is as exciting a step forward in PC
peripherals as the Voodoo1 was when it first emerged. Not only for
the 6800s enormous graphical power, but also for its potential
contribution to PC-based video production and editing, which is an
active business for me. The very powerful integrated video processor
is as important to me as the graphics capability, particularly the
MPEG-2 encoding hardware elements. Adobe After Effects have
already declared support for the NV40 and no doubt other video
toolmakers like Pinnacle are looking hard at its capability. Now if
Intel would only reduce the price of the P4 EE to that of the retail
list of the 6800Ultra, or less, instead of fleecing potential
customers at $999 a pop, then I would be very happy indeed with my
video production/editing hardware after those two were installed.

John Lewis
  #2  
Old April 15th 04, 10:46 PM
Skippy
external usenet poster
 
Posts: n/a
Default

$7 million, actually...


"K" wrote in message
news
On Thu, 15 Apr 2004 18:59:37 +0000, John Lewis wrote:


Gabe's still counting his $5,000,000 in change he got for selling ATI
worthless pieces of paper to bundle in with their cards

K



  #3  
Old April 15th 04, 10:55 PM
K
external usenet poster
 
Posts: n/a
Default

On Thu, 15 Apr 2004 18:59:37 +0000, John Lewis wrote:


Still have to hear from Gabe@Valve. All quiet from him, so
far......... Must be busy tweaking the HL2 code for Shaders 3.0 ?
Shaders 2.0 must now be just a little passe..... Far Cry V1.1
implementation of Shader 3.0 is apparently little rough at the
moment, but Crytek says that they are working on it. No doubt
it will be in a polished patch by the time the NV40 is retail
available.



Gabe's still counting his $5,000,000 in change he got for selling ATI
worthless pieces of paper to bundle in with their cards

K
  #4  
Old April 15th 04, 11:59 PM
John Reynolds
external usenet poster
 
Posts: n/a
Default

"John Lewis" wrote in message
...
From the nVidia news release:-
--------------------------------------------------------------------------

--------------------------------------
NVIDIA Corporation ( NASDAQ: NVDA), the worldwide leader in visual
processing solutions, introduced today the NVIDIA(R) GeForce(TM) 6800
models of graphics processing units (GPUs) for high-performance
desktop computers. The NVIDIA GeForce 6 Series, which includes the
flagship GeForce 6800 Ultra and GeForce 6800, is designed to deliver:

-- Industry-leading 3D performance -- new superscalar 16-pipe
architecture delivers more than twice that of current industry leading
NVIDIA GPUs

-- New features, including Microsoft DirectX(R) 9.0 Shader Model 3.0
feature set -- for ultra-realistic cinematic effects

-- Unprecedented on-chip video processing engine -- enabling high-
definition video and DVD playback

"This is the biggest generation-to-generation performance leap that we
have ever seen with a new GPU," said Jen-Hsun Huang, president and CEO
of NVIDIA. "In addition to the raw performance increase, we had two
fundamental strategies with the 6800 models. First was to take
programmability to the next level with the industry's only GPU with
Shader Model 3.0. Second was to extend the reach of GPUs to the
consumer electronics market with a powerful and fully programmable
video processor capable of multiple video formats and 'prosumer' level
image processing."

"As DOOM 3 development winds to a close, my work has turned to
development of the next generation rendering technology. The NV40 is
my platform of choice due to its support of very long fragment
programs, generalized floating point blending and filtering, and the
extremely high performance," said John Carmack, president and
technical director of id Software
--------------------------------------------------------------------------

-----------------------------

Still have to hear from Gabe@Valve. All quiet from him, so
far......... Must be busy tweaking the HL2 code for Shaders 3.0 ?
Shaders 2.0 must now be just a little passe..... Far Cry V1.1
implementation of Shader 3.0 is apparently little rough at the
moment, but Crytek says that they are working on it. No doubt
it will be in a polished patch by the time the NV40 is retail
available.

For me personally the 6800 is as exciting a step forward in PC
peripherals as the Voodoo1 was when it first emerged. Not only for
the 6800s enormous graphical power, but also for its potential
contribution to PC-based video production and editing, which is an
active business for me. The very powerful integrated video processor
is as important to me as the graphics capability, particularly the
MPEG-2 encoding hardware elements. Adobe After Effects have
already declared support for the NV40 and no doubt other video
toolmakers like Pinnacle are looking hard at its capability. Now if
Intel would only reduce the price of the P4 EE to that of the retail
list of the 6800Ultra, or less, instead of fleecing potential
customers at $999 a pop, then I would be very happy indeed with my
video production/editing hardware after those two were installed.

John Lewis


Your post would be. . .hmmm, what's the word. . .more legit if you weren't
coming off as a nVidia fanboy flaming away at Valve, John. Newell simply
voiced what every developer knew about the FX parts: they sucked at running
DX9 code at floating point precision. Hell, these NV40 previews show that
more clearly than anything else. And what do Carmack and Newell have in
common? Their companies' new engines both required special code paths to
get good performance out of FX boards? Think about that, John. Oh, and for
Far Cry whether those new screenshots require SM 3.0 support is still up in
the air. I've heard they're created using offset mapping, not vertex
texturing; this was written by Democoder, the guy who got that Unreal 3
engine movie and some Far Cry shots from yesterday (he's a regular poster at
B3D).

Anyways, the 6800U looks like a very impressive part. The only real
negatives are the power consumption/heat and the fact that both AA and AF
could be better. It'll be interesting to see if the R420 from ATI can
compete. Anthony "Reverend" Tan just quoted Tim Sweeney on B3D's board and
Tim said, about R420, that "It rocks!". This next generation is definitely
going to be much more interesting than last year's, that's for sure.

John "fanboys suck" Reynolds


  #5  
Old April 16th 04, 12:00 AM
John Reynolds
external usenet poster
 
Posts: n/a
Default


"Skippy" wrote in message
...
$7 million, actually...


"K" wrote in message
news
On Thu, 15 Apr 2004 18:59:37 +0000, John Lewis wrote:


Gabe's still counting his $5,000,000 in change he got for selling ATI
worthless pieces of paper to bundle in with their cards

K


And how much did nVidia pay Activison/id for the Doom 3 deal? I've heard
4-5 million. Gwarsch! Carmack has sold out to nVidia!! He's deh devil!!

John


  #6  
Old April 16th 04, 12:16 AM
Destroy
external usenet poster
 
Posts: n/a
Default

And how much did nVidia pay Activison/id for the Doom 3 deal? I've heard
4-5 million. Gwarsch! Carmack has sold out to nVidia!! He's deh devil!!


He also seems to be in bed with Intel. His engines always run better on
nonAMD systems.

  #7  
Old April 16th 04, 01:22 AM
John Lewis
external usenet poster
 
Posts: n/a
Default

On Thu, 15 Apr 2004 22:59:53 GMT, "John Reynolds"
wrote:

Your post would be. . .hmmm, what's the word. . .more legit if you weren't
coming off as a nVidia fanboy flaming away at Valve, John. Newell simply
voiced what every developer knew about the FX parts: they sucked at running
DX9 code at floating point precision. Hell, these NV40 previews show that
more clearly than anything else. And what do Carmack and Newell have in
common? Their companies' new engines both required special code paths to
get good performance out of FX boards? Think about that, John. Oh, and for
Far Cry whether those new screenshots require SM 3.0 support is still up in
the air. I've heard they're created using offset mapping, not vertex
texturing; this was written by Democoder, the guy who got that Unreal 3
engine movie and some Far Cry shots from yesterday (he's a regular poster at
B3D).

Anyways, the 6800U looks like a very impressive part. The only real
negatives are the power consumption/heat


the NV40 GPU consumes ~ 25 watts more than the NV35 or R350.
The whole board consumes a max of 110 watts. Compare the
Prescott 3.4 CPU @ 103 watts max. ( Northwood 3.4, 89 watts )


and the fact that both AA and AF
could be better.


In what way... please be specific...

It'll be interesting to see if the R420 from ATI can
compete


Let's hope that they have a VPU on board that is competitive with
that in the NV40. For professional video applications, that feature
is almost as important as the graphics-engine features.


Anthony "Reverend" Tan just quoted Tim Sweeney on B3D's board and
Tim said, about R420, that "It rocks!". This next generation is definitely
going to be much more interesting than last year's, that's for sure.

John "fanboys suck" Reynolds


aka John "Ati fanboy now, past- 3dfx and nVidia fan-boy" Reynolds.




John Lewis

  #8  
Old April 16th 04, 01:37 AM
Derek Baker
external usenet poster
 
Posts: n/a
Default

"John Reynolds" wrote in message
...
"John Lewis" wrote in message
...
From the nVidia news release:-


--------------------------------------------------------------------------
--------------------------------------
NVIDIA Corporation ( NASDAQ: NVDA), the worldwide leader in visual
processing solutions, introduced today the NVIDIA(R) GeForce(TM) 6800
models of graphics processing units (GPUs) for high-performance
desktop computers. The NVIDIA GeForce 6 Series, which includes the
flagship GeForce 6800 Ultra and GeForce 6800, is designed to deliver:

-- Industry-leading 3D performance -- new superscalar 16-pipe
architecture delivers more than twice that of current industry leading
NVIDIA GPUs

-- New features, including Microsoft DirectX(R) 9.0 Shader Model 3.0
feature set -- for ultra-realistic cinematic effects

-- Unprecedented on-chip video processing engine -- enabling high-
definition video and DVD playback

"This is the biggest generation-to-generation performance leap that we
have ever seen with a new GPU," said Jen-Hsun Huang, president and CEO
of NVIDIA. "In addition to the raw performance increase, we had two
fundamental strategies with the 6800 models. First was to take
programmability to the next level with the industry's only GPU with
Shader Model 3.0. Second was to extend the reach of GPUs to the
consumer electronics market with a powerful and fully programmable
video processor capable of multiple video formats and 'prosumer' level
image processing."

"As DOOM 3 development winds to a close, my work has turned to
development of the next generation rendering technology. The NV40 is
my platform of choice due to its support of very long fragment
programs, generalized floating point blending and filtering, and the
extremely high performance," said John Carmack, president and
technical director of id Software


--------------------------------------------------------------------------
-----------------------------

Still have to hear from Gabe@Valve. All quiet from him, so
far......... Must be busy tweaking the HL2 code for Shaders 3.0 ?
Shaders 2.0 must now be just a little passe..... Far Cry V1.1
implementation of Shader 3.0 is apparently little rough at the
moment, but Crytek says that they are working on it. No doubt
it will be in a polished patch by the time the NV40 is retail
available.

For me personally the 6800 is as exciting a step forward in PC
peripherals as the Voodoo1 was when it first emerged. Not only for
the 6800s enormous graphical power, but also for its potential
contribution to PC-based video production and editing, which is an
active business for me. The very powerful integrated video processor
is as important to me as the graphics capability, particularly the
MPEG-2 encoding hardware elements. Adobe After Effects have
already declared support for the NV40 and no doubt other video
toolmakers like Pinnacle are looking hard at its capability. Now if
Intel would only reduce the price of the P4 EE to that of the retail
list of the 6800Ultra, or less, instead of fleecing potential
customers at $999 a pop, then I would be very happy indeed with my
video production/editing hardware after those two were installed.

John Lewis


Your post would be. . .hmmm, what's the word. . .more legit if you weren't
coming off as a nVidia fanboy flaming away at Valve, John. Newell simply
voiced what every developer knew about the FX parts: they sucked at

running
DX9 code at floating point precision. Hell, these NV40 previews show that
more clearly than anything else. And what do Carmack and Newell have in
common? Their companies' new engines both required special code paths to
get good performance out of FX boards? Think about that, John. Oh, and

for
Far Cry whether those new screenshots require SM 3.0 support is still up

in
the air. I've heard they're created using offset mapping, not vertex
texturing; this was written by Democoder, the guy who got that Unreal 3
engine movie and some Far Cry shots from yesterday (he's a regular poster

at
B3D).

Anyways, the 6800U looks like a very impressive part. The only real
negatives are the power consumption/heat and the fact that both AA and AF
could be better. It'll be interesting to see if the R420 from ATI can
compete. Anthony "Reverend" Tan just quoted Tim Sweeney on B3D's board

and
Tim said, about R420, that "It rocks!". This next generation is

definitely
going to be much more interesting than last year's, that's for sure.

John "fanboys suck" Reynolds



Got a link for that Tan comment?

--
Derek


  #9  
Old April 16th 04, 01:38 AM
John Reynolds
external usenet poster
 
Posts: n/a
Default

"John Lewis" wrote in message
...
On Thu, 15 Apr 2004 22:59:53 GMT, "John Reynolds"
wrote:

Your post would be. . .hmmm, what's the word. . .more legit if you

weren't
coming off as a nVidia fanboy flaming away at Valve, John. Newell simply
voiced what every developer knew about the FX parts: they sucked at

running
DX9 code at floating point precision. Hell, these NV40 previews show

that
more clearly than anything else. And what do Carmack and Newell have in
common? Their companies' new engines both required special code paths to
get good performance out of FX boards? Think about that, John. Oh, and

for
Far Cry whether those new screenshots require SM 3.0 support is still up

in
the air. I've heard they're created using offset mapping, not vertex
texturing; this was written by Democoder, the guy who got that Unreal 3
engine movie and some Far Cry shots from yesterday (he's a regular poster

at
B3D).

Anyways, the 6800U looks like a very impressive part. The only real
negatives are the power consumption/heat


the NV40 GPU consumes ~ 25 watts more than the NV35 or R350.
The whole board consumes a max of 110 watts. Compare the
Prescott 3.4 CPU @ 103 watts max. ( Northwood 3.4, 89 watts )


I read so many previews yesterday that I don't remember which one, but one
of them did show under load the 6800U hitting around 200 watts.

and the fact that both AA and AF
could be better.


In what way... please be specific...


The AA is limited to 4x for multi-sampling, and it lacks gamma correction
and programmable patterns. It's an improvement from previous nVidia parts,
but still lags behind ATI's AA. And the AF is now angle-dependent like
ATI's, which is an intentional step-down in quality.


It'll be interesting to see if the R420 from ATI can
compete


Let's hope that they have a VPU on board that is competitive with
that in the NV40. For professional video applications, that feature
is almost as important as the graphics-engine features.


Anthony "Reverend" Tan just quoted Tim Sweeney on B3D's board and
Tim said, about R420, that "It rocks!". This next generation is

definitely
going to be much more interesting than last year's, that's for sure.

John "fanboys suck" Reynolds


aka John "Ati fanboy now, past- 3dfx and nVidia fan-boy" Reynolds.


I used to be a 3dfx fanboy years ago. I'll never make that mistake again.
Though I've been running ATI hardware since Sept. of '02 when the 9700 Pro
came out, I'll switch to a 6800U in a heartbeat if I think it's the better
part (gotta' wait for those R420 previews).

John


  #10  
Old April 16th 04, 01:45 AM
John Reynolds
external usenet poster
 
Posts: n/a
Default


"Derek Baker" wrote in message
...
Anyways, the 6800U looks like a very impressive part. The only real
negatives are the power consumption/heat and the fact that both AA and

AF
could be better. It'll be interesting to see if the R420 from ATI can
compete. Anthony "Reverend" Tan just quoted Tim Sweeney on B3D's board

and
Tim said, about R420, that "It rocks!". This next generation is

definitely
going to be much more interesting than last year's, that's for sure.

John "fanboys suck" Reynolds


Got a link for that Tan comment?


http://www.beyond3d.com/forum/viewto...asc&sta rt=76


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
NV40 ~ GeForce 6800 specs NV55 Ati Videocards 52 April 20th 04 11:09 PM


All times are GMT +1. The time now is 12:53 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.