A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Ati Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Nvidia, ATI/AMD talk about GPU architectures for future consoles



 
 
Thread Tools Display Modes
  #1  
Old October 13th 07, 08:51 AM posted to alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,alt.games.video.xbox,alt.games.video.sony-playstation3,rec.games.video.nintendo.wii
R600
external usenet poster
 
Posts: 12
Default Nvidia, ATI/AMD talk about GPU architectures for future consoles

Nvidia, ATI/AMD look beyond GPUs toward unified gaming engines

A roundtable discussion in San Francisco this morning provided a quick
glimpse into a very possible future for console gaming hardwa an
evolution beyond the XBox 360 and the PlayStation-3 to a future that
changes the entire role of graphics processing units (GPUs.) The
discussion started with the observation that both Nvidia and ATI,
before the latter's absorption into AMD, have been actively exploring
general-purpose computing applications for the highly-parallel shading
engines in their GPUs.

Jonah Alben, vice president of GPU engineering at Nvidia, said that
this thread began when, in response to game-developers' requests for
more ability to differentiate, the GPU architects made the shading
engines on their chips programmable. This not only allowed game
developers to put their own shading algorithms-which have a
significant impact on game appearance-on the GPU hardware, but it also
incidentally created a very large array of somewhat-general little
parallel processing units, each with its own local memory, ALU, and
instruction set.

It didn't take too long for developers in other applications to latch
onto that fact. Today, applications developers have programmed GPUs to
analyze financial instruments, to reduce geological data, and to do
the heavy lifting in a variety of other applications. IBM fellow James
Kahle made a similar remark about the arguably more general, if less
parallel, IBM Cell processor. Cell-based blades are being used today
for financial analysis, geological exploration and medical imaging, he
said. Alben added that despite the somewhat limited instruction sets
of the GPU shading engines, the only criterion for applications being
ported to them seemed to be that the applications be parallelizable.

But then the discussion turned back to the world of gaming consoles.
Many of the intense, non-graphics tasks that go into an immersive game
are also at least moderately parallelizable: game physics and probably
the artificial-intelligence engines that run game sequence are
examples. Could these tasks also be moved to the GPU, perhaps with a
little more general-purpose tweak to the shading-engine hardware? The
consensus was that yes, there were important opportunities there.

This in turn brought two very interesting observations. One, from
Cadence CTO for design systems Ted Vucurevich, was that the shading
engines really needed 64-bit datapaths to exploit these opportunities.
But since 64-bit was already being discussed simply to upgrade
graphics rendering, this could well be within the GPU vendors'
roadmaps. Vucurevich also pointed out, parenthetically, that Cadence
is investigating using GPUs to do the complex calculations in the
parallelizable codes within EDA tools.

The other comment, by AMD vice president of engineering Robert
Feldstein, was that the computing power of GPUs could be harnessed for
processing graphic input, as well as for rendering. For example, he
suggested, a camera tracking the console user could provide a video
stream. The GPU could analyze this video to extract gestures, motion,
and even facial expressions from the user, providing an input to the
game system even more natural and immersive than that offered by
controllers on the Nintendo Wii.

The idea that the GPU, once regarded as a non-programmable fixed-
function device could emerge as the real computing heart of the game
system, taking major tasks away from the CPU, is fascinating. But the
rapid spread of GPU-based computing in other areas suggests that this
is a very plausible future for gaming SoCs. And, as we have seen
repeatedly in the past, if the console gaming industry makes something
inexpensive enough, architects will figure out how to use it in
embedded systems as well.


http://www.edn.com/blog/1690000169/post/1110015711.html

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Nvidia, ATI/AMD talk about GPU architectures for future consoles R600 Nvidia Videocards 0 October 13th 07 08:51 AM
Question about future nvidia drivers? nick Nvidia Videocards 7 October 10th 03 10:23 AM
Inq update on future ATI & Nvidia chips Radeon350 Ati Videocards 0 August 13th 03 10:41 PM
Inq update on future ATI & Nvidia chips Radeon350 Nvidia Videocards 0 August 13th 03 10:41 PM
Inq update on future ATI & Nvidia chips Radeon350 General 0 August 13th 03 10:41 PM


All times are GMT +1. The time now is 02:33 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.