If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#1
|
|||
|
|||
holy smokes; AMD-ATI R600 ( Radeon X2900XTX ) GPU Technology Demo; "Ruby 4"
decent quality video
http://www.hexus.tv/show.php?show=66 low quality video http://www.youtube.com/watch?v=EJ6aMxPh6k0 looks better than Nvidia's "Adrianne" tech demo for the G80 ~ GeForce 8800 GTX R600 / Radeon X2900XTX is a friggin' 512-bit MONSTER that will provide bandwidth of over 100 GB/s even with lower-end GDDR3 memory. According to Wikipedia, R600's bandwidth ranges from 115.2 GB/s to 160 GB/s depending on the memory options. http://en.wikipedia.org/wiki/Radeon_R600 some of the technical details of the Ruby 4 demo: __________________________________________________ _________________________ "AMD Ruby 4 (R600) demo is impressive CeBIT 007 Ruby learns to ski with nice triangle budget By Theo Valich in Hanover: Friday 16 March 2007, 07:20 YOU CAN FIND R600 boards in hidden places of CeBIT, but we expected that. AMD is faithfully protecting the possible leakage of pictures with marking all of the boards with name of the partner it was intended for, only problem there was the colour of the markings. We do not see the reason for this paranoid behaviour, since all of the important R600 pictures ended up on websites a long time ago. We have seen UFO, GDDR3 boards floating around with a "subject-to-change" GF8800-looking triple-heatpipe cooler in red colour, but we'll leave it at that. The theme of this story is Ruby, after all. We have seen the demo running couple of times and this poor hack has to say that it looks quite impressive, especially when it comes to snow itself or fur on Ruby's winter outfit. There is also a matter of realistic blend of textures on her face and skeleton animation. You could easily imagine that Ruby is a real person, judging by insane amount of detail that went into a creation of this demo. Main render target resolution is not full HD or 1920x1080 (or 1200) pixels, but rather a baseline 1280x720p with HDR in FP16 format and MSAA turned on to 4X. Anisotropic filtering should be set at high in all cases, but this resolution left us confused a bit. It seems that ATI will push the CrossFire two-board package for full-HD resolution. This is mainly due to texture memory budget, since current demo is eating 680 MB. Since ATI is only having 1GB boards around, you can easily calculate that remaining 320MB would not be sufficient for the 1080p frame buffer and decent framerates. However, do not think that there are issues with GPU themselves, since virtual memory addressing is working just nicely on both G80 and R600 chips. Scenes in the demo have between two and two and a half million triangles, depending on scene complexity. Ruby is around 200K triangles and it uses 128 morph targets for facial animation and around 200 bones for skinning animation. When it comes to the face of ATI's bride, animation was done by filming a face of the real-world Ruby (actress) with a high-definition camera. After the filming, plain vanilla video was analysed and processed step-by-step, using highly complex facial recognition software in order to extract facial animation data. After this video session, Ruby's face alone got layered with 15 different textures, and placed in a scene generated by procedurally generated snow. Snow simulation is processed entirely on a GPU and can be dynamically melted or amplified to increase the snow cover (and the snow effect while Ruby is doing a snowboard scene in Janica Kostelic style). When it comes to the fur on Ruby's collar, this is no longer simple vertex-generated fur with predefined movement, but rather a simulated with a physics model, also done on the GPU." http://www.theinquirer.net/default.aspx?article=38361 __________________________________________________ _______________________ |
#3
|
|||
|
|||
holy smokes; AMD-ATI R600 ( Radeon X2900XTX ) GPU Technology Demo; "Ruby 4"
On 21 Mar 2007 13:29:23 -0700, "Radeon350"
wrote: decent quality video http://www.hexus.tv/show.php?show=66 low quality video http://www.youtube.com/watch?v=EJ6aMxPh6k0 Holy SMOKES is right. The part is in redesign again on 65nm. The 65nm yield is pitiful. Rumor from Cebit says that less than 20,000 total of the 'current design' will be made available for shipment starting in May ... not clear whether that is the 80nm heat-monster or the 65nm poor yield version. And the date for full unfettered production is unknown. Shades of the X1800-series fiasco all over again. I'm sure that nVidia must be rather amused by this hiccup. The nV partners have now shipped 500,000 8800-series graphics cards. John Lewis |
#4
|
|||
|
|||
holy smokes; AMD-ATI R600 ( Radeon X2900XTX ) GPU Technology Demo; "Ruby 4"
http://www.vr-zone.com/?i=4830
Radeon R600XTX [sic] retail board (the 9.5" version) photos and "benchmarks". -- "War is the continuation of politics by other means. It can therefore be said that politics is war without bloodshed while war is politics with bloodshed." "Radeon350" wrote in message oups.com... decent quality video http://www.hexus.tv/show.php?show=66 low quality video http://www.youtube.com/watch?v=EJ6aMxPh6k0 looks better than Nvidia's "Adrianne" tech demo for the G80 ~ GeForce 8800 GTX R600 / Radeon X2900XTX is a friggin' 512-bit MONSTER that will provide bandwidth of over 100 GB/s even with lower-end GDDR3 memory. According to Wikipedia, R600's bandwidth ranges from 115.2 GB/s to 160 GB/s depending on the memory options. http://en.wikipedia.org/wiki/Radeon_R600 some of the technical details of the Ruby 4 demo: __________________________________________________ _________________________ "AMD Ruby 4 (R600) demo is impressive CeBIT 007 Ruby learns to ski with nice triangle budget By Theo Valich in Hanover: Friday 16 March 2007, 07:20 YOU CAN FIND R600 boards in hidden places of CeBIT, but we expected that. AMD is faithfully protecting the possible leakage of pictures with marking all of the boards with name of the partner it was intended for, only problem there was the colour of the markings. We do not see the reason for this paranoid behaviour, since all of the important R600 pictures ended up on websites a long time ago. We have seen UFO, GDDR3 boards floating around with a "subject-to-change" GF8800-looking triple-heatpipe cooler in red colour, but we'll leave it at that. The theme of this story is Ruby, after all. We have seen the demo running couple of times and this poor hack has to say that it looks quite impressive, especially when it comes to snow itself or fur on Ruby's winter outfit. There is also a matter of realistic blend of textures on her face and skeleton animation. You could easily imagine that Ruby is a real person, judging by insane amount of detail that went into a creation of this demo. Main render target resolution is not full HD or 1920x1080 (or 1200) pixels, but rather a baseline 1280x720p with HDR in FP16 format and MSAA turned on to 4X. Anisotropic filtering should be set at high in all cases, but this resolution left us confused a bit. It seems that ATI will push the CrossFire two-board package for full-HD resolution. This is mainly due to texture memory budget, since current demo is eating 680 MB. Since ATI is only having 1GB boards around, you can easily calculate that remaining 320MB would not be sufficient for the 1080p frame buffer and decent framerates. However, do not think that there are issues with GPU themselves, since virtual memory addressing is working just nicely on both G80 and R600 chips. Scenes in the demo have between two and two and a half million triangles, depending on scene complexity. Ruby is around 200K triangles and it uses 128 morph targets for facial animation and around 200 bones for skinning animation. When it comes to the face of ATI's bride, animation was done by filming a face of the real-world Ruby (actress) with a high-definition camera. After the filming, plain vanilla video was analysed and processed step-by-step, using highly complex facial recognition software in order to extract facial animation data. After this video session, Ruby's face alone got layered with 15 different textures, and placed in a scene generated by procedurally generated snow. Snow simulation is processed entirely on a GPU and can be dynamically melted or amplified to increase the snow cover (and the snow effect while Ruby is doing a snowboard scene in Janica Kostelic style). When it comes to the fur on Ruby's collar, this is no longer simple vertex-generated fur with predefined movement, but rather a simulated with a physics model, also done on the GPU." http://www.theinquirer.net/default.aspx?article=38361 __________________________________________________ _______________________ |
#5
|
|||
|
|||
holy smokes; AMD-ATI R600 ( Radeon X2900XTX ) GPU Technology Demo; "Ruby 4"
"Radeon350" wrote in message oups.com... decent quality video http://www.hexus.tv/show.php?show=66 low quality video http://www.youtube.com/watch?v=EJ6aMxPh6k0 looks better than Nvidia's "Adrianne" tech demo for the G80 ~ GeForce 8800 GTX R600 / Radeon X2900XTX is a friggin' 512-bit MONSTER that will provide bandwidth of over 100 GB/s even with lower-end GDDR3 memory. According to Wikipedia, R600's bandwidth ranges from 115.2 GB/s to 160 GB/s depending on the memory options. http://en.wikipedia.org/wiki/Radeon_R600 some of the technical details of the Ruby 4 demo: GREAT! I can see it now. 1TB HDD required and HD-DVD drive to handle all the damn high resolution textures. Not to mention needing super fast HDD to cache the textures. I am assuming the bottom end card will come with 1GB of RAM? |
#6
|
|||
|
|||
holy smokes; AMD-ATI R600 ( Radeon X2900XTX ) GPU Technology Demo; "Ruby 4"
"John Lewis" wrote in message
[...] X1800-series fiasco all over again. I'm sure that nVidia must be rather amused by this hiccup. The nV partners have now shipped 500,000 8800-series graphics cards. Just as a matter of interest, where did that figure come from? |
#7
|
|||
|
|||
holy smokes; AMD-ATI R600 ( Radeon X2900XTX ) GPU Technology Demo; "Ruby 4"
It is indeed impressive. The graphics look on par with the movie
Final Fantasy. Still, NVidia has the better hardware and products, and I doubt that ATI will change this. ATI lost alot of momentum with the X1900 flaws NVidia's only real issue is their drivers aren't so great for Vista. They've fallen behind in the driver race. Still, it is leaps better than what ATI was doing a few years ago. |
#8
|
|||
|
|||
holy smokes; AMD-ATI R600 ( Radeon X2900XTX ) GPU Technology Demo; "Ruby 4"
On Mar 22, 1:44 am, "Magnulus" wrote:
It is indeed impressive. The graphics look on par with the movie Final Fantasy. Still, NVidia has the better hardware and products, and I doubt that ATI will change this. ATI lost alot of momentum with the X1900 flaws NVidia's only real issue is their drivers aren't so great for Vista. They've fallen behind in the driver race. Still, it is leaps better than what ATI was doing a few years ago. while the demo does look impressive, in no way does it look even close to the movie Final Fantasy: The Spirits Within. If GPUs were capable of FF: TSW in realtime, there wouldn't be much point in developing ever-more powerful GPUs. That level of graphics won't happen in realtime until probably the middle to later part of the next decade, if that soon. |
#9
|
|||
|
|||
holy smokes; AMD-ATI R600 ( Radeon X2900XTX ) GPU TechnologyDemo; "Ruby 4"
Magnulus wrote:
It is indeed impressive. The graphics look on par with the movie Final Fantasy. Still, NVidia has the better hardware and products, and I doubt that ATI will change this. ATI lost alot of momentum with the X1900 flaws NVidia's only real issue is their drivers aren't so great for Vista. They've fallen behind in the driver race. Still, it is leaps better than what ATI was doing a few years ago. Not to mention that AMD is getting their asses handed to them by Intel. I like the competition, so I hope AMD and ATI can get better. Drivers for Vista seem to be a huge problem with all cards. Micorsoft deserves a few lumps here. |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
new Crysis / CryEngine 2 technology demo. Whoa! holy shit this just PISSES on everything else from a tremendous height. | Air Raid | Nvidia Videocards | 13 | March 13th 07 04:01 PM |
new Crysis / CryEngine 2 technology demo. Whoa! holy shit this just PISSES on everything else from a tremendous height. | Air Raid | Ati Videocards | 13 | March 13th 07 04:01 PM |
Force "medium present" or "device ready"? | Mike Richter | Cdr | 5 | October 23rd 06 12:12 AM |
Downside of changing "Max frames to render ahead"/"Prerender Limit" to 1/0? | Jeremy Reaban | Nvidia Videocards | 2 | March 31st 06 04:24 AM |
Still no audio after system crash during HL2 "Lost Coast" hi-def demo level | !bungle | Nvidia Videocards | 3 | December 1st 05 02:15 AM |