If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#1
|
|||
|
|||
Intel Larrabee - Rasterisation Focus Confirmed - It'll run DX11 games- Not just for Raytracing
Larrabee's Rasterisation Focus Confirmed Wednesday 23rd April 2008, 08:33:00 PM, written by TeamB3D For many months, researchers and marketing fanatics at Intel have been heralding the upcoming 'raytracing revolution', claiming rasterisation has run out of steam. So it is refreshing to hear someone actually working on Larrabee flatly denying that raytracing will be the chip's main focus. Tom Forsyth is currently a software engineer working for Intel on Larrabee. He previously worked at Rad Game Tools on Pixomatic (a software rasterizer) and Granny3D, as well as Microprose, 3Dlabs, and most notably Muckyfoot Productions (RIP). He is well respected throughout the industry for the high quality insight on graphics programming techniques he posts on his blog. Last Friday, though, his post's subject was quite different: "I've been trying to keep quiet, but I need to get one thing very clear. Larrabee is going to render DirectX and OpenGL games through rasterisation, not through raytracing. I'm not sure how the message got so muddled. I think in our quest to just keep our heads down and get on with it, we've possibly been a bit too quiet. So some comments about exciting new rendering tech got misinterpreted as our one and only plan. [...] That has been the goal for the Larrabee team from day one, and it continues to be the primary focus of the hardware and software teams. [...] There's no doubt Larrabee is going to be the world's most awesome raytracer. It's going to be the world's most awesome chip at a lot of heavy computing tasks - that's the joy of total programmability combined with serious number-crunching power. But that is cool stuff for those that want to play with wacky tech. We're not assuming everybody in the world will do this, we're not forcing anyone to do so, and we certainly can't just do it behind their backs and expect things to work - that would be absurd." So, what does this mean actually mean for Larrabee, both technically and strategically? Look at it this way: Larrabee is a DX11 GPU with a design team that took both raytracing and GPGPU into consideration from the very start, while not forgetting performance in DX10+-class games that assume a rasteriser would be the most important factor determining the architecture's mainstream success or failure. There's a reason for our choice of phrasing: the exact same sentence would be just as accurate for NVIDIA and AMD's architectures. Case in point: NVIDIA's Analyst Day 2008 had a huge amount of the time dedicated to GPGPU, and they clearly indicated their dedication to non- rasterised rendering in the 2009-2010 timeframe. We suspect the same is true for AMD. The frequent implicit assumption that DX11 GPUs will basically be DX10 GPUs with a couple of quick changes and exposed tesselation is weak. Even if the programming model itself wasn't significantly changing (it is, with the IHVs providing significant input into direction), all current indications are that the architectures themselves will be significantly different compared to current offerings regardless, as the IHVs tackle the problem in front of them in the best way they know how, as they've always done. The industry gains new ideas and thinking, and algorithms and innovation on the software side mean target workloads change; there's nothing magical about reinventing yourself every couple of years. That's the way the industry has always worked, and those which have failed to do so are long gone. Intel is certainly coming up with an unusual architecture with Larrabee by exploiting the x86 instruction set for MIMD processing on the same core as the SIMD vector unit. And trying to achieve leading performance with barely any fixed-function unit is certainly ambitious. But fundamentally, the design principles and goals really aren't that different from those of the chips it will be competing with. It will likely be slightly more flexible than the NVIDIA and AMD alternatives, let alone by making approaches such as logarithmic rasterisation acceleration possible, but it should be clearly understood that the differences may in fact not be quite as substantial as many are currently predicting. The point is that it's not about rasterisation versus raytracing, or even x86 versus proprietary ISAs. It never was in the first place. The raytracing focus of early messaging was merely a distraction for the curious, so Intel could make some noise. Direct3D is the juggernaut, not the hardware. "First, graphics that we have all come to know and love today, I have news for you. It's coming to an end. Our multi-decade old 3D graphics rendering architecture that's based on a rasterization approach is no longer scalable and suitable for the demands of the future." That's why the message got so muddled, Tom. And no offence, Pat, but history will prove you quite wrong. http://www.beyond3d.com/content/news/631 |
#2
|
|||
|
|||
Intel Larrabee - Rasterisation Focus Confirmed - It'll run DX11games - Not just for Raytracing
On Apr 23, 9:29 pm, AirRaid wrote:
Larrabee's Rasterisation Focus Confirmed Wednesday 23rd April 2008, 08:33:00 PM, written by TeamB3D For many months, researchers and marketing fanatics at Intel have been heralding the upcoming 'raytracing revolution', claiming rasterisation has run out of steam. So it is refreshing to hear someone actually working on Larrabee flatly denying that raytracing will be the chip's main focus. Tom Forsyth is currently a software engineer working for Intel on Larrabee. He previously worked at Rad Game Tools on Pixomatic (a software rasterizer) and Granny3D, as well as Microprose, 3Dlabs, and most notably Muckyfoot Productions (RIP). He is well respected throughout the industry for the high quality insight on graphics programming techniques he posts on his blog. Last Friday, though, his post's subject was quite different: "I've been trying to keep quiet, but I need to get one thing very clear. Larrabee is going to render DirectX and OpenGL games through rasterisation, not through raytracing. I'm not sure how the message got so muddled. I think in our quest to just keep our heads down and get on with it, we've possibly been a bit too quiet. So some comments about exciting new rendering tech got misinterpreted as our one and only plan. [...] That has been the goal for the Larrabee team from day one, and it continues to be the primary focus of the hardware and software teams. [...] There's no doubt Larrabee is going to be the world's most awesome raytracer. It's going to be the world's most awesome chip at a lot of heavy computing tasks - that's the joy of total programmability combined with serious number-crunching power. But that is cool stuff for those that want to play with wacky tech. We're not assuming everybody in the world will do this, we're not forcing anyone to do so, and we certainly can't just do it behind their backs and expect things to work - that would be absurd." So, what does this mean actually mean for Larrabee, both technically and strategically? Look at it this way: Larrabee is a DX11 GPU with a design team that took both raytracing and GPGPU into consideration from the very start, while not forgetting performance in DX10+-class games that assume a rasteriser would be the most important factor determining the architecture's mainstream success or failure. There's a reason for our choice of phrasing: the exact same sentence would be just as accurate for NVIDIA and AMD's architectures. Case in point: NVIDIA's Analyst Day 2008 had a huge amount of the time dedicated to GPGPU, and they clearly indicated their dedication to non- rasterised rendering in the 2009-2010 timeframe. We suspect the same is true for AMD. The frequent implicit assumption that DX11 GPUs will basically be DX10 GPUs with a couple of quick changes and exposed tesselation is weak. Even if the programming model itself wasn't significantly changing (it is, with the IHVs providing significant input into direction), all current indications are that the architectures themselves will be significantly different compared to current offerings regardless, as the IHVs tackle the problem in front of them in the best way they know how, as they've always done. The industry gains new ideas and thinking, and algorithms and innovation on the software side mean target workloads change; there's nothing magical about reinventing yourself every couple of years. That's the way the industry has always worked, and those which have failed to do so are long gone. Intel is certainly coming up with an unusual architecture with Larrabee by exploiting the x86 instruction set for MIMD processing on the same core as the SIMD vector unit. And trying to achieve leading performance with barely any fixed-function unit is certainly ambitious. But fundamentally, the design principles and goals really aren't that different from those of the chips it will be competing with. It will likely be slightly more flexible than the NVIDIA and AMD alternatives, let alone by making approaches such as logarithmic rasterisation acceleration possible, but it should be clearly understood that the differences may in fact not be quite as substantial as many are currently predicting. The point is that it's not about rasterisation versus raytracing, or even x86 versus proprietary ISAs. It never was in the first place. The raytracing focus of early messaging was merely a distraction for the curious, so Intel could make some noise. Direct3D is the juggernaut, not the hardware. "First, graphics that we have all come to know and love today, I have news for you. It's coming to an end. Our multi-decade old 3D graphics rendering architecture that's based on a rasterization approach is no longer scalable and suitable for the demands of the future." That's why the message got so muddled, Tom. And no offence, Pat, but history will prove you quite wrong. http://www.beyond3d.com/content/news/631 They should worry more about fixing drivers that cause BSOD's (945 and 965 chipset) and file corruption (965 chipset) before trying to get games working at 5fps on DX10/DX11. |
#3
|
|||
|
|||
Intel Larrabee - Rasterisation Focus Confirmed - It'll run DX11 games - Not just for Raytracing
Ohhh! Brain Hurts ...sooo many big words ...So is the DX10 card I have
but have never run a DX10 title on ('cause there ain't no point) now obsolete ??? (\__/) (='.'=) (")_(") mouse(where is John Lewis to explain things when we need him ??) |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Intel Larrabee - Rasterisation Focus Confirmed - It'll run DX11 games- Not just for Raytracing | AirRaid[_3_] | Intel | 2 | April 26th 08 02:23 AM |
Intel Larrabee [speculation] to offer 16x the performance of GeForce8800 ? - Intel, Nvidia partnership to give Larrabee hardware rasterizingcapability? Larrabee could be useful for games | NV55 | Intel | 0 | December 19th 07 02:43 AM |
Intel Larrabee [speculation] to offer 16x the performance of GeForce8800 ? - Intel, Nvidia partnership to give Larrabee hardware rasterizingcapability? Larrabee could be useful for games | NV55 | Nvidia Videocards | 0 | December 19th 07 02:43 AM |
Intel 'Larrabee' GPU: 16 Cores - 2GHz - 150W - Nvidia Partnership(?) | AirRaid | Ati Videocards | 25 | June 30th 07 01:15 AM |
Intel 'Larrabee' GPU: 16 Cores - 2GHz - 150W - Nvidia Partnership(?) | AirRaid | Intel | 11 | June 19th 07 12:23 AM |