If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#41
|
|||
|
|||
On Thu, 15 Apr 2004 20:55:42 +1000, Ricardo Delazy wrote:
On Wed, 14 Apr 2004 19:13:30 GMT, "teqguy" wrote: Most MPEG encoding is processor dependent... I wish developers would start making applications that let the graphics card do video encoding, instead of dumping the work on the processor. I'm pretty sure I read somewhere that the (new & improved) Prescotty processor has been given a special hard wired instruction set which is dedicated to encoding video, so that should speed things up some what. I remember reading an article over a year ago which had Intel giving a demo of a future release CPU which apparently was running 3 full screen HD videos simultaneously rotating in a 3d cube. The processor prototype was not specified, but it may have been a Tejas as it was rated at 5GHz. SSE3 won't make Intel CPUs as fast as dedicated DSPs for video encoding. It can be an improvement over SSE and SSE2 but it's still not fast enough. They should have embedded a full DSP (or more than one) inside CPUs to achieve the same performance. SSE subsets are still too much tied to the general purpose x86 architecture and their efficiency it's poor compared to dedicated DSPs. A $40-50 floating point DSP can be 3x times faster than any SSE3 capable CPU at MPEG2/MPEG4 encoding. If it's true that Nvidia has designed the NV40 as a full DSP then it's just a matter of time and SDK availability to let programmers access the NV40 DSP thru DirectX or other dedicated APIs before known Codecs such as DiVX would be able to take advantage of GPU power. The only problem is that Nvidia needs a mainstream set of GPUs derived from this one with MPEG encoding/decoding on the market ASAP to set a standard, before ATI releases its own DSP GPUs with MPEG encoding/decoding capability. If the MPEG encoding/decoding in NV40 were fixed in hardware, hardwired then it would be a pretty low quality implementation and I really hope that the claims that the GPU it's a full DSP are true so that programmers with DSP experience could upload their own filters codes onto the GPU DSP to perform their own MPEG video encoding. I also hope that the SDK to access DSP features and reprogram MPEG video encoding would be free so that even non-commercial, freeware encoders could be available in the future to further exploit GPU capabilities. |
#42
|
|||
|
|||
"Mark Leuck" wrote:
"chrisv" wrote in message .. . "teqguy" wrote: The FX series is 28 to 23, ranging from the 5950 to the 5200. Better late then never, I guess. If you have nothing to contribute, shut up. If you're just going to post drivel, shut up. What's drivel is your obsessant need to critique everything anyone ever says. Wrong again. If I recall you are the same chrisv who stated over and over in alt.computer.storage how IBM never had a problem with it's last batch of Deathstar hard drives Too stupid to figure out that I was parodying the "great" Ron Reaugh with those posts. The google record proves that I in fact was very aware of the "Deathstar's" problems. Ignore the troll folks, he knows not what he says. It's true, you don't have a clue. |
#43
|
|||
|
|||
SCSI only operates at 320Mb/s.
320MB/s. But you need a lot of drives to saturate that. Any single IDE drive could easily do 320Mb/s Thats only 40MB/s. Eric |
#44
|
|||
|
|||
|
#45
|
|||
|
|||
DaveL wrote:
I think Nvidia learned their lesson about that from the 5800U debacle. It was ATI that stayed with the old standard and took the lead in performance. Meanwhile, Nvidia was struggling with fab problems. DaveL "Ar Q" wrote in message link.net... Isn't it time for NVidia to use 0.09um process? How could they put some many features if still using 0.13 um process? Heat generation is still too much of a risk for moving to 90-nm. If AMD moved to .09...... I'd have a new toaster. Say goodbye to overclocking at that point. The "features" can be expandable as much as they like.... right now they aren't even using the entire wafer for such optimizations, only a small section. A lot of those optimizations are software based too... the GPU just has to be able to support the relative ballpark of them. |
#46
|
|||
|
|||
"teqguy" wrote in message ... DaveL wrote: I think Nvidia learned their lesson about that from the 5800U debacle. It was ATI that stayed with the old standard and took the lead in performance. Meanwhile, Nvidia was struggling with fab problems. DaveL "Ar Q" wrote in message link.net... Isn't it time for NVidia to use 0.09um process? How could they put some many features if still using 0.13 um process? Heat generation is still too much of a risk for moving to 90-nm. If AMD moved to .09...... I'd have a new toaster. Say goodbye to overclocking at that point. The "features" can be expandable as much as they like.... right now they aren't even using the entire wafer for such optimizations, only a small section. That's going to be some big honked chip when they use the whole wafer. Jim M A lot of those optimizations are software based too... the GPU just has to be able to support the relative ballpark of them. |
#48
|
|||
|
|||
On Wed, 14 Apr 2004 17:17:07 GMT, "Ar Q"
wrote: "NV55" wrote in message om... the following is ALL quote: http://frankenstein.evilgeniuslabs.c...nv40/news.html Tuesday, April 13, 2004 NVIDIA GeForce 6800 GPU family officially announced - Cormac @ 17:00 It's time to officially introduce the new GPU generation from NVIDIA and shed the light on its architecture and features. So, the GeForce 6800 GPU family, codenamed NV40, today officially entered the distribution stage. Initially it will include two chips, GeForce 6800 Ultra and GeForce 6800, with the same architecture. These are the key innovations introduced in NVIDIA's novelties: *16-pipeline superscalar architecture with 6 vertex modules, DDR3 support and *real 32-bit pipelines *PCI Express x16, AGP 8x support *222 million transistors *400MHz core clock *Chips made by IBM *0.13µm process Isn't it time for NVidia to use 0.09um process? How could they put some many features if still using 0.13 um process? The NV40 die is .75 inches square and all the features are in there. The part will have been stress-tested by a vector-test program to completely exercise all of its functions before it is ever supplied to a 3rd party for incorporation into the 6800 video card. Future generations of this GPU will be on a smaller process. The current NV40 chip is made by IBM. IBM is working on a .065 nm process that AMD will use when it is sufficiently mature. No doubt nVidia will also be one of the first users of the process also. Will shrink the existing die area by a factor of 4 and also drop the power by about a factor of 6. Will probably take a couple of years to get there... nVidia will not make the mistake of ever using an immature process again. John Lewis |
#49
|
|||
|
|||
"joe smith" wrote in message ... pfft. You don't even know what the ATI offering is as yet, much less are you able to buy a 6800 until well into next month. No, I do not. I wrote that the rumor is that ATI wouldn't have 3.0 level shaders.. I was commenting on a rumor, if that isn't true then the situation is naturally entirely different. The confidentially / NDA ends 19th this month so soon after that we should begin to see cards dripping to the shelves like always (just noticed a trend in past 5-7 years, could be wrong but I wouldn't die if had to wait even 2 months.. or 7.. or 3 years.. the stuff will get here sooner or later.. unless the world explodes before that = [Snipped] 19th? Where did you get that date from? -- Derek |
#50
|
|||
|
|||
19th? Where did you get that date from?
"Confidential until April 19th 2004" stamped over slides, etc. material you find from here and there. |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Geforce 6800 drivers | Glitch | General | 3 | December 13th 04 09:30 AM |
Athlon xp 2600+ (@1.9ghz) with a Geforce 6800? | Kiran Kumar Kamineni | Overclocking AMD Processors | 1 | November 14th 04 11:24 AM |
Gigabyte NVIDIA 6600 Series: Bringing GeForce 6800 features to the mainstream! | Gigabyte USA Marketing | Gigabyte Motherboards | 0 | October 28th 04 11:04 PM |
GeForce 6800 Ultra (256 Mb) | F.O.R. | General | 1 | August 7th 04 02:20 AM |
P4C800-E Deluxe and BFG GeForce 6800 Ultra OC graphics card | Mark Cee | Asus Motherboards | 2 | June 28th 04 05:24 AM |