A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Processors » Overclocking
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Radeon 9800 Pro, Xp 2400,266mhz,1 gig pc2100 ram, this a good setup ?



 
 
Thread Tools Display Modes
  #11  
Old May 25th 04, 04:53 AM
David Maynard
external usenet poster
 
Posts: n/a
Default

Phil Weldon wrote:

When images on a strip of film are projected onto a screen, the images are
discrete, perhaps separated by a small blank interval, depending on the
projector (one type has continuous film advance rather than intermittent,
and uses a rotating prism syncronized with the film movement to project a
single frame until the next moves into place.)


I am aware of how a movie projector works. You missed the point.

As for frame rate on a comuter monitor, there is absolutely no way for
information to reach the screen faster than the frame rate of the monitor.


No one said it could.

If frame synch is turned off, and the frame generation rate allowed to
exceed the monitor frame rate then GPU and CPU power is just being wasted
because the extra will never reach the screen (to display at twice the
monitor frame rate would mean that half the information is never displayed,
and the GPU and CPU processing power would be better spent on increased
image quality.) And then there would be the displacement with moving objects
or a panning motion at some point in the displayed composite frame.

The "frozen in time" effect is just as present in film as in CGI. After
all, the exposure time can be varied in cinemaphotography to freeze any
motion (the downside is that more sensitive film, a faster lens, increased
scene illumination, and or "pushed" development must be used.) And what
about CGI use in filmed movies ("Hell Boy", "Von Helsing", "Shreck II"?)


You mean they weren't real?

On the other hand I doubt they were generated real time on a PC.

Those who report seeing a difference with computer display images when the
computer frame rate is higher than the monitor display rate are either
perceiving the "image tearing" you mention as meaningful screen action
(indicating really short attention spans, really short)


It indicates no such thing. Just as perceiving 'purple' from three
phosphors, none of which are 'purple', doesn't 'indicate' you're damn fast
at fourier calculations.

or reacting to frame
rate hype for 3-D accelerated adapter cards and games. Or maybe it is the
aura of an impending seizure.


I told you I had doubts about it but for you to just whimsically dismiss
it, unless you have done the appropriate experiments, is a bit cavalier.

You are looking solely at the 'mechanics' of the 'device' and, using that
kind of analysis, it's also obvious that color television can't work
because there isn't enough bandwidth for the color information, by an order
of magnitude, and you simply can't reproduce the visible spectrum with 3
fixed wavelength phosphors. But it does work due to the peculiarities of
the human eye and human perception.

But I'm not going to 'argue' it with you because it isn't my theory and I'm
not an expert on it. I simply note that there ARE people who say it makes a
difference, based on experiments they've done, and they have a theory as to
why.

  #12  
Old May 25th 04, 05:34 AM
Phil Weldon
external usenet poster
 
Posts: n/a
Default

I didn't explain to you how movie projectors work, I just responded to your
description of the 'theory' and the errors in that 'theory' as you describe
it. Evidently the expounders of that theory don't understand how movie
projectors work, nor how the cameras work either.

How the CGI composite images were generated has nothing with how they are
currently displayed in cinemas. The point is that they are quite satisfying
at 24 frames per seconds.

And of course color television isn't impossible, and WHAT bandwidth? USA
broadcast channels? Video amplifier bandwidth in television receivers?
Red bandwidth? Green bandwidth? Blue bandwidth? Video bandwidth? In NTSC
encoding, GREEN bandwidth is more than three times that of BLUE, something
like 1.7 MHz to .5 MHz, with RED bandwidth falling somewhere in between.

I don't whimiscally dismiss theory, it is bogus and I make fun of it, as it
deserves. NTSC television, on the other hand, depends on valid theories
that are confirmed, and depends on information that reaches the screen. The
video game rate "theory" evidently depends on information that does not
reach the screen. THAT is why I make fun of it.
--
Phil Weldon, pweldonatmindjumpdotcom
For communication,
replace "at" with the 'at sign'
replace "mindjump" with "mindspring."
replace "dot" with "."

"David Maynard" wrote in message
...
Phil Weldon wrote:

When images on a strip of film are projected onto a screen, the images

are
discrete, perhaps separated by a small blank interval, depending on the
projector (one type has continuous film advance rather than

intermittent,
and uses a rotating prism syncronized with the film movement to project

a
single frame until the next moves into place.)


I am aware of how a movie projector works. You missed the point.

As for frame rate on a comuter monitor, there is absolutely no way for
information to reach the screen faster than the frame rate of the

monitor.

No one said it could.

If frame synch is turned off, and the frame generation rate allowed to
exceed the monitor frame rate then GPU and CPU power is just being

wasted
because the extra will never reach the screen (to display at twice the
monitor frame rate would mean that half the information is never

displayed,
and the GPU and CPU processing power would be better spent on increased
image quality.) And then there would be the displacement with moving

objects
or a panning motion at some point in the displayed composite frame.

The "frozen in time" effect is just as present in film as in CGI.

After
all, the exposure time can be varied in cinemaphotography to freeze any
motion (the downside is that more sensitive film, a faster lens,

increased
scene illumination, and or "pushed" development must be used.) And what
about CGI use in filmed movies ("Hell Boy", "Von Helsing", "Shreck II"?)


You mean they weren't real?

On the other hand I doubt they were generated real time on a PC.

Those who report seeing a difference with computer display images when

the
computer frame rate is higher than the monitor display rate are either
perceiving the "image tearing" you mention as meaningful screen action
(indicating really short attention spans, really short)


It indicates no such thing. Just as perceiving 'purple' from three
phosphors, none of which are 'purple', doesn't 'indicate' you're damn fast
at fourier calculations.

or reacting to frame
rate hype for 3-D accelerated adapter cards and games. Or maybe it is

the
aura of an impending seizure.


I told you I had doubts about it but for you to just whimsically dismiss
it, unless you have done the appropriate experiments, is a bit cavalier.

You are looking solely at the 'mechanics' of the 'device' and, using that
kind of analysis, it's also obvious that color television can't work
because there isn't enough bandwidth for the color information, by an

order
of magnitude, and you simply can't reproduce the visible spectrum with 3
fixed wavelength phosphors. But it does work due to the peculiarities of
the human eye and human perception.

But I'm not going to 'argue' it with you because it isn't my theory and

I'm
not an expert on it. I simply note that there ARE people who say it makes

a
difference, based on experiments they've done, and they have a theory as

to
why.



  #13  
Old May 25th 04, 10:53 AM
David Maynard
external usenet poster
 
Posts: n/a
Default

Phil Weldon wrote:

I didn't explain to you how movie projectors work,



I didn't say you did. I said I know how they work, which includes your
"rotating prism" explanation and the rest.

I just responded to your
description of the 'theory' and the errors in that 'theory' as you describe
it. Evidently the expounders of that theory don't understand how movie
projectors work, nor how the cameras work either.


They understand it just fine.

How the CGI composite images were generated has nothing with how they are
currently displayed in cinemas.


More appropriately, "how they are currently displayed" has nothing to do
with how they are generated, which is the POINT of their theory: what
happens BEFORE it's sent to display.

The point is that they are quite satisfying
at 24 frames per seconds.


Which is irrelevant to the idea they proposed.

And of course color television isn't impossible,


Yes, of course. And I gave the reason why.

and WHAT bandwidth?


The color information, as I said. Makes no difference 'where' in the whole
schlemiel we look, there is a LIMIT to how much color information can
POSSIBLY be there because of how it's encoded.

USA
broadcast channels? Video amplifier bandwidth in television receivers?
Red bandwidth? Green bandwidth? Blue bandwidth? Video bandwidth? In NTSC
encoding, GREEN bandwidth is more than three times that of BLUE, something
like 1.7 MHz to .5 MHz, with RED bandwidth falling somewhere in between.


NTSC. The entire video bandwidth for luminance is 4.2 MHz. All of that is
available for 'B&W'. For color, the chroma subcarrier is modulated on top
of it at about 3.58Mhz and is comprised of two color information signals, I
and Q, (since we can use those with the luminance to recreate 3 primary
color signals). The I signal is bandwidth limited to about 1.5 MHz with the
Q limited to about .6 Mhz. That's the 'best' you could get without the
attendant phase and amplitude distortions resulting from broadcast
transmission.

(If you care, the I and Q are derived as follows"

I = 0.74 (R'-Y) - 0.27 (B'-Y) = 0.60 R' - 0.28 G' - 0.32 B'
Q = 0.48 (R'-Y) + 0.41 (B'-Y) = 0.21 R' - 0.52 G' + 0.31 B'

)

Now, subtract a .6Mhz bandwidth signal from a 4.2 MHz bandwidth signal and
the useful resulting signal is not going to contain any more resolution
than the lower of the two bandwidths: .6 MHz (plus uncorrected high
frequency luminance components, unless they're filtered out.)

The result is that NTSC color resolution STINKS. Which is one reason why
they make incredibly lousy PC monitors.

But, as it turns out, the human eye is more sensitive to luminance
information than it is to color so your mind's eye just doesn't give much
of a tinker's dam about how positively dismal the color resolution is when
viewing 'natural scenes' (as opposed to graphics/text) on a TV.


I don't whimiscally dismiss theory, it is bogus and I make fun of it, as it
deserves.


That's exactly what they said about Goddard's stupid notion that rockets
would work in the vacuum of space.

NTSC television, on the other hand, depends on valid theories
that are confirmed, and depends on information that reaches the screen. The
video game rate "theory" evidently depends on information that does not
reach the screen. THAT is why I make fun of it.


Your 'humor' of it is based on a false premise then.

  #15  
Old May 25th 04, 04:21 PM
Phil Weldon
external usenet poster
 
Posts: n/a
Default

Working from your description of this "theory" of the effect of frame rates
higher than the monitor refresh rate, I repeat my criticism. It depends on
information that does not reach the display. Color television is completely
different matter. It depends on our perception of what DOES reach the
display, rather than what DOES NOT reach the display, an important
difference, and an example of what separates science from mysticism. I
will though my back issues of the SMPT magazine, however, for mention of
something that bears on this "theory."

Anyway, the limitation of NTSC compared to PAL/SCAM is not so much
resolution, but control of color distortion in the broadcast path. The
vertical resolution increase that PAL/SECAM gains over NTSC is at the
expense of lower temporal resolution. The horizontal resolution increase
is at the is at the expense (for broadcast) of increased spectrum cost.
Compare the number of television broadcast stations in the PAL/SECAM world
with the number of television broadcast stations in the NTSC world. Now
that decoding of HDTV type broadcasts is possible in television receivers,
everything changes, and, like Richard Nixon, we won't have Never The Same
Color to kick around any more (well, he did make a comeback... hopefully
we'll be luckier with NTSC.)

--
Phil Weldon, pweldonatmindjumpdotcom
For communication,
replace "at" with the 'at sign'
replace "mindjump" with "mindspring."
replace "dot" with "."

"David Maynard" wrote in message
...
Phil Weldon wrote:

I didn't explain to you how movie projectors work,



I didn't say you did. I said I know how they work, which includes your
"rotating prism" explanation and the rest.

I just responded to your
description of the 'theory' and the errors in that 'theory' as you

describe
it. Evidently the expounders of that theory don't understand how movie
projectors work, nor how the cameras work either.


They understand it just fine.

How the CGI composite images were generated has nothing with how they

are
currently displayed in cinemas.


More appropriately, "how they are currently displayed" has nothing to do
with how they are generated, which is the POINT of their theory: what
happens BEFORE it's sent to display.

The point is that they are quite satisfying
at 24 frames per seconds.


Which is irrelevant to the idea they proposed.

And of course color television isn't impossible,


Yes, of course. And I gave the reason why.

and WHAT bandwidth?


The color information, as I said. Makes no difference 'where' in the whole
schlemiel we look, there is a LIMIT to how much color information can
POSSIBLY be there because of how it's encoded.

USA
broadcast channels? Video amplifier bandwidth in television receivers?
Red bandwidth? Green bandwidth? Blue bandwidth? Video bandwidth? In

NTSC
encoding, GREEN bandwidth is more than three times that of BLUE,

something
like 1.7 MHz to .5 MHz, with RED bandwidth falling somewhere in

between.

NTSC. The entire video bandwidth for luminance is 4.2 MHz. All of that is
available for 'B&W'. For color, the chroma subcarrier is modulated on top
of it at about 3.58Mhz and is comprised of two color information signals,

I
and Q, (since we can use those with the luminance to recreate 3 primary
color signals). The I signal is bandwidth limited to about 1.5 MHz with

the
Q limited to about .6 Mhz. That's the 'best' you could get without the
attendant phase and amplitude distortions resulting from broadcast
transmission.

(If you care, the I and Q are derived as follows"

I = 0.74 (R'-Y) - 0.27 (B'-Y) = 0.60 R' - 0.28 G' - 0.32 B'
Q = 0.48 (R'-Y) + 0.41 (B'-Y) = 0.21 R' - 0.52 G' + 0.31 B'

)

Now, subtract a .6Mhz bandwidth signal from a 4.2 MHz bandwidth signal and
the useful resulting signal is not going to contain any more resolution
than the lower of the two bandwidths: .6 MHz (plus uncorrected high
frequency luminance components, unless they're filtered out.)

The result is that NTSC color resolution STINKS. Which is one reason why
they make incredibly lousy PC monitors.

But, as it turns out, the human eye is more sensitive to luminance
information than it is to color so your mind's eye just doesn't give much
of a tinker's dam about how positively dismal the color resolution is when
viewing 'natural scenes' (as opposed to graphics/text) on a TV.


I don't whimiscally dismiss theory, it is bogus and I make fun of it, as

it
deserves.


That's exactly what they said about Goddard's stupid notion that rockets
would work in the vacuum of space.

NTSC television, on the other hand, depends on valid theories
that are confirmed, and depends on information that reaches the screen.

The
video game rate "theory" evidently depends on information that does not
reach the screen. THAT is why I make fun of it.


Your 'humor' of it is based on a false premise then.



  #16  
Old May 25th 04, 11:33 PM
David Maynard
external usenet poster
 
Posts: n/a
Default

Phil Weldon wrote:

Working from your description of this "theory" of the effect of frame rates
higher than the monitor refresh rate, I repeat my criticism. It depends on
information that does not reach the display.


No, it doesn't.

Color television is completely
different matter. It depends on our perception of what DOES reach the
display, rather than what DOES NOT reach the display, an important
difference, and an example of what separates science from mysticism. I
will though my back issues of the SMPT magazine, however, for mention of
something that bears on this "theory."


I would agree if your assumption were correct, but it's not.

Anyway, the limitation of NTSC compared to PAL/SCAM is not so much
resolution, but control of color distortion in the broadcast path.


If the point had been a comparison of NTSC vs PAL/SECAM you'd be correct
but the point wasn't a comparison. The point was the inherent poor
resolution of the color information.

PAL/SECAM also transmit low bandwidth color information but the modulation
techniques compensate for broadcast anomalies better, at the expense of
more complicated circuitry and more expensive TV sets. However, they depend
on the same 'tricks of the eye' to work.

The
vertical resolution increase that PAL/SECAM gains over NTSC is at the
expense of lower temporal resolution. The horizontal resolution increase
is at the is at the expense (for broadcast) of increased spectrum cost.
Compare the number of television broadcast stations in the PAL/SECAM world
with the number of television broadcast stations in the NTSC world. Now
that decoding of HDTV type broadcasts is possible in television receivers,
everything changes, and, like Richard Nixon, we won't have Never The Same
Color to kick around any more (well, he did make a comeback... hopefully
we'll be luckier with NTSC.)


The point had nothing to do with comparing various TV formats. The point
was that the human eye 'compensates' for the sketchy pictorial information
and 'interprets' it into a 'reasonable' representation. That pictorial
information does not have to be 'technically correct', or even 'good',
because the human eye does peculiar things of it's own in turning it into
"what you (think you) see."


  #17  
Old May 26th 04, 12:18 AM
Phil Weldon
external usenet poster
 
Posts: n/a
Default

What, then, do any of your points have to do with sending more frames to
the display device than can be displayed, which, I thought, was the
contention of this "theory". I agree, NTSC, PAL, and SECAM have nothing to
do with the contention EXCEPT for the fact that broadcast television depends
on perception of information that DOES reach the screen, NOT on information
that DOES NOT reach the screen, as your explanation of the "theory" of
excess frame rate indicates. Or maybe I have misinterpreted your
explantation, in which case more discussion is fruitless unless you can
clarify.

I hope the "theory" is not just that the CAPABILITY of frame rates far
beyond the display presentation frame rate indicate excess capacity to
handle peak graphics processing requirements well above the average graphics
processing requirements. If that is the contention, then I don't think
ANYONE would differ.

--
Phil Weldon, pweldonatmindjumpdotcom
For communication,
replace "at" with the 'at sign'
replace "mindjump" with "mindspring."
replace "dot" with "."


"David Maynard" wrote in message
...
Phil Weldon wrote:

Working from your description of this "theory" of the effect of frame

rates
higher than the monitor refresh rate, I repeat my criticism. It depends

on
information that does not reach the display.


No, it doesn't.

Color television is completely
different matter. It depends on our perception of what DOES reach the
display, rather than what DOES NOT reach the display, an important
difference, and an example of what separates science from mysticism. I
will though my back issues of the SMPT magazine, however, for mention of
something that bears on this "theory."


I would agree if your assumption were correct, but it's not.

Anyway, the limitation of NTSC compared to PAL/SCAM is not so much
resolution, but control of color distortion in the broadcast path.


If the point had been a comparison of NTSC vs PAL/SECAM you'd be correct
but the point wasn't a comparison. The point was the inherent poor
resolution of the color information.

PAL/SECAM also transmit low bandwidth color information but the modulation
techniques compensate for broadcast anomalies better, at the expense of
more complicated circuitry and more expensive TV sets. However, they

depend
on the same 'tricks of the eye' to work.

The
vertical resolution increase that PAL/SECAM gains over NTSC is at the
expense of lower temporal resolution. The horizontal resolution

increase
is at the is at the expense (for broadcast) of increased spectrum

cost.
Compare the number of television broadcast stations in the PAL/SECAM

world
with the number of television broadcast stations in the NTSC world. Now
that decoding of HDTV type broadcasts is possible in television

receivers,
everything changes, and, like Richard Nixon, we won't have Never The

Same
Color to kick around any more (well, he did make a comeback... hopefully
we'll be luckier with NTSC.)


The point had nothing to do with comparing various TV formats. The point
was that the human eye 'compensates' for the sketchy pictorial information
and 'interprets' it into a 'reasonable' representation. That pictorial
information does not have to be 'technically correct', or even 'good',
because the human eye does peculiar things of it's own in turning it into
"what you (think you) see."




  #18  
Old May 26th 04, 01:55 AM
David Maynard
external usenet poster
 
Posts: n/a
Default

Phil Weldon wrote:

What, then, do any of your points have to do with sending more frames to
the display device than can be displayed, which, I thought, was the
contention of this "theory".


No, and it's obvious to even the most casual observer that you can't 'send
more frames than can be displayed' but you're so obsessed with insisting
that is 'the theory' that you won't give it 2 seconds of thought.

One idea we had talked about before you jumped on this 'things that are
never seen' bandwagon was a frame consisting partly of one and partly of
the next, caused by the generated frame rate being faster than the display
frame rate. And while you seem to be absolutely convinced the observer is
demented I can imagine the eye 'integrating' the effect just as it does
other 'fragmented' information in conventional TV images.

Whether it does, or not, I don't know as I've never done any experiments
with it.


I agree, NTSC, PAL, and SECAM have nothing to
do with the contention EXCEPT for the fact that broadcast television depends
on perception of information that DOES reach the screen, NOT on information
that DOES NOT reach the screen, as your explanation of the "theory" of
excess frame rate indicates.


No, it doesn't, regardless of how many times you repeat it and I repeat
that it doesn't.

Or maybe I have misinterpreted your
explantation, in which case more discussion is fruitless unless you can
clarify.

I hope the "theory" is not just that the CAPABILITY of frame rates far
beyond the display presentation frame rate indicate excess capacity to
handle peak graphics processing requirements well above the average graphics
processing requirements. If that is the contention, then I don't think
ANYONE would differ.


  #19  
Old May 26th 04, 04:01 AM
Phil Weldon
external usenet poster
 
Posts: n/a
Default

Well, I am neither obsessed nor on a bandwagon. Why don't you restate the
"theory" (which you made clear is not yours) to clarify the discussion.
We've been talking past each other. Perception of video images, both analog
has been exhaustivly studied, there really isn't anything "new" about video
games that hasn't been studied in developing digital compression and display
of moving images.

--
Phil Weldon, pweldonatmindjumpdotcom
For communication,
replace "at" with the 'at sign'
replace "mindjump" with "mindspring."
replace "dot" with "."


"David Maynard" wrote in message
...
Phil Weldon wrote:

What, then, do any of your points have to do with sending more frames

to
the display device than can be displayed, which, I thought, was the
contention of this "theory".


No, and it's obvious to even the most casual observer that you can't 'send
more frames than can be displayed' but you're so obsessed with insisting
that is 'the theory' that you won't give it 2 seconds of thought.

One idea we had talked about before you jumped on this 'things that are
never seen' bandwagon was a frame consisting partly of one and partly of
the next, caused by the generated frame rate being faster than the display
frame rate. And while you seem to be absolutely convinced the observer is
demented I can imagine the eye 'integrating' the effect just as it does
other 'fragmented' information in conventional TV images.

Whether it does, or not, I don't know as I've never done any experiments
with it.


I agree, NTSC, PAL, and SECAM have nothing to
do with the contention EXCEPT for the fact that broadcast television

depends
on perception of information that DOES reach the screen, NOT on

information
that DOES NOT reach the screen, as your explanation of the "theory" of
excess frame rate indicates.


No, it doesn't, regardless of how many times you repeat it and I repeat
that it doesn't.

Or maybe I have misinterpreted your
explantation, in which case more discussion is fruitless unless you can
clarify.

I hope the "theory" is not just that the CAPABILITY of frame rates far
beyond the display presentation frame rate indicate excess capacity to
handle peak graphics processing requirements well above the average

graphics
processing requirements. If that is the contention, then I don't think
ANYONE would differ.




 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Radeon 9800 Pro and Mobile barton at 2.5ghz good match ?? The Other Guy. General 3 November 19th 04 04:03 AM
ATI Radeon 9800 Pro installation error Ben C General 4 October 21st 04 11:55 PM
Overclocking ATI Sapphire Radeon 9800 Pro video card? Cyde Weys General 3 July 21st 04 11:18 PM
Water-cooled 865PE + Radeon 9800 XT = rocketship . . . pulls a Challenger Ben Morehead Overclocking 7 October 30th 03 10:48 AM
Overclocking Radeon 9800 Pro? Ulf Stahl Overclocking 3 August 23rd 03 08:42 AM


All times are GMT +1. The time now is 01:10 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.