View Single Post
  #10  
Old May 25th 04, 03:19 AM
Phil Weldon
external usenet poster
 
Posts: n/a
Default

When images on a strip of film are projected onto a screen, the images are
discrete, perhaps separated by a small blank interval, depending on the
projector (one type has continuous film advance rather than intermittent,
and uses a rotating prism syncronized with the film movement to project a
single frame until the next moves into place.)

As for frame rate on a comuter monitor, there is absolutely no way for
information to reach the screen faster than the frame rate of the monitor.
If frame synch is turned off, and the frame generation rate allowed to
exceed the monitor frame rate then GPU and CPU power is just being wasted
because the extra will never reach the screen (to display at twice the
monitor frame rate would mean that half the information is never displayed,
and the GPU and CPU processing power would be better spent on increased
image quality.) And then there would be the displacement with moving objects
or a panning motion at some point in the displayed composite frame.

The "frozen in time" effect is just as present in film as in CGI. After
all, the exposure time can be varied in cinemaphotography to freeze any
motion (the downside is that more sensitive film, a faster lens, increased
scene illumination, and or "pushed" development must be used.) And what
about CGI use in filmed movies ("Hell Boy", "Von Helsing", "Shreck II"?)

Those who report seeing a difference with computer display images when the
computer frame rate is higher than the monitor display rate are either
perceiving the "image tearing" you mention as meaningful screen action
(indicating really short attention spans, really short) or reacting to frame
rate hype for 3-D accelerated adapter cards and games. Or maybe it is the
aura of an impending seizure.

--
Phil Weldon, pweldonatmindjumpdotcom
For communication,
replace "at" with the 'at sign'
replace "mindjump" with "mindspring."
replace "dot" with "."

"David Maynard" wrote in message
...
Phil Weldon wrote:

I am just indicating that rating performance by frames per second may

allow
comparisons, but is it a USEFUL comparison, especially since we all seem
perfectly satisfied by movies at 24 frames per second and television
displays at 30, 50, or 60 frames per second (I'm sorry, but PAL and

SECAM at
25 frames per second gives me a headache.)


Yeah. PAL and SECAM 25 FPS (50Hz refresh) DOES flicker, their claims to

the
contrary notwithstanding. I notice it too.

While I'm not sure I buy the whole theory, partly because it's not
something I've spent a lot of time on, there IS research which shows a
perceptible difference with frames rates 'too high to see'. I.E. faster
than the monitor refresh interval.

The postulated reason is that, with movie and TV, you're taking a
'snapshot' of real life movement, not a 'frozen in time' stagnant image
(things don't stop moving for your snapshot to take place), so there is
'smearing' of it over the frame interval and, they think, this provides
additional cues to the eye.

With computer generated frames, however, they ARE simply one stagnant

image
after another, computer shifted the 'right amount', 'full frame' at a

time,
to the next image to simulate movement. The idea is that faster than the
refresh rate frame generation creates a more lifelike 'moving picture'

that
the refresh rate is then taking the 'snapshot' of, kind of like how 'real
life' is moving all the time as the frame is taken.

Seems to me that, if it were 'perceptible', it would appear more like
tearing, since it isn't as if the entire image were moving, only 'part' of
the frame would be in the 'new position', but then I haven't run actual
human tests so I would be speculating whereas others claim to have

observed
it. Also, by perceptible they don't mean consciously observable, just that
the observers seem to feel that the 'too fast' frame rates are 'more
realistic'. Maybe the eye compensates for the 'partial' smear just as it
does for flicker and in recreating full color from 3 primaries. That would
make me think there is some minimum multiple (maybe an odd multiple so it
cycles through the image) before the effect would be effective, again,

like
a minimum rate to remove flicker.


As for my personal use, I like the price on display adapters two

generations
behind the bleeding edge. Paying $400 US or $500 US for performance

that is
only helpful for a handful of 3-D game programs is a very expensive
performance boost for a very limited use. Money spent on boosting the
performance of your entire system for a wide range of uses is better

spent.