View Single Post
  #9  
Old October 19th 07, 07:55 PM posted to alt.comp.periphs.videocards.nvidia
Mr.E Solved!
external usenet poster
 
Posts: 888
Default How much of a speed increase?

Phil Weldon wrote:

Well, the FPS rating is not exactly linear as a comparison benchmark. After
all, who can SEE 1000 frames per second (or 100).


The value of increased FPS has zero to do with "seeing" 1/1000th (or any
other reasonable fraction) of second state changes in the monitor.

Phosphors in displays have a medium-fast decay time that negate hyper
fast visual changes. If you want to switch a pixel from purple to green
and back in 1/500th of a second repeatedly, you need a different technology.

High FPS desirability has everything to do with being able to accurately
reproduce and synchronize your client gaming environment with the Master
Server environment. Add to that burdensome task maintaining that steady,
bright and vivid image on your display. Refreshing those phosphors as
intended. Then add that task the I/O of transferring those state changes
in and out of your PC.

Battlefield 2 servers are digital worlds that update 100 times a second.
they live in a 100Hz universe. If your client PC can match 100FPS
sustained rate and re-draw the display 100 times a second then you have
achieved perfect game-state harmony and the most accurate (barring
latency spikes) arena possible.

If I have 100 FPS sustained and you have 50 FPS sustained, I have twice
as many opportunities as you do and the score will reflect that. (all
else being equal). To answer your question then: Everyone can see the
difference when it is shown to them, but not every one can explain it.

HTH.