One day I got to thinking about all this FPS (Frames Per Second) business. There is so much talk about what's right, what's wrong, what can be and can't be seen. Considering I knew next to nothing aside from personal experience, I thought I'd do a little research and give a little fact and fiction. Many people argue about what makes a difference and what doesn't, specifically what the human eye can perceive. Some claim 24fps, others 30, some 60, some 200, some even upwards of 2000 and above. Feel free to add any numbers in-between. The truth of the matter is, every one of these people is right and wrong in their own respect. Why? Because that's not how the brain works. Try as I may to make the information "flow", it's relatively difficult because everything is intertwined. Therefore, I've provided a few categories with things to consider.
Motion Blur vs. Sharpness
Here's something interesting you can try right now. Take your mouse and shake it slowly. Now shake it really fast until you can't make out the outlines of the buttons. What's the FPS rate? Is it low because it's blurry and you can't make out the features? Or is it high because it doesn't look choppy (don't you think it would be really freaky if fast moving objects appeared at one point and then at another, with nothing in-between)?
Let me answer that for you: it's neither. Simply put, according to our brains frames per second don't exist. Hypothetically speaking, if you could make out the individual lines while shaking it really fast, then your eye was taking more "still shots" to make it look smooth and you'd have to shake it even faster to see the smooth motion. You can see where the catch-22 comes in.
Television
The most common "frame rate" on a television set is 24, with ranges that can go from 18-30 (these are approximations). I use quotes because TVs don't work the same way as computer screens. A television set doesn't render individual frames; instead it provides a range. Everything appears fluid because everything is blurred. Notice how still shots of action scenes aren't the most crisp images in the world? Wonder why. But then how come you can make out static details on a screen? Why can you see the same cracks in stones watching The Matrix as you can playing Q3A? See below.
Brightness vs. Darkness
Because of the way the world is set up, light is much easier to recognize than the absence thereof. To put it bluntly, it's a lot easier to notice a flash of light in a dark room than a total darkness in a bright room. The difference isn't apparent until the event time is reduced to hundredths of a second, but on a flicker-free TV set you can't see the black, although the refresh rate is only 100hz (100 times/second), whereas tests on Air Force pilots have shown their ability to not merely notice, but identify the type of aircraft when shown an image for only 1/220th of a second. Furthermore, eye sensitivity is different throughout (you can't notice the flicker of a 60hz monitor looking head on, but it's quite obvious when gazing sideways) which furthers these rates even more. Just please don't start me on the whole subliminal message trip.
Computers and Monitor Refresh Rates
A lot of people despise vertical sync, as this caps your maximum frame rate to that of your monitor's refresh rate (assuming we're talking CRT). Some things to consider: is running a game below the refresh rate of your CRT smooth? Or maybe it looks better synchronized? Or maybe it's best to run above it? What about using multiples (i.e. 170fps with an 85hz refresh rate)? Personally, I haven't noticed any differences. Just don't confuse that with variations in FPS. Current LCDs don't have higher than 30-40hz refresh rates, but they're progressive. So things may look blurry, but they won't be choppy. And if you want to crank up all the features, you won't be getting 100+ frame rates anyway. Just a thought.