Frame rate has become a controversial topic recently, so here’s my explanation of why almost everything developers say about frame rate is a lie.
What Is Frame Rate?
For anyone who doesn’t know, the term “frame rate” basically covers how many updates you see per second. Because celluloid film was just a collection of static images, it was worked out in the early days of cinema that you could establish a good impression of movement using around 25 frames per second (25fps).
Why Isn’t 25fps Good For Gaming?
There’s too much old tech chat to cover here, but the number dates back to old TV hardware. You can see some of that on this NTSC vs PAL article.
But games aren’t a series of static images – the things being rendered are moving in real-time. That means you don’t need the illusion of movement, you need accuracy. The time between the rendered frames is actually more important. At 30p, when you hit a button, the signal isn’t even sent until the next update, 0.03 seconds later. At 60p that drops to 0.01 seconds. It doesn’t sound like much, but once you take reaction speed and TV refresh rates into account, it all adds up. More frames means less delay.
But 30fps Is More Cinematic! The Hobbit Looked Weird At 48fps!
You want to know why The Hobbit looked weird at a higher frame rate? I’ll explain. With a typical Hollywood frame rate, you get motion blurring when the camera and actors move. This hides a lot of fine details, even in high definition. By increasing the frame rate you’re lowering the amount of blur, meaning that any prosthetics and props can’t hide their lack of detail. Did you think the CGI looked “weird” in The Hobbit too? Probably not, because CGI doesn’t suffer the same effects. And games are, by definition, CGI.
And please don’t use the word “cinematic” to describe games. I’ll let Jim Sterling explain why.
But Developer X Said It’s Better!
They’re lying, and I’ll tell you why. There is no denying that movement is more fluid at higher frame rates. Whether or not you can tell the difference is much more subjective. However, one thing that isn’t subjective is how much detail can be added to the graphics. Higher texture quality. More lighting and particle effects. Those are not subjective changes. All of the things that make a static frame look better also eat into the horsepower available to push those frames out. So they’re sacrificing smoothness that not everyone notices in order to add more detail that everyone will.
When It All Goes Wrong
Other than keeping the gameplay moving better, there are other ways things can go wrong. Artificially limiting your frame rate is something that happens to PC games a lot, and there’s no reason for it. If your hardware supports more then why aren’t they letting you run it that way? PC gaming is supposed to be diverse and backward compatible. Forcing the high-end gamers to run at lower levels is stupid and pointless. But there’s a second mistake that comes into play here…
Tying your physics to your frame rate. This happens a lot and it’s even more stupid. Some developers run their physics engines at 30fps as well (which is too low, but that’s another issue) meaning that if you mod a game to run faster, the physics go to hell. There’s no reason to do this at all, so please don’t.
Options. Let high-end PC gamers run at 500fps if they have the hardware to do it. PC games don’t force medium quality textures on everyone just because it’s “console-friendly” or whatever, so why not do the same with frame rate? Give people the option to choose their target frame rate along with their graphical detail. Problem solved!