The next-generation of gaming has promised consumers a new level of technical fidelity for the games they play. In the early days of PS4 and Xbox One, this meant not only beautiful textures and dynamic lighting presented in 1080p, but rock-steady 60fps (frames per second) gameplay as well, the current holy grail of framerate enthusiasts.
Things looked auspicious in the beginning. Games like MGSV: Ground Zeroes and Assassin's Creed IV: Black Flag hit these benchmarks admirably (at least on PS4). But if you've been paying attention to the latest gaming news, you might have noticed that several of the big upcoming releases, including The Order: 1886 and Bloodborne, while holding firm to 1080p, are targeting a not so impressive 30fps. Other games, such as Ubisoft's Assassin's Creed Unity, while still reportedly "aiming" for 60fps, are showing signs through early testing that this might be wishful thinking.
So why is this happening? Why would developers slide back on their promise to deliver us butter smooth framerates for our games? Is it because the new consoles simply can't handle 60fps? No, they obviously can and have. So what is the reason?
The simple truth is that the handful of 60fps games that came out at the beginning of the PS4/Xbox One console generation belong to a brief framerate golden age that was destined to pass away quickly. It's not that developers couldn't keep making games that run at 60fps, because they certainly could. It's that while developers are convinced that great graphics help sell their products, they don't believe high framerates have the same positive effect.
Mike Acton of Insomiac Games sums up the mentality succinctly in his company's report on the importance of framerate for sales:
[...]during development, there are hard choices to be made between higher quality graphics and framerate. And we want to make the right choices that reflect our commitment to providing you with the best looking games out there. To that end, our community team did some research into the question of framerate. The results perhaps confirmed what I've known for a long time, but found it difficult to accept without evidence. They found that:
- A higher framerate does not significantly affect sales of a game.
- A higher framerate does not significantly affect the reviews of a game.
What this means is that developers have a clear incentive to push the graphics of their games to the upper limit of what their hardware can handle, because better graphics means better sales. Developers do not, on the other hand, have any clear incentive to deliver higher framerates, as higher framerates do not tangibly influence sales. Thus, when it comes time for marking hard decisions, developers will almost always give preference to graphics over framerates.
It's important to realize that it will always come down to making these hard choices. Console hardware is inherently limited in ways that PC's are not. Developers can tweak and streamline software to better take advantage of a console's resources, but they can't add more RAM, for example, should they decide they needed it. A consoles's material resources are, for all intents and purposes, finite and fixed. Every developer has to choose how they are going to allocate those resources. They can choose to give priority to higher framerate or graphics, but choose they must. And the fact is, the choice will always be to prioritize the graphics of a game.
Some might wonder why game developers can't have their cake and eat it too. The key here is to understand that increasing the graphical fidelity of a game exponentially increases the processing power needed for a high framerate. With cross-gen games like Ground Zeroes and Assassin's Creed IV, the graphics could only be pushed so much because the games had to be feasible on the older consoles. That left plenty of room for higher framerates. The newer, next-gen only games, however, don't have that limitation imposed on them. Instead, developers are free to push the graphics to the system's limits. Will developers hold back on the graphics in this situation to ensure they have resources for running the game at 60fps? Absolutely not. And as the graphics arms race continues over the next several years, the chance of anyone getting a game to run at 60fps will be approximately nil.
Just take a look at what Dana Jan, game director for The Order, has to say on the issue:
I don't know of any other games that are gonna look like our game[...]with all the stuff that's going on lighting-wise, and run at 60. I think that's probably the thing that most people underestimate is [that] to make a game look like this—the way that they're lit, the number of directional lights that we have… We don't have a game where you're just outside in sunlight, so there's one light. We have candles flickering, fires, then characters have lights on them. So [to make] all those lights [work] with this fidelity means, I think, until the end of this system most people won't have any clue how to make that run 60 and look like this
Not until the end of this console generation's life does Jan think we will have games looking as good as The Order running at 60fps. This doesn't bode well for the future of 60fps console gameplay. Because you can bet your life that developers looking at The Order today aren't thinking to themselves, "How can I get a game that looks as good as The Order to run at 60fps?" No, they're thinking, "How can I make a game that looks even better than The Order run at a passable framerate?"