Near what appears to be the end of the current console generation, Sony and Microsoft released enhanced versions of their consoles that emphasized 4K play. The PS4 Pro and Xbox One X might not demand a 4K display, but a big part of justifying their premium price tags is based on whether or not you own a 4K display and how good of a 4K display you own.
Many have rightfully assumed that the PS4 Pro and Xbox One X are a sneak peek at what Sony and Microsoft have planned for the next generation of consoles. In many ways, that will probably prove to be true. However, there are some questions regarding whether or not they will actually require 4K displays.
By “require 4K,” we mean pretty much what you think we mean. You won’t need a 4K display to play next-generation games. If you’re thinking that it’s absurd to even suggest that consoles could possibly demand that you own such a display, that’s probably because you weren’t around for – or don’t remember – the launch of the Xbox 360. The Xbox 360 was released at a time when homes everywhere were embracing HD displays. While the console didn’t demand an HD display, non-HD display owners soon discovered that HD was more than a recommendation. Those who used non-HD displays soon discovered that things like in-game text were optimized for HD to such a degree that many games were technologically hostile to composite display gamers.
There are some who think that the next generation of consoles will “soft-require” 4K displays in the same way. There’s some logic to that idea. 4K displays are becoming more popular, the PS4 Pro and Xbox One X showed us the clear benefits of emphasizing 4K, and there’s something to be said for encouraging – or demanding – the use of a technology that is strictly better than what came before.
Also Read: Why Microsoft Will Release An Xbox Two Elite
However, that logic really only works if you believe that the HD revolution is the same as the “4K revolution.” There are several reasons why that isn’t the case.
The idea of 4K being strictly better than high-end HD isn’t exactly true. Yes, a 4K display looks better than an HD display if you do a side-by-side comparison, but video games have to take into account a lot more than just pure display quality. That’s especially true of frame rate. Just as it sounds, frame rate is the metric used to describe how many frames per second a screen can display. At present, the 60 FPS is the “minimum” high-end frame rate. While some high-end gaming machines can go higher, 60 FPS is the ideal number for consoles.
For many, 60 FPS is a more important metric than 4K. The difference between a game running at 30 FPS and 60 FPS is immediately noticeable to many gamers, while it takes a more discerning eye to spot the difference between midrange 4K and 1440p. More importantly, higher FPS results in more tangible benefits for video games whereas 4K is sometimes better for movies where there’s no user interaction. That’s why, at the moment, 60 FPS is the more immediate gaming technology goal than 4K
The reason that matters is that there are very few consumer graphics cards on the market that can run every major video game at 4K and with 60 FPS. With the design of next-gen consoles already underway, you can rest assured that neither the PS5 or Xbox 2 will feature GPUs that can consistently hit that mark in all current and future titles. Because of that, you can also feel safe buying a next-gen console without upgrading your TV.
Also Read: Will PS5 and Xbox 2 Require 4K TVs?