Ask the Experts: Gaming PC Hardware in 2020 and Beyond

Highlights:

  • Virtual reality (VR).

  • Upcoming PC hardware.

  • Games in 2020.

  • Display technology.

author-image

By

The last decade brought huge changes for the way PC games are played, with dramatic developments across every sector of PC hardware.

To gain some insights on what’s coming next, we asked some of the hardware and technology experts at Intel about how they see PC gaming continuing to evolve over the next few years.

We kicked things off by speaking with Bryn Pilney, one of Intel’s PC gaming specialists.

Is 2020 the year VR hits the mainstream?

For VR, it really depends on what you consider “mainstream”. Over the last few years we've seen a lot of movement in the VR space that’s really positive. For example, lower-priced PC based VR options like Windows* Mixed Reality and inside-out tracking that doesn’t require external cameras to operate.

We're also starting to see the software ecosystem develop. When premium head-mounted displays like the Vive and Oculus initially came out, one of the common complaints was lack of AAA experiences, but it takes time to create experiences tailored specifically for this new hardware. Now that developers have had a few years to get comfortable with developing software designed for newer headsets, we're starting to see ambitious VR titles like Half-Life*:Alyx coming out this year.

Based on the Steam sales, we have already seen that a solid AAA-developed title has the ability to move headsets. I think it's going to be a huge motivator for people to get into VR. Hopefully this is the first of many.

Why do you think this change is happening now, when these headsets have been on the market for a few years?

One of the biggest developments is that headsets with solid feature sets that support full-fledged experiences like integrated tracking became more affordable.

We're also starting to see some expansion at the top end of the stack as well. For people who want a more premium experience, newer headsets are working to eliminate some of the gripes from the first-generation headsets, like the “screen door” effect, by utilizing different panel technologies along with higher resolutions and refresh rates.

Whether or not people have been paying attention, there has been a lot of positive trajectory in this space. That said, it's going to take more than one software title to make VR hit the “mainstream”.

What PC hardware trends are you most excited about in the next few years?

It's hard to say what will become the next trend, but it's encouraging to see relatively narrow communities continue to form and grow. It speaks to how passionate the PC community is as a whole.

The mechanical keyboard community is a great example of one that started as a relatively small group of enthusiasts but evolved into a mainstay of PC gamers. The amount of people who previously never thought about their keyboard but now have a distinct preference demonstrates that people are becoming more particular about not just their computer, but how they interact with it.

While keyboards are an obvious example, similar traction can be found among mice enthusiasts as well. Communities are embracing just about every peripheral, from mousepads to audio equipment. More established communities even have specific offshoots, like display fanatics that can't get enough of ultrawides.

Of course, I can't ignore PC hardware enthusiasts. An excellent example of hardware-based communities that continue to grow is the SFF (small form factor) case community. There are frequent examples of people creating their own cases. Sometimes this is just for personal use, but there are also many examples of people actually bringing them to market.

What games are you most excited to play in 2020?

I'm still playing World of Warcraft Classic*, leading a raid group and getting ready for the release of Blackwing Lair* in Phase 3, so I anticipate 2020 will still be a strong WoW year for me.

I also recently bought a Valve Index*, so I'm playing through Half-Life* again in anticipation of Half-Life*: Alyx.

Generally speaking, I'm hoping 2020 will be a great year for VR. Gaming can be a great social activity, but some of the best VR experiences are best experienced alone. As more people get their hands on HMDs, and tools like Valve's Hammer* are updated to include VR-specific tools and components, I'm hoping to see a rise in multiplayer experiences that encourage people to continue putting on their headset.

Next we asked Roland Wooster, a Principal Engineer and Display Technologist with Intel, about the future of display technologies.

What new display technologies are you most excited about?

That would have to be High Dynamic Range (HDR). There is no other new display technology with such an immediately obvious benefit. Unlike resolution increases, where the benefit is somewhat dependent on how good your vision is, how big the screen is, and what distance you’re viewing the screen from, HDR provides a significant benefit at any distance, at any resolution, and isn’t dependent upon the quality of your vision.

That said, it’s not quite a slam dunk, as there are many variables that affect the quality of HDR and how significant the benefit will be.

It’s easy to slap an “HDR” sticker on a device, but a sticker alone doesn’t make for a better product. A sticker just saying “HDR” or “HDR10” doesn’t prove anything, and has no numerical basis. To really illustrate this point, some HDR devices provide the bare minimum of functionality to call themselves HDR10. At the lowest end of capability, just being able to process HDR10 input signals and output something to the screen counts as HDR10.

Some displays simply do this with last year’s Standard Dynamic Range (SDR) panels, but are now sold as HDR because they can interpret the HDR10 input signal. However, the output to the screen will barely be any different than a traditional SDR display. In fact, if done badly, it might even be worse.

The challenge, which was answered by VESA, was to develop a robust, fully open, public standard and logo program with a numerical basis in HDR display performance, called the DisplayHDR standard. The specs, test routines, automated test tool, and test templates are all available for free from www.DisplayHDR.org, and were recently updated in September of 2019 to version 1.1 with more stringent performance tests.

Currently VESA has seven separate performance levels for HDR. At a high level they focus on the peak luminance of the display, and this number is indicated in the logo, i.e. 400, 500, 600, 1000, 1400, and two parallel sets of specs for emissive pixel displays, which can achieve perfect black at the 400 and 500 level. Beyond simply measuring peak luminance, the VESA specs place requirements on dimming capability, color gamut, and a number of other critical features that help ensure a quality HDR display.

The DisplayHDR program was established two years ago. At least 18 of the top display vendors use the standard, and well over 125 products have been certified. If you’re shopping for an HDR display, make sure it is VESA DisplayHDR certified.

What are some reasons HDR is an advantage over SDR?

One of the really key differences between HDR and SDR, beyond the increased luminance, is the ability to perform local dimming. This enables segments of the screen to be very bright, while other segments of the screen can be very dark, and this increases the contrast level across the screen to achieve more life-like images.

Of the VESA Display HDR standards, only the 400 level does not require local dimming and wide color gamut. All of the True-Black tiers and the classic tiers at 500 and above all require local dimming and wide color gamut support. The 400 level is definitely a step up from SDR, but the difference versus a good SDR display may not be all that visible. When shopping for HDR, I would recommend the 500 level or above, and the more you can afford, the better it will be.

There’s also a misconception that the higher levels, like 1000 or 1400, are “too bright”, which is a misunderstanding of the fundamentals of HDR displays. Having a 1000 cd/m2 display doesn’t mean you’re going to type an email at 1000 cd/m2, as this would be a punishing experience and immediately case eyestrain.

Even on a 1000 cd/m2 display, the luminance level of “paper white” in your SDR applications like Outlook, web browsers, Word, Excel, and most other applications is set by the end user to a comfortable level based on your preferences for the ambient lighting conditions. I personally have my SDR “paper white” level set to around 130 nits, which I find optimal for my ambient lighting.

It’s only HDR applications that have access to luminance levels above the SDR Paper White. That means only games, movies, and content creation applications that support HDR have access to the luminance range from 130-1000 cd/m2 on my monitor. These applications don’t necessarily increase the average luminance level of the scene much at all above a traditional SDR monitor. It’s only the specular highlights that use the increased luminance range.

I have been using 1000 cd/m2 HDR displays at home and work for ~2 years, often 10+ hours a day with no eyestrain concerns, and I can’t imagine going back to SDR displays!

Do you think HDMI and DisplayPort will continue to be the dominant display connections for PC displays over the next few years?

It’s interesting that we still have these two high-bandwidth digital interfaces in the ecosystem. I think everyone would love to see this consolidate to only one. However, there are two distinct camps, neither of which would ever want to cede to the other, so we will likely see these two interfaces remain for a long time.

HDMI is the de facto standard for everything related to the TV. TVs typically only have HDMI inputs. TV-focused devices such as Blu-ray players, cable boxes, streaming boxes, and even game consoles typically only have HDMI outputs. Even consumer video cameras (if they have an output interface at all) will normally be HDMI.

DisplayPort is the de facto standard for everything related to the PC ecosystem. DisplayPort has generally been ahead in the bandwidth race for most of the last decade.

To provide some context for how far these display technologies have come, it helps to know a little of the technical history between the two standards.

DisplayPort 1.2 supported 4K/60 before HDMI 2.0 became available in the PC ecosystem, so 4K PC monitors and 4K PCs defaulted to DisplayPort. Then, when we transitioned to 4K/60/10bit for HDR, (up from 8bit SDR) DisplayPort 1.2 already supported this bandwidth, but HDMI 2.0 didn’t.

All of the HDR monitors were DP based, because HDMI 2.0 was a compromise for 4K/60/HDR. Then DP moved even further ahead with DP 1.3 to support 5K resolution, or high frame rate 4K. This was something that HDMI 2.0 had no solution for.

Then DP moved ahead again with DP 1.4 adding a superior compression technique than the solution adopted by HDMI 2.0a. To support 4K/60/10bit HDMI 2.0a used 4:2:2 or 4:2:0 color subsampling. This is perhaps acceptable for video, but absolutely horrible for text, so a very poor solution for PC usage. It also only achieves a compression ratio of 3:2, or 2:1. Meanwhile DisplayPort 1.4 provides a different compression solution called VDSC that achieves extremely good 3:1 compression, and I’ve even seen impressive 6:1 compression. Using DP 1.4 with DSC provides a massive amount of compressed bandwidth supporting high frame rate, high bit depth, 4K HDR, and even 8K/60.

HDMI then launched version 2.1 with a giant leap in bandwidth, and what I assume is a dramatically improved compression method (I haven’t personally seen compression tests for HDMI 2.1 so can’t comment). While this has made it to market in a few devices, the coverage at this point in time is limited to 8K TVs. More recently, DisplayPort released version 2.0 with an even bigger leap in bandwidth, and very impressive compression ratios.

At the upper levels there’s so much bandwidth it’s well beyond any display technology of today, so it provides for a lot of future proofing.

Bringing it back to the original question, of display interfaces for the PC, there’s a dramatic trend towards USB-C based display interfaces. USB4 actually includes DisplayPort, so in my mind this will very clearly become the default port and default display interface on all PCs, and I suspect the PC industry will transition quite quickly. I also suspect that PC to HDMI will most likely be increasingly supported by USB-C to HDMI 2.1 dongles, and less and less by GPU vendors, motherboard vendors, and OEM laptops.