Hacker News

HDMI 2.1 Display Stream Compression (DSC) Ready for Amdgpu Linux Driver

65 points by WithinReason ago | 40 comments

NooneAtAll3 |next [-]

Every time I hear about HDMI and amount of legalese problems around it, my response is "you can just use DisplayPort"

tapoxi |root |parent |next [-]

Most TVs don't have DisplayPort

preisschild |root |parent |next [-]

In my case its an issue because I have a monitor with only a single DP port and I need to switch between my tower and laptop. I have to use HDMI for the laptop to monitor connection.

chocochunks |root |parent [-]

You can get DisplayPort KVMs. As a nice bonus the KVM will let you share a single mouse and keyboard set between them.

preisschild |root |parent [-]

My monitor (Samsung Odyssey Neo G9) has a usb kvm built in, i can already do that.

Plus I havent really seen an external dp2.1 kvm switch yet and I'm sure if they exist they are expansive.

cassianoleal |root |parent |previous [-]

An adaptor costs £7.

AnthonBerg |root |parent |next [-]

They are not equivalent.

Conversion is a very intricate spec fulfilment over an incredibly high bandwidth signal.

I did the dive; The adapters are not sufficient.

whazor |root |parent |next [-]

Do you know whether HDMI CEC adapters impact the signal?

perching_aix |root |parent |previous [-]

In what way are they not sufficient?

charleslmunger |root |parent [-]

The Synaptics VMM7100-based adapters only support VRR on older firmware versions with bugs.

The Chrontel CH7218 is the most reliable but still also suffers blackouts during VRR.

ParadeTech PS196 adapters advertise VRR support but their DPCD does not correctly communicate that it is supported. So even if you add the chip to the VRR PCON list in the amdgpu driver, it still won't see VRR as supported.

And while some of these advertise themselves as displayport 2.0, all of them only support bandwidth of 25.96gbps on the displayport side, requiring DSC for 4k 120hz 10bit color, even though they support 48gbps on the HDMI output.

tapoxi |root |parent |next |previous [-]

For HDMI 2.0. For HDMI 2.1 and 4K/120hz you're looking at north of $25 and don't get VRR support.

cassianoleal |root |parent [-]

Fair enough about VRR, but £13 for 4k/240Hz - https://www.amazon.co.uk/dp/B0FQCF62CD

elabajaba |root |parent |next [-]

I've owned 2 of these (returned and reordered thinking the first might've just been bad) and neither worked properly on Linux with an AMD 9070xt and an LG CX. They'd have black screen dropouts every few minutes, and occasionally full screen color corruption.

redeeman |root |parent |previous [-]

with lossy compression

amlib |root |parent |next |previous [-]

They have limitations, specially when driven to the limits of the specifications.

When doing 4k@120fps 4:4:4 chroma you might have to deal with longer handshakes and sometimes even no handshake at all. Or random dropouts. Or HDR not activating properly.

happyPersonR |root |parent [-]

I thought handshakes were just when you were setting up a connection no?

Random dropouts tho sound bad… with high speed signaling also sounds like a pain to figure out

eliaspro |root |parent |next |previous [-]

But wouldn't this break the HDCP chain and therefore render many use-cases (playback of DRM-protected streams) broken?

cassianoleal |root |parent [-]

Is that a problem for most uses of DP?

preisschild |root |parent |previous [-]

There are only a few adapters that support the 2.1 features (hdr+vrr+high resolution+high refresh rate, no lossy DSC). I even had to flash custom firmware for most of those features to work (vrr still doesnt)

saghm |root |parent |next |previous [-]

Can DisplayPort be used for audio? My recollection is that the main advantage of HDMI is transmitting both video and audio.

expedition32 |root |parent |previous [-]

You can just use Nvidia.

NooneAtAll3 |root |parent [-]

no, Nvidia is the HDMI of gpus

WithinReason |next |previous [-]

This was previously blocked from inclusion in SteamOS by the HDMI forum. It would help the Steam Machine to each 4K120Hz on HDMI.

paol |root |parent [-]

It was blocked from inclusion in the AMD GPU drivers, it's nothing specific to Steam or the Steam Machine.

The HDMI Forum apparently forbids any open source implementation of HDMI 2.1. Although I don't know if they ever offered an official justification, for a group that exists to promote HDMI adoption, they're clearly morons.

account42 |root |parent |next [-]

It's a group that exists to make sure that the standard works for all the members, including media companies that think they can control the flow of information. They don't need to promote HDMI adoption since their members already control pretty much all the TV production.

saghm |root |parent [-]

It's still not clear why the standard wouldn't work by AMD having an open source implementation. I try to give the benefit of the doubt, but in this case, it's hard not to agree with the parent comment: whoever came up with this rule and the people who agreed with it are morons.

Aissen |root |parent |next |previous [-]

were*

As the article says, they most likely changed their mind, probably following quite a bit of background discussions and industry influence.

mort96 |root |parent |next |previous [-]

Wait "forbids", present tense? How does that track with this announcement that it's coming to the open source AMDGPU?

happyPersonR |root |parent [-]

This is actually a good question … wonder if it’s just api calls to a binary blob haha

preisschild |root |parent |next |previous [-]

Valve also contributes to the amdgpu driver

expedition32 |root |parent |previous [-]

If memory serves HDMI includes DRM which they don't want people to reverse engineer.

tosti |previous [-]

> 4K @ 240Hz

WHY!?

cassianoleal |root |parent |next [-]

Would you rather they explicitly blocked that even though the technology allows for it?

nottorp |root |parent |next [-]

It's too easy for display manufacturers to compete on moar pixels, moar fps, moar refresh. You just try to embiggen your numbers compared to your competitor.

Meanwhile, features where you can't compete on numbers but can ruin the experience are ignored.

Night_Thastus |root |parent [-]

What are these features that can't be measured?

Plenty of people who test monitors also compare things like color coverage, brightness, latency, contrast, viewing angles, etc, etc, etc. If you mean the entire monitor, they generally also cover things like how the display swivels/mounts among other things.

tosti |root |parent |previous [-]

No, I just don't think such a high refresh rate accomplishes anything. Not even bragging rights. 120Hz, possibly. But 240? Are you going to introduce a telly into a slow-motion studio, on the set?

mort96 |root |parent |next [-]

The difficult thing for these standards is the data rate. 4k 240Hz is the same data rate as 8k 60Hz (since 8k is 4x the number of pixels as 4k).

If you want to support 8k 60Hz, the only reason you wouldn't also support 4k 240Hz would be because you actively choose to disallow that. That seems like a bad idea.

Night_Thastus |root |parent |previous [-]

It's diminishing returns, but for those who want it and have the hardware to support it, why not?

I have a 360 and when I play something I can actually get a full 360 out of, it's wonderful! Though honestly anything over 100 I'm perfectly fine with.

preisschild |root |parent |previous [-]

I have a Samsung G95NC (DP2.1, 7680x2160, 240Hz) and you definitely notice the difference between 120Hz and 240Hz. Although personally I wouldn't pay a cent more for an even higher refresh rate since the difference is much less noticable than 60vs120Hz and I expect 240Hz vs 420 to make even less of a difference.