I don't have (all of) those services but they're not like for like comparable for any services using Widevine, e.g. Amazon, Netflix and Disney. A Linux install will give you access to L3 content using the browser (usually limited to 720p but you can get a 1080p stream out of Netflix using a browser extension). I think the poster was having a bit of a joke though.
I'm really surprised that no one has bothered (or managed) to create an open-source TEE emulator. I mean, there's a ton of different ARM CPUs around there, and it's all ARM under the hood, so it should be possible to create a virtual TEE/TPM and provide it transparently to a VM?
My guess is that Widevine is pretty universally supported across multiple streaming platforms (i.e. won the DRM format war, even for audio) and 720p is good enough for most users. L1 also requires hardware decryption keys which are unique per device AFAIK (assuming Android - {I know browsers use a shared L3 key but not sure how L1 is implemented on PC formats}) and easily blacklisted.
For Linux users wanting a home theatre experience, I feel they are probably more inclined to go the bluray/piracy route. Personally I use my Android TV to stream in 4K with surround sound but am satisfied that I can still play everything using Linux, albeit at a lower resolution.
Yes, given the 4K WEB-DL releases it seems a few groups/individuals possess the knowledge of decrypting L1 content. Occasionally an L1 key will be leaked but is burnt quickly. I think the requirement of "unique" keys stops an interest in a public effort to make L1 content work under Linux, especially as there is already support for L3 content. Even under Windows and macOS end users must be using Microsoft Edge or Safari (platform specific) - given Chrome's huge marketshare I imagine lots of non Linux users are consuming exclusively L3 content on desktop platforms so I'm not sure there's much demand for full/ultra HD content, although many users may simply be unaware that their browser choice affects the quality of content they are streaming.
Nah, it's HDCP that's cracked. You can grab HDCP strippers (or rather, "splitters"/"signal multiplexers" that happen to strip HDCP) for cheap on alibaba and the likes. From there, all you need is a HDMI framegrabber or if you want the best possible quality, an SDI converter and a BlackMagic Decklink SDI capture card. That's a few hundred bucks.
What I'm more surprised, is that since most of the decoding and stuff is done on the GPU anyways, and the GPU has to be HDCP compliant, why can't they just directly send the stream to the GPU as is, tell it where to display it, and be done with it.
Well, then you couldn't have a play button, or pause button, or video text over the top of the video; unless there was some way of turning the GPU into a hardware-level compositor... which is generally not recognized as a good idea.
> unless there was some way of turning the GPU into a hardware-level compositor... which is generally not recognized as a good idea
I admit this isn't a domain I'm particularly fluent in. As a matter of fact, I thought that was already the case with all the talk about "GPU acceleration" of desktops. I think there was an HN post about Windows a few days ago, with people commenting on how all this changed between Windows 95 and 8/10.
It's not that it can't be done, just that it's an arms race of reverse engineering to try and keep it working against a motivated and financed adversary