Apple Silicon M1 supports “billion of colors” a.k.a. HDR 10-bit output

  • > The prominent use of color no doubt stems from Steve Jobs obsessing over the smallest details.

    Actually color in the history of Apple stems from Woz not Jobs, who added color to Apple II. Jobs in fact reverted to black and white in Macintosh/original NeXT and spent the compute/memory resources on other things (e.g. higher res).

  • So, I'll admit to being a Luddite on the whole HDR thing, but I suppose it's worth figuring out in the case it becomes commonplace and not another fad like 3D TV.

    How does HDR affect colors in software? For example, on a 10-bit display - does my "legacy" website showing #FFF, show a true brightest-white color, or would I need to use a special definition to achieve it? Unfortunate, as I'm sure the hex value for a 10-bit 3-tuple is not quite as "perfect" as #000000-#FFFFFF. e.g. like `color: rgbhdr(#0x3FF3FF3FF)`

    If I use a non-HDR-aware color-picker app (or take a screenshot), and pick HDR content versus normal content, is there a translation layer that scales down the value to RGB? Or does it clamp and "overexpose"?

    It's a whole new world. I googled "HDR in CSS" and got this[0] which is not quite "stubborn programmer who thinks this is a gimmick" friendly...

    Anyone have a good resource that explains how one would use HDR colors in practice? And ideally one that touches on considerations like the interactions between HDR-aware and non-HDR-aware applications.

    [0]: https://w3c.github.io/ColorWeb-CG/

  • So I’m not convinced this is actually “billions of colours”. Technically, it means having a 10-bit colour encoding over the wire such that you can express over a billion colours. The distinction between 8-bit and 10-bit is not actually sRGB vs HDR in the same way as dithering a GIF to a maximum of 256 colours lets you display the sRGB colour space but limits you to only 256 colours of it. https://helpx.adobe.com/photoshop-elements/using/dithering-w...

    Similarly, you can turn on that little High Dynamic Range checkbox and get HDR but only have 16.7 million colours at your disposal because it’s output in 8-bits per colour rather than 10-bits per colour.

    And it’s really hard to tell the difference sometimes between 8-bit HDR and 10-bit HDR. Like really hard. Like usually only visible when doing colour grading such that you need every possible nuance of data to more accurately shade and re-colour your pixels. https://youtu.be/MyaGXdnlD6M

    Of course I imagine there’s also good vs bad dithering and the output to the attached laptop computer screen is probably better than the multiple cables and adapters required to output to TVs and external displays, but... the easiest way to tell whether something supports billions of colours is to go into monitor preferences and look for 10-bit or 422 or 444. If you see 420 or 8-bit, technically you might still have HDR but you don’t have “billions of colours”, technically.

  • One of my very first jobs I worked to test VGA cards. They were 4mb and 8mb. Yes you read that right, that is "m" as in megabytes. My setup was a 25 megahertz open case motherboard, a few of these. It was part of the last quality control station and I would pop each vga card into the PCI slot and boot it up.

    I would basically run a macro which cycle through the resolutions from b/w, 4 colors, 16, 256, ... and up to a million or so. I think there is a name for this, but basically watching a prism of rainbow colors. At a million+ the color tones are very smooth and you don't see an outline.

    I could not imagine I would be able to differentiate a billion colors from its previous factor. At that point, I would stamp it and it goes off to shipping for packaging, and to the customers.

  • “For as long as we’ve had TVs, color has been an important metric to judge the display’s quality.”

    Um, no, we had TVs for quite a while before it got any colors, in the beginning it was all gray scale.

  • Im confused why there was any question of wether it supported HDR or not. Apple has had it listed for a while: https://support.apple.com/en-us/HT210980

  • This isn't surprising, and isn't that big a deal. Most modern high-end laptops have HDR support and/or 10-bit output. Similarly, most modern CPUs and GPUs can accelerate 10-bit video decoding in hardware.

  • Fun fact: In human testing, subjects cannot tell the difference between 8-bit RGB and 10-bit RGB except for greyscale gradients.

    (I wish I could link the study, but I found it during the peak of my 2012-era graphics career. Suffice to say, more devs should study the science of perceptual testing.)

  • Is this a high enough bit depth to eliminate the visual banding of gradients? If not, then what is? Because that would seem to be approximately the depth beyond which there would never be any benefit to further increasing it.

  • A little bit of Apple hagiography in there, claiming Apple is usually first. Discrete video cards have consistently been at the forefront of features like this, leading Apple by years. The difference is, the consumer must decide to build a system with these features. If I want an HDR system, I buy an HDR capable discrete card, HDR capable monitor, etc and put it all together.

    What Apple does is assemble it all in a base model, like a console, so that all models have the feature set as base. And like other commenters have pointed out, PCs with HDR have been shipping long before, especially "workstation" machines.

  • Very strange. With an M1 MacBook Air, the Costa Rica video plays in Safari using rec.709 (non-HDR) and in Chrome with bt.2020 (HDR). Article uses an M1 Mac mini connected to external monitors so it's a different test but it showed Safari playing bt.2020 (doesn't say anything about Chrome).

    Seems like it will take time for this all to shake out.

  • “Usually, this would list a value of 30-bit here (i.e. 10-bit per RGB channel)“

    Anyone else remember the early days when you could actually choose 32bit colour in system preferences?

    Drives me crazy that it’s taken this long to get back to unbanded gradients.

  • I just got a 16GB M1 MBP, and I can see the HDR option in Youtube even on the inbuilt display. Didn't know that the inbuilt display supported HDR too. I thought HDR was going to be reserved for the future mini-LED displays.

  • Industry is going to have to get their nomenclature straightened out because nobody really knows what's going on in consumer land. HD was something people understood - 4K maybe. Beyond that, not so much.

  • While we're on the subject - has anyone connected a 5K monitor to the Macbook Air M1 using a DisplayPort 1.4 port yet?

    In particular the Viewsonic XB2779QQS-S1... (

  • 2^30 is just above 1 billion. Isn’t “billions of colors” at least imprecise, if not false advertising?

  • Big deal. So does the three-year-old Walmart laptop I keep in my garage.

    I find the reaction to the M1 pretty funny. People are impressed by the silliest things.

  • Currently, this article has only 14 points and two comments suggesting it is not a big deal. Why did it rise to position three on Hacker News?