Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>many modern screens still are 12bit

Unless you're using a really cheap screen, most LCDs are 8 bits per color. The cheap ones use 6 bits + 2 bits FRC. 12 bits (presumably 4 bits per color) seems insanely low because it would mean only 16 possible shades per primary color.

>meanwhile silicon graphics had a CRT screen in 1995 that was 48bit

Source for this? 10 bit color support in graphics cards only became available around 2015. I find it hard to believe that there was 48 bits (presumably 16 bit per color) back in 1995, where 256 color monitors were common. Moreover, is there even a point of using 16 bits per color? We've only switched to 10 bit because of HDR.

>7. Those things are crazy resilient, I still have some working screens from 80286 era (granted, the colors are getting weird now with aging phosphors), while some of my new flatpanels failed within 2 years with no repair possible.

This sounds like standard bathtub curve/survivor bias to me.



I am talking about SGI workstations, indeed the 1995 ones didn't support (without modification) 48bit, instead it was "only" 12bit per channel, 3 channels, thus 36bit.

Here is a photo of John Carmack using such workstation: https://external-preview.redd.it/EnhEls7GJgm9UxR8FE9Dc3FfH4X...

In 1997 then they launched the "Octane" Workstation line, that could output 4 channels of 12bit, thus reaching 48bit.

https://hardware.majix.org/computers/sgi.octane/

One of the purposes of these machines, was make HDR images (among other things).

Sadly for THAT, I don't have time to track sources now, I am busy with something else.

As for a monitor that could support this stuff, one is SONY GDM90W11 that could do 1900x1200


> 10 bit color support in graphics cards only became available around 2015.

That's off by a decade.

> 256 color monitors

Is that a thing that exists?

> We've only switched to 10 bit because of HDR.

You can get clear banding on an 8 bit output, and 10 bit displays are used at the high end. 10-bit HDR isn't immune to banding, since most of the increased coding space goes into expanding the range. There's a good reason for 12 bit HDR to exist.


>That's off by a decade.

mainstream support, at least. 10 bit HDR support for AMD cards was introduced with Fiji (2015), and Nvidia was introduced with Maxwell (2014)


2002. I remember this. Matrox was one of major GPU players at the time. https://en.wikipedia.org/wiki/Matrox_Parhelia


So it looks like it supported 10 bit frame buffers, but not necessarily 10 bit output. A quick search suggests it only supported DVI, which didn't support 10 bit output. In the context of talking about monitors, this essentially means that 10 bit wasn't supported at the time. Otherwise you could claim you have 192 bit color by software rendering at 64 bits per color, but outputting at 8 bits.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: