Unless you're using a really cheap screen, most LCDs are 8 bits per color. The cheap ones use 6 bits + 2 bits FRC. 12 bits (presumably 4 bits per color) seems insanely low because it would mean only 16 possible shades per primary color.
>meanwhile silicon graphics had a CRT screen in 1995 that was 48bit
Source for this? 10 bit color support in graphics cards only became available around 2015. I find it hard to believe that there was 48 bits (presumably 16 bit per color) back in 1995, where 256 color monitors were common. Moreover, is there even a point of using 16 bits per color? We've only switched to 10 bit because of HDR.
>7. Those things are crazy resilient, I still have some working screens from 80286 era (granted, the colors are getting weird now with aging phosphors), while some of my new flatpanels failed within 2 years with no repair possible.
This sounds like standard bathtub curve/survivor bias to me.
I am talking about SGI workstations, indeed the 1995 ones didn't support (without modification) 48bit, instead it was "only" 12bit per channel, 3 channels, thus 36bit.
> 10 bit color support in graphics cards only became available around 2015.
That's off by a decade.
> 256 color monitors
Is that a thing that exists?
> We've only switched to 10 bit because of HDR.
You can get clear banding on an 8 bit output, and 10 bit displays are used at the high end. 10-bit HDR isn't immune to banding, since most of the increased coding space goes into expanding the range. There's a good reason for 12 bit HDR to exist.
So it looks like it supported 10 bit frame buffers, but not necessarily 10 bit output. A quick search suggests it only supported DVI, which didn't support 10 bit output. In the context of talking about monitors, this essentially means that 10 bit wasn't supported at the time. Otherwise you could claim you have 192 bit color by software rendering at 64 bits per color, but outputting at 8 bits.
Unless you're using a really cheap screen, most LCDs are 8 bits per color. The cheap ones use 6 bits + 2 bits FRC. 12 bits (presumably 4 bits per color) seems insanely low because it would mean only 16 possible shades per primary color.
>meanwhile silicon graphics had a CRT screen in 1995 that was 48bit
Source for this? 10 bit color support in graphics cards only became available around 2015. I find it hard to believe that there was 48 bits (presumably 16 bit per color) back in 1995, where 256 color monitors were common. Moreover, is there even a point of using 16 bits per color? We've only switched to 10 bit because of HDR.
>7. Those things are crazy resilient, I still have some working screens from 80286 era (granted, the colors are getting weird now with aging phosphors), while some of my new flatpanels failed within 2 years with no repair possible.
This sounds like standard bathtub curve/survivor bias to me.