Table of Contents
July 29, 2019 – The Graphics card section is no longer relevant due to Nvidia updating its GeForce graphics card drivers to offer 30-bit (10-bit per channel) in OpenGL programs to match what was previously only available in their Quadro graphics card line.
Nvidia article on topic: https://www.nvidia.com/en-us/geforce/news/studio-driver/?ncid=so-twit-88958#cid=organicSocial_en-us_Twitter_NVIDIA_Studio
30 bit setup in Photoshop
To setup a 30-bit workflow in Photoshop choose Edit, > Preferences > Performance. From the "Graphics Processor Settings" section choose "Advanced Settings…" button. This will bring up the Advanced Graphics Processor Settings window. In the Advanced Graphics Processor Settings window, select the checkbox for "30 Bit Display", then choose the "OK" button.
You will need to also have a workstation class graphics card (AMD Radeon Pro / Nvidia Quadro) in your system, and connect to a 10-bit monitor using a DisplayPort cable connection.
How to enable 30-bit in Lightroom
Unfortunately, Adobe Lightroom does not offer a full 10-bit per channel workflow support at this time. However, you can make a feature request vote with Adobe at this link to have them include it in the future:
https://feedback.photoshop.com/photoshop_family/topics/add_10_bit_support_to_lightroom
How to enable 30-bit in After Effects
Adobe After Effects can only support 10-bit per channel output with an I/O card (such as a Blackmagic Decklink or similar) monitoring card. Surprisingly, After Effects does not support 10-bit per channel through a workstation class (AMD Radeon Pro / Nvidia Quadro) card.
How to enable 30-bit in Premiere Pro
The good news is that there is Nothing to enable in Premiere Pro, 10-bit per channel output should always be on by default. Adobe Premiere Pro supports 10-bit per channel output by using either an I/O card (such as a Blackmagic Decklink) which offers 10-bit per channel though HDMI / SDI, or with a workstation class graphics card (Quadro / AMD Radeon Pro). With workstation class graphics cards, 10 bit per channel is only available over DisplayPort, not HDMI.
How to enable 30-bit in Illustrator
Unfortunately, Adobe Illustrator does not offer a full 10-bit per channel workflow support at this time.
Which graphics cards offer 30-bit color in Photoshop?
As noted in an earlier article about setting up graphics card software to display 10 bpc output, both workstation class graphics cards (AMD Radeon Pro, & Nvidia Quadro), and consumer class graphics cards (AMD Radeon, Nvidia GeForce) graphics cards give you the ability to set 10 bpc (10-bit per channel R, G, B) for full screen Direct X programs through their driver software to allow a greater number of colors to be displayed for programs that utilize DirectX – again provided you are connected to a 10-bit display.
However professional programs like Adobe Photoshop along with others tend to utilize OpenGL for 10-bit per channel color, and currently only workstation class Nvidia Quadro, or AMD Radeon Pro graphics cards offer 10-bit per channel color through OpenGL.
As Nvidia itself notes regarding 10 bpc output:
"NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI."
10-bit per channel work-around
If you have other programs that would benefit from a consumer class AMD Radeon or Nvidia GeForce graphics card, or simply would like the price to performance these graphics cards offer in certain situations, there is a work-around to get 30-bit color in professional programs while using one of these cards by utilizing a 10-bit I/O add-on card, such as a Blackmagic Decklink card. These cards will provide a 10 bpc signal to a monitor. So it technically is possible to have an 8 bpc GeForce or Radeon graphics card in your system for general use, and also have one of the 10-bit I/O type of card to supply 10 bpc to a 10-bit screen. One thing to be aware of though is that the screen attached to the 10-bit I/O card will only supply an image of the photo during editing to the screen attached to this card. When not editing a photo / video, the screen will not display anything else. So after factoring in the cost of the additional 10-bit I/O card, and the fact it only outputs to the screen it is connected to while editing a photo or video, it may or may not be beneficial to go this route over simply getting a workstation class graphics card that can output all content to all screens in 10 bpc.
Is this an 10-bit or 30-bit monitor?
First, it is worth noting that monitor manufacturers will list their 30-bit monitors as "10-bit". Yes this is confusing! The "10-bit" nomenclature the monitor manufacturers use really refers to 10-bit per channel. So that means 10-bit Red, 10-bit Green, and 10-bit Blue channels, which equates to a total of 30-bit values (10+10+10 bits per channel R,G,B) – the value amount of which can be seen in the chart on this page. Likewise an "8-bit" monitor is really a 24-bit total, as it is also referring to 8-bit Red, 8-bit Green, 8-bit Blue channels, and 8+8+8 = "24-bit".
There are 10-bit monitors that have multiple inputs, and not all of those inputs will necessarily support 10-bit per channel. So please check with your individual monitor manufacturer specifications to see which input ports are 10-bit supported.
It is also worth mentioning there are some monitors advertised as offering 10-bit color output, but are not true 10-bit, but rather 8-bit+FRC. 8-bit+FRC (Frame Rate Control) monitors are 8-bit monitors that essentially fake the output of a 10-bit monitor by flashing two colors quickly to give the illusion of the color that should be displayed if it were a true 10-bit monitor. For example if the color that should be displayed on the 10-bit monitor is number 101 in the Look Up Table (LUT), and an 8-bit monitor is only capable of displaying color number 100, or 104, an 8-bit+FRC monitor would flash the color number 100 and number 104 quickly enough that in theory one should not notice the flashing. It's goal is to fake the human eye into thinking it is really color number 101. To do this the 8-bit+FRC monitor would flash between color number 100 for 75% of the time, and color 104 for 25% of the time, to give the illusion of color number 101, similar to how rapid succession still shots work to give the illusion of motion when displayed one after the other quickly enough. If color 102 needed to be displayed, an 8-bit+FRC monitor would flash between displaying color number 100 for 50% of the time, and color number 104 for 50% of the time to give the illusion of color 102, as opposed to a true 10-bit monitor which would be able to simply display color number 102 from the LUT.
I hope this helps!