Table of Contents
TL;DR: NVIDIA GeForce RTX 3080 & 3090 performance in Lightroom Classic
Adobe has been steadily adding GPU support into Lightroom Classic over the last few versions, but for the tasks we currently test, there is little advantage to using a powerful GPU like the new GeForce RTX 3080 10GB or 3090 24GB. In fact, there is almost no appreciable difference between the fastest GPU we tested and having no GPU at all.
Introduction
On September 1st, NVIDIA launched the new GeForce RTX 30 Series, touting major advancements in performance and efficiency. While gaming is almost always a major focus during these launches, professional applications like Lightroom Classic are becoming more and more important for NVIDIA's GeForce line of cards. Due to the significant improvements, Adobe has made around GPU acceleration in Lightroom Classic, this is the first time we will be doing GPU-focused testing for Lightroom Classic, Because of this, we are very interested to see what kind of performance delta we will see between the various cards we will be testing.
If you want to see the full specs for the new GeForce RTX 3070, 3080, and 3090 cards, we recommend checking out NVIDIAs page for the new 30 series cards. But at a glance, here are what we consider to be the most important specs:
VRAM | CUDA Cores | Boost Clock | Power | MSRP | |
---|---|---|---|---|---|
RTX 2070S | 8GB | 2,560 | 1.77 GHz | 215W | $499 |
RTX 3070 | 8GB | 5,888 | 1.70 GHz | 220W | $499 |
RTX 2080 Ti | 11GB | 4,352 | 1.55 GHz | 250W | $1,199 |
RTX 3080 | 10GB | 8,704 | 1.71 GHz | 320W | $699 |
Titan RTX | 24GB | 4,608 | 1.77 GHz | 280W | $2,499 |
RTX 3090 | 24GB | 10,496 | 1.73 GHz | 350W | $1,499 |
While specs rarely line up with real-world performance, it is a great sign that NVIDIA has doubled the number of CUDA cores compared to the comparable RTX 20 series cards with only a small drop in the boost clock. At the same time, the RTX 3080 and 3090 are also $500-1000 less expensive than the previous generation depending on which models you are comparing them to.
With the launch of the RTX 3090, we can update our previous Adobe Lightroom Classic – NVIDIA GeForce RTX 3080 Performance article with results for the 3090, but since the RTX 3070 is not launching until sometime in October, we cannot include it at this time. However, we are very interested in how the RTX 3070 will perform, and when we are able to test that card, we will post another follow-up article with the results.
Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.
Test Setup
Listed below is the specifications of the system we will be using for our testing:
Test Platform | |
CPU | Intel Core i9 10900K 10 Core |
CPU Cooler | Noctua NH-U12S |
Motherboard | Gigabyte Z490 Vision D |
RAM | 4x DDR4-2933 16GB (64GB total) |
Video Card | NVIDIA GeForce RTX 3090 24GB Gigabyte GeForce RTX 3080 OC 10GB NVIDIA Titan RTX 24GB NVIDIA GeForce RTX 2080 Ti 11GB NVIDIA GeForce RTX 2080 SUPER 8GB NVIDIA GeForce RTX 2070 SUPER 8GB NVIDIA GeForce RTX 2060 SUPER 8GB AMD Radeon RX 5700 XT 8GB AMD Radeon RX Vega 64 8GB Intel UHD 630 (10900K Integrated) |
Hard Drive | Samsung 960 Pro 1TB |
Software | Windows 10 Pro 64-bit (Ver. 2004) Lightroom Classic 2020 (Ver. 9.4) PugetBench for Lightroom Classic (Ver. 0.92) |
*All the latest drivers, OS updates, BIOS, and firmware applied as of September 7th, 2020
Big thank you to Gigabyte for providing the GeForce RTX™ 3080 GAMING OC 10G used in our testing!
To test each GPU, we will be using the fastest platform currently available for the "Active Tasks" in Lightroom Classic – most notably the Intel Core i9 10900K. We overall recommend AMD Ryzen or Threadripper CPUs for Lightroom Classic due to their significantly higher performance for "Passive Tasks" like generating previews and exporting, but since the GPU is not significantly used for any of those tasks, we decided to use the 10900K to minimize the impact of the processor for the tasks that use the GPU. Even with this, however, be aware that there typically isn't much variation in performance between different video cards in Lightroom Classic.
We will also include results for the integrated graphics built into the Intel Core i9 10900K and with GPU acceleration disabled to see how much the recently added GPU acceleration features improve performance.
For the testing itself, we will be using our PugetBench for Lightroom Classic benchmark. This tests a number of range of effects and tasks in Lightroom Classic including importing, exporting, and tests simulating culling tasks. If you wish to run our benchmark yourself, you can download the benchmark and compare your results to thousands of user-submitted results in our PugetBench database.
Raw Benchmark Results
While we are going to go through our analysis of the testing in the next section, we always like to provide the raw results for those that want to dig into the details. If there is a specific task that tends to be a bottleneck in your workflow, examining the raw results is going to be much more applicable than our more general analysis.
Lightroom Classic Performance Analysis
Since this is the first time we are specifically testing GPU performance in Lightroom Classic, we do not yet have a specific "GPU Score" built into our benchmark. In fact, there are several tasks that we hope to include in the future (such as slider and brush lag) that should be an even better indicator of GPU performance than the tasks we currently test.
However, we should be able to see at least some indication of relative GPU performance with our current tests.
Overall, we didn't see much of a difference between the various GPUs we tested, or even the test using just Intel integrated graphics and GPU acceleration disabled entirely. NVIDIA is definitely a hair faster than AMD (which oddly was slower than having no GPU acceleration at all), but the performance between each NVIDIA GPU is close enough to be within the margin of error. In fact, Lightroom Classic tends to have a larger margin of error than our other benchmarks, and anything within ~5% we would consider to be effectively the same.
We could go into our results in more detail, but what we are taking from this is that for what we are testing, the GPU has almost no impact. As we mentioned earlier in this post, we do hope to include a number of other tests that should be a better indicator of GPU performance, but this simply reinforces that your GPU is a very low priority relative to your CPU, RAM, and storage.
How well does the NVIDIA GeForce RTX 3080 & 3090 perform in Lightroom Classic?
Adobe has been steadily adding GPU support into Lightroom Classic over the last few versions, but the different cards we tested all performed roughly the same. NVIDIA has a small lead over AMD, but the fact that having GPU acceleration disabled was also faster than the AMD cards tells us that GPU acceleration in Lightroom Classic is still very early in its development.
As we mentioned earlier in the article, there are a number of tasks that we currently do not test that should leverage the GPU a bit more (such as slider/brush lag) that we hope to add to our benchmark in the future. We are unfortunately limited to what is possible via the Lightroom Classic API, but with luck, we will be able to improve our GPU-specific testing in the future. However, without significant changes to the Lightroom Classic, we don't expect there to be any reason to invest in a high-end GPU any time in the near future.
As always, keep in mind that these results are strictly for Lightroom Classic. If you have performance concerns for other applications in your workflow, we highly recommend checking out our Hardware Articles (you can filter by "Video Card") for the latest information on how a range of applications perform with the new RTX 3080 and 3090 GPUs, as well as with different CPUs and other hardware.
Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.