Table of Contents
TL;DR: NVIDIA GeForce RTX 3080 performance in Lightroom Classic
Adobe has been steadily adding GPU support into Lightroom Classic over the last few years, but just like Photoshop, the most important thing is to simply have a supported card. There is some difference between a low-end GPU and one like the new GeForce RTX 3080, but between cards that are roughly in the same ballpark, you will be hard-pressed to notice a difference.
At most, there is only a few percent difference between the RTX 2060 SUPER and the RTX 3080, and effectively no difference between the RTX 2080 Ti and RTX 3080.
Introduction
On September 1st, NVIDIA launched the new GeForce RTX 30 Series, touting major advancements in performance and efficiency. While gaming is almost always a major focus during these launches, professional applications like Lightroom Classic are becoming more and more important for NVIDIA's GeForce line of cards. Due to the significant improvements Adobe has made around GPU acceleration in Lightroom Classic, this is the first time we will be doing GPU-focused testing for Lightroom Classic, Because of this, we are very interested to see what kind of performance delta we will see between the various cards we will be testing.
If you want to see the full specs for the new GeForce RTX 3070, 3080, and 3090 cards, we recommend checking out NVIDIAs page for the new 30 series cards. But at a glance, here are what we consider to be the most important specs:
VRAM | CUDA Cores | Boost Clock | Power | MSRP | |
---|---|---|---|---|---|
RTX 2070S | 8GB | 2,560 | 1.77 GHz | 215W | $499 |
RTX 3070 | 8GB | 5,888 | 1.70 GHz | 220W | $499 |
RTX 2080 Ti | 11GB | 4,352 | 1.55 GHz | 250W | $1,199 |
RTX 3080 | 10GB | 8,704 | 1.71 GHz | 320W | $699 |
Titan RTX | 24GB | 4,608 | 1.77 GHz | 280W | $2,499 |
RTX 3090 | 24GB | 10,496 | 1.73 GHz | 350W | $1,499 |
While specs rarely line up with real-world performance, it is a great sign that NVIDIA has doubled the number of CUDA cores compared to the comparable RTX 20 series cards with only a small drop in the boost clock. At the same time, the RTX 3080 and 3090 are also $500-1000 less expensive than the previous generation depending on which models you are comparing them to.
Since only the RTX 3080 is fully launched at this point (the 3090 is set to launch on Sept 24th, and the 3070 sometime in October), we, unfortunately, will only be able to examine the 3080 at this time. However, we are very interested in how the RTX 3070 and 3090 will perform, and when we are able to test those cards, we will post follow-up articles with the results.
Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.
Test Setup
Listed below is the specifications of the system we will be using for our testing:
Test Platform | |
CPU | Intel Core i9 10900K 10 Core |
CPU Cooler | Noctua NH-U12S |
Motherboard | Gigabyte Z490 Vision D |
RAM | 4x DDR4-2933 16GB (64GB total) |
Video Card | Gigabyte GeForce RTX 3080 OC 10GB NVIDIA Titan RTX 24GB NVIDIA GeForce RTX 2080 Ti 11GB NVIDIA GeForce RTX 2080 SUPER 8GB NVIDIA GeForce RTX 2070 SUPER 8GB NVIDIA GeForce RTX 2060 SUPER 8GB AMD Radeon RX 5700 XT 8GB AMD Radeon RX Vega 64 8GB |
Hard Drive | Samsung 960 Pro 1TB |
Software | Windows 10 Pro 64-bit (Ver. 2004) Lightroom Classic 2020 (Ver. 9.4) PugetBench for Lightroom Classic (Ver. 0.92) |
*All the latest drivers, OS updates, BIOS, and firmware applied as of September 7th, 2020
Big thank you to Gigabyte for providing the GeForce RTX™ 3080 GAMING OC 10G used in our testing!
To test each GPU, we will be using the fastest platform currently available for the "Active Tasks" in Lightroom Classic – most notably the Intel Core i9 10900K. We overall recommend AMD Ryzen or Threadripper CPUs for Lightroom Classic due to their significantly higher performance for "Passive Tasks" like generating previews and exporting, but since the GPU is not significantly used for any of those tasks, we decided to use the 10900K to minimize the impact of the processor for the tasks that use the GPU. Even with this, however, be aware that there typically isn't much variation in performance between different video cards in Lightroom Classic.
For the testing itself, we will be using our PugetBench for Lightroom Classic benchmark. This tests a number of range of effects and tasks in Lightroom Classic including importing, exporting, and tests simulating culling tasks. If you wish to run our benchmark yourself, you can download the benchmark and compare your results to thousands of user-submitted results in our PugetBench database.
Raw Benchmark Results
While we are going to go through our analysis of the testing in the next section, we always like to provide the raw results for those that want to dig into the details. If there is a specific task that tends to be a bottleneck in your workflow, examining the raw results is going to be much more applicable than our more general analysis.
Lightroom Classic Performance Analysis
Since this is the first time we are specifically testing GPU performance in Lightroom Classic, we do not yet have a specific "GPU Score" built into our benchmark. In fact, there are several tasks that we hope to include in the future (such as slider and brush lag) that should be an even better indicator of GPU performance than the tasks we currently test.
However, we should be able to see at least some indication of relative GPU performance with our current tests.
Overall, we didn't see much of a difference between the various GPUs we tested. NVIDIA is definitely a hair faster than AMD, but the performance between each NVIDIA GPU is close enough to be within the margin of error. In fact, Lightroom Classic tends to have a larger margin of error than our other benchmarks, and anything within ~5% we would consider to be effectively the same.
If you scroll to the second chart for the "Active Score", this is where we expected to see the largest variation in performance between each GPU model. Here, the new RTX 3080 technically took second place to the previous generation RTX 2080 Ti, although given how close the results are, the RTX 3080, 2080 Ti, and even the older GTX 1080 Ti are all effectively identical.
Realistically, however, only the AMD Radeon 5700XT and (oddly) the NVIDIA Titan RTX showed low enough results that you might be able to notice a difference in your day-to-day work.
How well does the NVIDIA GeForce RTX 3080 perform in Lightroom Classic?
Adobe has been steadily adding GPU support into Lightroom Classic over the last few years, but just like Photoshop, the most important thing is to simply have a supported card. There is some difference between a low-end GPU and one like the new GeForce RTX 3080, but between cards that are roughly in the same ballpark, you will be hard-pressed to notice a difference.
As we mentioned earlier in the article, there are a number of tasks that we currently do not test that should leverage the GPU a bit more (such as slider/brush lag) that we hope to add to our benchmark in the future. We are unfortunately limited to what is possible via the Lightroom Classic API, but with luck, we will be able to improve our GPU-specific testing in the future. However, without significant changes to the Lightroom Classic, we don't expect there to be any reason to invest in a high-end GPU any time in the near future.
As always, keep in mind that these results are strictly for Lightroom Classic. If you have performance concerns for other applications in your workflow, we highly recommend checking out our Hardware Articles (you can filter by "Video Card") for the latest information on how a range of applications perform with the new RTX 3080 GPU, as well as with different CPUs and other hardware.
Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.