The Best GPU for DaVinci Resolve | Nvidia Quadro vs GeForce GTX

In the battle of Nvidia Quadro vs GeForce GTX for DaVinci Resolve which GPU should win your hard earned cash?

nvidia quadro vs geforce gtx

There’s one question that seems to be on everyone’s mind when it comes to building a workstation to run DaVinci Resolve… Nvidia Quadro vs GeForce GTX.

The workstation purists will tell you that a high-end Nvidia Quadro card will give you the best performance. I used to think so too, but now I’m here to tell you that’s probably not the case. Read on to find out why I believe the best value GPU for DaVinci Resolve is the Nvidia GeForce GTX line.

A GPU is a fast, powerful, and massively parallel number cruncher. It manipulates image data in memory incredibly quickly and is highly efficient in any application where processing of large blocks of data happens in parallel. This can be any computer graphics application as well as various engineering and scientific processing applications.

However, not all GPU intensive applications have the same requirements. Some require more mathematical precision than others.

Nvidia Quadro vs GeForce GTX

What’s the difference between a high-end workstation GPU and a high-end gaming GPU?

There are a number of differences, but the most significant one has to do with what type of math the GPU cores are optimised for.

Let’s head to wikipedia for some definitions:

Single Precision Floating Point

Single-precision floating-point format is a computer number format that occupies 4 bytes (32 bits) in computer memory and represents a wide dynamic range of values by using a floating point.

Floating Point

The term floating point refers to the fact that a number’s radix point (decimal point, or, more commonly in computers, binary point) can “float”; that is, it can be placed anywhere relative to the significant digits of the number.

Double Precision Floating Point

Double-precision floating-point format is a computer number format that occupies 8 bytes (64 bits) in computer memory and represents a wide, dynamic range of values by using a floating point.

You can dig into it a lot deeper on wikipedia if you want, I can get lost in there for hours.

Single precision operations are referred to simply as FP32, and double precision operations are referred to as FP64. GPU cores are optimised for either FP32 or FP64 operations and while it can compute both, there will be a performance hit as explained the article “Explaining FP64 Performance on GPUs” which is the best of only a few articles I could find that lays this out in a readable manner.

The bottom line is Nvidia GeForce GTX cards have very good FP32 performance and poor FP64 performance.

Not All Nvidia Quadro Cards Are Equal

Many people incorrectly assume that all Nvidia Quadro GPU’s are designed for high FP64 performance. This is not necessarily the case, and depends on the GPU architecture. I’ve heard plenty of people make the blanket argument that double precision floating point capability is the primary justification for any Quadro card’s price, and why it is better suited to DaVinci Resolve. It’s not that simple.

As mentioned in “Explaining FP64 Performance on GPUs“, the Nvidia Quadro M6000 actually has only minimal FP64 performance compared to the Nvidia Quadro K6000 or Tesla K40. There is a difference between the Kepler and Maxwell generations of Quadro cards when it comes to GPU architecture. The simple fact that it’s a Quadro GPU does not necessarily mean it’s geared to FP64 over FP32 operations.

This is not the real flaw in the argument however.

DaVinci Resolve Is Coded For FP32 Only

The real kicker here is that DaVinci Resolve utilises only FP32 operations and makes no use of FP64 operations at all. So any argument that a FP64 workstation class GPU is better suited to DaVinci Resolve due to it’s “better” level of precision is completely unfounded.

Nvidia GeForce GTX for DaVinci Resolve

While there are some other differences between a Quadro or Tesla GPU and GeForce GTX GPU, such as ECC memory, better quality assurance and testing procedures, a top of the line Nvidia GeForce GTX GPU offers incredibly high performance and far better value for money in most DaVinci Resolve applications where there is little, if anything to be gained by spending more money on a Nvidia Quadro GPU.

In conclusion:

  • There is definitely no advantage to having a GPU optimised for FP64 performance when running Resolve, especially since it will come at the cost of FP32 performance which is far more important.
  • Nvidia have decided to include minimal FP64 performance in the Maxwell generation of Nvidia Quadro GPU’s in any case, so that’s not the issue. Both the latest Maxwell generation Quadro and GeForce GTX are FP32 GPU’s.
  • If you want to spend the extra money, there is nothing wrong with choosing a Nvidia Quadro GPU, but you’re not compromising performance in any way by choosing a top of the line GeForce GTX instead.

I would love to hear from anyone who has run Resolve on both GPU’s with real-world comparisons to share. Please comment and get in touch if you have anything to add.

Further Reading

Building a new workstation? Read more about: DaVinci Resolve System Requirements | A Reality Check

Planning on working with XAVC, XAVC-S, AVCHD or any H.264 format? Read more about: XAVC / XAVC-S and DaVinci Resolve | Why You Need to Transcode

Stay Connected

Sign Up today for free and be the first to get notified on new digital cinema technology and filmmaking updates from Digital Cinema Demystified.

Follow Digital Cinema Demystified on Facebook to make sure you see future articles and posts directly in your news feed.

YouTube Channel

Subscribe to learn: technology, technique, creative digital cinema camera and lens tests, color grading tutorials, and much more.

18 thoughts on “The Best GPU for DaVinci Resolve | Nvidia Quadro vs GeForce GTX”

  1. Can you recommend a low cost graphic card to learn DaVinci Resolve. I don’t want to edit 6k or 4k videos. i already have a pc with i7 and a nvidea 610 graphic card. But resolve is not working ( i know it is a low config ). I cant think about a quadro card. So pls give me a suggestion for selecting a gpu.

    1. Hi Jithu, what are the rest of your specs? You can look at a Nvidia GTX 960 with 2GB GDDR5, you should be able to find something around $180. You could also spend less than that, the main thing is to make sure it has at least 2GB GPU memory.

  2. Hello, great explication! It helped me a lot!!! But I’m still with a doubt: what about the 10-bit per color channel feature (on 10-bit monitor obviously), prerogative of quadro/firepro cards?
    I’m wondering if it would be possible to have two cards, one gaming and one pro, in the same system. Do you think it would be possible to have a powerful GTX for GPU computing, like the 1070, and a symple (and relatively cheap) Quadro for GUI 10-bit compositing…
    Any info regarding this argument would be very appreciated, since I’m getting lost in the vastity of the web looking for answers 🙂
    Thank you, Lorenzo

  3. I’m interested in the 4K, DCI-P3 LG 31mu94 monitor for color correction. I haven’t built a PC yet, but I’m leaning towards the Intel 6850k, unless I don’t need the extra PCIe lanes, in which case I’ll downgrade to the 6800k. I plan on at least 16GB RAM, ideally 32GB.

    I am leaning towards the Quadro M4000. I probably will start shooter with a used BMPCC, but I want to eventually invest in a 4K+ camera.

    I am under the impression that I would need a Quadro because the 31mu97 monitor is 10-bit which is only utilized by Quadro cards, but a poster on Reddit told me 10-bit is just marketing. I try to research this, but I get conflicted info.

    The PC I’m planning on building will be for creative projects, from 3D modeling and programming, to writing, to A/V editing (including color correction). I was thinking of investing in the flat, 2k 21:9 LG 34um95 as well for the extra workspace for text-centric tasks, and I know running a 2K wide screen and a large 4K monitor at the same time would require a beefy card, anyway. But if I could save money here, I could invest it, elsewhere.

    Thanks for the post, it was helpful!

  4. Hi,

    Many thanks for your article; very insightful. The main argument thus far I’ve heard of going Quadro vs GeForce, had to do with 8-bit vs 10-bit color support.

    I am about to invest in a new 4k workstation, that should last me about 3-4 years.

    Much like Drew and Lorenzo, regarding the GPU, I too am puzzled by the 10-bit color capability on the Quadro cards vs GeForce cards:
    – If you grade in DaVinci Resolve, will you see a benefit from using 10-bit enabled Quadro cards on your 10-bit computer monitor (e.g. the LG 31mu94 or HP z32x)?
    – In the past, you could tweak GeForce cards to get them to display 10-bit colors. Is that still possible with the 10xx line up of cards?
    – Or should you get a separate 10-bit grading monitor (Eizo or Flanders) and use a Blackmagic Decklink to create a 10-bit color video output to connect to that monitor?

    Also, do you know if Adobe CC video apps are coded for FP32 or FP64?

    Look forward to hearing from you.

    1. Hi Richard, as far as I know, no color correction apps use FP64 and to do so would make very little sense at all. In a pro-setup you aren’t relying on your desktop GUI for color accuracy anyway, so really it doesn’t matter if your GUI GPU can drive a display in 8-bit or 10-bit color. Often the GPU used to drive the desktop GUI is not the most powerful GPU in the system. Resolve recommends two GPU’s, one dedicated to your GUI, the other dedicated to processing, and the color bit-depth output of either doesn’t really matter. What matters is that you have a dedicated 10-bit video output to a dedicated calibrated reference display using something like a Blackmagic Decklink Mini Monitor. There are numerous reasons other than just color bit depth not to rely on a GUI desktop display output for color. A dedicated video output ensures proper color management and that you are monitoring in the correct color space, this will be different to the color space your desktop GUI uses.

  5. Hi Richard,

    Thanks for your reply.

    So this would a good reason to choose a CPU with a build-in GPU for use of the desktop GUI and use a dedicated GeForce 1080 to do the calculations for your full screen grading monitor, which is then connected via a separate 10-bit Decklink?

    This is stuff most computer builders don’t know…

    1. Yes, an on-board GPU for GUI would probably do it but I’m not sure of specific specs for that. The best thing is to download the official configuration guide and take a look at the recommended configs. Yes your 1080 would be a number crunching card, exactly, and it doesn’t need to be connected to any display directly to do its job. Your video output would be from the Decklink Mini Monitor, it’s an inexpensive card and gives you a clean 10-bit output which you can then configure correctly in Resolve’s color management settings.

  6. Hi,

    Thank you for your nice article ! I have a question :

    We have a 2013 Mac Pro ( 8 cores / 64 gb/D700 ) at the office, combined with a BlackMagic UltraStudio 4K in TB connected to a Eizo monitor to have a true 10 bit output as we work with Apple Prores 4:2:2 – 10 bits files. We sometimes work with 4:4:4:4 files but not always.

    We built a PC system that cost way less money that our mac pro with an GTX1070 card, just to make some test in the first place and now we realize that we could migrate from mac to pc, as the pc offers way more options for upgrading and it handles our work the same way as the mac pro.

    Our main question here is to know if we will need to buy a BM PCIe card for a 10 bit output in 4:2:2 to our Eizo Monitor or if we can get it directly from the GTX card ?

    My kind regards,

    JD

    1. Hi JD, for monitoring you’ll definitely need a Decklink card. It’s the only way to get a dedicated clean and true 10-bit signal out for monitoring. The GPU is only used for number crunching, and running your desktop GUI, but that’s not going to help you when it comes to proper monitoring. The Decklink PCIe card is the way to go. Thankfully all you really need is the Decklink Mini Monitor, and that doesn’t cost much either.

  7. hi, i just read your article wich is turning around my head. what about 10 bit monitoring video? titan don’t output that quality as far as i know, to be able to get full spectrum in a pro monitor you need 10 bit video

    1. Hi Milton, in Resolve the GPU output actually has nothing to do with proper monitoring and is often not even driving a display at all. The GPU is being used for processing only and even if it is driving your desktop display, you can’t really rely on your OS and typical desktop display for grading. A lot of new home colorists or hobbyists do it, as well as when grading on a laptop of course, but it’s not ideal and would not be the way a professional grading suite is set up. To get a 10-bit dedicated video output for monitoring on a calibrated 10-bit display you’ll want a separate video card such as the Blackmagic Design Decklink.

Leave a Reply