Skip to main content

Scientific computing (also known as computational science) is about as niche a PC category as you’ll find. Its requirements are very precise, and while some are similar to other types of computing — encoding video and using computer-aided design (CAD) applications, for example — much of what scientific computing requires for best performance is application-specific.

So, what is scientific computing all about? Fundamentally, it’s the use of computers to solve a vast array of scientific problems in fields ranging from physics to chemistry to biology and beyond. Practically speaking, more than anything it’s about executing exceptionally large mathematical models, utilizing complex algorithms, running complicated simulations, and developing involved computational models. These can all be very computationally intense.

Some of the areas where scientific computing is utilized include:

  • Economic modeling
  • Biological systems
  • Engineering solutions
  • Urban planning
  • Machine Learning/Deep learning

So, what computer components are important for a well-performing and efficient PC for scientific computing? Frankly, all of them. You need a fast CPU, loads of RAM, fast storage devices with copious space, and a speedy GPU. Consider: the ultimate scientific computing machine is a supercomputer, and increasingly organizations are building multi-computer systems to keep up with their needs. Accordingly, putting a PC on your desktop for scientific computing isn’t quite as simple as selecting a productivity, creativity, or even gaming machine.

The following are a few PC configurations that you can consider, but there really aren’t that many ranges. A low-end PC just won’t cut it if you want to get real work done.

 CPUVideo CardStorageMemory
Intel Core i9-10980HKNVIDIA RTX 30801x 2TB SSD
16 GB DDR4
Intel Xeon Silver 4210NVIDIA Quadro RTX 40001x 4TB SSD32 GB ECC DDR4
(2x) Intel Silver Xeon 4214NVIDIA Quadro RTX 60002x  4TB SSD64 GB ECC DDR4


Scientific computing software can be rather obscure, but chances are if you’re reading this you’re involved in the field and well aware of the most appropriate tools. Some familiar titles will likely include MATLAB by Mathworks, Wolfrum Mathematica, Paraview, and Caffe. The first three applications are used to perform complex data analysis and visualization, while the fourth is a deep learning tool.

If you look at the basic requirements for these applications, they’re not that extensive. MATLAB, for example, recommends just an Intel or AMD CPU with at least four logical cores and AVX2 instruction set support (included with almost every modern Intel and AMD CPU), 3.5 GB of solid-state drive (SSD) storage space, and 8GB of RAM. A GPU can be used to accelerate the application, and NVIDIA’s CUDA parallel computing platform is supported meaning you can use off-the-shelf NVIDIA GPUs.

At the same time, the faster the CPU, the better performance you’re going to get. If you’re running particularly large simulations, for example, or working with huge datasets, then you’ll want all the CPU power you can get your hands on. Starting with Intel’s top-end 10th-gen Core i9 processor will give you plenty of headroom for all but the most demanding processes.

If you’re delving into machine learning, you’ll want to consider Intel’s single or dual Intel Xeon processors. These CPUs are designed to work with the most complex tasks, including scientific computing applications. Check out our tool for buying a motherboard to make sure you’re prepared to take on that kind of power.


Many scientific computing applications can use NVIDIA’s CUDA parallel computing platform to accelerate a variety of tasks. Therefore, choosing an NVIDIA GPU is likely a safe choice. Another popular option is OpenCL, and if that’s what your applications utilize, then you can consider AMD GPUs as well.

Your biggest choice will be between a consumer, gaming-oriented GPU like an NVIDIA RTX or an AMD Radeon, and a commercial CPU like NVIDIA’s Quadro and AMD’s Radeon Pro. As with CPUs, if you can afford a commercial GPU then that’s the way you’ll want to go. You’ll get better stability through more developed drivers and perhaps additional functionality — for example, NVIDIA’s Ampere architecture as featured in its Quadro line is specifically mentioned in some scientific application specifications.


How much storage you need will depend entirely on the size of your datasets and how many of them you need to manage. Most likely, we’re talking about multiple terabytes of storage, with 2TB being a minimum and the more the better.

But no matter how much storage you need, you’ll want to use SSDs rather than slower spinning hard disk drives (HDD). You’ll want to get data to and from your CPU as quickly as you can, and SSDs are the only way to go.

Fortunately, you can buy larger SSDs today for reasonable amounts of money, and so you don’t need to break the back to make sure your storage keeps up with your modelling.


As with storage, you’ll want to get as much RAM as you can afford. Starting with 16GB is likely a safe bet, but buying a motherboard that can support lots of RAM is a great idea. Scientific computing tasks like modeling very complex systems and machine learning can use tons of RAM, and 64GB or more wouldn’t be unreasonable.

As with storage, you’ll also want to get the fastest RAM that you can. Again, your motherboard will come into play here in terms of the RAM speed that it will support. Make sure to select a model that will max out RAM speed as well as the amount of RAM that can be installed.


Some computing tasks, such as gaming and video editing, have very straightforward and easy to specify requirements. Scientific computing is a little different, in that varying applications can have precise requirements for the best performance. This guide is just a cursory look at what kind of PC can best meet your scientific computing needs in general.

Mark Coppock

Author Mark Coppock

A technology and aspiring science fiction writer from just outside Los Angeles, CA.

More posts by Mark Coppock

What's your take?