Display technologies have come a long way. In recent years, we have seen advancements in the field at a pace that has been faster than ever before. With better technology, come issues that need resolving. Regarding display tech, as frame rates and refresh rates increased, screen tearing quickly became an issue that needed to be addressed. That’s where G-Sync comes in.

Nvidia is a leader in the computer graphics industry. Over the years, it has implemented and perfected its solution to reduce screen tearing and other artifacts. This is what we know as G-Sync. Let’s take a deeper look at how G-Sync works, and if you should be using it.

See also: Nvidia GPU guide: All Nvidia GPUs explained, and the best Nvidia GPU for you

V-Sync, and the road to G-Sync

To begin, the frame rate is the number of frames the GPU renders per second. Refresh rate is the number of times your monitor refreshes every second. A lack of synchronization between the frame rate and the refresh rate causes screen artifacts like screen tearing, stuttering, and juddering. Synchronization technologies are thus needed to keep these issues from happening.

Before understanding G-Sync, we need to look at V-Sync. V-Sync was a software-based solution that made the GPU hold back frames in its buffer until the monitor could refresh. On paper, this works fine and solves the screen tearing issues.

However, the thing with these artifacts is that they usually occur when the frame and refresh rates are high. A software solution like V-Sync couldn’t synchronize the rates quickly enough, leading to another unacceptable issue — input lag.

Nvidia tried its hand with a software solution of its own. Adaptive V-Sync was driver-based technology which worked by locking the frame rate to the display refresh rate. It unlocked the frame rate when performance dipped and locked it when performance was sufficient.

Nvidia didn’t stop there and went on to later introduce G-Sync, in 2013.

What is G-Sync?

In simple terms, G-Sync is Nvidia’s display technology found in monitors, laptops, and some TVs. It actively helps reduce display artifacts that counter smoothness, like screen tearing, stuttering, and juddering. It needs a supported monitor, and a supported NVIDIA GPU to use.

G-Sync is based on VESA Adaptive-Sync, a technology that works by enabling variable refresh rates on your monitor. G-Sync works opposite of Nvidia’s last effort, using variable refresh rate to make the monitor itself sync its refresh rate to match the frame rate the GPU is churning out.

This means that the input lag is minimized as the processing occurs on the monitor itself, close to the final display output. However, this implementation requires dedicated hardware to work. Nvidia developed a board to replace the scalar board of the monitor, which processes on the monitor side of things. Nvidia’s board comes with 768MB of DDR3 memory to have a buffer for frame comparison.

This board gives the Nvidia driver greater control over the monitor. It acts as an extension of the GPU that communicates with it to ensure the frame rate and the refresh rates are on the same page. It has full control over the vertical blanking interview (VBI), which is the time between the monitor displaying the current frame and it starting with the next frame. The display works in tandem with the GPU, adapting its refresh rates to the frame rates of the GPU, with Nvidia’s drivers at the wheel.

See also: GPU vs CPU: What’s the difference?

G-Sync vs G-Sync Ultimate vs G-Sync Compatible monitors and TVs

Being proprietary and hardware-based, monitors and TVs need official certification from Nvidia. G-Sync has three different tiers with its certification — G-Sync, G-Sync Ultimate, and G-Sync Compatible.

G-Sync Compatible is the most basic of the three tiers. It supports display sizes between 24 to 88 inches. You don’t get the Nvidia board inside, but instead, get validation from the company for no display artifacts.

G-Sync is the middle tier for displays between 24 to 38 inches. It also includes the Nvidia hardware in the monitor with the validation. In addition, displays in this tier have certifications for over 300 tests for display artifacts.

G-Sync Ultimate is the highest tier of this technology. It has displays between 27 and 65 inches. It includes the Nvidia board inside, with the validation and certification in 300+ tests. These displays also get “lifelike” HDR, which simply means that they support true HDR with over 1,000 nits brightness.

TVs are only available in the G-Sync Compatible tier as of yet. Nvidia started offering this certification to certain flagship LG OLED TVs starting in 2019. The 2019 LG B9, C9, and E9, as well as the 2020 LG BX, CX, GX, ZX, and B1, C1, G1, and Z1 series of TVs, are officially supported as of yet.

See also: The best G-Sync monitors for Nvidia-powered PC gaming

G-Sync system requirements

NVIDIA G-Sync board visualized
Nvidia

G-Sync doesn’t just need a supported display to work, it also needs a supported Nvidia GPU. For G-Sync, supported operating systems are Windows 7, 8.1, and 10, and DisplayPort 1.2 support directly from the GPU is required. Here are the other requirements:

  • Desktop PC connected to G-Sync monitor: NVIDIA GeForce GTX 650 Ti BOOST GPU or higher, Nvidia driver version R340.52 or higher
  • Laptop connected to G-Sync monitor: NVIDIA GeForce® GTX 980M, GTX 970M or GTX 965M GPU or higher, Nvidia driver version R340.52 or higher
  • Laptop with G-Sync-supported laptop displays: NVIDIA GeForce® GTX 980M, GTX 970M or GTX 965M GPU or higher (SLI supported), Nvidia driver version R352.06 or higher

G-Sync HDR (i.e. G-Sync Ultimate) has a slightly higher bar for system requirements. It only works with Windows 10 and needs DisplayPort 1.4 support directly from the GPU. Additionally, PCs/laptops connected to G-Sync HDR monitors need NVIDIA GeForce GTX 1050 GPU or higher, and Nvidia R396 GA2 or higher.

See also: Nvidia GeForce RTX 30 series: Everything you need to know

FreeSync, and the disadvantages of G-Sync

AMD FreeSync
AMD

Nvidia to G-Sync is AMD to FreeSync. The major difference between the two is that FreeSync doesn’t use proprietary hardware. It is also based on VESA’s Adaptive-Sync, but uses the regular scalar board found in monitors. This significantly cuts down on the requirements needed to use FreeSync down to an AMD GPU.

The main issue with G-Sync is that the dedicated hardware comes at a price premium with supported monitors. That simply does not exist with FreeSync, making monitor support an easier process. Overall, FreeSync will end up being the significantly cheaper solution of the two. This can be a deal-breaker for those looking to go with one of the two solutions with a brand new PC.

At the end of the day, your pick of syncing technology will mostly depend on your choice of GPU, unless you own a monitor that supports either of the two already, and want a new GPU. If you want to cover your bases, you could also go for a monitor that supports both G-Sync and FreeSync.