What does Vsync actually do? You’ve probably seen it in almost every video game you’ve played, but should you turn it?
You also probably have heard about G-Sync and FreeSync. Which is the best option of them all?
Vsync, which stands for Vertical Synchronization, limits the number of frames your graphics card is outputting to match the refresh rate of your gaming monitor.
Its purpose is to stop screen tearing misalignment and does what its name says, vertically synchronize.
The method lowers your graphics card to only output a frame every time the monitor refreshes.
Good in theory, but V-Sync is not bulletproof.
For example, if your frame rate drops even just one frame below your monitor’s refresh rate then the refresh rate will compensate and be temporarily halved in order to keep in sync with the frame rate.
If the graphics card hasn’t generated a new frame then the previous one is used, resulting in stuttering.
Frames Per Second and Refresh Rate
A few definitions need to be clarified before truly understanding adaptive sync technology and the differences between G-Sync, FreeSync, and Vsync.
First, Frames Per Second (FPS) refers to the number of frames or images that your graphics card can create every second. The more frames that are output per second the smoother the image and the game being played will be.
The Refresh Rate is tied to the monitor and refers to how many frames your monitor can refresh every second. The refresh rate of a monitor is measured in hertz (Hz).
In general, if the amount of FPS your graphics card is producing is equal to or slightly greater than your monitor’s refresh rate then you will have a smooth gameplay experience.
For example, if you have a 120hz monitor and your graphics card is outputting a constant 120 FPS then you’re seeing the smoothest image that your gear possibly can have.
Most gaming monitors are still 60Hz while many monitors nowadays are 120Hz or higher.
Simply put, the higher the refresh rate, the higher the maximum number of frames per second that can be displayed.
The Frame Buffer is basically a temporary storage that receives frames that your graphics card is generating during a game. These frames are then sent to your monitor where they are then displayed.
Typically two buffered images are present at any moment.
A primary buffer which is the one that is being shown your screen and a secondary buffer which is the image that is to be displayed next.
When the two buffers are swapped and a next frame is displayed, the secondary buffer replaces the primary buffer while the secondary buffer is itself replaced by a newly generated one.
Screen Tearing (shown above) usually occurs when you have a powerful graphics card or are running a game that isn’t graphics intensive. It happens because your graphics card is producing more frames than your gaming monitor’s refresh rate.
If the frame buffer swap occurs between a monitor refresh then your display it will be sent two or more images.
Because information for multiple frames is simultaneously sent to your monitor’s display a screen tearing misalignment like in the example above can happen.
G-Sync vs FreeSync
G-Sync is NVIDIA’s patented implementation which requires monitors to include proprietary hardware and costs more.
You are limited into only using a NVIDIA brand graphics card with the monitor if you wish to use G-Sync.
It works under the same principle as FreeSync, but in comparison the current G-Sync monitors fare better at lower frame rates.
Nowadays due to AMD implementing low framerate compensation the two are mostly equal in terms of performance.
The largest drawback for G-Sync is its increased cost which stems from licensing fees.
Something streamer’s appreciate is a nifty feature exclusive to G-Sync that enables it work in “borderless windowed” mode which allows you to quickly Alt+Tab between your game and other programs.
With AMD’s FreeSync technology the monitor automatically changes the refresh rate that it is operating at to match the frame rate of the graphics card.
In order for a monitor to be officially labeled under the FreeSync brand it must live up to AMD’s FreeSync certification standards.
If you want a FreeSync monitor and want to take advantage of its adaptive sync you’ll need an AMD card.
AMD announced that in 2016 FreeSync will be able to work over HDMI and not just DisplayPort like G-Sync.
The following table summarizes the differences between the two competing technologies:
|Feature||FreeSync Monitor||G-Sync Monitor|
|Proprietary Module Required||No||Yes|
|Can be used with AMD card||Yes||No|
|Can be used with NVIDIA card||Yes||Yes|
How useful was this post?
Click on a star to rate it!
Average rating / 5. Vote count: