By Benjy Boxer
VSync is a crucial setting in our game streaming application and one of the easiest ways to lower latency in your games generally. Although it’s a really common setting in PC games, many people aren’t aware of the intricate functions happening in the background to produce a smoother video and why you’ll see screen tearing when it’s off.
What’s Happening When Your Screen Refreshes?
Monitors have a refresh rate that is typically between 60–144hz. One hz or hertz means you’ll have one cycle per second, so 60hz is 60 cycles per second. When the display functions happen, the image is drawn from left to right, and top to bottom.
During every cycle, your monitoring is having an image drawn to it really quickly. Internally, each drawing function is a swapping of back buffers to the front buffer. Your GPU typically has 1–3 buffers. Two buffers is called “double buffering” and is most common. Three buffers is aptly named “triple buffering” and is used in some situations (most of the time on Macs). For simplicity, during the rest of the article, it’ll be assumed that you’re running a GPU with double buffering. With double buffering, your GPU has a back and front buffer. The front buffer is what you see; the back buffer is what you’re pushing information to from the GPU. So, the program (in this case, a game) is sending data to the GPU, which is then writing information to the back buffer. From there, the information is sending the rendered data to the front buffer for display. It gets really confusing, however, when you have multiple GPUs in your machine, which most computers have because many CPUs have an integrated GPU. If you plug your monitor into the GPU doing the rendering, the writing to the back buffer and then displaying via the front buffer all happens in the same GPU. If you plug your monitor into a port in your motherboard, you’ll be rendering with your discrete graphics (Nvidia or AMD typically) and displaying with your Intel or AMD GPU on the integrated graphics card. You should try to avoid this because it adds latency with data being passed around the machine.
If you want to see the refresh rates, you can record your computer monitor with a camera. When you’re watching a video of your computer monitor, you can sometimes see lines flowing down the screen. This happens when you’re recording a 60hz screen with the standard 50hz camera.
How This Impacts PC Gaming
In PC gaming, if VSync is off, you’ll see tearing because you’re seeing the split between the two images. The front buffer hasn’t been completely drawn yet while the back buffer has already been swapped with the front buffer. The display has started to draw the next frame on the screen. Since video is just a series of still images, you’re actually seeing half of the screen with the next still image and half with the image you should be seeing. That tear is your brain interpreting two slightly different images. This reduces lag, but it can be annoying for some gamers less concerned about tearing.
VSync Times The Buffer Swapping
VSync is a timing mechanism that uses the ultra sensitive clock on the GPU to time when the next frame should be drawn on the monitor. If VSync is off the GPU spits out the front buffer to the monitor whether it’s fully written or not. If VSync is on, it times the swapping of back to front in such a way that the front will always be fully written before it sent to the display. You’re basically locking the back buffer until the front buffer is done drawing. Although every monitor is different, the time it takes to draw an image on the screen is typically a few milliseconds. When the GPU is told the monitor has fully completed its drawing, it swaps the front and back buffers, back becomes front and vice versa. This results in a much smoother video, but it definitely adds latency to the video because the GPU is purposefully waiting to tell the monitor to draw the image.
At this point, we should also mention what’s happening with GSync and FreeSync. These are alternatives to VSync sponsored by Nvidia and AMD. Rather than the GPU waiting for the monitor’s refresh cycle to swap buffers, the GPU actually tells the monitor to change it’s refresh rate whenever it gets a fully rendered frame, thus giving you the VSync benefit of no tearing plus the lower latency that you would get with VSync off. This new technology is included in newer Nvidia and AMD GPUs. You’ll also need a monitor that supports these.
How Does VSync Impact Parsec?
Just like a game, the Parsec window is being drawn to your screen. We use OpenGL to draw frames and manage frame buffers. We run into the same frame buffer issues and drawing that you would get if your monitor were plugged directly into your gaming PC. We manage the timing of drawing the frame via similar mechanisms to what would happen directly in the GPU without Parsec on your gaming PC. With VSync on in the Parsec Application, we’re timing your frames to make sure they appear smooth on the screen. With VSync off, there’s no timing added, and you’ll see the same tearing you’d expect on a gaming PC directly connected to a monitor with VSync off. There’s a caveat here, however, on some Intel integrated GPUs, they actually do the VSync no matter what. So if VSync is on in Parsec, the client may actually VSync twice.
VSync In The Game Versus VSync In Parsec
Yeah…Things get a little confusing here. Parsec gets the frame from the host when it’s finished being rendered to the back buffer(s). VSync on the host in theory shouldn’t be needed, but many drivers, games, and other factors can impact how gracefully the system handles providing the fully-rendered frame to Parsec. The short answer: It’s a subjective setting for each different system and game, but most of the time, turning VSync off on the host is a snappier experience. If you’re optimizing for lowering lag, we recommend turning VSync off in the game and in Parsec. If you’re optimizing for smoothness, we recommend turning VSync on in both. If you’re up for experimenting, you can try messing around with the combination of the two.