There are things that are not just difficult to write about, but very difficult. Which you just need to see once, rather than hear about them a hundred times or read on the Internet. For example, it is impossible to describe some natural wonders, such as the majestic Grand Canyon or the snow-capped Altai Mountains. You can look at beautiful pictures with their images a hundred times and admire videos, but all this cannot replace live impressions.

The topic of smooth frame output on the monitor using Nvidia G-Sync technology also applies to such topics - from the text descriptions the changes do not seem so significant, but in the very first minutes of playing a 3D game on a system with an Nvidia Geforce video card connected to G-Sync -monitor, it becomes clear how big the qualitative leap is. And although more than a year has passed since the announcement of the technology, the technology does not lose its relevance, it still has no competitors (among the solutions that have entered the market), and the corresponding monitors continue to be produced.

Nvidia has been working on improving the visual experience of Geforce GPU users in modern games by making rendering smoother for quite some time. You can recall the adaptive synchronization technology Adaptive V-Sync, which is a hybrid that combines modes with vertical sync enabled and disabled (V-Sync On and V-Sync Off, respectively). In the case when the GPU provides rendering at a frame rate lower than the monitor refresh rate, synchronization is disabled, and for FPS exceeding the refresh rate, it is enabled.

Adaptive Sync didn't solve all the smoothness issues, but it was still an important step in the right direction. But why was it necessary to create any special synchronization modes and even release software and hardware solutions? What's wrong with technologies that have been around for decades? Today we'll tell you how Nvidia G-Sync technology helps eliminate all known display artifacts, such as image tearing, unsmooth footage, and increased lag.

Looking far ahead, we can say that G-Sync synchronization technology allows you to get a smooth frame change with the highest possible performance and comfort, which is very noticeable when playing on such a monitor - this is noticeable even to the average home user, and for avid gamers it can mean an improvement reaction time, and at the same time game achievements.

Today, most PC gamers use monitors with a refresh rate of 60 Hz - typical LCD screens, the most popular now. Accordingly, both when synchronization is turned on (V-Sync On) and when it is turned off, there are always some shortcomings associated with the basic problems of ancient technologies, which we will talk about later: high delays and FPS jerks when V-Sync is turned on and unpleasant tearing images when turned off.

And since delays and unsmooth frame rates are more disruptive and annoying to the game, rarely do any players turn on synchronization at all. And even some models of monitors with a refresh rate of 120 and 144 Hz that have appeared on the market cannot help eliminate the problems completely, they simply make them somewhat less noticeable, updating the screen content twice as often, but the same artifacts are still present: lags and the absence of the same comfortable smoothness.

And since monitors with G-Sync, paired with an appropriate Nvidia Geforce graphics card, can provide not only a high refresh rate, but also eliminate all these shortcomings, purchasing such solutions can be considered even more important than even upgrading to a more powerful GPU. But let’s first figure out why it was necessary to do something different from long-known solutions - what’s the problem?

Problems with existing video output methods

Technologies for displaying images on a screen with a fixed refresh rate have appeared since the times when cathode ray tube (CRT) monitors were used. Most readers should remember them - pot-bellied, just like ancient televisions. These technologies were originally developed to display television images at a fixed frame rate, but in the case of devices for displaying 3D images dynamically calculated on a PC, this solution raises major problems that have not yet been resolved.

Even the most modern LCD monitors have a fixed refresh rate of the image on the screen, although technologically nothing prevents you from changing the picture on them at any time, with any frequency (within reasonable limits, of course). But PC gamers since the days of CRT monitors have been forced to put up with a decidedly imperfect solution to the problem of synchronizing the frame rate of 3D rendering and the refresh rate of the monitor. Until now there have been very few options for image output - two, and both of them have drawbacks.

The root of all the problems is that with a fixed refresh rate of the image on the monitor, the video card renders each frame at a different time - this is due to the constantly changing complexity of the scene and the load on the GPU. And the rendering time of each frame is not constant, it changes every frame. It’s no wonder that when trying to display a number of frames on the monitor, synchronization problems arise, because some of them require much more time to render than others. As a result, we get different preparation times for each frame: sometimes 10 ms, sometimes 25 ms, for example. And monitors that existed before the advent of G-Sync could only display frames after a certain period of time - not earlier, not later.

The matter is further complicated by the wealth of software and hardware configurations of gaming PCs, combined with very different loads depending on the game, quality settings, video driver settings, etc. As a result, it is impossible to configure each gaming system so that training is carried out with constant or at least not too different times in all 3D applications and conditions - as is possible on game consoles with their single hardware configuration.

Naturally, unlike consoles with their predictable frame rendering times, PC players are still seriously limited in their ability to achieve a smooth gaming experience without noticeable drops and lags. In an ideal (read - impossible in reality) case, updating the image on the monitor should be carried out strictly after the next frame is calculated and prepared by the graphics processor:

As you can see, in this hypothetical example, the GPU always has time to draw a frame before it needs to be transferred to the monitor - the frame time is always slightly less than the time between updates to the information on the display, and in between the GPU rests a little. But in reality, everything is completely different - the frame rendering time is very different. Imagine if the GPU does not have time to render a frame in the allotted time - then the frame must either be displayed later, skipping one image update on the monitor (vertical synchronization is enabled - V-Sync On), or the frames must be displayed in parts with synchronization disabled, and then on the monitor simultaneously there will be pieces from several adjacent frames.

Most users turn off V-Sync to get lower latency and smoother frames on the screen, but this solution introduces visible artifacts in the form of image tearing. And with synchronization enabled, there will be no image tearing, since the frames are displayed exclusively in their entirety, but the delay between the player’s action and the image update on the screen increases, and the frame output rate is very uneven, since the GPU never draws frames in strict accordance with the image update time on the monitor.

This problem has existed for many years and clearly interferes with the comfort of viewing the result of 3D rendering, but until some time no one bothered to solve it. And the solution, in theory, is quite simple - you just need to display information on the screen strictly when the GPU finishes working on the next frame. But let’s first take a closer look at examples of how exactly existing image output technologies work, and what solution Nvidia offers us in its G-Sync technology.

Disadvantages of output when synchronization is disabled

As we have already mentioned, the vast majority of players prefer to keep synchronization turned off (V-Sync Off) in order to get the frames drawn by the GPU to be displayed on the monitor as quickly as possible and with minimal delay between the player’s actions (keystrokes, mouse commands) and their display. For serious players this is necessary for victories, and for ordinary players in this case the sensations will be more pleasant. This is how working with V-Sync disabled looks schematically:

There are no problems or delays with the output of frames. But although disabled vertical synchronization solves the lag problem as much as possible, providing minimal latency, at the same time artifacts appear in the image - picture tearing, when the image on the screen consists of several pieces of adjacent frames drawn by the GPU. Also noticeable is the lack of smoothness of the video due to the unevenness of the frames coming from the GPU to the screen - image breaks in different places.

This image tearing occurs as a result of an image consisting of two of more frames rendered on the GPU during a single monitor refresh cycle. Of several - when the frame rate exceeds the monitor refresh rate, and of two - when it approximately corresponds to it. Look at the diagram shown above - if the contents of the frame buffer are updated in the middle between times when information is displayed on the monitor, then the final image on it will be distorted - part of the information in this case belongs to the previous frame, and the rest to the current one being drawn.

With synchronization disabled, frames are transmitted to the monitor with absolutely no regard to the frequency and time of its update, and therefore never coincide with the monitor’s refresh rate. In other words, with V-Sync disabled, monitors without G-Sync support will always experience such image tearing.

The point is not only that it is unpleasant for the player to see stripes twitching all over the screen, but also that the simultaneous rendering of parts of different frames can misinform the brain, which is especially noticeable with dynamic objects in the frame - the player sees parts of objects shifted relative to each other. You have to put up with this only because disabling V-Sync provides minimal output delays at the moment, but far from ideal dynamic image quality, as you can see in the following examples (clicking on frames in full resolution):

Using the examples above, taken using the FCAT software and hardware complex, you can see that the real image on the screen can be composed of pieces of several adjacent frames - and sometimes unevenly, when a narrow strip is taken from one of the frames, and the neighboring ones occupy the remaining ( noticeably larger) part of the screen.

Problems with image tearing are even more visible in the dynamics (if your system and/or browser does not support playing MP4/H.264 videos in a resolution of 1920x1080 pixels with a refresh rate of 60 FPS, then you will have to download them and view them locally using a media player with corresponding capabilities):

As you can see, even in dynamics, unpleasant artifacts in the form of picture breaks are easily noticeable. Let's see how this looks schematically - in a diagram that shows the output method when synchronization is disabled. In this case, frames arrive on the monitor immediately after the GPU finishes rendering them, and the image is displayed on the display even if the output of information from the current frame has not yet been completely completed - the remaining part of the buffer falls on the next screen update. That is why each frame of our example displayed on the monitor consists of two frames drawn on the GPU - with an image break in the place marked in red.

In this example, the first frame (Draw 1) is drawn by the GPU to the screen buffer faster than its 16.7 ms refresh time - and before the image is transferred to the monitor (Scan 0/1). The GPU immediately starts working on the next frame (Draw 2), which breaks the picture on the monitor, containing another half of the previous frame.

As a result, in many cases a clearly visible stripe appears on the image - the boundary between the partial display of adjacent frames. In the future, this process is repeated, since the GPU works on each frame for a different amount of time, and without synchronizing the process, the frames from the GPU and those displayed on the monitor never match.

Pros and cons of Vsync

When traditional vertical synchronization (V-Sync On) is enabled, the information on the monitor is updated only when the work on the frame is completely completed by the GPU, which eliminates tearing in the image, because the frames are displayed exclusively on the screen. But, since the monitor updates the content only at certain intervals (depending on the characteristics of the output device), this binding brings other problems.

Most modern LCD monitors update information at a rate of 60 Hz, that is, 60 times per second - approximately every 16 milliseconds. And with synchronization enabled, the image output time is strictly tied to the monitor’s refresh rate. But as we know, the GPU rendering rate is always variable, and the time it takes to render each frame varies depending on the constantly changing complexity of the 3D scene and quality settings.

It cannot always be equal to 16.7 ms, but will be either less than this value or more. When synchronization is enabled, the GPU's work on frames again finishes either earlier or later than the screen refresh time. If the frame was rendered faster than this moment, then there are no special problems - the visual information is simply waiting for the monitor to update to display the entire frame on the screen, and the GPU is idle. But if the frame does not have time to render in the allotted time, then it has to wait for the next image update cycle on the monitor, which causes an increase in the delay between the player’s actions and their visual display on the screen. In this case, the image of the previous “old” frame is again displayed on the screen.

Although all this happens quite quickly, the increase in latency is visually easily noticeable, and not only by professional players. And since the frame rendering time is always variable, turning on the binding to the monitor refresh rate causes jerks when displaying a dynamic image, because the frames are displayed either quickly (equal to the monitor refresh rate), or twice, three or four times slower. Let's look at a schematic example of such work:

The illustration shows how frames are displayed on the monitor when vertical synchronization is turned on (V-Sync On). The first frame (Draw 1) is rendered by the GPU faster than 16.7 ms, so the GPU does not go to work on drawing the next frame, and does not tear the image, as is the case with V-Sync Off, but waits for the first frame to be completely output to the monitor. And only after that it starts drawing the next frame (Draw 2).

But working on the second frame (Draw 2) takes longer than 16.7 ms, so after they expire, visual information from the previous frame is displayed on the screen, and it is shown on the screen for another 16.7 ms. And even after the GPU finishes working on the next frame, it is not displayed on the screen, since the monitor has a fixed refresh rate. In total, you have to wait 33.3 ms for the second frame to be output, and all this time is added to the delay between the player's action and the end of the frame being output to the monitor.

Added to the problem of time lag is a gap in the smoothness of the video sequence, noticeable in the jerkiness of the 3D animation. The problem is shown very clearly in a short video:

But even the most powerful graphics processors in demanding modern games cannot always provide a sufficiently high frame rate that exceeds the typical monitor refresh rate of 60 Hz. And, accordingly, they will not allow you to play comfortably with synchronization turned on and without problems such as picture tearing. Especially when it comes to games such as the online game Battlefield 4, the very demanding Far Cry 4 and Assassin’s Creed Unity in high resolutions and maximum game settings.

That is, the modern player has little choice - either get a lack of smoothness and increased delays, or be content with imperfect picture quality with broken pieces of frames. Of course, in reality everything doesn’t look so bad, because somehow we played all this time, right? But in times when they are trying to achieve the ideal in both quality and comfort, you want more. Moreover, LCD displays have the fundamental technological ability to output frames when the graphics processor indicates it. The only thing left to do is to connect the GPU and monitor, and such a solution already exists - Nvidia G-Sync technology.

G-Sync technology - Nvidia's solution to problems

So, most modern games, when synchronization is turned off, cause picture tearing, and when synchronization is turned on, they cause unsmooth frame changes and increased delays. Even with high refresh rates, traditional monitors do not eliminate these problems. It's likely that Nvidia's employees have been so fed up with the choice between two less-than-ideal options for displaying frames in 3D applications for many years that they decided to get rid of the problems by giving players a fundamentally new approach to updating information on the display.

The difference between G-Sync technology and existing display methods is that the timing and frame rate of the Nvidia variant is determined by the Geforce GPU, and it is dynamically variable rather than fixed, as was previously the case. In other words, in this case, the GPU takes full control of the frame output - as soon as it finishes working on the next frame, it is displayed on the monitor, without delays or image tearing.

Using such a connection between the GPU and specially adapted monitor hardware gives players a better output method - simply ideal, in terms of quality, eliminating all the problems we mentioned above. G-Sync ensures perfectly smooth frame changes on the monitor, without any delays, jerks or artifacts caused by the display of visual information on the screen.

Naturally, G-Sync doesn't work magically, and to make the technology work on the monitor side requires the addition of special hardware logic in the form of a small board supplied by Nvidia.

The company is working with monitor manufacturers to include G-Sync cards in their gaming display models. For some models there is even an option for an upgrade by the user himself, but this option is more expensive and does not make sense, because it is easier to immediately buy a G-Sync monitor. For a PC, it is enough to have any of the modern Nvidia Geforce video cards in its configuration, and an installed G-Sync-optimized video driver - any of the latest versions will do.

When Nvidia G-Sync technology is enabled, after finishing processing the next frame of a 3D scene, the Geforce graphics processor sends a special signal to the G-Sync controller board built into the monitor, and it tells the monitor when to update the image on the screen. This allows you to achieve simply perfect smoothness and responsiveness when playing on a PC - you can verify this by watching a short video (necessarily at 60 frames per second!):

Let's see what the configuration looks like with G-Sync technology enabled, according to our diagram:

As you can see, everything is very simple. Enabling G-Sync locks the monitor's refresh rate to the end of each frame rendering on the GPU. The GPU fully controls the work: as soon as it finishes rendering the frame, the image is immediately displayed on a G-Sync-compatible monitor, and the result is not a fixed display refresh rate, but a variable one - exactly like the GPU frame rate. This eliminates problems with image tearing (after all, it always contains information from one frame), minimizes frame rate stuttering (the monitor does not wait longer than the frame is physically processed on the GPU) and reduces output lag compared to the method with V-sync enabled.

It must be said that players clearly did not have enough of such a solution; the new method of synchronizing the GPU and the Nvidia G-Sync monitor really has a very strong effect on the comfort of playing on a PC - that almost perfect smoothness appears, which was not there before - in our time of super-powerful video cards! Since the announcement of G-Sync technology, old methods have instantly become anachronistic and upgrading to a G-Sync monitor capable of a variable refresh rate of up to 144 Hz seems like a very attractive option that allows you to finally get rid of problems, lags and artifacts.

Does G-Sync have any disadvantages? Of course, like any technology. For example, G-Sync has an unpleasant limitation, which is that it provides smooth frame output on the screen at a frequency of 30 FPS. And the selected refresh rate for a monitor in G-Sync mode sets the upper limit for the speed at which screen content is refreshed. That is, with a refresh rate set to 60 Hz, maximum smoothness will be provided at a frequency of 30–60 FPS, and at 144 Hz - from 30 to 144 FPS, but not less than the lower limit. And with a variable frequency (for example, from 20 to 40 FPS), the result will no longer be ideal, although it is noticeably better than traditional V-Sync.

But the main disadvantage of G-Sync is that it is Nvidia's own technology, which competitors do not have access to. Therefore, at the beginning of this year, AMD announced a similar FreeSync technology, which also consists of dynamically changing the frame rate of the monitor in accordance with the preparation of frames from the GPU. An important difference is that AMD’s development is open and does not require additional hardware solutions in the form of specialized monitors, since FreeSync has been transformed into Adaptive-Sync, which has become an optional part of the DisplayPort 1.2a standard from the well-known organization VESA (Video Electronics Standards Association). It turns out that AMD will skillfully use the theme developed by its competitor to its advantage, since without the advent and popularization of G-Sync, they would not have had any FreeSync, as we think.

Interestingly, Adaptive-Sync technology is also part of the VESA embedded DisplayPort (eDP) standard, and is already used in many display components that use eDP for signal transmission. Another difference from G-Sync is that VESA members can use Adaptive-Sync without having to pay anything. However, it is very likely that Nvidia will also support Adaptive-Sync in the future as part of the DisplayPort 1.2a standard, because such support will not require much effort from them. But the company will not give up G-Sync either, as it considers its own solutions a priority.

The first monitors with Adaptive-Sync support should appear in the first quarter of 2015, they will not only have DisplayPort 1.2a ports, but also special support for Adaptive-Sync (not all monitors with DisplayPort 1.2a support will be able to boast of this). Thus, in March 2015, Samsung plans to launch the Samsung UD590 (23.6 and 28 inches) and UE850 (23.6, 27 and 31.5 inches) monitor lines with support for UltraHD resolution and Adaptive-Sync technology. AMD claims that monitors with support for this technology will be up to $100 cheaper than similar devices with G-Sync support, but it is difficult to compare them, since all monitors are different and come out at different times. In addition, there are already not so expensive G-Sync models on the market.

Visual difference and subjective impressions

We described the theory above, and now it’s time to show everything clearly and describe your feelings. We tested Nvidia G-Sync technology in practice in several 3D applications using an Inno3D iChill Geforce GTX 780 HerculeZ X3 Ultra graphics card and an Asus PG278Q monitor that supports G-Sync technology. There are several models of monitors on the market that support G-Sync from different manufacturers: Asus, Acer, BenQ, AOC and others, and for the monitor model Asus VG248QE you can even buy a kit to upgrade it to support G-Sync on your own.

The youngest video card model to use G-Sync technology is the Geforce GTX 650 Ti, with the extremely important requirement of a DisplayPort connector on board. Other system requirements include an operating system of at least Microsoft Windows 7, the use of a good DisplayPort 1.2 cable, and the use of a high-quality mouse with high sensitivity and polling rate is recommended. G-Sync technology works with all full-screen 3D applications that use the OpenGL and Direct3D graphics APIs when running on Windows 7 and 8.1 operating systems.

Any modern driver will be suitable for operation, which - G-Sync has been supported by all the company's drivers for more than a year. If you have all the required components, you only need to enable G-Sync in the drivers, if this has not already been done, and the technology will work in all full-screen applications - and only in them, based on the very principle of the technology.

To enable G-Sync technology for full-screen applications and get the best experience, you need to enable the 144 Hz refresh rate in the Nvidia Control Panel or operating system desktop settings. Then, you need to make sure that the use of the technology is allowed on the corresponding “G-Sync Setup” page...

And also - select the appropriate item on the “Manage 3D Parameters” page in the “Vertical Sync Pulse” parameter of the global 3D parameters. There you can also disable the use of G-Sync technology for testing purposes or if any problems arise (looking ahead, we did not find any during our testing).

G-Sync technology works at all resolutions supported by monitors, up to UltraHD, but in our case we used the native resolution of 2560x1440 pixels at 144 Hz. In my comparisons to the current state of affairs, I used a 60Hz refresh rate mode with G-Sync disabled to emulate the behavior of typical non-G-Sync monitors found on most gamers. Most of which use Full HD monitors capable of a maximum mode of 60 Hz.

It is definitely worth mentioning that although with G-Sync enabled, the screen refresh will be at the ideal frequency - when the GPU “wants” it, the optimal mode will still be rendering at a frame rate of about 40-60 FPS - this is the most suitable frame rate for modern games, not too small to hit the lower limit of 30 FPS, but also not requiring lowering the settings. By the way, this is the frequency that Nvidia’s Geforce Experience program strives for, providing the appropriate settings for popular games in the software of the same name included with the drivers.

In addition to games, we also tried a specialized test application from Nvidia - . This application shows a 3D pendulum scene that is convenient for assessing the smoothness and quality, allows you to simulate different frame rates and select the display mode: V-Sync Off/On and G-Sync. Using this test software it is very easy to show the difference between different synchronization modes - for example, between V-Sync On and G-Sync:

The Pendulum Demo application allows you to test different synchronization methods in different conditions, it simulates an exact frame rate of 60 FPS to compare V-Sync and G-Sync in ideal conditions for the outdated synchronization method - in this mode there should simply be no difference between the methods. But the 40–50 FPS mode puts V-Sync On in an awkward position, where delays and unsmooth frame changes are visible to the naked eye, since the frame rendering time exceeds the refresh period at 60 Hz. When G-Sync is turned on, everything becomes perfect.

As for comparing modes with V-Sync disabled and G-Sync enabled, the Nvidia application also helps to see the difference here - at frame rates between 40 and 60 FPS, image tearing is clearly visible, although there are fewer lags than with V-Sync On. And even an unsmooth video sequence in relation to the G-Sync mode is noticeable, although in theory this should not be the case - perhaps this is how the brain perceives “broken” frames.

Well, with G-Sync enabled, any of the modes of the test application (constant frame rate or variable - it doesn’t matter) always ensures the smoothest video possible. And in games, all the problems of the traditional approach to updating information on a monitor with a fixed refresh rate are sometimes even more noticeable - in this case, you can clearly evaluate the difference between all three modes using the example of the game StarCraft II (viewing a previously saved recording):

If your system and browser support playing the MP4/H.264 video format at a frequency of 60 FPS, then you will clearly see that in the disabled synchronization mode there are obvious tearing of the picture, and when V-Sync is turned on, jerks and unsmoothness of the video are observed. All this disappears when Nvidia G-Sync is turned on, in which there are no artifacts in the image, no increase in delays, or “ragged” frame rate.

Of course, G-Sync is not a magic wand, and this technology will not get rid of delays and slowdowns that are not caused by the process of outputting frames to a monitor with a fixed refresh rate. If the game itself has problems with the smoothness of frame output and large jerks in FPS caused by loading textures, data processing on the CPU, suboptimal work with video memory, lack of code optimization, etc., then they will remain in place. Moreover, they will become even more noticeable, since the output of the remaining frames will be perfectly smooth. However, in practice, problems do not occur too often on powerful systems, and G-Sync really improves the perception of dynamic video.

Since Nvidia's new output technology affects the entire output pipeline, it could theoretically cause artifacts and uneven frame rates, especially if the game artificially caps the FPS at some point. Probably, such cases, if they exist, are so rare that we did not even notice them. But they noted a clear improvement in gaming comfort - when playing on a monitor with G-Sync technology enabled, one gets the impression that the PC has become so much more powerful that it is capable of a constant frame rate of at least 60 FPS without any dropouts.

The feeling you get when playing on a G-Sync monitor is very difficult to describe in words. The difference is especially noticeable at 40-60 FPS - a frame rate that is very common in demanding modern games. The difference compared to conventional monitors is simply amazing, and we will try not only to tell it in words and show it in video examples, but also to show frame rate graphs obtained under different display modes.

In games of such genres as real-time strategy and similar ones, like StarCraft II, League of Legends, DotA 2, etc., the advantages of G-Sync technology are clearly visible, as you can see from the example in the video above. In addition, such games always require fast-paced action that does not tolerate delays and unsmooth frame rates, and smooth scrolling plays a rather important role in comfort, which is greatly hampered by picture tearing with V-Sync Off and delays and lags with V-Sync On. So G-Sync technology is ideal for games of this type.

First-person shooters like Crysis 3 and Far Cry 4 are even more common; they are also very demanding on computing resources, and with high quality settings, players often get frame rates of just about 30-60 FPS - ideal for use G-Sync, which really significantly improves the comfort when playing in such conditions. The traditional vertical sync method will very often force you to output frames at a frequency of only 30 FPS, increasing lags and jerks.

The same goes for third-person games like the Batman, Assassin's Creed, and Tomb Raider series. These games also use the latest graphics technology and require fairly powerful GPUs to achieve high frame rates. With maximum settings in these games and disabling V-Sync, FPS often results in the order of 30–90, which causes unpleasant image tearing. Enabling V-Sync only helps in some scenes with lower resource requirements, and the frame rate jumps from 30 to 60 steps, which causes slowdowns and jerks. And turning on G-Sync solves all these problems, and this is clearly noticeable in practice.

Practice test results

In this section, we'll look at the impact of G-Sync and V-Sync on frame rates - the performance graphs give you a clear idea of ​​how the different technologies perform. During testing, we tested several games, but not all of them are convenient to show the difference between V-Sync and G-Sync - some gaming benchmarks do not allow you to force V-Sync, other games do not have a convenient means of playing the exact game sequence (most modern games unfortunately), still others execute on our test system either too quickly or within narrow framerate limits.

So we settled on Just Cause 2 with maximum settings, as well as a couple of benchmarks: Unigine Heaven and Unigine Valley - also at maximum quality settings. The frame rates in these applications vary quite widely, which is convenient for our purpose of showing what happens to frame output under different conditions.

Unfortunately, at the moment we do not have the FCAT software and hardware system in use, and we will not be able to show graphs of real FPS and recorded videos in different modes. Instead, we tested the second-average and instantaneous frame rates using a well-known utility at 60 and 120 Hz monitor refresh rates using V-Sync On, V-Sync Off, Adaptive V-Sync, and G-Sync technology at 144 Hz to show the clear difference between the new technology and current 60 Hz monitors with traditional vertical sync.

G-Sync vs V-Sync On

We will begin our study by comparing modes with vertical synchronization enabled (V-Sync On) and G-Sync technology - this is the most revealing comparison, which will show the difference between methods that do not have the disadvantages of image tearing. First, we will look at the Heaven test application at maximum quality settings in a resolution of 2560x1440 pixels (clicking on thumbnail images opens graphs in full resolution):

As can be seen in the graph, the frame rate with G-Sync enabled and without synchronization is almost the same, except for the frequency above 60 FPS. But the FPS in the mode with the vertical synchronization method enabled is noticeably different, because in it the frame rate can be lower than or equal to 60 FPS and a multiple of integers: 1, 2, 3, 4, 5, 6..., since the monitor sometimes has to show the same previous frame over several update periods (two, three, four, and so on). That is, possible “steps” of the frame rate value with V-Sync On and 60 Hz: 60, 30, 20, 15, 12, 10, ... FPS.

This gradation is clearly visible in the red line of the graph - during the run of this test, the frame rate was often equal to 20 or 30 FPS, and much less often - 60 FPS. Although with G-Sync and V-Sync Off (No Sync) it was often within a wider range: 35–50 FPS. With V-Sync enabled, this output rate is not possible, so the monitor always displays 30 FPS in such cases - limiting performance and adding lag to the overall output time.

It should be noted that the graph above does not show the instantaneous frame rate, but average values ​​within a second, and in reality FPS can “jump” much more - almost every frame, which causes unpleasant instability and lags. In order to see this clearly, we present a couple of graphs with instant FPS - more precisely, with graphs of the rendering time of each frame in milliseconds. First example (the lines are slightly shifted relative to each other, only approximate behavior in each mode is shown):

As you can see, in this example, the frame rate in the case of G-Sync changes more or less smoothly, and with V-Sync On it changes stepwise (there are single jumps in rendering time in both cases - this is normal). With Vsync enabled, frame rendering and output times can be as low as 16.7 ms; 33.3 ms; 50 ms, as can be seen on the graph. In FPS numbers this corresponds to 60, 30 and 20 frames per second. Besides this, there is no particular difference between the behavior of the two lines; there are peaks in both cases. Let's look at another significant period of time:

In this case, there are obvious fluctuations in the frame rendering time, and with them the FPS in the case of vertical synchronization enabled. Look, with V-Sync On there is an abrupt change in frame rendering time from 16.7 ms (60 FPS) to 33.3 ms (30 FPS) and back - in reality, this causes that very uncomfortable unsmoothness and clearly visible jerks in the video sequence. The smoothness of frame changes in the case of G-Sync is much higher and playing in this mode will be noticeably more comfortable.

Let's look at the FPS graph in the second test application - Unigine Valley:

In this benchmark we note about the same thing as in Heaven. The frame rates in G-Sync and V-Sync Off modes are almost the same (except for a peak above 60 Hz), and V-Sync turned on causes a clear step change in FPS, most often showing 30 FPS, sometimes dropping to 20 FPS and rising to 60 FPS - typical behavior of this method, causing lags, jerks and unsmooth video footage.

In this subsection, we just have to look at a segment from the built-in test of the game Just Cause 2:

This game perfectly shows the inadequacy of the outdated V-Sync On synchronization method! When the frame rate varies from 40 to 60-70 FPS, the G-Sync and V-Sync Off lines almost coincide, but the frame rate with V-Sync On reaches 60 FPS only in short periods. That is, with the real capabilities of the GPU for playing at 40-55 FPS, the player will be content with only 30 FPS.

Moreover, in the section of the graph where the red line jumps from 30 to 40 FPS, in reality, when viewing the image, there is a clear unsmooth frame rate - it jumps from 60 to 30 almost every frame, which clearly does not add smoothness and comfort when playing. But maybe vertical sync will cope better with a frame refresh rate of 120 Hz?

G-Sync vs V-Sync 60/120 Hz

Let's look at two modes of enabled vertical synchronization V-Sync On at 60 and 120 Hz image refresh rates, comparing them with the V-Sync Off mode (as we defined earlier, this line is almost identical to G-Sync). At a refresh rate of 120 Hz, more values ​​are added to the FPS “steps” we already know: 120, 40, 24, 17 FPS, etc., which can make the graph less stepped. Let's look at the frame rate in the Heaven benchmark:

It is noticeable that the 120Hz refresh rate helps V-Sync On mode achieve better performance and smoother frame rates. In cases where at 60 Hz the graph shows 20 FPS, the 120 Hz mode gives an intermediate value of at least 24 FPS. And 40 FPS instead of 30 FPS is clearly visible on the graph. But there are no fewer steps, but even more, so that the frame rate with a 120 Hz update, although it changes by a smaller amount, does so more often, which also adversely affects the overall smoothness.

There are fewer changes in the Valley benchmark, as the average frame rate is closest to the 30 FPS level available for both 60 and 120 Hz refresh rates. Sync Off provides smoother frames but with visual artifacts, and V-Sync On modes again show jagged lines. In this subsection we just have to look at the game Just Cause 2.

And again we clearly see how flawed vertical synchronization is, which does not provide a smooth change of frames. Even switching to a 120 Hz refresh rate gives the V-Sync On mode just a few extra “steps” of FPS - the jumps in frame rate back and forth from one step to another have not gone away - all this is very unpleasant when viewing animated 3D scenes, you can take our word for it or watch the sample videos above again.

Impact of output method on average frame rate

What happens to the average frame rate when all these synchronization modes are enabled, and how does enabling V-Sync and G-Sync affect the average performance? You can roughly estimate the speed loss even from the FPS graphs shown above, but we will also present the average frame rate values ​​we obtained during testing. The first one will be Unigine Heaven again:

The performance in the Adaptive V-Sync and V-Sync Off modes is almost the same - after all, the speed almost does not increase above 60 FPS. It is logical that enabling V-Sync also leads to a decrease in the average frame rate, since this mode uses stepped FPS indicators. At 60Hz, the drop in average frame rate was more than a quarter, and turning on 120Hz brought back only half the loss in average FPS.

The most interesting thing for us is how much the average frame rate drops in G-Sync mode. For some reason, the speed above 60 FPS is cut, although the monitor was set to 144 Hz mode, so the speed when G-Sync is turned on was slightly lower than the mode with synchronization disabled. In general, we can assume that there are no losses at all, and they certainly cannot be compared with the lack of speed with V-Sync On. Let's consider the second benchmark - Valley.

In this case, the drop in average rendering speed in modes with V-Sync enabled decreased, since the frame rate throughout the test was close to 30 FPS - one of the frequency “steps” for V-Sync in both modes: 60 and 120 Hz. Well, for obvious reasons, the losses in the second case were slightly lower.

When G-Sync was turned on, the average frame rate was again lower than that noted in the disabled synchronization mode, all for the same reason - turning on G-Sync “killed” FPS values ​​above 60. But the difference is small, and Nvidia’s new mode provides noticeably faster speeds than with Vsync enabled. Let's look at the last chart - the average frame rate in the game Just Cause 2:

In the case of this game, V-Sync On mode suffered significantly more than in test applications on the Unigine engine. The average frame rate in this mode at 60 Hz is more than one and a half times lower than with synchronization disabled altogether! Enabling a 120 Hz refresh rate greatly improves the situation, but still G-Sync allows you to achieve noticeably better performance even in average FPS numbers, not to mention the comfort of the game, which can no longer be assessed by numbers alone - you have to see it with your own eyes.

So, in this section we found out that G-Sync technology provides frame rates close to the mode with synchronization disabled, and its inclusion has almost no effect on performance. In contrast to V-Sync, when enabled, the frame rate changes in steps, and often there are jumps from one step to another, which causes unsmooth movements when outputting an animated series of frames and has a detrimental effect on comfort in 3D games.

In other words, both our subjective impressions and test results suggest that Nvidia's G-Sync technology really changes the visual comfort of 3D games for the better. The new method is devoid of graphic artifacts in the form of tearing of a picture consisting of several adjacent frames, as we see in the mode with V-Sync disabled, and there are no problems with the smoothness of frame output to the monitor and increased output delays, as in the V-Sync mode On.

Conclusion

With all the difficulties of objectively measuring the smoothness of video output, I would first like to express a subjective assessment. We were quite impressed with the gaming experience on the Nvidia GeForce and the G-Sync-enabled monitor from Asus. Even a one-time “live” demonstration of G-Sync really makes a strong impression with the smoothness of frame changes, and after a long trial of this technology, it becomes very dreary to continue playing on a monitor with old methods of displaying images on the screen.

Perhaps G-Sync can be considered the biggest change in the process of displaying visual information on the screen in a long time - we finally saw something truly new in the connection between displays and GPUs, which directly affects the comfort of viewing 3D graphics, and even and so noticeable. And before Nvidia announced G-Sync technology, for many years we were tied to outdated image output standards, rooted in the requirements of the TV and film industry.

Of course, I would like to have such capabilities even earlier, but now is not a bad time for its implementation, since in many demanding 3D games, at maximum settings, top modern video cards provide a frame rate at which the benefits of enabling G-Sync become maximum. And before the advent of technology from Nvidia, the realism achieved in games was simply “killed” by far from the best methods of updating the image on the monitor, causing image tearing, increased delays and jerks in the frame rate. G-Sync technology allows you to get rid of these problems by equating the frame rate on the screen to the rendering speed of the graphics processor (albeit with some limitations) - this process is now managed by the GPU itself.

We have not met a single person who tried G-Sync at work and remained dissatisfied with this technology. Reviews from the very first lucky people who tested the technology at an Nvidia event last fall were entirely enthusiastic. Journalists from the trade press and game developers (John Carmack, Tim Sweeney and Johan Andersson) also supported the new withdrawal method. We now join in - after several days of using a monitor with G-Sync, I don’t want to go back to old devices with long-outdated synchronization methods. Ah, if only there were more choice of monitors with G-Sync, and if they weren’t equipped exclusively with TN matrices...

Well, among the disadvantages of Nvidia’s technology, we can note that it operates at a frame rate of at least 30 FPS, which can be considered an annoying drawback - it would be better if even at 20-25 FPS the image was displayed clearly after it was prepared on the GPU . But the main disadvantage of the technology is that G-Sync is the company’s own solution, which is not used by other GPU manufacturers: AMD and Intel. You can also understand Nvidia, because they spent resources on the development and implementation of the technology and negotiated with monitor manufacturers to support it precisely with the desire to make money. Actually, they once again acted as the engine of technical progress, despite the company’s supposed greed for profit. Let's reveal a big "secret": profit is the main goal of any commercial company, and Nvidia is no exception.

And yet, the future is more likely to lie in more universal open standards, similar in essence to G-Sync, like Adaptive-Sync, an optional feature within DisplayPort 1.2a. But the appearance and distribution of monitors with such support will have to wait some more time - somewhere until the middle of next year, and G-Sync monitors from different companies (Asus, Acer, BenQ, AOC and others) have already been on sale for several months , although not too cheap. Nothing prevents Nvidia from supporting Adaptive-Sync in the future, although they have not officially commented on this topic. Let's hope that GeForce fans not only now have a working solution in the form of G-Sync, but that in the future it will be possible to use dynamic refresh rates within the framework of a generally accepted standard.

Among other disadvantages of Nvidia G-Sync technology for users, we note that its support on the monitor side costs the manufacturer a certain amount, which also results in an increase in the retail price relative to standard monitors. However, among G-Sync monitors there are models of different prices, including some that are not too expensive. The main thing is that they are already on sale, and every player can get maximum comfort when playing right now, and so far only when using Nvidia Geforce video cards - the company vouches for this technology.

Do you have a G-SYNC capable monitor and an NVIDIA graphics card? Let's look at what G-SYNC is, how to enable it and configure it correctly in order to fully use the potential and capabilities of this technology. Keep in mind that just turning it on isn't everything.

Every gamer knows what vertical synchronization (V-Sync) is. This function synchronizes the image frames in such a way as to eliminate the effect of screen tearing. If you disable vertical synchronization on a regular monitor, you will decrease the input lag (latency) and you will notice that the game will respond better to your commands, but thus the frames will not be properly synchronized and it will end up with screen tearing.

V-Sync eliminates screen tearing, but at the same time causes an increase in the delay of the image output relative to the controls, so that the game becomes less comfortable. Every time you move the mouse, it appears that the movement effect occurs with a slight delay. And here the G-SYNC function comes to the rescue, which allows you to eliminate both of these shortcomings.

What is G-SYNC?

A rather expensive but effective solution for NVIDIA GeForce video cards is the use of G-SYNC technology, which eliminates screen tearing without using additional input lag. But to implement it you need a monitor that includes the G-SYNC module. The module adjusts the screen refresh rate to the number of frames per second, so there is no additional delay and the effect of screen tearing is eliminated.

Many users, after purchasing such a monitor, only enable NVIDIA G-SYNC support in the NVIDIA Control Panel settings with the belief that this is all they should do. Theoretically yes, because G-SYNC will work, but if you want to fully maximize the use of this technology, then you need to enable a number of additional functions related to the appropriate setting of classic V-sync and limiting the FPS in games to a number of frames less than the maximum refresh rate monitor. Why? You will learn all this from the following recommendations.

Enabling G-SYNC in the NVIDIA Control Panel

Let's start with the simplest basic solution, that is, from the moment you turn on the G-SYNC module. This can be done using the NVIDIA Control Panel. Right-click on your desktop and select NVIDIA Control Panel.

Then go to the Display tab - G-SYNC Setup. Here you can enable the technology using the “Enable G-SYNC” field. Tag it.

You can then specify whether it will only work in full screen mode, or can also activate in games running in windowed mode or a full screen window (without borders).

If you select the “Enable G-SYNC for full screen mode” option, the function will only work in games that have the full screen mode set (this option can be changed in the settings of specific games). Games in windowed mode or full screen will not use this technology.

If you want windowed games to also use G-SYNC technology, then enable the “Enable G-SYNC for windowed and full screen mode” option. When this option is selected, the function intercepts the currently active window and overlays its action on it, enabling it to support modified screen refresh. You may need to restart your computer to activate this option.

How to check that this technology is enabled. To do this, open the Display menu at the top of the window and check the “G-SYNC Indicator” field in it. This will inform you that G-SYNC is enabled when you launch the game.

Then go to the Manage 3D Settings tab in the side menu. In the "Global settings" section, find the "Preferred refresh rate" field.

Set this to "Highest available". Some games may impose their own refresh rate, which may result in G-SYNC not being fully utilized. Thanks to this parameter, all game settings will be ignored and the ability to use the maximum monitor refresh rate will always be enabled, which in devices with G-SYNC is most often 144Hz.

In general, this is the basic setup you need to do to enable G-SYNC. But, if you want to fully use the potential of your equipment, you should read the following instructions.

What should I do with V-SYNC if I have G-SYNC? Leave it on or turn it off?

This is the most common dilemma of G-SYNC monitor owners. It is generally accepted that this technology completely replaces the classic V-SYNC, which can be completely disabled in the NVIDIA Control Panel or simply ignored.

First you need to understand the difference between them. The task of both functions is theoretically the same - to overcome the effect of screen tearing. But the method of action is significantly different.

V-SYNC synchronizes frames to match the monitor's constant refresh rate. Consequently, the function acts as an intermediary, capturing the picture and therefore the display frame so as to adapt them to a constant frame rate, thereby preventing image tearing. As a result, this can lead to input lag (delay), because V-SYNC must first “capture and organize” the image, and only then display it on the screen.

G-SYNC works exactly the opposite. It adjusts not the image, but the monitor refresh rate to the number of frames displayed on the screen. Everything is done in hardware using the G-SYNC module built into the monitor, so there is no additional delay in displaying the image, as is the case with vertical synchronization. This is its main advantage.

The whole problem is that G-SYNC only works well when the FPS is in the supported refresh rate range. This range covers frequencies from 30 Hz to the maximum value the monitor supports (60Hz or 144Hz). That is, this technology works to its full potential when FPS does not drop below 30 and does not exceed 60 or 144 frames per second, depending on the maximum supported refresh rate. The infographic below, created by BlurBusters, looks really good.

What happens if the fps goes outside this range? G-SYNC will not be able to adjust the screen update, so anything outside the range will not work. You will find exactly the same problems as on a regular monitor without G-SYNC and classic vertical sync will work. If it is turned off, screen tearing will occur. If it is turned on, you will not see the gap effect, but an iput lag (delay) will appear.

Therefore, it is in your best interest to stay within the G-SYNC refresh range, which is a minimum of 30Hz and a maximum of whatever the monitor maxes out (144Hz is most common, but there are 60Hz displays as well). How to do it? Using appropriate vertical synchronization parameters, as well as by limiting the maximum number of FPS.

What, then, is the conclusion from this? In a situation where the number of frames per second drops below 30 FPS, you need to leave vertical sync still enabled. These are rare cases, but if it does happen, V-SYNC ensures that there will be no tearing effect. If the upper limit is exceeded, then everything is simple - you need to limit the maximum number of frames per second so as not to approach the upper limit, when crossed, V-SYNC is turned on, thereby ensuring continuous operation of G-SYNC.

Therefore, if you have a 144 Hz monitor, you need to enable the FPS cap at 142 to avoid going too close to the upper limit. If the monitor is 60 Hz, set the limit to 58. Even if the computer is able to make more FPS, it will not do this. Then V-SYNC will not turn on and only G-SYNC will be active.

Enabling Vsync in NVIDIA Settings

Open the NVIDIA Control Panel and go to the “Manage 3D Settings” tab. In the Global Setting section, find the Vertical Sync option and set the option to “On”.

Thanks to this, vertical synchronization will always be ready to turn on if the FPS drops below 30 FPS, and a monitor with G-SYNC technology would not be able to cope with this.

Limit FPS to less than maximum screen refresh rate

The best way to limit frames per second is to use RTSS (RivaTuner Statistics Server) program. Of course, the best solution is to use the limiter built into the game, but not everyone has one.

Download and run the program, then in the list of games on the left side, check the Global field. Here you can set a common limiter for all applications. On the right side, find the “Framerate limit” field. Set the limit here for 144Hz monitors - 142 FPS, respectively, for 60Hz devices -58 FPS.

When the limit is set, there will be no delay in activating classic vertical synchronization and playing will become much more comfortable.

Best monitors for gaming | Models supporting Nvidia G-Sync technologies

Variable or adaptive screen refresh rate technology comes in two varieties: AMD FreeSync and Nvidia G-Sync. They perform the same function - they reduce the refresh rate of the source (video card) and the display to prevent annoying frame tearing during fast movements in the game. FreeSync is part of the DisplayPort specification, while G-Sync requires additional hardware licensed from Nvidia. The implementation of G-Sync adds about $200 to the price of the monitor. If you already have a modern GeForce graphics card, the choice is obvious. If you're still undecided, you should know that G-Sync has one advantage. When the frame rate drops below the G-Sync threshold, which is 40 fps, frames are duplicated to prevent image tearing. FreeSync does not have such a feature.


Pivot table


Model AOC G2460PG Asus RoG PG248Q Dell S2417DG Asus ROG SWIFT PG279Q
Category FHD FHD QHD QHD
Best price in Russia, rub. 24300 28990 31000 58100
Panel/backlight type TN/W-LED TN/W-LED edge array TN/W-LED edge array AHVA/W-LED edge array
24" / 16:9 24" / 16:9 24" / 16:9 27" / 16:9
Curvature radius No No No No
1920x1080 @ 144 Hz 1920x1080 @ 144 Hz, 180 Hz overclocked 2560x1440 @ 144 Hz, 165 Hz overclocked 2560x1440 @ 165 Hz
FreeSync operating range No No No No
Color depth/color gamut 8-bit (6-bit with FRC) / sRGB 8-bit/sRGB 8-bit/sRGB 8-bit/sRGB
Response time (GTG), ms 1 1 1 4
Brightness, cd/m2 350 350 350 350
Speakers No No No (2) 2 W
Video inputs (1) DisplayPort (1) DisplayPort v1.2, (1) HDMI v1.4 (1) DisplayPort v1.2, (1) HDMI v1.4
Audio connectors No (1) 3.5mm headphone output (1) 3.5 mm Stereo in, (1) 3.5 mm headphone output (1) 3.5mm headphone output
USB v3.0: (1) input, (2) outputs; v2.0: (2) outputs v3.0: (1) input, (2) output v3.0: (1) input, (4) output v3.0: (1) input, (2) output
Energy consumption, W 40 typical 65 max. 33 typical 90 max., 0.5 expected
559x391-517x237 562x418-538x238 541x363x180 620x553x238
Panel thickness, mm 50 70 52 66
Frame width, mm 16-26 11 top/side: 6, bottom: 15 8-12
Weight, kg 6,5 6,6 5,8 7
Guarantee 3 years 3 years 3 years 3 years

Model Acer Predator XB271HK Acer Predator XB321HK Asus ROG PG348Q Acer Predator Z301CTМ
Category UHD UHD WQHD QHD
Best price in Russia, rub. 43900 62000 102000 58000
Panel/backlight type AHVA/W-LED edge array IPS/W-LED edge array AH-IPS/W-LED edge array AMVA/W-LED, edge array
Screen diagonal/aspect ratio 27" / 16:9 32" / 16:9 34" / 21:9 30" / 21:9
Curvature radius No No 3800 mm 1800 mm
Maximum resolution/frequency updates 3840x2160 @ 60 Hz 3840x2160 @ 60 Hz 3440x1440 @ 75 Hz, 100 Hz overclocked 2560x1080 @ 144 Hz, 200 Hz overclocked
FreeSync operating range No No No 8-bit/sRGB
Color depth/color gamut 10-bit/sRGB 10-bit/sRGB 10-bit/sRGB 10-bit/sRGB
Response time (GTG), ms 4 4 5 4
Brightness, cd/m2 300 350 300 300
Speakers (2) 2 W, DTS (2) 2 W, DTS (2) 2 W (2) 3W, DTS
Video inputs (1) DisplayPort v1.2, (1) HDMI v1.4 (1) DisplayPort, (1) HDMI (1) DisplayPort v1.2, (1) HDMI v1.4 (1) DisplayPort v1.2, (1) HDMI v1.4
Audio connectors (1) 3.5mm headphone output (1) 3.5mm headphone output (1) 3.5mm headphone output (1) 3.5mm headphone output
USB v3.0: (1) input, (4) output v3.0: (1) input, (4) output v3.0: (1) input, (4) output v3.0: (1) input, (3) output
Energy consumption, W 71.5 typical 56 typical 100 max. 34 W at 200 nits
Dimensions LxHxW (with base), mm 614x401-551x268 737x452-579x297 829x558x297 714x384-508x315
Panel thickness, mm 63 62 73 118
Frame width, mm top/side: 8, bottom: 22 top/side: 13, bottom: 20 top/side: 12, bottom: 24 top/side: 12, bottom: 20
Weight, kg 7 11,5 11,2 9,7
Guarantee 3 years 3 years 3 years 3 years

AOC G2460PG – FHD 24 inches


  • Best price in Russia: 24,300 rub.

ADVANTAGES

  • Excellent implementation of G-Sync
  • Screen refresh rate 144 Hz
  • ULMB Motion Blur Suppression
  • High build quality
  • Very high color and grayscale quality

FLAWS

  • Non-standard gamma
  • Insufficient brightness for optimal ULMB performance
  • Not IPS

VERDICT

Although G-Sync remains a premium and expensive option, the AOC G2460PG is the first monitor in this segment that is aimed at the budget buyer. It costs about half the price of the Asus ROG Swift, so you can save a little money, or install two monitors on your desk at once.

Asus RoG PG248Q – FHD 24 inches


  • Best price in Russia: RUB 28,990.

ADVANTAGES

  • G-Sync
  • 180 Hz
  • Low latency
  • Responsiveness
  • Color accuracy with calibration
  • Sleek appearance
  • Build quality

FLAWS

  • To achieve the best picture, adjustments are required
  • Contrast
  • Expensive

VERDICT

The PG248Q is like an exotic sports car - expensive and impractical to operate. But if you set the correct settings during installation, you will get an excellent gaming experience. In terms of smoothness and responsiveness, this monitor is perhaps the best we've tested to date. It is worth the money and time spent. Highly recommended.

Dell S2417DG


  • Best price in Russia: 31,000 rub.

    ADVANTAGES

    • Superior motion processing quality
    • Color accuracy at factory settings
    • Resolution QHD
    • Refresh rate165 Hz
    • Gaming Features
    • Frame 6 mm

    FLAWS

    • Contrast
    • Gamma Curve Accuracy
    • ULMB reduces light output and contrast
    • Viewing Angles

    VERDICT

    If Dell had fixed the gamma issues we encountered in testing, the S2417DG would have earned our Editor's Choice award. The monitor conveys movements incredibly smoothly, with absolutely no ghosting, shaking or tearing - you can’t take your eyes off it. The benefit from the ULMB function is minor, but nevertheless it is present. It's not the cheapest 24-inch gaming monitor, but it beats out its more expensive competitors and deserves a spot on the list.

    Asus RoG Swift PG279Q – QHD 27 inches


    • Best price in Russia: 58,100 rub.

    ADVANTAGES

    • Stable operation at 165 Hz
    • G-Sync
    • Vivid and sharp images
    • Saturated color
    • GamePlus
    • Joystick for OSD menu
    • Stylish appearance
    • High build quality

    FLAWS

    • Significant reduction in luminous flux in ULMB mode
    • Calibration is required to achieve the best image quality
    • Expensive

    VERDICT

    Asus' new addition to the ROG lineup isn't perfect, but it's definitely worth a look. The PG279Q has everything an enthusiast needs, including a crisp and bright IPS panel, 165Hz refresh rate, and G-Sync. This monitor isn't cheap, but we haven't heard of users regretting the purchase yet. We enjoyed playing on this monitor, and you'll probably enjoy it too.

    Acer Predator XB271HK – UHD 27 inches


    • Best price in Russia: 43,900 rub.

    ADVANTAGES

    • Rich colors
    • Image accuracy at factory settings
    • G-Sync
    • Ultra HD resolution
    • Viewing Angles
    • Build quality

    FLAWS

    • Expensive

    G-Sync Technology Overview | A Brief History of Fixed Refresh Rate

    Once upon a time, monitors were bulky and contained cathode ray tubes and electron guns. Electron guns bombard the screen with photons to illuminate colored phosphor dots we call pixels. They draw from left to right each "scanning" line from top to bottom. Adjusting the speed of the electron gun from one full update to the next was not very practical before, and there was no particular need for this before the advent of 3D games. Therefore, CRTs and related analog video standards were designed with a fixed refresh rate.

    LCD monitors gradually replaced CRTs, and digital connectors (DVI, HDMI and DisplayPort) replaced analog connectors (VGA). But the associations responsible for standardizing video signals (led by VESA) have not moved away from fixed refresh rates. Film and television still rely on an input signal at a constant frame rate. Once again, switching to a variable refresh rate doesn't seem all that necessary.

    Adjustable frame rates and fixed refresh rates are not the same

    Before the advent of modern 3D graphics, fixed refresh rates were not an issue for displays. But it came up when we first encountered powerful GPUs: the rate at which the GPU rendered individual frames (what we call the frame rate, usually expressed in FPS or frames per second) is not constant. It changes over time. In heavy graphic scenes, the card can provide 30 FPS, and when looking at an empty sky - 60 FPS.

    Disabling synchronization causes gaps

    It turns out that the variable frame rate of the GPU and the fixed refresh rate of the LCD panel do not work very well together. In this configuration, we encounter a graphical artifact called “tearing.” It occurs when two or more partial frames are rendered together during the same monitor refresh cycle. Usually they are displaced, which gives a very unpleasant effect while moving.

    The image above shows two well-known artifacts that are common but difficult to capture. Because these are display artifacts, you won't see this in regular game screenshots, but our images show what you actually see while playing. To shoot them, you need a camera with a high-speed shooting mode. Or if you have a card that supports video capture, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to the next; This is the method we use for FCAT tests. However, it is best to observe the described effect with your own eyes.

    The tearing effect is visible in both images. The top one was done using a camera, the bottom one was done through the video capture function. The bottom picture is “cut” horizontally and looks displaced. In the top two images, the left photo was taken on a Sharp screen with a 60 Hz refresh rate, the right one was taken on an Asus display with a 120 Hz refresh rate. The tearing on a 120Hz display is less pronounced because the refresh rate is twice as high. However, the effect is visible and appears in the same way as in the left image. This type of artifact is a clear sign that the images were taken with vertical sync (V-sync) turned off.

    Battlefield 4 on GeForce GTX 770 with V-sync disabled

    The second effect that is visible in the images of BioShock: Infinite is called ghosting. It is especially visible at the bottom left of the photo and is associated with a delay in screen refresh. In short, individual pixels do not change color quickly enough, resulting in this type of glow. A single frame cannot convey the effect ghosting has on the game itself. A panel with an 8ms gray-to-gray response time, such as the Sharp, will end up producing a blurry image with any movement on the screen. This is why these displays are generally not recommended for first-person shooters.

    V-sync: "wasted on soap"

    Vertical sync, or V-sync, is a very old solution to the tearing problem. When this feature is activated, the graphics card attempts to match the screen's refresh rate, eliminating tearing completely. The problem is that if your graphics card can't keep the frame rate above 60 FPS (on a 60 Hz display), the effective frame rate will jump between multiples of the screen refresh rate (60, 30, 20, 15 FPS, etc.) etc.), which in turn will lead to noticeable slowdowns.

    When the frame rate drops below the refresh rate with V-sync active, you will experience stuttering

    Moreover, since V-sync makes the graphics card wait and sometimes relies on the invisible surface buffer, V-sync can introduce additional input lag into the render chain. Thus, V-sync can be both a blessing and a curse, solving some problems but causing other disadvantages. An informal survey of our staff found that gamers tend to disable V-sync, only turning it on when tearing becomes unbearable.

    Get Creative: Nvidia Unveils G-Sync

    When starting a new video card GeForce GTX 680 Nvidia has included a driver mode called Adaptive V-sync that attempts to mitigate problems by enabling V-sync when frame rates are above the monitor's refresh rate and quickly disabling it when performance drops sharply below the refresh rate. Although the technology did its job well, it was a workaround that did not eliminate tearing if the frame rate was lower than the monitor's refresh rate.

    Implementation G-Sync much more interesting. Generally speaking, Nvidia is showing that instead of forcing graphics cards to run at a fixed display frequency, we can force new monitors to run at a variable frequency.

    GPU frame rate determines the monitor's refresh rate, removing artifacts associated with enabling and disabling V-sync

    The packet data transfer mechanism of the DisplayPort connector has opened up new possibilities. By using variable blanking intervals in the DisplayPort video signal and replacing the monitor scaler with a module that operates on variable blanking signals, the LCD panel can operate at a variable refresh rate related to the frame rate that the video card is outputting (within the monitor's refresh rate). In practice, Nvidia got creative with the special features of the DisplayPort interface and tried to catch two birds with one stone.

    Even before testing begins, I would like to commend the team for their creative approach to solving a real problem affecting PC gaming. This is innovation at its finest. But what are the results G-Sync on practice? Let's find out.

    Nvidia sent us an engineering sample of the monitor Asus VG248QE, in which the scaling device is replaced by a module G-Sync. We are already familiar with this display. The article is dedicated to him "Asus VG248QE Review: 24-Inch 144Hz Gaming Monitor for $400", in which the monitor earned Tom's Hardware Smart Buy award. Now it's time to find out how Nvidia's new technology will affect the most popular games.

    G-Sync Technology Overview | 3D LightBoost, built-in memory, standards and 4K

    As we reviewed Nvidia's press materials, we asked ourselves many questions, both about the technology's place in the present and its role in the future. During a recent trip to the company's headquarters in Santa Clara, our US colleagues received some answers.

    G-Sync and 3D LightBoost

    The first thing we noticed was that Nvidia sent a monitor Asus VG248QE, modified to support G-Sync. This monitor also supports Nvidia's 3D LightBoost technology, which was originally designed to boost the brightness of 3D displays but has long been used unofficially in 2D mode, using pulsating panel backlighting to reduce ghosting (or motion blur). Naturally, it became interesting whether this technology is used in G-Sync.

    Nvidia gave a negative answer. While using both technologies at the same time would be ideal, today strobing the backlight at a variable refresh rate leads to flicker and brightness issues. Solving them is incredibly difficult because you have to adjust the brightness and track the pulses. As a result, the choice now is between the two technologies, although the company is trying to find a way to use them simultaneously in the future.

    Built-in G-Sync module memory

    As we already know, G-Sync eliminates the step input lag associated with V-sync as there is no longer a need to wait for the panel scan to complete. However, we noticed that the module G-Sync has built-in memory. Can the module buffer frames on its own? If so, how long will it take for the frame to travel through the new channel?

    According to Nvidia, frames are not buffered in the module's memory. As data arrives, it is displayed on the screen, and the memory performs some other functions. However, the processing time for G-Sync noticeably less than one millisecond. In fact, we encounter almost the same delay when V-sync is turned off, and it is associated with the characteristics of the game, video driver, mouse, etc.

    Will G-Sync be standardized?

    This question was asked in a recent interview with AMD, when the reader wanted to know the company's reaction to the technology G-Sync. However, we wanted to ask the developer directly and find out if Nvidia plans to bring the technology to an industry standard. In theory, a company can offer G-Sync as an upgrade to the DisplayPort standard, providing variable refresh rates. After all, Nvidia is a member of the VESA association.

    However, there are no new specifications planned for DisplayPort, HDMI or DVI. G-Sync already supports DisplayPort 1.2, that is, the standard does not need to be changed.

    As noted, Nvidia is working on compatibility G-Sync with a technology that is now called 3D LightBoost (but will soon have a different name). In addition, the company is looking for a way to reduce the cost of modules G-Sync and make them more accessible.

    G-Sync at Ultra HD resolutions

    Nvidia promises monitors with support G-Sync and resolutions up to 3840x2160 pixels. However, the model from Asus, which we will look at today, only supports 1920x1080 pixels. Currently, Ultra HD monitors use the STMicro Athena controller, which has two scalers to create a tiled display. We are wondering if there will be a module G-Sync support MST configuration?

    In truth, 4K displays with variable frame rates will still have to wait. There is no separate upscaling device that supports 4K resolution yet; the nearest one should appear in the first quarter of 2014, and monitors equipped with them will not appear until the second quarter. Since the module G-Sync replaces the scaling device, compatible panels will begin to appear after this point. Fortunately, the module natively supports Ultra HD.

    NVIDIA G-SYNC monitors feature revolutionary NVIDIA technology that eliminates screen tearing and input lag with VSync frame synchronization and enhances the capabilities of modern monitors, resulting in the smoothest, most responsive gaming experience you've ever seen .

    As a result, game scenes appear instantly, objects become clearer, and gameplay becomes smoother.

    HOW DOES NVIDIA G-SYNC WORK?

    NVIDIA® G-SYNC™ is an innovative solution that breaks old molds to create the most advanced and responsive computer displays in history. The NVIDIA G-SYNC module can be installed independently or purchased pre-installed in the latest gaming monitors. It helps you forget about screen tearing, input lag, and eye-strain-inducing jitter that were caused by older technologies that migrated from analog TVs to modern monitors.

    PROBLEM: OLD MONITOR TECHNOLOGY

    When televisions were invented, they used cathode ray tubes, which worked by scanning the flow of electrons across the surface of a phosphor-coated tube. This beam caused the pixel to flicker, and when enough pixels were activated fairly quickly, the cathode ray tube created the feeling of full-motion video. Believe it or not, these early TVs operated at a refresh rate of 60 Hz, since the commercial AC frequency in the United States is 60 Hz. Matching the TV's refresh rate to industrial AC frequency made it easier to build early electronics and reduce screen noise.

    By the time personal computers were invented in the early 1980s, cathode ray TV technology was firmly established as the simplest and most cost-effective technology for creating computer monitors. 60Hz and fixed refresh rates have become the standard, and system builders have learned to make the most of a less-than-ideal situation. Over the past three decades, while cathode ray TV technology has evolved into LCD and LED TV technology, no major company has dared to challenge this stereotype, and synchronizing the GPU with the monitor's refresh rate remains standard industry practice to this day.

    The problem is that video cards do not render images at a constant frequency. In fact, GPU frame rates will vary significantly even within the same scene in the same game depending on the current GPU load. And if monitors have a fixed refresh rate, then how do you transfer images from the GPU to the screen? The first way is to simply ignore the monitor's refresh rate and refresh the image mid-cycle. We call this VSync disabled mode, which is how most gamers play by default. The downside is that when one monitor refresh cycle includes two images, a very noticeable "break line" appears, commonly referred to as screen tearing. A good known way to combat screen tearing is to enable VSync technology, which causes the GPU to delay screen refreshes until the start of a new monitor refresh cycle. This causes image judder when the GPU frame rate is lower than the display refresh rate. It also increases latency, which leads to input lag - a noticeable delay between pressing a key and the result appearing on the screen.

    To make matters worse, many players suffer from eye strain due to image shaking, and others develop headaches and migraines. This led us to develop Adaptive VSync technology, an effective solution that was well received by critics. Despite the creation of this technology, the problem of input lag still remains, and this is unacceptable for many enthusiast gamers and is completely unsuitable for professional gamers (e-sports) who independently configure their video cards, monitors, keyboards and mice to minimize annoying delays between action and reaction.

    SOLUTION: NVIDIA G-SYNC

    Meet NVIDIA G-SYNC technology, which eliminates screen tearing, VSync display lag, and image jitter. To achieve this revolutionary capability, we created G-SYNC for monitors, which allows the monitor to synchronize with the frame rate of the GPU, rather than the other way around, resulting in faster, smoother, tear-free images that take gaming to the next level.

    Industry gurus John Carmack, Tim Sweeney, Johan Andersson and Mark Rein were impressed by NVIDIA G-SYNC technology. Esports players and eSports leagues are lining up to use NVIDIA G-SYNC technology, which will unleash their true skills, requiring even faster reactions thanks to an imperceptible delay between on-screen actions and keyboard commands. During internal testing, avid gamers spent their lunch hours playing online LAN matches using G-SYNC-enabled monitors to win.

    If you have a monitor that supports NVIDIA G-SYNC, you will have an undeniable advantage over other players in online games, provided that you also have a low ping.

    A REVOLUTIONARY SOLUTION IS HERE

    In these times of technological wonders, few advances can be called truly “innovative” or “revolutionary.” However, NVIDIA G-SYNC is one of those few breakthroughs that revolutionizes outdated monitor technology with an innovative approach that has never been tried before.

    G-SYNC eliminates input lag, tearing, and jitter for a stunning visual experience on any G-SYNC-enabled monitor; they're so great that you'll never want to use a regular monitor again. In addition to the visual revolution, multiplayer gamers will benefit from the combination of G-SYNC, fast GeForce GTX graphics, and low-latency input devices. This will definitely interest fans of shooters. For eSports athletes, NVIDIA G-SYNC is a significant improvement. Because G-SYNC eliminates input lag, success or failure is now up to the players, helping to separate the pros from the amateurs.

    If you, like eSports athletes, want the sharpest, smoothest, most responsive gaming experience possible, then NVIDIA G-SYNC enabled monitors are a breakthrough that you won't find anywhere else. A true innovation in the era of enhancements, NVIDIA G-SYNC will revolutionize the way you play games.



  • This article is also available in the following languages: Thai

    • Next

      THANK YOU so much for the very useful information in the article. Everything is presented very clearly. It feels like a lot of work has been done to analyze the operation of the eBay store

      • Thank you and other regular readers of my blog. Without you, I would not be motivated enough to dedicate much time to maintaining this site. My brain is structured this way: I like to dig deep, systematize scattered data, try things that no one has done before or looked at from this angle. It’s a pity that our compatriots have no time for shopping on eBay because of the crisis in Russia. They buy from Aliexpress from China, since goods there are much cheaper (often at the expense of quality). But online auctions eBay, Amazon, ETSY will easily give the Chinese a head start in the range of branded items, vintage items, handmade items and various ethnic goods.

        • Next

          What is valuable in your articles is your personal attitude and analysis of the topic. Don't give up this blog, I come here often. There should be a lot of us like that. Email me I recently received an email with an offer that they would teach me how to trade on Amazon and eBay. And I remembered your detailed articles about these trades. area I re-read everything again and concluded that the courses are a scam. I haven't bought anything on eBay yet. I am not from Russia, but from Kazakhstan (Almaty). But we also don’t need any extra expenses yet. I wish you good luck and stay safe in Asia.

    • It’s also nice that eBay’s attempts to Russify the interface for users from Russia and the CIS countries have begun to bear fruit. After all, the overwhelming majority of citizens of the countries of the former USSR do not have strong knowledge of foreign languages. No more than 5% of the population speak English. There are more among young people. Therefore, at least the interface is in Russian - this is a big help for online shopping on this trading platform. eBay did not follow the path of its Chinese counterpart Aliexpress, where a machine (very clumsy and incomprehensible, sometimes causing laughter) translation of product descriptions is performed. I hope that at a more advanced stage of development of artificial intelligence, high-quality machine translation from any language to any in a matter of seconds will become a reality. So far we have this (the profile of one of the sellers on eBay with a Russian interface, but an English description):
      https://uploads.disquscdn.com/images/7a52c9a89108b922159a4fad35de0ab0bee0c8804b9731f56d8a1dc659655d60.png