[G-Sync monitors] Worth it?

DocHolliday

Contributor
Veteran XX
New place so I am building out my office. I have a good CPU, mono, ram, SSD, etch. Adding on GTX 970 and triple monitors. Don't necessarily see myself gaming with multiple though so the single 970 should be plenty of horse power. Can go SLI if I wanna add on later for Star Citizen.

Anyway, gsync sounds appealing. What is TW's experience?
 
Never heard of them.

If Kotz sees this he will tell you to import a Korean brand, so be ready for that. :p:
 
4k would need insane horsepower for gaming... soooo many pixels. Might be a few years yet. 120/144Hz is what I'm loking at... the ASUS ROG 24" Gsync is what I'm thinking will be my next monitor in the nearish future.

edit: 27 inch, not 24... wot Haggis sed
 
Last edited:
The gist of it is it syncs every refresh of the monitor with a frame from the GPU so tearing is a thing of the past. Games are supposed to look sharper and cleaner during fast moving scenes.

I want to know if its all that in the real world. I trust the people here more than the review sites.
 
Sounds exactly like existing VSYNC technology.

Nvidia said:
Nothing is more distracting when gaming than frame rate stuttering and screen tearing. Stuttering occurs when frame rates fall below the VSync frame rate cap, which is typically 60 frames per second, matching the 60Hz refresh rate of most monitors and screens. When frame rates dip below the cap VSync locks the frame rate to the nearest level, such as 45 or 30 frames per second. As performance improves the frame rate returns to 60.

In performance-intensive games this dramatic change in frame rate can occur several times per second, resulting in clearly noticeable stuttering as the frame rate jumps around, often causing eye strain and headaches.

Gsync eliminates those issues and syncs each frame and is not capped at 60hz.
 
It really depends on if you are a hardcore gamer or not. If you do nothing but game on the PC tower, and your gear can support it, then GSync/Lightmotion/etc is definitely worth it. If gaming is not your #1 for your tower then I would say it is a waste of money.
 
It is pretty much the only reason I use the tower. Rest of the time like right now I am on the couch using my tablet.

Money isn't really an issue...... but I don't like being frivolous as well so I do my research. Starting to wonder if I do triple maybe I'll do the ROG in the middle with two cheaper Asus's on the outside for browsing, email, IRC, voice comms etch.

Will have to look up lightmotion. Never heard of that.
 
It is pretty much the only reason I use the tower. Rest of the time like right now I am on the couch using my tablet.

Money isn't really an issue...... but I don't like being frivolous as well so I do my research. Starting to wonder if I do triple maybe I'll do the ROG in the middle with two cheaper Asus's on the outside for browsing, email, IRC, voice comms etch.

Will have to look up lightmotion. Never heard of that.

LightBoost HOWTO | Blur Busters

however i believe monitors with gsync can't use the 'lightboost' feature if gysnc is turned on
 
Last edited:
Sounds exactly like existing VSYNC technology.

It's actually the opposite. Vsync syncs the number of frames your GPU draws to your monitor's refresh rate.

Whereas Gsync syncs your monitor's refresh rate to however many frames your GPU can draw at any given time.

Haven't seen it in person but it makes perfect sense that you should get a much smoother display out of say 40fps on a 60hz monitor, and no tearing on say 100fps on a 60hz monitor. The monitor is always displaying exactly as many fps as whatever the gpu can output.

I don't know if 4k is mature enough yet

It's not.
 
Last edited:
The monitor is over-priced.

1440p 144mhz, single Displayport I/O.

G-Sync is, in theory, a great idea that is long overdue, but only if you can drive a 1440p monitor faster than its native refresh rate all the time.

The idea of G-Sync is overdue... the industry has for a long time now been designing display devices backwards, syncing (or not syncing as the case is) the frame output of the video card to the fixed refresh rate of the monitor. Which is what causes the problems with artifacting and tearing of images in games.

This was an industry standard originally driven by days of old when CRT's ruled as the native desktop display devices, but was rendered arcane with the advent of flat panel LCD's and modern GPU advances. It's archaic and long over due to be updated to a new standard driven by modern developments in display technology.


Enter G-Sync and FreeSync. The new concept is too sync the refresh rate of the monitor dynamically to the fluctuating output frame rate of the GPU. The monitor now waits for the GPU before updating a new frame to the display, and vice versa, which eliminates the visual artifacts that stem from desyncing.

The question is? Does it work as intended? Is it worth paying almost a thousand dollars for a monitor?

From my experience with this monitor, the jury is still out on it... it's new technology that is not fully adopted or evolved yet. It's hard to justify a $900 1440p monitor.
 
Back
Top