LG 32GK850G QHD G-Sync Gaming Monitor Reviewed

When it comes to gaming monitors I’ll admit to having little experience and perhaps even little concern for what window I view my gaming through. I don’t play many fast-paced action games, so that competitive edge is meaningless to me. For the most part, in fact, I’m writing words or writing code.

But despite being a haughty sceptic of 144Hz gaming, or any refresh rates above 60hz for that matter, I found the 32GK850G surprisingly made a meaningful impact to my game.

Whether it was the G-Sync – a technology that synchronises frame updates with monitor updates to avoid necessitating arbitrary frame-rates – or the ability of the monitor to achieve 144 frames/second I don’t fully know. But the important thing is that it felt… different.

LG’s 32GK850G is one of the more aesthetically pleasing gaming monitors. While it’s very obviously gamer at a glance, it isn’t quite so outrageously gaudy as some of its contemporaries and has a strong focus on clean lines and refined presentation that is sorely lacking in the gamer market.

Like the awesome LG 38UC99-W UltraWide monitor, this monitor has a 4-way navigator underneath the front edge that serves as the sole input for the OSD. I *love* this little joysticks, since they feel much, much more natural than the arbitrarily positioned straight line of buttons you find on so many other monitors. It’s always clear at a touch just where the directions are, and which button will select what you’re currently focused on. By contrast the inputs on other monitors exploit no intuitive knowledge and are a complete pain to use.

The 32GK850G includes HDMI, DisplayPort, 3.5mm audio and- thankfully- a USB hub for routing your mouse/keyboard (if you’re a weirdo like me who still staunchly uses wired peripherals) neatly across your desk.

Getting the blasted thing working…

I’m still rocking my Razer Blade laptop. While I have a love-hate relationship with it, I’m forced to admit that it’s served me well and is still running solid as it approaches two years of service- well, not quite yet if you count that the first one had to be replaced!

One key problem with my Razer Blade, however, is that it’s incapable of driving a G-Sync monitor. This is, insofar as I can determine, because of the Optimus technology used to facilitate switching between the NVidia GTX1060 discrete GPU and the Intel HD Graphics 530. The techniques used by Optimus lack any sort of hardware multiplexing. Rather than directly using, and outputting, a video signal from the discrete GPU the Optimus software decides whether an application’s rendering calls should be passed to it, and the resulting video is output via the Intel GPU which is physically connected to the display.

This means all video from the GTX1060 has to go via the HD Graphics 530 card to finally be routed to the outputs, and this signal path prevents G-Sync support.

The lack of G-Sync support internally isn’t a problem for eGPUs though. Connecting another NVidia GTX1060 to my Razer Blade via Thunderbolt 3 grants the system a GPU that *is* physically connected to its accompanying display. The result? G-Sync is available, and I could experiment with 144Hz magic.

Firing up a game…

NVidia’s drivers make it relatively simple to manage an external GPU and generally speaking I could hot plug it, launch a game and have the right GPU and display fire up. This is, of course, academic if you’re connecting this display directly to a desktop computer.

My game of choice in this case is, as always, Natural Selection 2. It’s my go-to game for testing anything, and it’s also a twitch shooter with frustrating netcode and a heavy demand on keen reaction times.

It’s also a game that very dramatically changes detail levels from area to area- at one point you might be in an empty corridor moving from A to B, and at another you might be in an active base with a dozen other players and structures all doing their thing. This means it also dramatically changes framerates, and this is where G-Sync really came into its own for me.

Previously with just my fixed 60Hz laptop display this led to tearing and stuttering as the framerate and refresh rate fell in and out of sync with each other. It wasn’t great. With the G-Sync display the framerate was free to dip, and rise, as and when the game demanded and I *never* saw any tearing or stuttering. At least no stuttering that wasn’t caused by the game dropping to slideshow framerates.

This was, by far, the biggest impact the display had on my gaming experience. Additionally I’d swear the ability to crank out over 60Hz afforded a minor improvement to my play. It’s not a game I’m great at when it comes to coordination and skill, but I definitely noticed myself playing better.

Desktop use

While not 4K, as some sources may suggest (seriously, LG’s own product page title reads “32 inch UHD 4K Monitor | LG 32GK850G | LG UK” … QHD and 4K are not the same! ), the QHD 2560×1440 resolution (that’s 4x720p or what the TV industry used to call “HD Ready”) of the 32GK850G is spacious enough for handling productivity and SRS BZNS alongside gaming. My mainstay workstation monitors are the same resolution and I find that it’s a pretty good sweet spot at the moment.

It, of course, works perfectly fine in Linux too. It’s a monitor. Why wouldn’t it?

Backlighting

I’m in two minds about the backlight feature in this monitor. While LG proudly touts it as “Sphere Lighting” it’s ultimately just a ring of LEDs arranged around the back of the monitor with their own separate controls. Indeed the LEDs are so completely unintegrated into the monitor that they cannot even be adjusted with the OSD and have their own controls to change mode. This feels like a missed opportunity to me.

What I’d have liked to see- although I appreciate Philips may be a somewhat difficult beast to wrangle with on this one- is some form of ambilight integration or- at the *very* least- support for controlling the lighting from the connected computer so that one of the various existing dynamic ambient lighting applications could drive it. In its current state LG’s “Sphere Lighting” is no different from purchasing a cheap, colour changing light product and standing it behind your LCD. Yes, it’s nicely packaged into the back, looks good and works well, but it feels ultimately pointless since the majority of the time it will never match what’s on your screen, and cannot be hooked into- for example- Razer’s Chroma API.

What really drives home the frustration of this miss is that the 32GK850G has a USB hub built right in. It would have been trivial to develop or appropriate a USB LED controller and hook it to the hub internally. How do I know this? We’ve done it with the Mote USB lighting stick. If a little Sheffield-based company with a handful of engineers can produce a USB-connected LED lighting product then LG have no excuse. Rest assured if I’d owned this monitor, I’d have hacked it open in a flash and modified it to be USB controlled.

I’m flabbergasted that LG missed this trick! Instead the best they’ve got is “Mode 3. Dynamic Lighting. Funky and dynamic!” I love your monitors, LG, I really do. But cycling a hue around your lighting ring might look good for a laugh, but it barely qualifies as “dynamic” and is completely useless, if not outright distracting in any normal usage scenarios.

Overall

This is yet another display that I can say I’d love to own. Since I’ve had no basis for comparison with other G-Sync displays, however, I can’t really place it in the wider market. What I can say, though, is that I enjoyed gaming on this monitor so much that I forgot to get around to actually writing this review! Better late than never, eh?

If you’re looking at it for the dynamic lighting you’ll be disappointed that there’s very little control afforded to you although to be fair to LG the lighting is purposeful- it can help counter eye-fatigue otherwise caused by looking at bright rectangle in a dark room.

If you’re looking just to up your game with G-Sync, then this does the trick but – presumably due to the addition of lighting – isn’t priced quite so keenly as its competition.