It's been almost 5 years since I made any major changes to my main rig. I'm still rocking that Ryzen 1800x, and that surround sound system. I upgraded from 32 GB to 64 GB RAM soon after building it, so it's ready for serious server things, and playing with RAM drives is fun sometimes. I recently got a breathtaking MSI RX 6800 Gaming X Trio 16GB. My old Acer monitors were still going fine, but 4K is the hot new stuff. I have the mindset that if I'm computing but not looking at a monitor, I'm computing wrong, so monitors are important. (That's why I despise RGB LEDs on literally anything, why I use black monolithic computer cases, and why I use unlabeled, unlit keyboards.) So should I go 4K, or go high refresh rate? Why not both?
To my relief, 4K monitors are affordable anymore. Curious, I looked for options above 100hz. A bit pricey, but I recalled that they were even more expensive not too long ago. There are several 28 inch options, and a few 32 inch ones. I was leaning towards the larger monitors when some research revealed that the 28 inch panels were faster and had less ghosting; i.e., better. It turns out that the Innolux M280DCA-E7B panel they are based on is excellent. So 28 inches, it is.
Since I've never knowingly seen a high refresh rate monitor in person, and since watching one on Youtube doesn't work, I went to my local halls of hell. Their website had a particular model on sale and wondered if there were any in store. There wasn't, but there were other high refresh rate monitors on display, and I could tell that motion was indeed smoother. So what's better than one of those? Two! Satisfied that I could tell a difference, I purchased 2 Samsung Odyssey G7 (model LS28AG700NNXZA) monitors.
I knew these would have tiny pixels like every smartphone and laptop, but none have as huge display areas. Ever since using a 17 inch 1280×1024 monitor in 2003, I've been desktop computing with PPI in the mid 90s. After setting these up and turning them on, the increased pixel density (157 PPI) blew me away. At 144hz, everything looks, moves, and scrolls so smoothly. I swear that these monitors have more saturated colors, too.
The big concern with increasing resolution is having lower framerates in games. Since Halo is old, it went to 144 FPS at 4K and stayed there, even though it has no low/medium/high/ultra/extreme/WTFOMGBBQ graphical options. I turned down Borderlands 3's settings a bit to get it running (and looking) to my satisfaction in 4K. I poked my head into Cyberpunk, and it does OK in 4K if I turn off ray tracing. I'm happy that my RX 6800 has sufficient power for running modern games at 4K. I'll be capturing screenshots in 4K for my future game "review" posts. Note that images on this blog will still be 1080p max (probably), but full uncompressed versions will be in the backup.
Along with the higher refresh rates, these monitors support variable refresh rates. (I've never seen a 100+hz LCD monitor that didn't.) Usually, a monitor will finish scanning an image before switching to a new one. If the computer can't finish a new image before the scan completes, the monitor will redraw the old image. If the computer finishes the new image partway through a scan, the image will wait until the monitor finishes. If it switches in the middle of a scan, you will see part of the old image and part of the new one at the same time. (I don't like that, but some gamers are fine with it.) Variable refresh rates mean that the monitor will wait a little to avoid being in the middle of a scan when a new image is available. Thanks to that feature, motion looks smooth down to the mid 40s FPS before it starts to feel choppy. But since I have a Steam backlog of Humble Bundles (old games mostly), I'll be stuck at 144 FPS most of the time.
Back in the good old days, my freaky dreaming friend and his brother clung to their CRTs long after everyone else got LCD monitors. (I wonder if "fullscreen" to the one means "4:3 aspect ratio" instead of "fills the entire screen", like a normal person. If he's still stubborn, he should be eaten by that cow.) Like some other SERIOUS GAMERS at the time, they cited better response times, refresh rates, and resolutions as reasons for keeping their CRTs. I assume that they went LCD around the time I started this blog. I assert that LCDs have finally met or surpassed CRTs in every objective metric that matters.
In my last monitor post, I mused about getting a third. I did, as illustrated by screenshots on my Borderlands articles, but I never really got into using a third monitor. Even though I've replaced them, I still have and use them. When the world changed forever and we all started working from home, I took 2 1080p monitors from the office. I've replaced those with two of my old Acers, and rotated them into portrait mode. (I've asked if they want the old ones back.) While observing International Backup Awareness Day on theandrewbailey.com, the USB cable felt funny when I disconnected the external drive. I figured out that the vibrating sensation was AC power! Something was shorting to ground (through my hands, down to my bare feet, and out the concrete basement floor), and I narrowed it down to the UPS or the CRT monitor. I took those out and put the other monitor down there. USB cables don't vibrate anymore.
Make no mistake, I like these monitors. The only thing that I hate about them is that they have external power bricks. I thought monitors lost those 15 years ago. I'm not sure if these monitors are 100% worth the price, not only in monetary terms, but in processing power, too. If they cost half as much, it would be a no-brainer upgrade. Maybe that's why 1440p exists. I wonder how long it will take me to expect this smoothness everywhere, and consider 1080p60 outdated, like an old 17 inch 1280×1024 TN LCD with huge bezels.