Inside the LG G5’s shocking last-place finish at the 2025 TV Shootout

| 4 870


The 2025 TV Shootout went down over the weekend, and the results are shocking: yes, the Sony Bravia 8 II won the overall competition and my personal award for silliest name, but the LG G5 came in last place by a huge margin. I was one of the judges, and I think I have a sense of what’s going on.

If you’re not familiar, the TV Shootout is an annual event hosted by Value Electronics, a boutique and high-end home theater store started by Robert and Wendy Zohn in 1998. They’ve been holding the event for 21 years now, and Robert proudly begins the occasion by holding up his framed registered trademarks for “TV Shootout” and “King of TV,” which is the title bestowed on the winner. I’ve been following the results for years, so it was a real thrill when Robert asked me to judge last year and equally exciting when he asked me back again this year.

Value Electronics President Robert Zohn.

Value Electronics President Robert Zohn.
Photo: Nilay Patel / The Verge

(As Vergecast and Decoder listeners know, I’m out on parental leave for a few months, but Value Electronics is 15 minutes away from my house and staring at TVs in a dark room for several hours with other display nerds is my personal heaven, so I made a tiny exception.)

The event is pretty straightforward: the flagship 65-inch OLED TVs from Sony, LG, Panasonic, and Samsung were each professionally calibrated as closely as possible to reference standards by Dwayne Davis, a professional ISF calibrator familiar to AV forum nerds as D-Nice. The TVs (and MSRP) this year were:

  • LG OLED65G5WUA: $3,399.99
  • Panasonic TV65Z95BP: $3,199.99
  • Samsung QN65S95FAFXZA: $3,299.99
  • Sony K-65XR80M2: $3,499.99

Robert had asked many more manufacturers to participate, and most declined, knowing they could not compete. He also excluded mini LED TVs this year after they didn’t stack up to the OLEDs last year; he plans to have a separate shootout for those later.

The Shootout judges were all professional display experts who work in and around the film industry. Many of them have been judging the Shootout for years now. They were:

  • Ilya Akiyoshi, a cinematographer who’s worked on The White Lotus and Captain America: Civil War
  • Todd Anderson, a certified THX calibrator and host of the Home Theater News Review podcast
  • Chris Boylan, an ISF-certified calibrator and editor-at-large of eCoustics
  • Jason Dustal, an ISF calibration instructor and co-chair of the CEDIA standards committee
  • Jeffrey Hagerman, a cinematographer and colorist
  • Cecil Meade, an ISF-certified calibrator known as ClassyTech on the AV forums
  • John Reformato, an ISF-certified calibrator
  • Mike Renna, an ISF-certified calibrator
  • Richard Drutman, a filmmaker
  • David Mackenzie, CEO of Fidelity in Motion, a compression and mastering company
  • And, of course, me

The rest of the room was filled with engineers and marketing folks from Sony, LG, and Samsung, several YouTubers, and various other display nerds, all paying close attention to the judging and the differences between the displays.

The judges were asked to objectively evaluate how closely the images on each set matched a pair of $43,000 Sony BVM-HX3110 professional reference monitors across a number of categories in a very dark room, using both test patterns and real content delivered from a Panasonic Blu-ray player, a Kaleidescape streaming box, and an Apple TV, all switched by an AVPro Edge 8×8 HDMI matrix and delivered over Bullet Train optical HDMI cables.

The closer the image was to those BVM reference displays, the higher the score, and the further from the reference, the lower the score. There were categories in which some TVs might have looked subjectively better than the reference displays, particularly in dark scenes where all the TVs tended to boost shadow detail to be more visible. But the judges were instructed to give lower scores for deviating from the reference in either direction. We were also instructed not to compare the TVs to one another, only to the reference monitors.

It was only the final category, “bright room out of the box,” that was totally subjective, and in which we were allowed to compare the TVs to each other. As the name suggests, the shades were opened in the room, and the TVs were set to uncalibrated filmmaker modes with energy-saving features turned off. More on this in a moment.

As ever, this means the Shootout ultimately delivers a very specific kind of winner: the TV that can be most closely calibrated to match an expensive professional reference display when viewed in a dark room. We didn’t look at anything else at all: not gaming features, number of HDMI inputs, operating systems, or even Dolby Vision support (which the Samsung does not have). This whole thing was about the limits of picture quality and picture quality alone. There are a lot of reasons you might pick any of these TVs that have nothing to do with how closely they can be calibrated to match a reference display, but that’s not what the Shootout is about.

It’s a big upgrade year for OLED TVs: Panasonic is back in the US market with the Z95B, and there are new panel technologies in the mix. LG and Panasonic are using tandem OLED panels for the first time, while Sony and Samsung are using new, brighter QD-OLED panels. (You can pretty easily surmise that Samsung is providing the QD-OLEDs and LG is behind the tandems, but none of the manufacturers will confirm anything.)

The underlying commonality of the panels means the Shootout really stresses the image processing differences between the manufacturers, and the results were fascinating. Panasonic had an incredibly strong showing, coming in first on the HDR tests and third overall by only a hair. Sony won the King of TV title for the seventh year in a row, which will do nothing to quell critics who say that measuring how close everything can come to a Sony reference display means Sony will always win. But the Samsung was a very close second, and to my eye, it only really fell behind because Samsung cannot help itself when it comes to colors — everything was generally a little more saturated and vibrant than the reference display.

SDR Voting Categories

Manufacturer

Contrast / Grayscale

Color

Processing

Bright Living Room

Overall Average

LG OLED65G5WUA 3.69 3.84 3.31 4.06 3.68
Panasonic TV65Z95BP 3.84 3.97 3.78 4.25 3.92
Samsung QN65S95FAFXZA 4.38 3.88 3.66 4.19 4.00
Sony K-65XR80M2 4.41 3.84 4.22 4.19 4.16

Best SDR TV is: Sony’s K65XR80M2
Data: Value Electronics

HDR Voting Categories

Manufacturer

Dynamic Range / EOTF Accuracy

Color

Processing

Bright Living Room

Overall Average

LG OLED65G5WUA 3.41 2.84 3.34 3.94 3.30
Panasonic TV65Z95BP 4.03 4.00 3.97 3.88 3.98
Samsung QN65S95FAFXZA 3.88 4.13 3.72 4.38 3.97
Sony K-65XR80M2 3.94 4.03 3.53 4.19 3.88

Best HDR TV is: Panasonic’s TV65Z95BP
Data: Value Electronics

The shocker was the dismal showing by the LG G5, a hotly anticipated set because of that new tandem OLED panel. There’s no other way to say it: the G5 basically failed several of the tests, showing the wrong colors on some of the linearity test patterns, big posterization artifacts in dark scenes, a slight green cast that kept reappearing, and an overall tendency to push color and brightness in dark scenes in ways that did not require display nerds to see. The LG made Sansa Stark look like she had a blocky red rash during a particularly dim Game of Thrones scene that the Sony and Samsung handled nearly perfectly. “There are lots of problems with the LG this year,” said judge Cecil Meade. I heard other judges say, “Have you seen what the LG is doing?” more than once. Indeed, the G5 was so far off on some of the test patterns that Dwayne reminded the judges that the lowest possible score was 1, not 0. This is generally a bad sign.

If I had to explain why the LG did so poorly while the Panasonic did so well using the same panel, I’d put it down to confidence, bordering on cockiness. The test patterns tended to reveal that Panasonic’s image processing is strictly by the book — the new kid in school playing exactly by the rules, while the other manufacturers have all learned where they want to push things or make their own choices.

The testing lineup.

The testing lineup.
Photo: Nilay Patel / The Verge

A simple example is HDR detail: the Panasonic dutifully accepts the metadata of the HDR content it’s presented and doesn’t display any detail beyond the listed brightness while all the other manufacturers have learned HDR metadata is often inaccurate, so they read the content directly to figure out how best to display it, which often resulted in additional detail being shown. This might result in a lower technical Shootout score, since it’s a deviation from the strict reference image, but TV makers are all doing it because they’ve learned that consumers will reliably complain about losing detail in the highlights and shadows, not about having too much.

These little tricks and tactics are both the result of experience building these displays and what feels like obvious attempts to differentiate in the market. Sony prides itself on reference-level restraint, and it tends to get that result, while Samsung uses the same panel to deliver punched-up Samsung-style colors. And I would say, based on LG’s third-place showing in the Shootout last year, that LG has learned a vivid, contrast-y OLED look sells way more TVs than the ability to calibrate closely to a reference display.

Everything came to a head in the “bright room out of box” test, which was fairly controversial in the room. It’s a totally subjective test with no real standard to measure against, and all the manufacturers spend almost all their engineering time making sure they look great this way because, well, most people put their TVs in a bright room and never change the settings. There’s no way to really rate TVs of this caliber against each other on this test — it really comes down to personal preference. “They’re all fives — they’re all bright, they’re all colorful. What else is there to say?” said David Mackenzie, a judge on the panel who also helped author the UHD specifications. You can see it in the scores, where the LG managed to pull itself back into contention and the saturated colors of the Samsung pushed it into a commanding lead in the HDR test. I would go so far as to argue the bright room scores are important but should be taken out of the averages that determine the winners, because they’re essentially a wild card.

Here’s my scorecard.

Here’s my scorecard.
Photo: Nilay Patel / The Verge

And it’s true: the fine differences between these sets take a dark room and a lot of time and calibration to see. Anyone just putting one on the wall will undoubtedly be happy with their purchase, especially if you factor things like HDMI ports and Dolby Vision into your decision. I have both Sony and LG OLED TVs that reliably wow everyone who looks at them, and a lot of people love the contrast-y LG OLED look — and LG’s cheaper price tags.

But if you’re chasing reference-level image perfection, it’s another year for Sony, while it feels like LG has all but abandoned this particular game. And I’d guess Panasonic is going to put up an even bigger fight next time around.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.