The Study Nobody Was Ready For
German tech publication ComputerBase ran a blind test that gaming Twitter is still arguing about. They showed 6,747 readers side-by-side video clips of six real, popular games rendered three ways: DLSS 4.5 upscaling at quality mode, AMD FSR, and native 4K rendering — all at the same resolution output. Respondents picked their favorite. They could also mark "equivalent" if they couldn't tell the difference. The results were not close [1].
48% Don't Care About Native Anymore
DLSS 4.5 won in every single game. Overall: 48% of votes went to DLSS. Native rendering came second at 24%. FSR landed at 15%. About 13% marked all three as equivalent — one in eight gamers who genuinely couldn't tell [1]. The games weren't synthetic benchmarks. They were titles people are actually playing right now: Arc Raiders, Cyberpunk 2077, Horizon Forbidden West, Satisfactory, The Last of Us Part Two, and one other. These are showpiece games — visually demanding, built with real artistry. The fact that DLSS beat native in these games specifically isn't something you can wave away by saying the test used unoptimized software [1]. What the test used was Nvidia's iCat player, showing clips side by side. You could argue about the format — it's video, not live gameplay — but Linus made a point worth sitting with: if anything, DLSS should be harder to prefer in a static video comparison, because the latency advantage you'd get in real gameplay doesn't exist in a clip. DLSS upscaling (not frame gen) typically improves frame rates because you're rendering at lower resolution before the ML upscale step [1].



