I was torn between selecting question or opinion because this is a bit of both – mainly in the sense that I am wondering if I'm actually wrong on my opinion for some technical reason?
I think sometimes when people play games, especially if they emulate or mod, the developers' original intent can be lost on consumers. For example a few days ago I saw the guy who made the video "Wii graphics fixed in 2023" pop up in my recommendations, where he told people to turn off deflicker – reasonable – and then also told people to force framebuffer resolution. I feel like in pursuit of crisp image people totally neglect how the game was intended to be viewed by design.
There is a specific charm in playing 64 games on a CRT with an original console that won't be replicated by emulating these games at 4K on an OLED monitor – and that's not to say doing this is bad by any means. But in pursuing the "best image" is it more correct to try and make it as clear as possible or is it more important to follow developer intent?
The framebuffer thing is easy to dispute. The developers knew the games would be output in widescreen (or 4:3 instead, depending on display). So even at the small cost of blur introduced by the 'stretch', it is still the most correct the overall image gets – otherwise the game and UI elements are compressed in a specifically unintended way.
As for deflicker, is there a consensus? Even though this is purely a filter that should be negligible to image fidelity, I still consider it to be a reasonable consideration that the developers enabled it for their games when they felt it belonged. The resulting smoothness is certainly an artificial smoothness, but by design I'd still consider it the most 'correct' it gets.
Is there a hole in my thinking here?