I use a 1080p monitor and what I’ve noticed is that once creators start uploading 4k content the 1080p version that I watch on fullscreen has more artifacting than when they only uploaded in 1080p.

Did you notice that as well?

Watching in 1440p on a 1080p monitor results in a much better image, to the detriment of theoretically less sharper image and a lot higher CPU usage.

  • DdCno1@beehaw.org
    link
    fedilink
    arrow-up
    7
    ·
    17 days ago

    There’s something else that hasn’t been mentioned yet: Video games in particular have been so detailed since the eight generation (XB1/PS4) that 1080p with its significant compression artifacts on YouTube swallows too many of those fine moving details, like foliage, sharp textures, lots of moving elements (like particles) and full-screen effects that modify nearly every pixel of every frame.

    And no, you will not get a less sharp image by downsampling 1440p or even 4K to 1080p, on the contrary. I would recommend you take a few comparison screenshots and see for yourself. I have a 1440p monitor and prefer 4K content - it definitely looks sharper, even down to fine-grain detail and I did the same when I had a 1200p screen, preferring 1440p content then (at least as soon as it was available - the early years were rough).

    If you are noticing high CPU usage at higher video resolutions, it’s possible that your GPU is outdated and can’t handle the latest codecs anymore - or that your operating system (since you’re on Linux based on your comment history) doesn’t have the right drivers to take advantage of the GPU’s decoding ability and/or is struggling with certain codecs. Under normal circumstances, there should be absolutely no increased CPU usage at higher video resolutions.

    • kevincox@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      17 days ago

      It may be worth right-clicking the video and choosing “Stats for Nerds” this will show you the video codec being used. For me 1080p is typically VP9 while 4k is usually AV1. Since AV1 is a newer codec it is quite likely that you don’t have hardware decoding support.