I use a 1080p monitor and what I’ve noticed is that once creators start uploading 4k content the 1080p version that I watch on fullscreen has more artifacting than when they only uploaded in 1080p.

Did you notice that as well?

Watching in 1440p on a 1080p monitor results in a much better image, to the detriment of theoretically less sharper image and a lot higher CPU usage.

  • MrSoup@lemmy.zip
    link
    fedilink
    arrow-up
    15
    ·
    6 days ago

    YouTube automatically generate videos in lower resolution of the one uploaded.
    So when you watch a 4k video and switch to 1080, you are no longer watching the original video but a re-encoded one by YouTube itself which could have more artifacts since it’s resized and compressed.

    I dunno the exact specs (like bit rate, etc.), someone will probably add them in another reply.

    • Maxy@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      15
      ·
      6 days ago

      I believe YouTube always re-encodes the video, so the video will contain (extra) compression artefacts even if you’re watching at the original resolution. However, I also believe YouTube’s exact compression parameters aren’t public, so I don’t believe anyone outside of YouTube itself knows for sure which videos are compressed in which ways.

      What I do know is that different content also compresses in different ways, simply because the video can be easier/harder to compress. IIRC, shows like last week tonight (mostly static camera looking at a host) are way easier to compress than higher paced content, which (depending on previously mentioned unknown parameters) could have a large impact on the amount of artefacts. This makes it more difficult to compare different video’s uploaded at their different resolutions.

      • MrSoup@lemmy.zip
        link
        fedilink
        arrow-up
        5
        ·
        6 days ago

        YouTube always re-encodes the video

        You are right. For example you can upload an avi to YouTube, but they will never host and stream an avi.

        • DdCno1@beehaw.org
          link
          fedilink
          arrow-up
          4
          ·
          6 days ago

          AVI is a container, not a codec. An AVI container can contain video encoded with any kind of codec (unlike some other container formats, which are more restrictive). If you want to, you could put e.g. a VP9 or AV1 video stream (so the ones that YouTube is using) into an AVI container. In theory at least, if you uploaded an AVI file containing VP9 video, YouTube could just extract it from the container and stream it as is, but they’ll still re-encode it. Before you think that all of this talk of modern codecs in AVI containers is theoretical, AVI is used a a standard for archiving with some institutions, so it’s more relevant than you might think.

          However, you are partially right in that AVI can not be used for streaming, not just by YouTube, but in general, since this requirement obviously wasn’t taken into account when it was introduced in 1992 and thus not incorporated into this standard.

      • kevincox@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        6 days ago

        Just to be clear it is probably a good thing that YouTube re-encodes all videos. Videos are a highly complex format and decoders are prone to security vulnerabilities. By transcoding everything (in a controlled sandbox) YouTube takes most of this risk on and makes it highly unlikely that the resulting video that they serve to the general public is able to exploit any bugs in decoders.

        Plus YouTube serves videos in a variety of formats and resolutions (and now different bitrates within a resolution). So even if they did try to preserve the original encoding where possible you wouldn’t get it most of the time because there is a better match for your device.

        • Maxy@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          1
          ·
          5 days ago

          I agree that, theoretically speaking, YouTube might be protecting some end users from this type of attack. However, the main reason YouTube re-encodes video is to reduce (their) bandwidth usage. I think it’s very kind towards YouTube to view this as a free service to the general public, when it’s mostly a cost-cutting measure.

  • kevincox@lemmy.ml
    link
    fedilink
    arrow-up
    11
    ·
    6 days ago

    I’m pretty sure that YouTube has been compressing videos harder in general. This loosely correlates with their release of the “1080p Enhanced Bitrate” option. But even 4k videos seem to have gotten worse to my eyes.

    Watching a higher resolution is definitely a valid strategy. Optimal video compression is very complicated and while compressing at the native resolution is more efficient you can only go so far with less bits. Since the higher resolution versions have higher bitrates they just fundamentally have more data available and will give an overall better picture. If you are worried about possible fuzziness you can try using 4k rather than 1440p as it is a clean doubling of 1080p so you won’t lose any crisp edges.

  • chunkystyles@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    11
    ·
    6 days ago

    YouTube compresses the shit out of 1080p content. Any video that has a lot of movement will look like trash at 1080p. Even if you’re on a lower resolution monitor, the higher bit rate of higher resolution videos will look better. It’s all very stupid on our end, but I assume it saves them a ton on bandwidth.

  • Ace! _SL/S@ani.social
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    6 days ago

    That’s because for example Youtube uses a bitrate of 4-7mbps for 1080p. 1440p gets arround 13mbps and 4k something like 46mbps iirc

    Other media providers are similiarly bad with their bitrates

    • Peter1986C@lemmings.world
      link
      fedilink
      arrow-up
      2
      ·
      6 days ago

      For AV1 that could still be okay, lol. It would be kind of meh for e.g. H264 but YT does not even use that anymore AFAIK.

      • Ace! _SL/S@ani.social
        link
        fedilink
        arrow-up
        3
        ·
        6 days ago

        Youtube uses VP9 for all resolutions most of the time. 1080p and below offer AVC as fallback encoding

  • DdCno1@beehaw.org
    link
    fedilink
    arrow-up
    7
    ·
    6 days ago

    There’s something else that hasn’t been mentioned yet: Video games in particular have been so detailed since the eight generation (XB1/PS4) that 1080p with its significant compression artifacts on YouTube swallows too many of those fine moving details, like foliage, sharp textures, lots of moving elements (like particles) and full-screen effects that modify nearly every pixel of every frame.

    And no, you will not get a less sharp image by downsampling 1440p or even 4K to 1080p, on the contrary. I would recommend you take a few comparison screenshots and see for yourself. I have a 1440p monitor and prefer 4K content - it definitely looks sharper, even down to fine-grain detail and I did the same when I had a 1200p screen, preferring 1440p content then (at least as soon as it was available - the early years were rough).

    If you are noticing high CPU usage at higher video resolutions, it’s possible that your GPU is outdated and can’t handle the latest codecs anymore - or that your operating system (since you’re on Linux based on your comment history) doesn’t have the right drivers to take advantage of the GPU’s decoding ability and/or is struggling with certain codecs. Under normal circumstances, there should be absolutely no increased CPU usage at higher video resolutions.

    • kevincox@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      6 days ago

      It may be worth right-clicking the video and choosing “Stats for Nerds” this will show you the video codec being used. For me 1080p is typically VP9 while 4k is usually AV1. Since AV1 is a newer codec it is quite likely that you don’t have hardware decoding support.

  • stealth_cookies@lemmy.ca
    link
    fedilink
    arrow-up
    5
    ·
    6 days ago

    The one I’ve noticed is that for videos with the 1080p “Enhanced Bitrate” option, the free 1080p video looks like a blurry mess compared to normal 1080p content.

    • kevincox@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      6 days ago

      From my experience it doesn’t matter if there is an “Enhanced Bitrate” option or not. My assumption is that around the time that they added this option they dropped the regular 1080p bitrate for all videos. However they likely didn’t eagerly re-encode old videos. So old videos still look OK for “1080p” but newer videos look trash whether or not the “1080p Enhanced Bitrate” option is available.

  • Maxy@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    5
    ·
    6 days ago

    About the “much higher CPU usage”: I’d recommend checking that hardware decoding is working correctly on your device, as that should ensure that even 4K content barely hits your CPU.

    About the “less sharper image”: this depends on your downscaler, but a proper downscaler shouldn’t make higher-resolution content any more blurry than the lower-resolution version. I do believe integer scaling (eg. 4K -> 1080p) is a lot less dependant on having a proper downscaler, so consider bumping the resolution up even further if the video, your internet, and your client allow it.

    • sexy_peach@feddit.orgOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      I was just guessing the higher CPU usage. You’re probably right that it doesn’t matter

    • Peter1986C@lemmings.world
      link
      fedilink
      arrow-up
      2
      ·
      6 days ago

      Youtube pushes the AV1 “format” heavily these days which is hard to decode using hardware acceleration, given that a lot of devices still out there do not support that.

      • kevincox@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        6 days ago

        which is hard to decode using hardware acceleration

        This is a little misleading. There is nothing fundamental about AV1 that makes it hard to decode, support is just not widespread yet (mostly because it is a relatively new codec).

        • Peter1986C@lemmings.world
          link
          fedilink
          arrow-up
          1
          ·
          6 days ago

          I mean, given that many devices do not support accelerating it, it is in practice “hard to accelerate” unless you add a new gfx card or buy a new device.

          I may not have worded it optimally (2L speaker), but I am sure it was fairly clear what I meant. 🙂

          • kevincox@lemmy.ml
            link
            fedilink
            arrow-up
            3
            ·
            6 days ago

            I wouldn’t call a nail hard to use because I don’t have a hammer. Yes, you need the right hardware, but there is no difference in the difficulty. But I understand what you are trying to say, just wanted to clarify that it wasn’t hard, just not widespread yet.

      • Maxy@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        1
        ·
        5 days ago

        Good point, though I believe you have to explicitly enable AV1 in Firefox for it to advertise AV1 support. YouTube on Firefox should fall back to VP9 by default (which is supported by a lot more accelerators), so not being able to decode AV1 shouldn’t be a problem for most Firefox-users (and by extension most lemmy users, I assume).

        • Peter1986C@lemmings.world
          link
          fedilink
          arrow-up
          1
          ·
          4 days ago

          I am running mostly Firefox or Librewolf on Linux these days, but I do not remember having to enable it. Not all of my systems support accelerating AV1 in their hardware, but they do play at 1080p (but with framedrops once above 30fps on the unaccelerated computer). But yeah, I do hope YT keeps VP9 around because of the acceleration.

  • TranquilTurbulence@lemmy.zip
    link
    fedilink
    arrow-up
    3
    ·
    6 days ago

    I haven’t noticed anything. Would you do me a disservice and explain what I’m missing in my blissful ignorance. Make me see something that can never be unseen.

    • sexy_peach@feddit.orgOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 days ago

      I sit quite close to a large 1080p monitor. That’s why I notice when the bitrate is low and the video I am seeing lacks true 1080*720 pixels. Basically it’s compressed so much, that the image is noticeably worse than an image my monitor could display. That’s why when I use a higher pixel count compression, like 1440p, the compression problems don’t show as bad on the screen that will only show 1080p pixels anyways. That’s what I am talking about. On a phone or a laptop screen it will probably be less noticeable. I guess that’s why Youtube does it, it probably saves them a huge amount of bandwidth and people who want really good quality video might already have 4k displays which then get a way higher bitrate video feed anyways.

      I guess the 1080p monitor size starts to be a niche. More and more people using it are on smartphones I guess so it really makes sense to have a very low bitrate.

      • TranquilTurbulence@lemmy.zip
        link
        fedilink
        arrow-up
        2
        ·
        4 days ago

        Turns out, I have an old dumb FullHD TV that should be ideal for this experiment. So, if I watch a YT video on 1080p, I should be able to see compression artefacts that are invisible when using a higher resolution. How is that supposed to work anyway, given that the browser knows the output resolution? Will it just download a higher resolution video, drop every other pixel, and display the rest?

        • sexy_peach@feddit.orgOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 days ago

          Will it just download a higher resolution video, drop every other pixel, and display the rest?

          Yes, just like it can show a 1080p video not in fullscreen :)

    • Peter1986C@lemmings.world
      link
      fedilink
      arrow-up
      3
      ·
      6 days ago

      I can only imagine that they (OP) set quality settings on [auto]. That way they might have YT constantly lowering bitrates/resolution. I do not have any issues either, but I use fixed quality settings.

      • DdCno1@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        6 days ago

        No, that’s not what they are talking about. Even if you set the video to 1080p and make sure that YouTube isn’t lowering it to a lower resolution, it still won’t look very good.

        Whether you notice or not depends on how perceptive you are, the quality of your eyesight and also the size and quality of your display. It’s hard to notice on a low-grade laptop screen (or smaller), as well as a cheap TN panel monitor, but go beyond around 20" and use a decent enough IPS panel and those blocky compression artifacts are hard to miss.