When square pixels aren't square

(alexwlchan.net)

59 points | by PaulHoule 5 hours ago

7 comments

  • drmpeg 4 hours ago
    > Videos with non-square pixels are pretty rare...

    Before HD, almost all video was non-square pixels. DVD is 720x480. SD channels on cable TV systems are 528x480.

    • m132 4 hours ago
      >Before HD, almost all video was non-square pixels

      Correct. This came from the ITU-R BT.601 standard, one of the first digital video standards authors of which chose to define digital video as a sampled analog signal. Analog video never had a concept of pixels and operated on lines instead. The rate at which you could sample it could be arbitrary, and affected only the horizontal resolution. The rate chosen by BT.601 was 13.5 MHz, which resulted in a 10/11 pixel aspect ratio for 4:3 NTSC video and 59/54 for 4:3 PAL.

      >SD channels on cable TV systems are 528x480

      I'm not actually sure about America, but here in Europe most digital cable and satellite SDTV is delivered as 720x576i 4:2:0 MPEG-2 Part 2. There are some outliers that use 544x576i, however.

    • GrantMoyer 47 minutes ago
      Even with modern digital codecs and streaming, there's usually chroma subsampling[1], so the color channels may have non-square "pixels" even if overall pixels are nominally square. I most often see 4:2:0 subsampling, which still has square pixels, but at half resolution in each dimension. However 4:2:2 is also fairly common, and it has half resolution in only one dimension, so the pixels are 2:1. You'd have trouble getting a video decoding library to mess this up though.

      [1]: https://en.wikipedia.org/wiki/Chroma_subsampling

    • badc0ffee 2 hours ago
      Displaying content from a DVD on a panel with square pixels (LCD, plasma, etc.) required stretching or omitting some pixels. For widescreen content you'd need to stretch that 720x480 to 848x480, and for 4:3 content you'd need to stretch it to 720x540, or shrink it to 640x480, depending on the resolution of the panel.

      CRTs of course had no fixed horizontal resolution.

      Edit: I just realized I forgot about PAL DVDs which were 720x576. But the same principle applies.

    • binaryturtle 3 hours ago
      Just look at Japanese television… most channels get broadcast at 1440x1080i for 16:9 content instead the full 1920x1080i (to save bandwidth for other things, I assume), so it's still very common with HD too.
      • ndiddy 2 hours ago
        It may also be due to legacy reasons. Japan was a pioneer in adopting HD TV years before the rest of the world, but early HD cameras and video formats like HDCAM and HDV only recorded 1080i at 1440x1080. If their whole video processing chain is set up for 1440x1080, they’d likely have to replace a lot of equipment to switch over to full 1920x1080i.
    • ranger_danger 3 hours ago
      I'm confused... what does DVD, SD or any arbitrary frame size have to do with the shape of pixels themselves? Is that not only relevant to the display itself and not the file format/container/codec?

      My understanding is that televisions would mostly have square/rectangular pixels, while computer monitors often had circular pixels.

      Or are you perhaps referring to pixel aspect ratios instead?

      • leguminous 10 minutes ago
        CRTs didn't have pixels at all. They had shadow masks (or aperture grilles) and phosphors, which could be a triad of rectangles, lines spanning basically the entire screen height, or dots. They did not line up with the signal, so it doesn't make sense to call them pixels.
      • badc0ffee 2 hours ago
        I'm not 100% sure I understand your question, but in order to display a DVD correctly, you need to either display the pixels stored in the video stream wider than they are tall (for widescreen), or narrower than they are tall (for 4:3). Displaying those pixels 1:1 on a display with square pixels would never be correct for DVD video.
      • binaryturtle 3 hours ago
        A square pixel has a 1:1 aspect ratio (width is the same as the height). Any other rectangular pixel with widths different than their heights would be considered "non-square".

        F.ex. in case of a "4:3 720x480" frame… a quick test: 720/4=180 and 480/3=160… 180 vs. 160… different results… which means the pixels for this frame are not square, just rectangular. Alternatively 720/480 vs. 4/3 works too, of course.

        • ranger_danger 2 hours ago
          Again I think you're talking about pixel aspect ratios instead, and not physically non-square pixels, which would be display-dependent. OP only said "square pixels" but then only talked about aspect ratios, hence my confusion.
          • dahart 1 hour ago
            OP quoted “non-square pixels” from the article, which is talking about pixel aspect ratios, i.e., width vs height. The implicit alternative to square in this context is rectangular, we’re not talking about circular or other non-rectangular shapes. Whenever the display aspect ratio is different than the storage or format aspect ratio, that means the pixels have to be non-square. For example, if a DVD image is stored at 720x480 and displayed at 4:3, the pixel aspect ratio would have to be 8:9 to make it work out: (720x8)/(480x9)==4/3. I believe with NTSC, DVDs drop a few pixels off the sides and use 704x480 and a pixel aspect ratio of 10:11.
          • formerly_proven 1 hour ago
            Dots on a crt are not pixels. Their shape depends on the shadow mask.
  • a012 3 hours ago
    I’m no expert but this sounds like a digital version of the anamorphic lens/system, doesn’t it?
    • pixelesque 2 hours ago
      It is.

      Some modern films are still filmed with anamorphic lenses because the director / DP like that, and so we in the VFX industry have to deal with plate footage that way, and so have to deal with non-square pixels in the software handling the images (to de-squash the image, even though the digital camera sensor pixels that recorded the image from the lens were square) in order to display correctly (i.e. so that round circular things still look round, and are not squashed).

      Even to the degree that full CG element renders (i.e. rendered to EXR with a pathtracing renderer) should really use anisotropic pixel filter widths to look correct.

    • shrinks99 2 hours ago
      Yes, and when working with footage shot with anamorphic lenses one will have to render the footage as non-square pixels, mapped to the square pixels of our screens, to view it at its intended aspect ratio. This process is done either at the beginning (conforming the footage before sending to editorial / VFX) or end (conforming to square pixels as a final step) of the post-production workflow depending on the show.
  • alberth 3 hours ago
    Am I missing the obvious, but it seems like the author is messing with the aspect ratio.
    • avianlyric 32 minutes ago
      No the author is highlighting the fact that the aspect ratio a video is stored in doesn’t always match the aspect ratio a video is displayed in. So simply calculating the aspect ratio based on the number of horizontal and vertical pixels gives you the storage ratio, but doesn’t always result in the correct display ratio.
    • ranger_danger 3 hours ago
      Yes I think they are conflating square pixels with square pixel aspect ratios.

      If a video file only stores a singular color value for each pixel, why does it care what shape the pixel is in when it's displayed? It would be filled in with the single color value regardless.

      • 8n4vidtmkvmk 19 minutes ago
        Because if that pixel takes up 2 vertical pixels when displayed in your web browser... That takes up more space and causes layout shift.

        I thought i understood the article just fine but these comments are confusing.

  • sbondaryev 4 hours ago
    This reminded me of retina screenshots on mac — selecting a 100×100 area can produce a 200×200 file. Different cause but same idea - the stored pixels don’t always match what you see on screen.
    • m132 3 hours ago
      This is indeed similar in the effects, but completely different in the cause to the phenomenon referenced in the article (device pixel ratio vs pixel aspect ratio).

      What you're referring to stems from an assumption made a long time ago by Microsoft, later adopted as a de facto standard by most computer software. The assumption was that the pixel density of every display, unless otherwise specified, was 96 pixels per inch [1].

      The value stuck and started being taken for granted, while the pixel density of displays started growing much beyond that—a move mostly popularized by Apple's Retina. A solution was needed to allow new software to take advantage of the increased detail provided by high-density displays while still accommodating legacy software written exclusively for 96 PPI. This resulted in the decoupling of "logical" pixels from "physical" pixels, with the logical resolution being most commonly defined as "what the resolution of the display would be given its physical size and a PPI of 96" [2], and the physical resolution representing the real amount of pixels. The 100x100 and 200x200 values in your example are respectively the logical and physical resolutions of your screenshot.

      Different software vendors refer to these "logical" pixels differently, but the most names you're going to encounter are points (Apple), density-independent pixels ("DPs", Google), and device-independent pixels ("DIPs", Microsoft). The value of 96, while the most common, is also not a standard per se. Android uses 160 PPI as its base, Apple has for a long time used 72.

      [1]: https://learn.microsoft.com/en-us/archive/blogs/fontblog/whe...

      [2]: https://developer.mozilla.org/en-US/docs/Web/API/Window/devi...

      • 8n4vidtmkvmk 13 minutes ago
        Why does the PPI matter at all? Thought we only cared about the scaling factor. So 2 in this 100 to 200 scenario. It's not like I'm trying to display a true to life gummy bear on my monitor, we just want sharp images.
      • sublinear 3 hours ago
        I might be misunderstanding what you're saying, but I'm pretty sure print and web were already more popular than anything Apple did. The need to be aware of output size and scale pixels was not at all uncommon by the time retina displays came out.

        From what I recall only Microsoft had problems with this, and specifically on Windows. You might be right about software that was exclusive to desktop Windows. I don't remember having scaling issues even on other Microsoft products such as Windows Mobile.

        • m132 2 hours ago
          Print was always density-independent. This didn't translate into high-density displays, however. The web, at least how I remember it, for the longest time was "best viewed in Internet Explorer at 800x600", and later 1024x768, until vector-based Flash came along :)

          If my memory serves, it was Apple that popularized high pixel density in displays with the iPhone 4. They weren't the first to use such a display [1], but certainly the ones to start a chain reaction that resulted in phones adopting crazy resolutions all the way up to 4K.

          It's the desktop software that mostly had problems scaling. I'm not sure about Windows Mobile. Windows Phone and UWP have adopted an Android-like model.

          [1]: https://en.wikipedia.org/wiki/Retina_display#Competitors

  • drob518 3 hours ago
    Proving that everything is more complicated than you first think it is when you lift up a corner of the rug.
  • fasterik 3 hours ago
    Obligatory "A Pixel Is Not A Little Square"

    https://alvyray.com/Memos/CG/Microsoft/6_pixel.pdf

    • groundzeros2015 52 minutes ago
      Yep, daily reminder that pixels are discrete point samples.
  • lihaciudanieljr 4 hours ago
    [dead]