• GamingChairModel@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    2 days ago

    This write-up is really, really good. I think about these concepts whenever people discuss astrophotography or other computation-heavy photography as being fake software generated images, when the reality of translating the sensor data with a graphical representation for the human eye (and all the quirks of human vision, especially around brightness and color) needs conscious decisions on how those charges or voltages on a sensor should be translated into a pixel on digital file.

    • XeroxCool@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      19 hours ago

      Same, especially because I’m a frequent sky-looker but have to prepare any ride-along that all we’re going to see by eye is pale fuzzy blobs. All my camera is going to show you tonight is pale sprindly clouds. I think it’s neat as hell I can use some $150 binoculars to find interstellar objects, but many people are bored by the lack of Hubble-quality sights on tap. Like… Yes, and then sent a telescope to space in order to get those images.

      That being said, I once had the opportunity to see the Orion nebula through a ~30" reflector at an Observatory, and damn. I got to eyeball about what my camera can do in a single frame with perfect tracking and settings.

  • JeeBaiChow@lemmy.world
    link
    fedilink
    English
    arrow-up
    38
    ·
    2 days ago

    Good read. Funny how I always thought the sensor read rgb, instead of simple light levels in a filter pattern.

  • tyler@programming.dev
    link
    fedilink
    English
    arrow-up
    20
    ·
    2 days ago

    This is why I don’t say I need to edit my photos, but instead I need to process them. Editing is clearly understood by the layperson as Photoshop and while they don’t understand processing necessarily, many people still understand taking photos to a store and getting them processed from the film to a photo they can give someone.

    • Fmstrat@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 hours ago

      As a former photographer back when digital was starting to become the default, I wish I had thought of this.

  • worhui@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    2 days ago

    Not sure how worth mentioning it is considering how good the overall write up is.

    Even though the human visual system has a non-linear perception of luminance. The camera data needs to be adjusted because the display has a non-linear response. The camera data is adjusted to make sure it appears linear to the displays face. So it’s display non-uniform response that is being corrected, not the human visual system.

    There is a bunch more that can be done and described.