“We developed a deep neural network that maps the phase and amplitude of WiFi signals to UV coordinates within 24 human regions. The results of the study reveal that our model can estimate the dense pose of multiple subjects, with comparable performance to image-based approaches, by utilizing WiFi signals as the only input.”

  • Agent641@lemmy.world
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    2
    ·
    2 years ago

    You know what else let’s you see through walls? Windows. (Suck it, Linux users!)

  • Dog@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    2 years ago

    If I didn’t have a reason for why Ethernet is superior, I have one now!

  • Xavier@lemmy.ca
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 years ago

    Henceforth, the building code shall make mandatory that every room be perfectly grounded Faraday cages (/s).

    Still, imagine lethal drones integrated with that technology (of course, they already have infrared, maybe even some adequate wavelength of X-rays).

    Nevertheless, pretty cool to see how far we can take preexisting technology with the help of some deep learning layers.

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    3
    ·
    2 years ago

    I am going to repeat myself forever it seems. We got it wrong when we decided that you only have privacy when someone can’t physically see what you are up to. Nothing else is treated this way. You are not allowed to drive as fast as your car can physically move. You are not allowed to go into anything locked as long as you are able to pick it. You are not allowed to steal whatever you want as long as no one tackles you for it. And yet somehow some way it became understood that merely because someone can get a photo of you they have the legal right to do so.

    As if access to better technology means you should follow less moral rules vs the opposite. Someone with a junk camera of the 80s can do far less perving compared to the new cameras+drones out there.

  • Maggoty@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    3
    ·
    2 years ago

    Duh? I don’t think anyone with the right field of study thought this wasn’t possible. It just doesn’t have good use cases.

    • Milkyway@feddit.de
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 years ago

      I’m an EE, and I have serious doubt about this actually working nearly as good as they are putting it. This sort of stuff is hard, even with purpose built radar systems. I’m working with angle estimation in Multipath environments, and that shit fucks your signals up. This may work it you have extremely precisely characterised the target room and walls, and a ton of stuff around it, and then don’t change anything but the motion of the people. But that’s not practical.

      • Maggoty@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 years ago

        It’s Popular Mechanics, of course it doesn’t work as well as they say it does. But the theory has been around a long time.

    • deafboy@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      2 years ago

      Full body vr tracking without sensors?

      The human presence sensors based on this are already on the consumer market, we just need to dial up the sensitivity.

    • brianorca@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 years ago

      There are already smart light bulbs you can buy off the shelf that use radio signals to see when somebody is in the room. Then it can turn on the lights automatically, without a camera or infrared sensor in the area.