In 2025, researchers at La Sapienza University of Rome unveiled WhoFi, a new system that can identify and track individuals using only Wi-Fi signals. The technology is both groundbreakings and frankly deeply concerning. In their paper “Deep Person Re-Identification via Wi-Fi Channel Signal Encoding”, they explain how they turn ordinary wireless networks into biometric sensors capable of recognizing people by how their bodies subtly disturb radio waves.
WhoFi relies on something called Channel State Information (CSI), this data shows how Wi-Fi signals change as they pass through the environment. Every person’s body reflects and absorbs those signals in slightly different ways. Movement, posture, and body composition all produce unique patterns, essentially a physical “signature” that can be learned and recognized.
The La Sapienza team built a deep learning pipeline to interpret this data. It preprocesses CSI streams, feeds them into neural network encoders and produces a compact signature vector for each person. When a new person passes through the Wi-Fi field, the system compares their signal pattern to its gallery of known signatures and identifies them even through walls or in total darkness.
In tests on benchmark datasets, WhoFi achieved accuracy rates above 95% under controlled conditions. Unlike camera-based systems, it works without line of sight and requires no ID card, wearable, or phone. The person being tracked doesn’t need to participate or even know it’s happening.
WhoFi’s most immediate potential lies in physical access control and security. If matured and commercialized, it could dramatically change how buildings, campuses, and facilities manage entry and surveillance.
WhoFi could replace or supplement keycards and facial recognition for door access. As someone approaches, Wi-Fi sensors detect their unique signature and unlock automatically. No badges, fingerprints, or face scans needed — just your physical presence. In labs, hospitals, or clean rooms where hands-free access matters, this could simplify operations and reduce contamination risk.
It could serve as an extra verification step in existing systems. A match between an existing credential and a WhoFi signature could confirm identity with higher confidence. Conversely, mismatches could flag potential intrusions, making security more resilient.
Because it works through walls, WhoFi could alert staff to unauthorized presence in restricted areas, even in darkness or when cameras are obstructed. It could act as a silent alarm system that detects movement and identity simultaneously.
Retailers could use WhoFi to track returning customers, monitor dwell times, or analyze movement, all without cameras. The system’s ability to recognize individuals through obstacles could produce deeper behavioral data than visual tracking alone.
The technology does have some technical and operational challenges. Real-world use will test WhoFi’s robustness. Everyday environments are noisy filled with reflections, interference, and multiple people moving at once. These factors can distort signals and degrade accuracy. Enrollment also presents challenges. To re-identify someone, the system must have a stored “signature,” which means an initial calibration step. Changes in body shape, posture, or clothing could shift that signature and require re-training.
Still, for access control and security, WhoFi represents a potential breakthrough, a step toward ambient, device-free authentication.
The more powerful WhoFi becomes, the more it challenges privacy and data protection laws. Its ability to detect, locate, and uniquely identify people without their cooperation collides directly with existing privacy frameworks.
Unlike cameras or card readers, WhoFi operates invisibly. People can be tracked simply by moving through a Wi-Fi zone. There’s no obvious indicator of surveillance, no camera lens, no beep, no opt-out. Over time, these systems could generate movement logs and behavioral profiles, linking physical presence across multiple spaces. Even if names aren’t stored, unique signatures enable persistent tracking. This is the concerning part, while Personally Identifiable Information (PII) may not be attached, your unique signature can be used to build a profile and detailed history that at some point could be paired with PII.
Under the EU’s General Data Protection Regulation (GDPR), biometric data used for unique identification counts as a “special category” of personal data, subject to strict rules. Collecting it requires explicit consent or a clear legal basis. WhoFi would almost certainly qualify. Even though it records signal distortions rather than facial features, the end result is the same: a stable identifier tied to an individual’s body.
That raises major compliance problems. How can consent be obtained from people who don’t know they’re being scanned? How can individuals exercise their rights to access, correction, or deletion if they can’t even tell a system is collecting their data?
Other jurisdictions are likely to follow GDRP. U.S. states like Illinois and Texas already regulate biometric identifiers under specific laws, and Wi-Fi-based body signatures could fall under those definitions. If deployed carelessly, WhoFi systems could face lawsuits and bans similar to those that hit facial recognition vendors.
The same traits that make WhoFi exciting for access control make it dangerous for surveillance abuse.
- Mass tracking: Cities or corporations could monitor people’s movements continuously, blurring the line between security and spying.
- Covert monitoring: In private homes or workplaces, WhoFi could be misused to follow individuals without consent.
- Opaque decisions: If access is denied or granted automatically, users may never know why or how to challenge errors.
To align with privacy law, future deployments would need safeguards: visible disclosure, opt-in consent, encrypted data handling, and limits on range and retention. Systems should store only temporary, anonymized embeddings and destroy raw data after use. Independent audits could enforce accountability.
WhoFi hints at a future where the walls themselves can recognize you, where Wi-Fi becomes a biometric medium. For security and convenience, the implications are enormous. For privacy and civil liberties, they are equally profound.
Used responsibly, it could simplify secure access, reduce friction, and enhance safety. Used irresponsibly, it could create an invisible surveillance web that no one can see or escape.
The researchers at La Sapienza have opened a new frontier in sensing technology. What happens next depends on how policymakers, engineers, and society choose to govern it. The challenge isn’t just technical, it’s ethical. The question is no longer whether our surroundings can recognize us, but whether they should.






