Javascript based biometrics

Has there been any discussion concerning javascript based biometric de-anonymization techniques using keyboard and mouse I/O? Are there any plans to attempt to introduce delay in these two sources of user input
as an obfuscation method to this technique?

1 Like

I’m sure this has been discussed plenty of times (I don’t have links though), but it sounds extremely, extremely hard.
The best solution is, as always, disabling JavaScript (i.e. the “Safest” setting in Tor).

Like all fingerprinting, you have to think about the available precision in the observation of the user. In the case of mouse/keyboard input fingerprinting, the precision comes from the timer available to the attacker. The big hammer solution is to just lower the precision of the JS timer API to induce aliasing and reduce the number of bits contributed to an overall fingerprint of the user. However it’s not that simple. There are other ways to create high precision timers other than the explicit timer API; counting loop iterations being the most obvious. The timing problem is very hard to solve but there are other potential solutions.

A similar timing-based attack can be found in network traffic flow analysis. The solution there is to delay and combine packets and transmitting them at a regular cadence to eliminate the opportunity to track a packet as it flows through a network simply based on ingress/egress timing. Maybe there’s a solution where input updates from the keyboard/mouse are delayed slightly and batched up and delivered in the browser at regular time intervals to try to smooth out any observable variation between input events rather than trying to prevent high resolution timers.

This has been discussed for many years, thanks for bringing it up again since it hasn’t been solved.

2 Likes

Out of curiosity, who is going to those lengths to perform mouse/keyboard input fingerprinting? It must be happening or it would not be a subject here. I’m assuming state actors and malicious regimes. Or are we talking about the big corporations?

I came across this today: A.I. can identify keystrokes by just the sound of your typing and steal information with 95% accuracy, new research shows
Laptop users are at risk of having sensitive information stolen just by the sound of typing on their keyboard

It’s not really fingerprinting. It might be BS.

(you might have to toggle reader mode to see it) A.I. can listen to you type and steal your information: study | Fortune

1 Like

I think, as is so very often the case, one must take the appropriate mitigations based on the perceived &/or actual threat model. For example, I’ve isolated a number of desktop computers in a soundproof glass cabinet with it’s own dedicated cooling system, no risk of eves-dropping on the sound of keystrokes there. However, reading what I view on my monitor by directing a laser beam at my head? That’s another matter. There will always be a spectrum of perceived &/or actual threats. One can only but endeavour to be suitably technically informed, to make the best possible choice. of course, if one is specifically targeted as an individual, she might very well be doomed :scream: However, in the context of the information provided by @dhuseby in his answer to the OP’s question, I personally found it to be a useful overview & point de départ into further exploration of this topic. Whether one attempts The Conga Dance or the High Jump & everything in between, the variables will inevitably continue to wax & wane.

But how would they pickup audio from typing on the keyboard? What if the microphone and speaker are disabled?

This article explains one example of an “AI system that can eavesdrop on your keyboard to collect potentially sensitive data” using machine learning to differentiate the sounds of the different keyboard keys,
“The researchers first recorded audio samples of typing on a MacBook Pro, pressing each key 25 times. This allowed the AI system to analyze the minute variations between the sound emanating from each key.”[1]
And,
“The audio recordings were then transformed into spectrograms, which are visual representations of sound frequencies over time. The AI model was trained on these spectrograms, learning to associate different patterns with different keystrokes.”[1]
And,
“With microphones embedded in common consumer devices, typing acoustics are more exposed and accessible than ever before.”[1]
For example other devices in the vicinity of the keyboard, e.g. Siri, Alexa, other mobile phones/tablets/Apple Watch et al/baby monitor/home surveillance cameras/VR headsets/Gaming headphones etc.

Again, identifying the threat model towards the individual is key to determining the actual &/or potentially more advanced threats & mitigations. If the threat is against a specific individual, e.g. yourself, you might find yourself in the realm of Sousveillance [2]

Fear is often triggered by the unknown. Seemingly random threats could be heading my way?! Often the antidote to fear is awareness, stemming from knowledge. Find the mitigations that seem proportionate to your situation.

This is why developing an OPSEC [3] mindset will guide you beyond feeling bamboozled by potential-surveillance-threats-everywhere & lead you into a calm oasis of choice & strategy, with the composure of a well-seasoned chess player.

[1] AI Can Tell What You Type By Listening to Your Keystrokes - Decrypt

[2] Sousveillance - Wikipedia

[3] Why OPSEC Is for Everyone, Not Just for People with Something to Hide | Tripwire

1 Like

Don’t know. I’m assuming it is sound which is not detectable with human ears. A long time ago I remember reading that keyboards emit RF signals. Is that still true. The links below (or above) from ukmr are also interesting.

“laser eavesdropping”

edit: enjoy … Cone of Silence (Get Smart) - Wikipedia

[1] very interesting. I think they reference the same research. “The AI system needs to be calibrated to specific keyboard models”.
WoW that’s a tall order.

And I don’t understand the comment about “Touch typists seem to confuse the model, making its accuracy drop to 40%”. Or does that imply most people use 1 or 2 fingers to type. So touch typing would not apply to smartphones which is what most people use.

The countermeasures are interesting.

[3] Ha I’ve been saying this for ages. I always get that blank stare from people who have “nothing to hide”. In all those instances, not one person has allowed me to go through their wallet or phone or computer or willing to let me go to their home and snoop around. Hmmmm.

Everyone should read this.

:grinning: Ha, you are giving away your age with the cone of silence… or maybe you watch the retro channels

It’s been talked about on gitlab repeatedly, but as far as I know there are no mitigations planned in TBB.

The good news is, it can be mitigated at the OS level using Kloak.

http://www.dds6qkxpwdeubwucdiaord2xgbbeyds25rbsgr73tbfpqpt4a6vjwsyd.onion/wiki/Keystroke_Deanonymization

Can you please eli5 how does this work? And is this a targetted attack? Suppose you visit a website, which requires javascript and you upload a file there or create and account and send a message to someone etc. then are they still doing network traffic flow analysis for you or do they do this only for high value targets? And suppose after registering on this website the user never logs in on this website again then will there be any such analysis performed using some other fingerprinting means?