Laptops more susceptible to having keyboard recorded in quieter areas, like coffee shops, libraries, offices. Previous attempts at keylogging VoIP calls achieved 91.7 percent top-5 accuracy over Skype in 2017 and 74.3 percent accuracy in VoIP calls in 2018.
Mechanical keyboard users in shambles
This is the best summary I could come up with:
In their paper A Practical Deep Learning-Based Acoustic Side Channel Attack on Keyboards (full PDF), UK researchers Joshua Harrison, Ehsan Toreini, and Marhyam Mehrnezhad claim that the trio of ubiquitous machine learning, microphones, and video calls “present a greater threat to keyboards than ever.”
Laptops, in particular, are more susceptible to having their keyboard recorded in quieter public areas, like coffee shops, libraries, or offices, the paper notes.
Combining the output of the keystroke interpretations with a “hidden Markov model” (HMM), which guesses at more-likely next-letter outcomes and could correct “hrllo” to “hello,” saw one prior side channel study’s accuracy jump from 72 to 95 percent—though that was an attack on dot-matrix printers.
The Cornell researchers believe their paper is the first to make use of the recent sea change in neural network technology, including self-attention layers, to propagate an audio side channel attack.
Because of this, the potential for a second machine-bolstered system to correct the false keys, given a large language corpus and the approximate location of a keystroke, seems strong.
The 2013 “Dropmire” scandal that saw the US spying on its European allies was highly likely to have involved some kind of side channel attack, whether through wires, radio frequencies, or sound.
I’m a bot and I’m open source!
Unless you turn on “original sound for musicians” Zoom uses AI to filter the audio for voices mainly. I rarely if ever hear any keystrokes or mouse clicks anymore… Lots of other non voice noises get filtered out.
Basically if on a zoom call they can record my key strokes and steal our passwords?
Only if you leave your mic unmuted.
This is a troubling advancement, they all are, but the methods of countering this specific one are plentiful.
Really, what’s needed is a more robust mute function with a good voice recognition system that automatically cuts off the mic when you’re not speaking. That, and people need to learn to use push to talk.
Also left out of the headline is the fact that this attack was specifically designed to be leveraged against one specific common laptop, a MacBook. Admittedly, if you are using one it can be a concern, but it’s safe to assume that unless your attacker knows the exact model of computer you are using and dedicates serious resources in to building a targeted attack like this, you’re fine.
The tiniest variation will likely dramatically improve your security.
As a cybersecurity researcher, there are plenty of other attacks that are cheaper and easier to implement that you should be concerned about.
The tiniest variation will likely dramatically improve your security.
Security via having lots of crumbs and hair and crud built up inside your keyboard. Check.
So, just leave it with my kids and dog for 10 minutes?
Yes. Why go to all the trouble of doing very technical things when you can instead do moderately technical but very cunning things?
An old episode of the defunct Reply All podcast comes to mind: “what kind of idiot gets phished?”
How long before they can recreate enough of your keyboard and screen via webcam using the reflections from your glasses / eyeballs?
I mean there’s a lot of ifs here. Is your microphone next to your keyboard? Is someone on the calling attempting this with mature software? Do you they know you are typing your password or something else sensitive?
I would imagine passwords are harder to pin down, assuming it’s like cryptography. If someone types three sentences that’s a lot of coherent data to work with. If someone types 1234BossSucks! then from a cryptography perspective, going to be a low chance to understand.
My I’ll informed two cents.
Edit:
Reading more of the bot summary, and yeah, this sounds more like cryptography. If the microphone can hear your key strokes, data like the volume, timbre, overall rhythm, etc, can be collected and mapped to possible text.
New security guideline just dropped: frequently rotate your keyboard layout.
Removed by mod
Reminds me the researcher working to recreate sound from microvibrations in videos. Spooky stuff. Cool
I’m a Nigerian Prince. Pls zoom with me!
That’s super interesting! Spooky, next to your AI-generated voice being used for fraud, but something I had not thought about before.