There is no defense when your digital fingerprint is used as AI training data.
The headline reads:
New acoustic attack steals data from keystrokes with 95% accuracy:
New acoustic attack steals data from keystrokes with 95% accuracy
Highlights include:
A team of researchers from British universities has trained a deep learning model that can steal data from keyboard keystrokes recorded using a microphone with an accuracy of 95%.
What makes this so bad is that everyone has a unique keyboard entry fingerprint.
We all have a unique rhythm and groove to express ourselves on a keyboard.
Learning to change these patterns is slow, because typing is in a large part muscle memory.
Changing that is slow and intentional work, especially for veteran keyboard users.
Just as there is a personal keyboard usage pattern that can be detected, all content made by humans can be fed into a detection algorithm.
The same training that makes ChatGPT so effective can be applied to deanonymizing you on the internet.
How many millions would an attacker spend on compute if you could decode Elon typing out his eX-Twitter password?