Machine learning has many useful cybersecurity applications — provided the technology behind it isn’t unduly influenced or tampered with by malicious actors, such that its data integrity is sabotaged.
Donnie Wendt, principal security researcher at Mastercard, calls this the “Uncle Ronnie” effect: “When my son was little, before he’d go over to visit my brother — his Uncle Ronnie — I’d say, ‘Please, please, don’t learn anything new from him.’ Because I know he’s going to teach my son something bad,” Wendt told SC Media in an interview at the CyberRisk Alliance’s 2022 InfoSec World Conference in Orlando, Florida.
Likewise, an adversary can compromise a machine learning system — teaching it bad habits that its operators must then undo — if they can even detect the breach in the first place.
On the cyber front, properly trained machine-learning systems can help with such tasks as classifying malware, identifying phishing attempts, intrusion detection, behavioral analytics, and predicting if and when a vulnerability will be exploited. But there are ways to skew the results.
“Our adversaries will try to figure out how to circumvent [machine learning] classification often times by injecting adversarial samples that will poison the training,” explained Wendt, who presented on this very topic earlier this week at the InfoSec World conference. Alternatively, bad actors could launch an inference attack to gain unauthorized access to the data based to train the machine learning system.
To protect against attacks launched on machine learning models, Wendt recommended conducting proper data sanitization and also ensuring that you have “proper version control access control around your data… so that if there is an attack… you can go back to prior versions of that data and rerun that model and look for drift.” If you find evidence of wrongdoing, then you can at least undo whatever it is that troublemaking “Uncle Ronnie” taught your machine learning system.
For more insight on how machine learning it can be maliciously influenced, watch the embedded video below.