Artificial Intelligence is being used more and more in cybersecurity industry, as it has been seen as a potential solution for discovering and combating malicious software and other cyber threats, stop cyber attacks before they compromise devices.
However, recently, security researchers have revealed that artificial intelligence can also be used by threat actors to power a next-generation malware. The artificial intelligence-powered malware can potentially evade even the best antivirus defenses and compromise a computer network or start a cyber attack.
DeepLocker – newly developed Artificial Intelligence-based malware
Security researchers at IBM Research developed DeepLocker – a new type of highly targeted and evasive attack tool based on Artificial Intelligence. The malware is able to conceal its malicious intentions until it reaches its target.
DeepLocker is a stealthy new breed of malware and it can fly under the radar without being noticed.
DeepLocker unleashes its malicious action as soon as the Artificial Intelligence model identifies the target through indicators like facial recognition, geolocation and voice recognition.
DeepLocker utilizes Spray & Pray Approach of a traditional malicious software
According to the researchers at IBM Research malware uses Spray and pray approach of traditional malware. This kind of stealthy malware powered by artificial intelligence is extremely dangerous and can compromise millions of systems without being detected.
DeepLocker is able to hide the malicious payload in benign carrier apps in order to evade being detected by security scans until it finds a particular target. Potential victims are identified by using voice recognition, facial recognition, geolocation or other system-level indicators.
What is unique about DeepLocker is that the use of AI makes the “trigger conditions” to unlock the attack almost impossible to reverse engineer. The malicious payload will only be unlocked if the intended target is reached,- the researcher explained.
A proof of concept was introduced in order to demonstrate the malware capabilities.
“Imagine that this video conferencing application is distributed and downloaded by millions of people, which is a plausible scenario nowadays on many public platforms. When launched, the app would surreptitiously feed camera snapshots into the embedded AI model, but otherwise behave normally for all users except the intended target,” – the researchers at IBM Research added.
When the victim sits in front of the computer and uses the application, the camera would feed their face to the app, and the malicious payload will be secretly executed, thanks to the victim’s face, which was the preprogrammed key to unlock it.
In order to perform an attack, all the DeepLocker malware needs is the potential victim’s picture which can be easily found on social media. Which can make it even easier, attackers can utilize Social Mapper – a free and simple tool which can be used to track people across social media networks.
More information on the DeepLocker artificial intelligence-powered malware researchers will provide at the Black Hat USA security conference.