Growth of AI could boost cybercrime and security threats, report warns
Date: February 21, 2018Source: Computer Crime Research Center
Wanton proliferation of artificial intelligence technologies could enable new forms of cybercrime, political disruption and even physical attacks within five years, a group of 26 experts from around the world have warned.
In a new report, the academic, industry and the charitable sector experts, describe AI as a “dual use technology” with potential military and civilian uses, akin to nuclear power, explosives and hacking tools.
“As AI capabilities become more powerful and widespread, we expect the growing use of AI systems to lead to the expansion of existing threats, the introduction of new threats and a change to the typical character of threats,” the report says.
They argue that researchers need to consider potential misuse of AI far earlier in the course of their studies than they do at present, and work to create appropriate regulatory frameworks to prevent malicious uses of AI.
Guardian Today: the headlines, the analysis, the debate - sent direct to you
Read more
If the advice is not followed, the report warns, AI is likely to revolutionise the power of bad actors to threaten everyday life. In the digital sphere, they say, AI could be used to lower the barrier to entry for carrying out damaging hacking attacks. The technology could automate the discovery of critical software bugs or rapidly select potential victims for financial crime. It could even be used to abuse Facebook-style algorithmic profiling to create “social engineering” attacks designed to maximise the likelihood that a user will click on a malicious link or download an infected attachment.
The increasing influence of AI on the physical world means it is also vulnerable to AI misuse. The most widely discussed example involves weaponising “drone swarms”, fitting them with small explosives and self-driving technology and then setting them loose to carry out untraceable assassinations as so-called “slaughterbots”.
Add comment Email to a Friend