INSUBCONTINENT EXCLUSIVE:
Image copyrightBlizzardImage caption
Starcraft is a fast-moving strategy game played in real time
Robots
that train themselves in battle tactics by playing video games could be used to mount cyber-attacks, the UK military fears.The warning is in
a Ministry of Defence report on artificial intelligence.Researchers in Silicon Valley are using strategy games, such as Starcraft II, to
teach systems how to solve complex problems on their own.But artificial intelligence (AI) programs can then "be readily adapted" to wage
cyber-warfare, the MoD says.Officials are particularly concerned about the ability of rogue states and terrorists to mount advanced
persistent threat attacks, which can disable critical infrastructure and steal sensitive information."Not only will AI increase the variety
and tempo of cyber-attacks, it will also decrease the cost and increase the variety of actors able to undertake this activity," the report
"As the requirement for skilled specialists involved in the attack diminishes, the limitation will become access to the AI algorithms needed
to conduct such an attack
"In other words, any actor with the financial resources to buy, or steal, an AI APT (advanced persistent threat) system could gain access to
tremendous offensive cyber-capability, even if that actor is relatively ignorant of internet security technology
"Given that the cost of replicating software can be nearly zero, that may hardly present any constraint at all
This is likely to be a live issue by 2020 or soon thereafter
"For example, the state-of-the-art AI is being trained in tactical reasoning by playing computer strategy games
"AIs like this could then be readily adapted to drive APT cyber-attack tactics, where the AI is competing against human or non-adaptive
automated cyber-defenders."The report cites Google's DeepMind artificial intelligence research project, which is using Starcraft II - a
real-time strategy game for PC and Mac users, launched in 2010 - to train programs to think for themselves.London-based DeepMind, which says
it is committed to making the world a "better place" through artificial intelligence, by solving complex problems such as climate change,
has also used Atari and Go games to train its systems.But, like other artificial intelligence researchers, those at DeepMind were attracted
to the complexity and fast-moving nature of Starcraft, which involves a three-way conflict between humans, the insectoid Zerg and Protoss
Image copyrightYONHAP/ ReutersImage caption
Go player Lee Sedol pits his wits against Google's AI programme AlphaGo
DeepMind said testing its artificial intelligence "agents" in games "that are not specifically designed for artificial
intelligence research, and where humans play well" was "crucial" to their development
Starcraft players build bases to gather resources that help make combat units to seek out and destroy opponents.Other technology companies,
including Facebook, have now developed artificial intelligence bots to play the game after makers, Blizzard Entertainment, released tools to
enable them to do so.Human players trounced artificial intelligence bots made by Facebook, DeepMind and other companies in a Starcraft
tournament in November, suggesting there is still some way to go before the robots take over.Steven Murdoch, an information security
research fellow at University College London, said artificial intelligence bots with the ability to carry out sophisticated cyber-attacks on
their own were "fairly far away".Even those that could play games, such as Go and Starcraft, against humans were "not very creative" and
relied on following "a simple set of rules"
"As technology advances, more automation will be available, particularly for the delivery of malicious software, but the preparation of
attacks and development of tactics will still require human expertise for the foreseeable future," he told TheIndianSubcontinent News.AI
programs could be stolen and misused, as the MoD says in its report, but current systems "are quite specific to a particular task and it
takes considerable skill and expertise to adapt a system to a new application area," he added.The MoD report says the private sector is
leading the way in artificial intelligence research - and the technology industry's reluctance to appear too close to defence or security
agencies is creating a skills shortage in the military."Some Western commercial entities have publicly declared policies stating they will
not contract with defence or security agencies, which may compound the challenges facing the MoD," says the report
"This is in stark contrast to other states, which have enshrined access rights to expertise, technology and data in their national
legislation."The report proposes setting up a register "of security cleared UK nationals with AI and robotics skills" to be called on in