Where computer science is taking us at the 2018 Mathematical and Computational Sciences Showcase

24 September 2018
(L-R): Associate Professor Ian Watson; Head of the Department of Computer Science, Professor Robert Amor; and Professor Cris Calude.
(L-R): Associate Professor Ian Watson; Head of the Department of Computer Science, Professor Robert Amor; and Professor Cris Calude.

What are the ethical issues around the development of artificial intelligence? And how smooth, really, is the road to quantum computational supremacy?

These were the subjects of two stimulating talks delivered by Computer Science staff at the 2018 Mathematical and Computational Sciences Showcase for alumni, postgraduate students and friends on Saturday.

The departments of Computer Science, Mathematics and Statistics banded together to present short and lively talks on current research and display the results of various projects. The annual event is now in its fourth year.

Associate Professor Ian Watson discussed the ethical issues associated with research into artificial intelligence (AI), a field in which he has worked for 30 years. He pointed out that there were limits to how much autonomy you might want to give to, say, a driverless car.

As an example, he said, if you were the sole passenger in a car with limitless autonomy and it came around a corner to find a truck on the wrong side of the road, would you want it to collide head-on and kill you, because to do otherwise would plough it into a roadside playground full of children, and risk a higher death toll? As he remarked, no one would want to buy a car that might kill them.

Ian also discussed the ethical issues around the development of autonomous weapons systems, where armed vehicles or planes can search for and eliminate people meeting pre-defined criteria, with no human intervention. In such a world, he asked, who is morally responsible for the deaths? “We can’t hold software morally responsible.”

He said that the ethical risks inherent in the development of autonomous weapons are of such concern to AI and robotics researchers that they have issued an open letter to the United Nations calling for curbs.

So far more than 25,000 members of the AI sector have signed the letter, which says, in part: “Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”

Ian believes that as the United Nations successfully outlawed land mines, it can galvanise countries to outlaw autonomous weapons systems.

Quantum computers don’t yet exist – but theoretically, they could solve in days complex problems that would take a supercomputer billions of years to unravel. However, the quantum bits or qubits that power these computers require prodigious engineering feats, such as keeping superconducting circuits at temperatures colder than outer space.

On Saturday, Professor Cris Calude told the audience that despite claims of progress made towards quantum computational supremacy by Google, IBM, Microsoft and other institutions, it would not make classical computing obsolete.

He discussed several reasons why and observed that the road to quantum supremacy is currently laden with claims over results: “The conversation on quantum computing, quantum cryptography and their applications needs an infusion of modesty, if not humility, more technical understanding and clarity as well as less hype,” he said. “Raising false expectations could be harmful for the field.”

  • Read more about Cris’ discussion of quantum supremacy here.
  • See photos from the day on the Department of Statistics Facebook page.