Quantum computing breakthrough could could help pave the way to artificial intelligence and other transformative developments, some people believe
Researchers at the University of New South Wales (UNSW) have used affordable silicon, the material already used in chips for our smartphones, computers and tablets.
The news is important because regular computers read data as binary bits – 0 or 1 – but quantum computing allows for a quantum bit – a qubit – to exist in both states at once thanks to the strange behaviour of subatomic particles.
Questions that might take a normal computer potentially millions of years to resolve might take a quantum computer a matter of days, reports the Sydney Morning Herald.
What we have is a game changer,” said team leader Andrew Dzurak, Scientia Professor and Director of the Australian National Fabrication Facility at UNSW.
“We’ve demonstrated a two-qubit logic gate – the central building block of a quantum computer – and, significantly, done it in silicon.
“Because we use essentially the same device technology as existing computer chips, we believe it will be much easier to manufacture a full-scale processor chip than for any of the leading designs, which rely on more exotic technologies.
“This makes the building of a quantum computer much more feasible, since it is based on the same manufacturing technology as today’s computer industry,” he added.
The first prototype quantum computing chip could be here within 5 years, with the first commercial application likely to be in large data centres, according to Prof Dzurak.
Potential benefits could also see the faster development of new medicines by accelerating the computer-aided design of pharmaceutical compounds – and reducing lengthy trial and error testing – to development of new, lighter and stronger materials from consumer electronics to aircraft as well as faster information searching through large databases.
Over on the Mother Jones news website, their report quite simply says: “It’s possible that large-scale quantum computing could lead to breakthroughs in emulating human thought processes and speeding up the creation of artificial intelligence. Maybe.”