Sophieersted -
You might find the following of interest:
The smaller computers get, the more powerful they seem to become: there's more number-crunching ability in a 21st-century
cellphone than you'd have found in a room-sized, specially cooled, military computer 50 years ago.
Despite such amazing advances, there are still plenty of complex problems that are beyond the reach of even the world's most powerful computers—and there's no guarantee we'll ever be able to tackle them. One problem is that the basic switching and memory units of computers, known as transistors, are now approaching the point where they'll soon be as small as individual atoms. If we want computers that are smaller and more powerful than today's, we'll soon need to do our computing in a radically different way. Entering the realm of atoms opens up powerful new possibilities in the shape of quantum computing, with processors that could work millions of times faster than the ones we use today. Sounds amazing, but the trouble is that quantum computing is hugely more complex than traditional computing and operates in the Alice in Wonderland world of quantum physics, where the "classical," sensible, everyday laws of physics no longer apply. What is quantum computing and how does it work?
However, even with the phenomenal strides we made in technology and classical computers since the onset of the computer revolution, there remain problems that classical computers just can’t solve. Many believe quantum computers are the answer.
Now that we have made the switching and memory units of computers, known as transistors, almost as small as an atom, we need to find an entirely new way of thinking about and building computers. Even though a classical computer helps us do many amazing things, “under the hood” it’s really just a calculator that uses a sequence of bits—values of 0 and 1 to represent two states (think on and off switch) to makes sense of and decisions about the data we input following a prearranged set of instructions. Quantum computers are not intended to replace classical computers, they are expected to be a different tool we will use to solve complex problems that are beyond the capabilities of a classical computer.
Basically, as we are entering a
big data world in which the information we need to store grows, there is a need for more ones and zeros and transistors to process it. For the most part classical computers are limited to doing one thing at a time, so the more complex the problem, the longer it takes. A problem that requires more power and time than today’s computers can accommodate is called an intractable problem. These are the problems that quantum computers are predicted to solve.
Conventional computers have two tricks that they do really well: they can store numbers in
memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can be done as a series of additions, for example). Both of a computer's key tricks—storage and processing—are accomplished using switches called
transistors, which are like microscopic versions of the switches you have on your wall for turning on and off the lights. A transistor can either be on or off, just as a light can either be lit or unlit. If it's on, we can use a transistor to store a number one (1); if it's off, it stores a number zero (0). Long strings of ones and zeros can be used to store any number, letter, or symbol using a code based on binary (so computers store an upper-case letter A as 1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 different characters (such as A-Z, a-z, 0-9, and most common symbols). Computers calculate by using circuits called
logic gates, which are made from a number of transistors connected together. Logic gates compare patterns of bits, stored in temporary memories called registers, and then turn them into new patterns of bits—and that's the computer equivalent of what our human brains would call addition, subtraction, or multiplication. In physical terms, the algorithm that performs a particular calculation takes the form of an
electronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.
When the transistor was invented, back in 1947, the switch it replaced (which was called the vacuum tube) was about as big as one of your thumbs. Now, a state-of-the-art microprocessor (single-chip computer) packs up to
30 billion transistors onto a chip of silicon the size of your fingernail! Chips like these, which are called
integrated circuits, are an incredible feat of miniaturization. Back in the 1960s, Intel co-founder Gordon Moore realized that the power of computers doubles roughly 18 months—and it's been doing so ever since. This apparently unshakeable trend is known as Moore's Law*.
When you enter the world of atomic and subatomic particles, things begin to behave in unexpected ways. In fact, these particles can exist in more than one state at a time. It’s this ability that quantum computers take advantage of.
Instead of bits, which conventional computers use, a quantum computer uses quantum bits—known as qubits. To illustrate the difference, imagine a sphere. A bit can be at either of the two poles of the sphere, but a qubit can exist at any point on the sphere. So, this means that a computer using qubits can store an enormous amount of information and uses less energy doing so than a classical computer.
As
Richard P. Feynman, one of the greatest physicists of the 20th century, once put it: "Things on a very small scale behave like nothing you have any direct experience about... or like anything that you have ever seen."
If you've studied
light, you may already know a bit about quantum theory. You might know that a beam of light sometimes behaves as though it's made up of particles (like a steady stream of cannonballs), and sometimes as though it's waves of
energy rippling through space (a bit like waves on the sea). That's called wave-particle duality and it's one of the ideas that comes to us from quantum theory. It's hard to grasp that something can be two things at once—a particle and a wave—because it's totally alien to our everyday experience: a car is not simultaneously a bicycle and a bus. In quantum theory, however, that's just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrödinger's cat. Briefly, in the weird world of quantum theory, we can imagine a situation where something like a cat could be alive and dead at the same time!
Instead of bits, a quantum computer has quantum bits or qubits, which work in a particularly intriguing way. Where a bit can store either a zero or a 1, a qubit can store a zero, a one, both zero and one, or an infinite number of values in between—and be in multiple states (store multiple values) at the same time! If that sounds confusing, think back to light being a particle and a wave at the same time, Schrödinger's cat being alive and dead, or a car being a bicycle and a bus.
A gentler way to think of the numbers qubits store is through the physics concept of superposition (where two waves add to make a third one that contains both of the originals). If you blow on something like a flute, the pipe fills up with a standing wave: a wave made up of a fundamental frequency (the basic note you're playing) and lots of overtones or harmonics (higher-frequency multiples of the fundamental). The wave inside the pipe contains all these waves simultaneously: they're added together to make a combined wave that includes them all. Qubits use superposition to represent multiple states (multiple numeric values) simultaneously in a similar way.
Just as a quantum computer can store multiple numbers at once, so it can process them simultaneously. Instead of working in serial (doing a series of things one at a time in a sequence), it can work in parallel (doing multiple things at the same time). Only when you try to find out what state it's actually in at any given moment (by measuring it, in other words) does it "collapse" into one of its possible states—and that gives you the answer to your problem. Estimates suggest a quantum computer's ability to work in parallel would make it millions of times faster than any conventional computer if only we could build it!
In practice, there are lots of possible ways of containing atoms and changing their states using
laser beams,
electromagnetic fields,
radio waves, and an assortment of other techniques. One method is to make qubits using
quantum dots, which are
nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another method makes qubits from what are called ion traps: you add or take away electrons from an atom to make an ion, hold it steady in a kind of laser spotlight (so it's locked in place like a nanoscopic rabbit dancing in a very bright headlight), and then flip it into different states with laser pulses. In another technique, the qubits are photons inside optical cavities (spaces between extremely tiny mirrors). Don't worry if you don't understand; not many people do. Since the entire field of quantum computing is still largely abstract and theoretical, the only thing we really need to know is that qubits are stored by atoms or other quantum-scale particles that can exist in different states and be switched between them.
We know for certain that a quantum computer could do better than a normal one is factorisation: finding two unknown prime numbers that, when multiplied together, give a third, known number.
In 1994, while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computer could follow to find the "prime factors" of a large number, which would speed up the problem enormously.
Shor's algorithm really excited interest in quantum computing because virtually every modern computer (and every secure,
online shopping and banking website) uses public-key
encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentially an "intractable" computer problem). If quantum computers could indeed factor large numbers quickly, today's online security could be rendered obsolete at a stroke. But what goes around comes around, and some researchers believe quantum technology will lead to much stronger forms of encryption. (In 2017, Chinese researchers demonstrated for the first time
how quantum encryption could be used to make a very secure video call from Beijing to Vienna.)
Apart from Shor's algorithm, and a search method called
Grover's algorithm, hardly any other algorithms have been discovered that would be better performed by quantum methods. Given enough time and computing power, conventional computers should still be able to solve any problem that quantum computers could solve, eventually. In other words, it remains to be proven that quantum computers are generally superior to conventional ones, especially given the difficulties of actually building them. Who knows how conventional computers might advance in the next 50 years, potentially making the idea of quantum computers irrelevant—and even absurd.
There's also the fundamental issue of how you get data in and out of a quantum computer, which is, itself, a complex computing problem.
Some critics believe these issues are insurmountable;
others acknowledge the problems but argue the mission is too important to abandon.
One thing is beyond dispute: quantum computing is very exciting—and you can find out just how exciting by tinkering with it for yourself, In 2019, Amazon's AWS Cloud Computing offshoot
announced a service called
Braket, which gives its users access to quantum computing simulators based on machines being developed by three cutting-edge companies (D-wave, IonQ, and Rigletti). Microsoft's Azure cloud platform offers a rival service called
Azure Quantum, while
Google's Quantum AI website offers access to its own research and resources. Take your pick—or try them all.
Suppose we keep on pushing Moore's Law—keep on making transistors smaller until they get to the point where they obey not the ordinary laws of physics (like old-style transistors) but the more bizarre laws of quantum mechanics. The question is whether computers designed this way can do things our conventional computers can't. If we can predict mathematically that they might be able to, can we actually make them work like that in practice?
By entering into this quantum area of computing where the traditional laws of physics no longer apply, we will be able to create processors that are significantly faster (a million or more times) than the ones we use today. Sounds fantastic, but the challenge is that quantum computing is also incredibly complex.
Most researchers agree that we're unlikely to see practical quantum computers appearing for some years—and more likely several decades. The conclusion reached by an influential National Academies of Sciences, Medicine and Engineering
report in December 2018 was that "it is still too early to be able to predict the time horizon for a practical quantum computer" and that "many technical challenges remain to be resolved before we reach this milestone."
The pressure is on the computer industry to find ways to make computing more efficient, since we reached the limits of energy efficiency using classical methods. By 2040, according to a report by the
Semiconductor Industry Association, we will no longer have the capability to power all of the machines around the world. That’s precisely why the computer industry is racing to make quantum computers work on a commercial scale. No small feat, but one that will pay extraordinary dividends.
It’s difficult to predict how quantum computing will change our world simply because there will be applications in all industries. We’re venturing into an entirely new realm of physics and there will be solutions and uses we have never even thought of yet. But when you consider how much classical computers revolutionized our world with a relatively simple use of bits and two options of 0 or 1, you can imagine the extraordinary possibilities when you have the processing power of qubits that can perform millions of calculations at the same moment.
What we do know is that it will be game-changing for every industry and will have a huge impact in the way we do business, invent new medicine and materials, safeguard our data, explore space, and predict weather events and climate change. It’s no coincidence that some of the world’s most influential companies such as IBM and Google and the world’s governments are investing in quantum computing technology. They are expecting quantum computing to change our world because it will allow us to solve problems and experience efficiencies that aren’t possible today. In another post, I dig deeper into how quantum computing will change our world.
See:
https://www.forbes.com/sites/bernar...-easy-explanation-for-anyone/?sh=425c2c7d1d3b
See:
https://www.explainthatstuff.com/quantum-computing.html
Despite all this progress, it's early days for the whole field, and most researchers agree that we're unlikely to see practical quantum computers appearing for some years—and more likely several decades. The conclusion reached by an influential National Academies of Sciences, Medicine and Engineering
report in December 2018 was that "it is still too early to be able to predict the time horizon for a practical quantum computer" and that "many technical challenges remain to be resolved before we reach this milestone."
The quantum computing devices can operate
more than a degree above absolute zero, the scientists report
in two papers published in the April 16, 2020, Nature. Although still chilly, that temperature is much easier to achieve than the approximately 10 millikelvin (0.01 degrees above absolute zero) temperatures typical of a popular type of quantum computer based on superconductors, materials which transmit electricity without resistance.
Hartmann352
* Moore's Law: states that the number of transistors on a microchip doubles every two years. The law claims that we can expect the speed and capability of our computers to increase every two years because of this, yet we will pay less for them. Another tenet of Moore's Law asserts that this growth is exponential. The law is attributed to Gordon Moore, the co-founder and former CEO of Intel. Another tenet of Moore's Law says that the growth of microprocessors is exponential.
Gordon Moore did not call his observation "Moore's Law," nor did he set out to create a "law." Moore made that statement based on noticing emerging trends in chip manufacturing at Fairchild Semiconductor. Eventually, Moore's insight became a prediction, which in turn became the golden rule known as Moore's Law.
As transistors in
integrated circuits become more efficient, computers become smaller and faster. Chips and transistors are microscopic structures that contain carbon and silicon molecules, which are aligned perfectly to move electricity along the circuit faster. The faster a microchip processes electrical signals, the more efficient a computer becomes. The cost of higher-powered computers has been dropping annually, partly because of lower labor costs and reduced semiconductor prices.
As Moore's Law advances, so the number of intractable problems diminishes: computers get more powerful and we can do more with them. The trouble is, transistors are just about as small as we can make them: we're getting to the point where the laws of physics seem likely to put a stop to Moore's Law. Unfortunately, there are still hugely difficult computing problems we can't tackle because even the most powerful computers find them intractable. That's one of the reasons why people are now getting interested in quantum computing.
See:
https://www.investopedia.com/terms/m/mooreslaw.asp