The Importance of Quantum Computing

Quantum Computing qubitTo understand what quantum computing is and why scientists find it exciting, we need to understand how traditional computing works. Today’s computers use switching and memory units – known as transistors – to store and retrieve data. These transistors handle many of the tasks calculators used to handle. Transistors have become much smaller – almost as small as an atom – but essentially, they function just like the old calculators using a sequence of bits 0-1 (you can think of these as on-off) known as a binary system. This processes the data we provide by following a pre-arranged set of instructions, known as a program.

Binary Process

We have come a very long way with this binary process. Our computers can do some complicated processing and sorting tasks by using a string of binary mathematical operations known as an algorithm. Google and other search engines use algorithms to make the sorting process very fast. The binary system of conventional computing basically does the addition, subtraction and/or multiplication almost instantly.

So why do we need a different way of computing? Miniaturization has given us the ability to pack hundreds of millions of transistors on a chip of silicon about the size of a fingernail. However, as computer technology continues to advance, the more information we need to store, the more bits and transistors we need. Currently, our transistors are as small as we can make them. Most computer tasks we do are unlikely to max out computer power because they need more transistors than our computers can house. However, as computers continue to handle complex computing problems on behalf of companies (and private and public organizations), they will hit a ceiling and exceed the capacity and capability currently available. Scientists refer to these no-go situations as intractable problems – problems traditional computing cannot solve. Quantum computing – using atomic particles – is seen as a possible answer to the capacity and time limitations inherent in binary systems.

Quantum Computing

Quantum theory deals with atoms and the subatomic particles they contain. Atoms do not obey the basic rules of traditional physics. In quantum computing, qubits take the place of bits. Unlike a bit that is restricted to a binary system (think 0-1 or on-off), an atomic qubit can store an infinite range of values between 0 and 1 in multiple states. Don’t worry too much about understanding exactly how it works, just remember this means quantum computing could do multiple things at the same time – unlike conventional computing, which does a series of things one at a time – and that it could work up to millions of times faster than our current binary systems.

Will quantum computing render traditional computing obsolete? No, that is unlikely. Most of us will not need such powerful computing technology. And the commercial launch of quantum computing is by no means a certainty. It’s been about 30 years since researchers began to discuss quantum computing theory, and we have seen some significant progress in the past seven or eight years, with Google and MIT both producing prototypes. Researchers estimate we won’t see mainstream quantum computing for some years. Interestingly, if/when quantum computing comes of age, it would have huge impact on our current encryption technology (encryption is really the deliberate manufacture of an intractable problem). Now, that might be something for us all to think about.


Disclaimer 

 

TELL US ABOUT YOUR REQUIREMENTS

     
     
     
     
     
    © 2024  CPAs Fairfield NJ
    Accounting Websites by Service2Client.com