We are pleased to share with you all an interesting article contributed by Ajay Malik who is Head Architecture/Engineering, Worldwide Corporate Networking & Services at Google.
Ajay Malik Head Architecture/Engineering, Worldwide Corporate Networking & Services at Google |
|
As the amount of data in the world is rapidly increasing, so is the time required for machines to process it. Augmented Reality, Virtual Reality, Artificial Intelligence, Robotics, Real-Time Analytics, and Machine Learning algorithms are needing the cloud to be infinitely faster as well as to require unlimited computing power and an endless amount of storage. Interestingly, this is happening on heels of the slow down of Moore’s law. Chip maker Intel has signaled a slowing of Moore’s Law, a technological phenomenon that has played a role in almost every significant advancement in engineering and technology for decades. We are no longer able to cram transistors in the circuits at the velocity we have been doing.
soon be powered by quantum computing, and software will be written in another way.
Although work has been in progress for over five decades, the last five years have seen strides in this space, 2017 can indeed be called the year of quantum computing.
IBM, Microsoft, Google, Intel, D-Wave have made tremendous advances this year. It is now here to push the bounds of computer performance further forward.
What is Quantum Computing?
superpositions allow qubits to perform multiple calculations at once rather than in sequence like a traditional machine. For example, you can compute four calculations with two qubits.
few doors as possible. A traditional computer will need to do, on average, a little over two operations to find the prize as it has to open each door in succession. The quantum computer, however, can locate the prize with one action because it can open all four doors at once! You can perform eight calculations with the three qubits. The number of such computations double for each additional qubit, leading to an exponential speed-up. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations (much larger than the Shannon Number) in one operation.
Top five things you should know about it
3. Quantum computing revolutionizes the way that we approach machine learning and artificial intelligence. It will accelerate machine learning remarkably. Quantum computers will reduce power consumption anywhere from 100 to 1000 times because quantum computers use quantum tunneling.
4. Quantum computing will destroy the internet security as it is known today. It would be able to crack several of today’s encryption techniques such as RSA and ECC within days if not hours. In this regards, Quantum computing is like a deja vu of discovering the use of enormous energy locked in an atom. Nuclear fission was found in 1938, nine months before the beginning of the Second World War, and it changed the world. Quantum computing could be the IT equivalent of an Atomic Bomb. We are now in a race against time to prepare modern cryptographic techniques before they get broken. New security methods that will allow us to secure data using the laws of physics rather than using external cryptographic methods.
5. Quantum computing is not for every problem. Classical computers are still better than Quantum computers at some tasks such as spreadsheets or desktop publishing. However, Quantum computing will be incredibly useful for solving notable chemistry issues, self driving cars coordination, financial modeling, weather forecasting, and particle physics, etc.
Have you written your first quantum program yet? In the next few articles, I will go over how how to program using Quantum computing, how to determine which problems are best to be addressed between Quantum computing vs. Classical computing, how it impacts Machine Learning, and how you will develop killer apps such as self-driving car coordination as well as the challenges/solutions in the world where cryptography and quantum computing intersect. Quantum computing revolutionizes the way we approach computer science and logic. A lot of algorithms will need to be redesigned/rewritten using quantum computing paradigms – looking forward to it!
PS: Isn’t this a remarkable pic? The heading picture is from the October 1927 Fifth Solvay International Conference on Electrons and Photons, where the world’s most notable physicists met to discuss the newly formulated quantum theory. Seventeen of the 29 attendees were or became Nobel Prize winners!
|
Interesting article ..