quantum mechanics emerged as a branch of physics in the early 1900s to explain nature on the scale of atoms and led to advances such as transistors, lasers, and magnetic resonance imaging. The idea to merge quantum mechanics and information theory arose in the 1970s, but garnered little attention until 1982 when physicist Richard Feynman gave a talk in which he reasoned that computing based on classical logic could not tractably process calculations describing quantum phenomena. Computing based on quantum phenomena configured to simulate other quantum phenomena, however, would not be subject to the same bottlenecks. Although this application eventually became the field of quantum simulation, it didn’t spark much research activity at the time.
In 1994, however, interest in quantum computing rose dramatically when mathematician Peter Shor developed a quantum algorithm, which could find the prime factors of large numbers efficiently. Here, “efficiently” means in a time of practical relevance, which is beyond the capability of state-of-the-art classical algorithms. Although this may seem simply like an oddity, it is impossible to overstate the importance of Shor’s insight. The security of nearly every online transaction today relies on an RSA cryptosystem that hinges on the intractability of the factoring problem to classical algorithms.
Looking back over the past few decades, it’s incredible how much technology has advanced. The smartphones we carry today have more computing power than NASA had available to guide the Apollo 11 astronauts to the moon and back 50 years ago. We can thank Moore’s law and the relentless pace of innovation.
But even with the amazing advances of Moore’s law, some computational tasks seem just as infeasible as they did 50 years ago. For the current classical computing paradigm, certain types of computational complexity are fundamentally out of reach.
In fact, when complexity grows exponentially, it’s difficult for the human mind to comprehend how quickly even the fastest supercomputers can be brought to their knees. For example, a Sudoku puzzle, which has merely a 9 x 9 grid of numbers and rules so simple that they can be learned in a minute, has more than 6,670,903,752,021,072,936,960 possible solutions, which may be larger than the number of stars in the universe!
This is a simple example of a class of problems that suffers from combinatorial explosion, where even a relatively small increase in the number of degrees of freedom (e.g. number of electrons and atoms in a molecule, or number of vertices in a graph), due to exponential growth in complexity, can make the task so daunting that conventional computers may take eons to search through the candidate solutions.
Over the past few decades, a new computing paradigm has emerged to tackle these types of challenges. From its origins in the theoretical musings and debates of physicists and mathematicians in the early 1980s, quantum computing has steadily moved from speculative research towards practical applications. Today it’s poised to offer a fundamentally new approach for navigating extreme computational complexity.
Instead of a classical bit that can be either 1 or 0 (i.e. yes or no), quantum superposition means that a quantum bit, or qubit, can be a 1, a 0, or any state in between, all at the same time. And quantum entanglement means that multiple qubits can be linked together, as if they were a connected entity, even if they’re separated by great distances.
This mind-bending flexibility allows quantum computing, for the suitable classes of problems that can leverage quantum superposition and entanglement, to process more information faster than anything else available or even imaginable. By harnessing the physics of the subatomic realm, quantum computers can run simulations, solve problems, and answer questions that even the most powerful classical supercomputers find difficult or impossible to tackle.
However, publicly available quantum computing has been on the horizon for decades; yet, there’s still no reliable forecast for when products capable of widespread adoption will get released, exactly what form the systems will take, or how broad and deep quantum computing’s impact will be. There are many technical challenges when it comes to getting a broadly applicable quantum computer to be useful at scale. And quantum computers are racing against classical computers, which have also been improving very rapidly in power, versatility, and ease of use.
One important bottleneck for quantum computers is that correctly reading the results of a quantum calculation is prone to a very high error rate – the delicate superposition state can deteriorate before the correct result is presented as an output.
In 2019, Samsung Catalyst Fund co-led with Mubadala a $ 55 million investment round in IonQ, a leader in quantum computing. With this funding – which adds to prior investments from NEA, GV, and Amazon – IonQ made its trapped-ion universal quantum computer accessible to businesses via the cloud. IonQ has developed the most powerful trapped-in quantum computing system to date and has harnessed its technology to break new ground in quantum computing, such as generating the world’s first quantum computer simulation of the water molecule.
Samsung has been active in the field of quantum computing and has recently invested in several start-ups like IonQ, a US-based company that is building ion-trap quantum machines. IonQ has raised $ 84 million in total funding, and its prior funding round in 2019 was co-led by Samsung Catalyst Fund, a venture capital fund backed by the Korean electronics giant, and Mubadala Capital, a venture fund backed by the Government of the UAE.
At Samsung, they see a unique opportunity to accelerate this exciting industry by leveraging our strengths in core technology and manufacturing, combined with market leadership in semiconductors, display, and battery technologies. And it is possible this will lead to a virtuous cycle: By helping to advance quantum computing, quantum computing may then in turn benefit the development of these technologies. It is uncertain, but if it’s true, these advances could help tackle problems that might be too complex for today’s tools.