Fujitsu Laboratories and Quantum Benchmark Begin Joint Research on Algorithms
Fujitsu Laboratories and Quantum Benchmark of Canada has announced that they will conduct joint research on quantum algorithms using Quantum Benchmark’s error suppression technology as they aim to advance the capabilities of current generation quantum computing platforms.
Quantum Benchmark, a startup founded by leading researchers from the University of Waterloo’s Institute for Quantum Computing, is the leading provider of software solutions for error characterization, error suppression, and performance validation for quantum computing hardware.
In this collaborative research project, the companies will develop practical quantum algorithms utilizing Fujitsu’s AI algorithm development technology as well as its knowledge gained through Digital Annealer applications in finance, medicine and material development. Quantum Benchmark’s patented True-QTM software system, which enables optimal performance of current hardware, is a key to this development.
Accordingly, Fujitsu Laboratories and Quantum Benchmark will endeavor to solve problems in the fields of materials science, drug development and finance that are intractable to solve with conventional computers.
Issues and Development Background
Quantum computers are expected to be able to perform a new form of computation by harnessing fundamental properties of the quantum world, such as entanglement and superposition. This is often explained by invoking the idea that they can process both 0 and 1 at the same time, and the continuum of states in between 0 and 1. This advantage comes by performing calculations using quantum bits, called “qubits”, which is unlike conventional computers which process conventional bits, that can be only 0 or 1.
However, quantum bits are fragile and highly vulnerable to errors and noise, and as time goes on, the effects of noise add up, making the quantum calculation results inaccurate. Since calculations for pharmaceuticals and materials are time-consuming, there is a need to develop error-suppression methods enabling algorithms to overcome the effects of noise.
For more info, click here