Your questions answered.

By Jeremy First and Ken Odegard

First, thank you for joining the "A practical perspective on quantum computing" webcast. I was humbled by how many tuned in and by the positive feedback. I hope you found it useful.

When preparing for the talk, I intended to pack in as much information as possible, understanding that I might not address all the questions in the live sessions. Here, I want to step through the questions that I did not have time to address and provide links to the literature for further reading.

Q: Are the current comparisons fair? Some seem between optimized quantum vs. suboptimal classical algorithms/technology.

A: Comparing the two different types of architectures can be very tricky, and it is often impossible to directly compare the algorithms. The assumptions and resources provided to each algorithm must be carefully considered to ensure that the quantum speedup is a result of access to a quantum resource only.

However, in some cases, the quantum algorithm gets a resource for free (such as quantum access to the data, for example) that is not inherent to the quantum device and that the classical algorithm does not. Equalizing these assumptions can lead to discovering “quantum-inspired" classical algorithms, such as the algorithms discovered by Ewin Tang (see this paper and this paper for examples).

There is also the case of Google's quantum supremacy announcement, where they performed a quantum calculation on their quantum Sycamore processor in 200 seconds, which they claim would have taken a classical computer 10,000 years. The calculation itself was designed to be easy on a quantum computer and extremely slow on a classical computer, so certainly it can be considered a suboptimal classical algorithm. Moreover, after Google's announcement, IBM demonstrated that the calculation could be completed in less than 2.5 days with some optimization. The purpose of Google’s experiment was to demonstrate a calculation that is faster on a quantum architecture, which they accomplished. Still, I do not think this means that the comparison is "fair".

While there is research into quantum benchmarking, the only way to be confident of the comparisons is to perform them yourself. While this is not yet possible, it is extremely valuable for companies to be considering what benchmarks they will run on quantum architectures as they mature.

Q: How can we get large data into QC efficiently?

A: There are many different ways to encode classical data onto the quantum state of a quantum computer. If the data is loaded bitwise into the states of the individual qubits, this is trivial and can be done with a circuit depth of \(1\) (a CNOT gate controlled on the classical bit applied to each qubit in the register). Instead, if you want to encode the data in the Hilbert space of the quantum register, you will need to prepare the quantum state using a state preparation circuit prior to your algorithm. Because the Hilbert space is exponential with respect to the number of qubits, this allows you to store \(n\) classical bits of information in \(\log_2(n)\) qubits but adds an additional \(\log(n)\) steps to the algorithm for the state preparation circuit. A great review of these types of circuits can be found here, and an article discussing applications of state preparation circuits to the encoding of features for quantum machine learning can be found here.

Q: Are there national research programs in the field of QC and does this field get government support?

A: Because of the eventual threat of QC to RSA encryption, QC research programs have seen broad government support from different countries. For example, here in the US, President Trump signed in 2018 the National Quantum Initiative into law, which allocated over $1B of federal research money over five years for quantum technology, and this effort is expected to be expanded under President Biden. This is not unique to the US, and many European countries are also heavily invested in QC. A thorough report on government led quantum initiatives around the world can be found here.

Q: How will QC impact the financial industry?

A: Because of the prevalence of large optimization problems and machine learning in the financial industry, the industry may become the first to experience any significant impacts from QC. Quantum speedups are expected in Monte-Carlo methods and portfolio optimizations. Advantages in quantum machine learning have already been demonstrated on today’s NISQ devices. For a good review of the quantum algorithms and expected speedups for the financial industry, see this paper.

Q: The persistent and annoying one: how long before “it happens"…i.e. it becomes a viable and better computing paradigm.

A: It is, of course, impossible to predict the future, but quantum computing power is experiencing at least exponential growth, and so we can reasonably predict the progress of QC architecture. But defining when it will become "viable and better computing paradigm" entirely depends on what applications you are interested in. If your algorithm can be ported efficiently to NISQ devices and solved variationally, quantum computing is already here. If you require fault-tolerance, then the first fault-tolerant machines will likely be here within the next five years. Within the next ten to fifteen years, it is reasonable to expect these fault-tolerant machines to be of sufficient scale to be used in production type workflows. If you are interested in workflows that do not contain a quantum speedup (see the following question on sorting), quantum computing will never be here, and there is no reason to switch from classical computing at all.

A great resource for keeping up to date with QC developments, without the hype, is the Quantum Computing Report mailing list. These often include expert surveys on how long it will take QC to break through to mainstream computing (see here, for example).

Q: Is there an algorithm for sorting?

Yes, there is a quantum algorithm for sorting (since \(\textbf{BPP} \subseteq \textbf{BQP}\), an efficient quantum algorithm exists for any efficient classical algorithm), but at \(n \cdot \log(n)\), the classical algorithms are just as efficient (see this paper). Intuitively, sorting is a data-intensive process, and so it makes sense that there might not be a speedup in this area. However, a quantum algorithm exists for space-constrained sorting that outperforms the classical alternative (see this paper).

Q: Where does quantum computing stand with computer simulations?

A: I am assuming this question is referring to computer physics-based simulations. QC architectures are currently far too immature to handle physics-based simulations. There are promising algorithms for solving partial differential equations (and thus simulating physics) for QC, such as the HHL algorithm, but they require many fault-tolerant qubits, which do not yet exist. There are several initiatives into solving PDE’s variationally (see the variational quantum linear solver) to run on NISQ devices, but it is not yet certain if variational approaches can achieve quantum advantages. See this review for more discussion on variational approaches to QC. A quantum advantage in physical simulations will likely only occur well into the fault-tolerant era of quantum computing.

Q: Do you have the reference for the approximate QFT on slide 29?

A: Thank you for asking this question, as it exposed a typo in my complexities for the algorithms. The AQFT only approaches a double exponential speedup, and its complexity should be \(\log(N)\cdot\log(\log(N))\). The slides previously misrepresented the complexity as \(\log(\log(N))\). Note here that \(N=2^n\), where \(n\) is the number of bits (or qubits) encoding your input. You can find references to the AQFT and its complexity here and here.

Q: Quantum parallelism: Is it possible to take advantage to efficiently aggregate the results of a given computation applied to a range of inputs? E.g. a Monte-Carlo simulation computing the average of a large number of parallel paths?

A: Yes, for this reason, Monte-Carlo algorithms are known to have a quadratic speedup on quantum devices over the classical alternatives (see this paper). This class of quantum algorithms is expected to have a large impact on many different computational fields, including molecular simulations and finance. For finance specific applications, Chapter 4 of this article gives an excellent overview of quantum Monte-Carlo algorithms.

Q: In an early slide - you said 1000 qubits are needed for correcting one logical qubit (due to errors). Is this a theoretical/mathematical result - or just a comment on current state-of-the-art hardware? So, could this 1000 become 100 eventually, and then 10 etc. with technological progress?

A: No, the 1000 number is estimated based on the quantum threshold theorem, current error rates, and state-of-the-art error correction algorithms, not a mathematical result. Any improvement in the hardware or algorithms could lower the estimated requirement. The current estimate is typically based on a 0.1% qubit error rate and the surface code for error correction, which requires an error threshold of 1% for fault tolerance.

Q: Could, for some applications, quantum computing based on QUTRITS (3 state system) give a possible advantage over qubits?

A: Yes! The Hilbert space for qutrits is \(3^n\), rather than \(2^n\) for qubits, so there is a computational advantage to qutrit systems over qubit systems (see this paper for an investigation into the advantage). From a theoretical standpoint, the same principles of quantum information theory that we applied to qubits can be applied to qutrits as well. It is still sufficiently challenging to implement qubit systems, so only a few groups actively research physical realizations of higher dimensional quantum systems. However, there has been significant interest in using photonic qutrits and beyond for quantum communication protocols. Ref. 8-12 of the linked paper explore different protocols in cryptography and communication using qutrit systems.

Q: For optimization problems, is NAG already investing in Quantum Inspired methods? (running on classical hardware)

A: While quantum-inspired algorithms are exciting, particularly in light of the debate on whether \(\textbf{BPP} \subset \textbf{BQP}\) or \(\textbf{BPP} = \textbf{BQP}\), for practical implementations, they often carry very stringent requirements on the problem (on the size, rank, and sparsity of the matrix, for example). Because of their limited application, we are not currently invested in any particular quantum-inspired algorithm, but we are actively monitoring the literature. If you have a particular use case for a quantum-inspired algorithm, we would be happy to hear more details of the problem to see if it might be a good fit.

Q: What simulation software should I choose?

A: This is entirely subjective and up to the user, and there are many different packages to choose from. A curated list of available QC software and their descriptions can be found here. The most popular packages are Qiskit (IBM) and Circ (Google). My personal favorite is Qiskit because of the Qiskit textbook and the interface to IBM’s physical QC architectures via their cloud IBM Quantum experience. However, all of the coding examples that were shown on the slides were using Cirq, which is also very easy to learn and read. If you decide to start with Cirq, I highly recommend the book Quantum Computing: An Applied Approach by Jack Hidary, which uses the Cirq library for its exercises.

Q: Which material applications/deployments (in terms of types of software) could QC have in the future and to which degree?

A: As previously stated, there is no way to predict what the future will hold for QC, or how far its reach will extend. QC will likely only have a significant impact on a select subset of specialty problems. That is, all the trouble of porting a problem to a quantum architecture is only worth it if the problem is in \(\textbf{BQP}\) but not in \(\textbf{BPP}\). We have not yet proven that such a problem exists. We know of problems (i.e., Shor’s algorithm) in \(\textbf{BQP}\) that are probably not in \(\textbf{BPP}\), but we have yet to prove that these problems are not in \(\textbf{BPP}\).

If you allow me to speculate, in the immediate future (<5 years), it is fair to expect QC to remain mostly academic, with lots of proof-of-concept type projects on NISQ devices. It is not yet apparent if variational algorithms on NISQ devices can achieve a quantum advantage. In the more distant future (say 5-10 years), fault tolerant quantum computers will be realized and will grow rapidly. As fault-tolerant algorithms become possible, the emphasis on variational algorithms will diminish. Eventually (say >10 years), qubit resources will be large enough to run production scale calculations.

Still, from a software perspective, QC will only provide a speedup to particular types of calculations. Hopefully, by the time QC is ready for production use, the software will be sufficiently abstract to a point where the computational kernels that benefit from a quantum architecture are automatically offloaded to the device, without a developer having to write the algorithms from scratch. Perhaps one day your HPC cluster will have CPUs, GPUs, and QPUs all heterogeneously tackling your compute problems.

Q: Will quantum computing make cryptocurrency worthless?

A: Hopefully not! The case of QC and cryptocurrency is intimately tied to QC and RSA encryption. Yes, QC will one day break RSA encryption through Shor’s algorithm, but it also provides a quantum-robust solution through quantum key distribution (QKD). Because of this, QC will one day break current blockchain securities (which rely on RSA), but quantum blockchain protocols relying on QKD are already known (see, for example, this paper).

Q: Is NAG active in the QC space?

A: NAG is not currently engaged in research and development in quantum computing, but we are actively monitoring the literature and engaging with thought leaders in the space. While QC remains mostly academic for now, we look forward to contributing more tangibly when it becomes of practical value to our customers.

Q: Could you share some material on the physical implementation of quantum computing gates and components?

A: There are many different physical realizations of quantum computers, which I will summarize. The most mature of these are superconducting quantum computers, ion trap quantum computers, and photonic quantum computers. Each system takes advantage of an isolated degree of freedom, which has a sufficient energy gap between the discrete levels, so the states are distinguishable. For superconductors, this is usually magnetic flux or charge state. For ion-trap architectures, this can be the electronic energy levels of the suspended ions. Gate operations can then be implemented through spectroscopic pulses on the qubit. At resonant energies, the pulse causes the probability of the qubit’s state to oscillate, allowing you to implement arbitrary quantum gates depending on the length of the pulse.

There is a large amount of literature on the specific implementations for different architectures. For superconducting qubits, see this paper for a great discussion of gate operations and this paper for a more general review of the technology. For ion-trap architectures, see this review. For photonic quantum computers, see this review. The experimental side of QC is rapidly evolving, and new exotic implementations of qubits are realized frequently. Still, at a theoretical level, they can all interpreted and understood using the same framework discussed in the talk.

Author
Comments
J
Jean-Jacques
Hello, about Quantum Inspired methods for optimization problems, I had in mind papers such as this one: https://advances.sciencemag.org/content/5/4/eaav2372
Regards.
JF
Jeremy First
I had not yet seen that paper, and it was a very interesting read, so thank you for sharing. Solving Ising models on classical architecture is indeed an expensive task with many practical applications, and the simulated bifurcation (SB) algorithm looks to be a promising way to efficiently obtain a solution, without needing to rely on the existence of a physical quantum computer. I'll keep on eye out for future papers on SB.

As far as I know, NAG has not yet invested resources specifically into SB as an algorithm, but we do support simulated annealing (SA) for comparable problems. If you are interested in testing our library for your specific use case (and discussing further whether SB might be worth exploring), please do get in touch with support@nag.com, and we can discuss it further.
MM
Magdi Mohamed
How about "Parallel" Sorting Algorithms?
In general, how does Quantum Computing compare to existing Parallel Architecture?
https://www.linkedin.com/pulse/parallel-sorting-architectures-magdi-mohamed/?trackingId=GuABkVx6LQ9qE9X20Ea6jg%3D%3D
JF
Jeremy First
In general, quantum architectures are not yet mature enough to achieve parallelism across multiple nodes. This is in part because the No-Cloning theorem forbids the direct copying of data from one processor to another, so we have to re-think how parallel architectures will work. However, recently the sharing a quantum state over multiple processors was achieved (DOI: 10.1126/science.abg1919), which is the first step is achieving a usable network of quantum processors.

In specific regards to parallel sorting algorithms, I was able to find a few papers that claim a quantum speedup for the perfect shuffle and merge sorting algorithms (DOI: 10.1109/H2RC49586.2019.00011 and DOI: 10.1109/TCSI.2005.856669), but I have not yet read these papers in depth to evaluate the feasibility on current quantum architecture. My expectation is that these algorithms are not robust against noise and that they will require fault-tolerance, still many years away, to only begin to test them on real devices.
Leave a Comment