IBM’s Jay Gambetta: From Quantum Devices to Quantum Computers—and a New Era of Algorithm Discovery

IBM Fellow and VP of IBM Quantum Jay Gambetta argued that the field has crossed a threshold: we now have quantum computers—not just devices—capable of running circuits beyond brute-force classical simulation. His three-part mission remains steady: (1) demonstrate systems operating beyond classical brute force, (2) deliver clear, scientifically verifiable quantum advantage, and (3) stay on a path to a fault-tolerant quantum computer by 2029.

Devices → Computers: Speed, Scale, Quality

Gambetta framed quantum processor capability along three axes—speed, scale, and quality—and said IBM’s superconducting qubits lead on execution speed (“kops”), which translates to lower cost to run. He placed a dotted line between “devices” and “computers,” noting that >100-qubit machines with steadily improving error rates are now usable as scientific tools. Others, he acknowledged, have entered this regime as well.

Production-Grade Operations: Uptime, Reliability, Footprint

Since putting the first quantum computer on the cloud in 2016, IBM has deployed 60+ devices and now operates two dozen quantum computers across data centers. He cited fleet metrics—~99.4% job success, ~97% availability, and guaranteed access to at least one top system—as evidence that quantum computing must be judged by system reliability as much as by qubit counts. IBM also supports on-prem installations with strategic partners worldwide (Japan, Germany/Spain’s Basque region, India, the U.S. including Chicago).

The Software Stack: Open, Performant, and Extensible

On software, Gambetta spotlighted Qiskit as the “gold standard,” with broad developer adoption, thousands of dependent projects, and recent work that made its compiler dramatically faster and more circuit-efficient. He emphasized an open ecosystem: Qiskit add-ons from partners, and function wrappers that let enterprises plug proprietary IP into cloud workflows—bridging academic research and commercial use.

Beyond Use Cases: Building Algorithm “Horizontals”

Rather than chase one-off demos, Gambetta called for a portfolio of computational horizontals that map to many industries:

  • Hamiltonian simulation (chemistry/materials): Work with RIKEN and Oak Ridge mixes quantum results with classical solvers to push into regimes hard for exact classical methods, including real-time dynamics on lattice models (Basque collaborators).

  • Optimization: A growing community is assembling benchmark suites; early studies with Los Alamos on multi-objective optimization and hybrid quantum-classical loops show promising gaps versus top classical baselines in specific problem classes.

  • Quantum machine learning: Structured-data approaches (e.g., covariant kernels) scale to larger problems; startups are exploring Hamiltonian data encoding to bypass classical data-ingest bottlenecks.

  • Differential equations: IBM teams reported an end-to-end algorithm indicating exponential-type speedups for stochastic differential equations, with related work (e.g., MIT) probing nonlinear dynamics and fusion-relevant models; alternative formulations map PDEs to loss-minimization objectives.

The point, he said, is to establish reusable building blocks where quantum shows reliable, reproducible advantages—then let industry use cases (from life sciences to energy and finance) flow from those foundations.

Partnerships as a Flywheel

IBM’s expanding network—universities, labs, startups, and enterprises—anchors the push from discovery to deployment. Examples included long-term collaborations in Japan (University of Tokyo, RIKEN), the Basque materials community, Chicago’s algorithm initiatives, and RPI on quantum-HPC integration (e.g., scheduling with SLURM). The aim: train talent, co-develop algorithms, seed startups, and embed quantum in HPC without reinventing the stack.

The Takeaway

Gambetta’s thesis is pragmatic and forward-leaning: the field has systems you can do science with today. The fastest path to broad utility is algorithm discovery across common computational horizontals, underpinned by reliable hardware, serious software, and open partnerships. Do that, he argued, and the milestones—verifiable advantage and, eventually, fault tolerance—stay within reach.

Previous
Previous

Boeing’s Jay Lowell: Building the First Global Quantum Entanglement Networks

Next
Next

Building Quantum Nations: Finland’s Playbook for Tech Sovereignty