From Energy Grids to Satellites and Blockchain: Cross-Sector Panel Shows Quantum’s First Real-World Wave Is Already Here

At Quantum World Congress 2025, a cross-sector panel led by Allison Schwartz, Head of Global Government Relations & Public Affairs at D-Wave, made a clear case: the first wave of applied quantum computing isn’t “five years away” — it’s already arriving in energy, logistics, national infrastructure, and space.

Schwartz framed the conversation as a three-way dialogue between hardware, software, and government, joined onstage by Rima Oueid of the U.S. Department of Energy (DOE), Dana Linnet, CEO of Artificial Brain, and Andrew King, a senior researcher at D-Wave. Together, they explored where real deployments are emerging, how quantum and AI intersect, and what governments must do to keep pace with global competitors.

We need to tell a better story about return on investment—phased, measurable, and grounded in what today’s hybrid quantum systems can already do for energy security and grid resilience.
— Rima Oueid, U.S. Department of Energy

From the government side, Oueid highlighted DOE’s Quantum Sandbox, an effort explicitly focused on applications rather than just basic research. After a first phase of investment in quantum information science centers, DOE is now asking a different question: What can we do with what we have today? In the energy sector, she noted, there are already compelling use cases for quantum annealing and noisy intermediate-scale quantum (NISQ) devices — especially in grid optimization and contingency analysis where probabilistic ranges, not exact answers, are what matter. The sandbox is about “learning by doing”: building algorithms, experimenting with hybrid quantum–AI approaches, and using live infrastructure models to probe where quantum makes sense.

Linnet offered the software-and-startup view from Artificial Brain, which has built a drag-and-drop quantum software platform called PLANC for complex optimization. She described their work modeling energy markets end-to-end — production, distribution across multiple jurisdictions, grid operations, and consumer demand — to simultaneously improve resilience and commercial performance. In a live demo at D-Wave’s Qubits conference, PLANC optimized a problem with 60,000 constraints and variables on D-Wave hardware in just 16 milliseconds, far beyond the scale where traditional AI methods typically break down. For her, this is proof that quantum is already usable for real-world optimization, not just a science project.

King connected these themes back to D-Wave’s production systems. On the customer side, he pointed to job-shop scheduling and workforce planning deployments such as Ford Otosan’s work in automotive production and Pason Food Group’s personnel scheduling, where quantum annealing has reduced planning time by up to 85%. On the research side, D-Wave has demonstrated beyond-classical performance in quantum simulation, tapping annealers’ strengths in simulating quantum dynamics and solving hard optimization problems that surface everywhere from networks to logistics.

We have enough quantum capability now to be using it and deploying it. Other countries are doing it. The question is whether we’re willing to move from science projects to real, funded applications.
— Dana Linnet, Artificial Brain

The panel then zoomed out into emergency response and space-based infrastructure. Linnet described Artificial Brain’s satellite Earth observation optimization work, which recently won a European Space Agency prize. By dynamically redirecting constellations to capture wildfire or other critical imagery, PLANC can optimize what each satellite observes across thousands of assets. She also showcased PLANC Hyperspectral, a patented/patent-pending approach to auto-labeling hyperspectral imagery. Running on IBM hardware, they labeled an image in one second using just 2–20 reference points and 19 qubits — compared to conventional AI methods that require thousands of reference points and far more time. Given the billions of hyperspectral images being collected, this could unlock enormous latent value for defense, climate, and disaster response.

Oueid added a forward-looking perspective on quantum in space. Microgravity, she explained, can reduce certain noise sources, increase bandwidth, and potentially change how quantum devices behave. Experiments on the International Space Station are already growing crystals and novel materials that could become key components for quantum computing or advanced semiconductors — potentially reshaping global supply chains. At the same time, quantum sensing and secure communications can help protect space assets and enable new modes of space-based infrastructure.

Inevitably, the conversation turned to AI’s energy appetite and blockchain. King noted that as AI workloads scale, data centers are increasingly eyeing nuclear reactors to keep up. Quantum systems, especially annealers, can solve sampling problems not just faster but far more efficiently in terms of energy per solution, making them attractive as part of future AI and data center architectures. He also referenced D-Wave’s recent work on a proof-of-quantum-work blockchain, replacing classical hashing with a quantum sampling function to sharply reduce the energy cost of consensus mechanisms.

In one of the panel’s most candid moments, Linnet critiqued the U.S. government’s current posture. Despite rhetoric about staying ahead of China, she argued that U.S. legislative and funding frameworks have not kept pace with the country’s ambitions: the National Quantum Initiative Act has not been reauthorized, and application-focused funding is limited compared to China, Europe, and the Gulf states, where large-scale deployments and generous ROI-backed investments are already underway. She warned of a coming “Sputnik moment” if the U.S. continues to insist that quantum is always five years away.

Oueid largely agreed on the need for phase-based ROI storytelling and public–private partnerships. DOE, she said, is pivoting from phase-one basic research to more applied phase-two work and is already planning for a phase three that focuses squarely on deployment and commercial uptake. To get there, the community needs to articulate the return on investment at each stage — from today’s hybrid optimizations to tomorrow’s fully fault-tolerant systems — and build clearer roadmaps for how current capabilities bridge into future breakthroughs.

Schwartz closed the session by returning to the panel’s central theme: collaboration across hardware, software, and government. From grid resilience to satellite tasking to blockchain and AI, the first generation of applied quantum systems is already being tested, deployed, and refined. The question now is whether industry and policymakers can match that technical progress with the funding, policy frameworks, and shared messaging needed to scale it.

Previous
Previous

Practical Paths for Quantum: QuEra’s 3×3 Playbook to Turn Promise into Business Value

Next
Next

IQM CEO Jan Goetz Highlights How Quantum–HPC Integration Is Moving From Theory to Deployment