Climate modelling — what quantum changes in 2027
Climate models built on classical supercomputers have hit a wall that more cores cannot break through. The atmosphere is a coupled non-linear system with chemistry, fluid dynamics, radiative transfer and ocean–ice feedback all running at once, and the bits we still resolve poorly — sub-grid cloud microphysics, aerosol nucleation, ocean turbulent mixing, soil biogeochemistry — are exactly the bits where the physics turns quantum mechanical at the molecular layer. By 2027, when the first 100-qubit superconducting machines come online in Europe under sovereign-compute mandates, the question stops being "can quantum help climate science" and becomes "which subroutines do you offload first, and how do you stitch the answers back into a CMIP-class model run on classical hardware". That is the engineering problem worth talking about.
Where classical climate modelling actually breaks
If you sit with the output of a modern Earth-system model — CESM, EC-Earth, UKESM, IFS — the dominant uncertainty is not the dynamical core. The Navier–Stokes solver, the spectral transforms, the semi-Lagrangian advection — those are mature. The uncertainty lives in the parametrisations: the empirical functions standing in for processes we cannot resolve at grid scale. Cloud feedback alone accounts for roughly half the spread across IPCC AR6 climate sensitivity estimates. Aerosol–cloud interaction is worse. And underneath every parametrisation sits a chemistry or thermodynamics problem that, written down honestly, is a many-body Schrödinger equation we are approximating with empirical fits or density-functional theory at modest accuracy.
This is the structural opening for quantum. Not replacing the climate model — the climate model is a 50-million-line classical artefact and will remain one — but replacing the worst-conditioned subroutines with quantum kernels that compute molecular-scale truths the classical model currently fudges.
What a 100-qubit superconducting machine can actually compute
Let us be honest about hardware in 2027. A 100-physical-qubit transmon processor on a heavy-hex topology, running in a dilution refrigerator at sub-15 mK, with two-qubit gate fidelities in the 99.5–99.9% range and circuit depths bounded by T1 and T2 coherence times, is not a fault-tolerant device. Surface-code logical qubits remain a roadmap item, not a 2027 deliverable. What you have is a noisy intermediate-scale quantum (NISQ) machine of the larger sort, accessed via OpenQASM 3 and orchestrated through Qiskit, PennyLane or Cirq pipelines.
That is enough for specific things. Variational quantum eigensolver (VQE) runs on small molecular Hamiltonians — tens of spin-orbitals — with active-space reduction and symmetry-adapted ansätze. Quantum approximate optimisation (QAOA) on combinatorial sub-problems with a few hundred binary variables. Quantum-enhanced sampling for stochastic differential equations underlying turbulence closures. Hamiltonian simulation of vibrational modes for infrared absorption cross-sections. None of these solves a full climate model. All of them feed inputs into one.
The atmospheric quantum simulation use cases that actually matter
The honest list of climate-relevant problems where a 100-qubit machine produces something a 100,000-core classical run cannot, by 2027:
- Atmospheric photochemistry of short-lived radicals. The OH radical governs methane lifetime in the troposphere. Its reaction kinetics with isoprene, dimethyl sulfide and halogenated species involve open-shell transition states where DFT routinely disagrees with experiment by factors of two. VQE with a properly chosen active space gets you closer to chemical accuracy on the rate-determining step.
- Aerosol nucleation thermodynamics. The formation of new particles from sulfuric acid, ammonia, amines and oxidised organics is a quantum cluster-formation problem. Classical molecular dynamics with empirical force fields underestimates binding energies of the smallest clusters — the ones that decide whether nucleation happens at all — by enough to shift cloud-condensation-nuclei budgets meaningfully.
- Carbon capture sorbent screening. Amine-functionalised metal–organic frameworks, alkaline earth carbonates, novel covalent organic frameworks. The CO₂ binding energetics and selectivity over N₂ and H₂O are governed by orbital interactions that quantum chemistry handles natively.
- Photovoltaic and photocatalytic materials. Excited-state dynamics in candidate perovskites, organic photovoltaics, water-splitting catalysts. Time-dependent DFT struggles with charge-transfer states; quantum subspace methods handle them more honestly.
- Battery electrolyte and electrode chemistry. Specifically the solid-electrolyte interphase formation reactions that decide cycle life of grid-scale storage.
- Grid optimisation under high renewable penetration. Unit commitment with stochastic wind and solar inputs is a mixed-integer program where QAOA-style heuristics on a few hundred logical variables become competitive with classical branch-and-cut on certain problem structures.
Notice the framing. None of these is "the climate model". All of them are inputs to the climate model that are currently estimated with empirical fits, low-accuracy electronic-structure methods, or laboratory measurements that simply do not exist for the species involved.
How you actually wire a quantum kernel into a CMIP run
This is the engineering most quantum talks skip. The climate model runs for months on a classical machine, producing terabytes per simulated year. The quantum machine produces, per circuit execution, a handful of expectation values or sampled bitstrings. The bandwidth mismatch is enormous and the latencies do not align.
The integration pattern that works is offline pre-computation of lookup tables, not real-time coupling. You identify a parametrisation in the climate model — say, the temperature- and pressure-dependent rate coefficient for OH + isoprene — and you replace its empirical Arrhenius fit with a quantum-computed surface across the relevant (T, P, composition) grid. You run that quantum job once, you tabulate the result, you ship the table to the climate model, and the model interpolates as it always did. The quantum computer is a chemistry oracle consulted offline, not an inner-loop accelerator.
For a small set of problems — primarily online optimisation of grid dispatch or adaptive observational network design — you can imagine warmer coupling, with the quantum machine queried hourly or daily over a network. But for atmospheric chemistry, the offline lookup-table pattern is the realistic 2027 architecture. It also means that one quantum machine, well utilised, can serve a whole community of classical climate modellers. That is a sovereign-compute argument, not a per-user one. We have written about the broader Ireland Quantum 100 programme and why the workload prioritisation is what it is.
Quantum advantage climate: what would actually count as proof
The phrase "quantum advantage" has been so abused in the literature that it is worth being narrow. For climate, the bar I would accept as a serious claim, on a 100-qubit superconducting machine in 2027, looks like this:
- A reaction rate or binding energy computed for a climate-relevant species where the quantum result agrees with the best available experimental measurement, and the best classical electronic-structure method (CCSD(T) with a large basis set, or equivalent) either disagrees materially or is computationally intractable for the system size.
- An aerosol cluster formation free energy where the quantum-corrected value, plugged into a global aerosol model, shifts the simulated CCN budget by an amount detectable against observational constraints.
- A grid-optimisation instance derived from real ENTSO-E or EirGrid scenarios where a quantum heuristic finds a feasible dispatch that classical solvers, given the same wall-clock budget, do not.
None of these is "the quantum computer simulated the climate". All of them are defensible, narrow, falsifiable claims. That is the standard worth holding 2027 results to. If you see a press release that does not pass this bar, treat it as marketing.
Why sovereign capacity matters for this specific workload
Climate science is a public good and the data is shared. But the priority queues on commercial quantum cloud services are not allocated by climate need; they are allocated by paying customers, and finance and pharma outbid climate every time. A sovereign machine with an explicit climate-first cohort policy changes the economics of who gets shots on the device. It also changes the data residency picture for European researchers operating under GDPR and increasingly under the EU AI Act for downstream model use. For Ireland specifically, a domestic 100-qubit transmon system is the difference between climate groups in Maynooth, Galway, Cork and TCD waiting in a queue behind US hyperscaler customers and getting prioritised access through a national programme. The same logic applies to climate-specific quantum workloads across the broader European research community.
Where to start this week
If you run a climate research group: pick the single parametrisation in your model that contributes the largest term to your output uncertainty, write down the underlying chemistry or physics Hamiltonian honestly, and ask whether a 50–80 spin-orbital active-space VQE could replace it. If the answer is yes, you have a 2027 collaboration brief. If you build climate software: start reading Qiskit and PennyLane tutorials, not because you will rewrite your model in them, but because understanding the API surface of a quantum kernel is what lets you design the lookup-table interface that will sit in your code by