Michael English

Ireland Quantum 100 · Technical Brief
Home · Essays · Quantum
Ireland Quantum 100 — Technical Brief

Climate modelling — what quantum changes in 2027

Climate models built on classical supercomputers have hit a wall that more cores cannot break through. The atmosphere is a coupled non-linear system with chemistry, fluid dynamics, radiative transfer and ocean–ice feedback all running at once, and the bits we still resolve poorly — sub-grid cloud microphysics, aerosol nucleation, ocean turbulent mixing, soil biogeochemistry — are exactly the bits where the physics turns quantum mechanical at the molecular layer. By 2027, when the first 100-qubit superconducting machines come online in Europe under sovereign-compute mandates, the question stops being "can quantum help climate science" and becomes "which subroutines do you offload first, and how do you stitch the answers back into a CMIP-class model run on classical hardware". That is the engineering problem worth talking about.

Where classical climate modelling actually breaks

If you sit with the output of a modern Earth-system model — CESM, EC-Earth, UKESM, IFS — the dominant uncertainty is not the dynamical core. The Navier–Stokes solver, the spectral transforms, the semi-Lagrangian advection — those are mature. The uncertainty lives in the parametrisations: the empirical functions standing in for processes we cannot resolve at grid scale. Cloud feedback alone accounts for roughly half the spread across IPCC AR6 climate sensitivity estimates. Aerosol–cloud interaction is worse. And underneath every parametrisation sits a chemistry or thermodynamics problem that, written down honestly, is a many-body Schrödinger equation we are approximating with empirical fits or density-functional theory at modest accuracy.

This is the structural opening for quantum. Not replacing the climate model — the climate model is a 50-million-line classical artefact and will remain one — but replacing the worst-conditioned subroutines with quantum kernels that compute molecular-scale truths the classical model currently fudges.

What a 100-qubit superconducting machine can actually compute

Let us be honest about hardware in 2027. A 100-physical-qubit transmon processor on a heavy-hex topology, running in a dilution refrigerator at sub-15 mK, with two-qubit gate fidelities in the 99.5–99.9% range and circuit depths bounded by T1 and T2 coherence times, is not a fault-tolerant device. Surface-code logical qubits remain a roadmap item, not a 2027 deliverable. What you have is a noisy intermediate-scale quantum (NISQ) machine of the larger sort, accessed via OpenQASM 3 and orchestrated through Qiskit, PennyLane or Cirq pipelines.

That is enough for specific things. Variational quantum eigensolver (VQE) runs on small molecular Hamiltonians — tens of spin-orbitals — with active-space reduction and symmetry-adapted ansätze. Quantum approximate optimisation (QAOA) on combinatorial sub-problems with a few hundred binary variables. Quantum-enhanced sampling for stochastic differential equations underlying turbulence closures. Hamiltonian simulation of vibrational modes for infrared absorption cross-sections. None of these solves a full climate model. All of them feed inputs into one.

The atmospheric quantum simulation use cases that actually matter

The honest list of climate-relevant problems where a 100-qubit machine produces something a 100,000-core classical run cannot, by 2027:

Notice the framing. None of these is "the climate model". All of them are inputs to the climate model that are currently estimated with empirical fits, low-accuracy electronic-structure methods, or laboratory measurements that simply do not exist for the species involved.

How you actually wire a quantum kernel into a CMIP run

This is the engineering most quantum talks skip. The climate model runs for months on a classical machine, producing terabytes per simulated year. The quantum machine produces, per circuit execution, a handful of expectation values or sampled bitstrings. The bandwidth mismatch is enormous and the latencies do not align.

The integration pattern that works is offline pre-computation of lookup tables, not real-time coupling. You identify a parametrisation in the climate model — say, the temperature- and pressure-dependent rate coefficient for OH + isoprene — and you replace its empirical Arrhenius fit with a quantum-computed surface across the relevant (T, P, composition) grid. You run that quantum job once, you tabulate the result, you ship the table to the climate model, and the model interpolates as it always did. The quantum computer is a chemistry oracle consulted offline, not an inner-loop accelerator.

For a small set of problems — primarily online optimisation of grid dispatch or adaptive observational network design — you can imagine warmer coupling, with the quantum machine queried hourly or daily over a network. But for atmospheric chemistry, the offline lookup-table pattern is the realistic 2027 architecture. It also means that one quantum machine, well utilised, can serve a whole community of classical climate modellers. That is a sovereign-compute argument, not a per-user one. We have written about the broader Ireland Quantum 100 programme and why the workload prioritisation is what it is.

Quantum advantage climate: what would actually count as proof

The phrase "quantum advantage" has been so abused in the literature that it is worth being narrow. For climate, the bar I would accept as a serious claim, on a 100-qubit superconducting machine in 2027, looks like this:

None of these is "the quantum computer simulated the climate". All of them are defensible, narrow, falsifiable claims. That is the standard worth holding 2027 results to. If you see a press release that does not pass this bar, treat it as marketing.

Why sovereign capacity matters for this specific workload

Climate science is a public good and the data is shared. But the priority queues on commercial quantum cloud services are not allocated by climate need; they are allocated by paying customers, and finance and pharma outbid climate every time. A sovereign machine with an explicit climate-first cohort policy changes the economics of who gets shots on the device. It also changes the data residency picture for European researchers operating under GDPR and increasingly under the EU AI Act for downstream model use. For Ireland specifically, a domestic 100-qubit transmon system is the difference between climate groups in Maynooth, Galway, Cork and TCD waiting in a queue behind US hyperscaler customers and getting prioritised access through a national programme. The same logic applies to climate-specific quantum workloads across the broader European research community.

Where to start this week

If you run a climate research group: pick the single parametrisation in your model that contributes the largest term to your output uncertainty, write down the underlying chemistry or physics Hamiltonian honestly, and ask whether a 50–80 spin-orbital active-space VQE could replace it. If the answer is yes, you have a 2027 collaboration brief. If you build climate software: start reading Qiskit and PennyLane tutorials, not because you will rewrite your model in them, but because understanding the API surface of a quantum kernel is what lets you design the lookup-table interface that will sit in your code by

Ireland Quantum 100

Read the full overview, the 12-month plan, and the climate-applications brief.

Visit the hub