Solid-state batteries and quantum chemistry
The lithium-ion cell in your phone is roughly the same chemistry as the one in your car. Liquid electrolyte, graphite anode, transition-metal oxide cathode, polymer separator. It works, it scales, and it occasionally catches fire. Solid-state batteries promise to fix the fire problem, double the energy density, and last a decade — but only if we can find a solid electrolyte that conducts lithium ions almost as well as a liquid does. That is a chemistry problem, and chemistry is where classical simulation runs out of road. This is one of the workloads we are building Ireland Quantum 100 to take on.
Why solid-state is hard, in one paragraph
A solid electrolyte has to do three things at once: conduct Li⁺ ions quickly, block electrons completely, and stay mechanically and chemically stable against both anode and cathode through thousands of charge cycles. The candidate families — sulphide glasses like Li₆PS₅Cl, oxide garnets like Li₇La₃Zr₂O₁₂ (LLZO), polymer composites, halide electrolytes like Li₃YCl₆ — each fail at least one of these requirements. Sulphides conduct beautifully but react with moisture and lithium metal. Oxides are stable but the grain boundaries kill ionic mobility. Halides sit somewhere in the middle. Picking which substitution, which dopant, which interface coating to try next is, today, mostly intuition plus expensive lab cycling. The underlying physics — how a lithium ion hops between coordination sites in a disordered solid — is governed by quantum mechanics that scales badly on classical hardware.
Where classical DFT hits the wall
Density Functional Theory is the workhorse of computational materials science and it has carried the field a long way. For a clean periodic crystal of a few dozen atoms, DFT gives you band structure, formation energies, migration barriers, and a reasonable first guess at ionic conductivity via nudged elastic band calculations. The problem is that real solid electrolytes are not clean periodic crystals. They are disordered, they have grain boundaries, they have cation mixing, and the interesting physics often involves strongly correlated electrons around transition metals or polaron formation around defects.
DFT with standard functionals (PBE, even hybrids like HSE06) systematically gets the wrong answer for systems where electron correlation matters — the same systems that dominate cathode chemistry. You can paper over it with DFT+U, but U is a fitted parameter, and fitting it on one composition does not transfer cleanly to the next. For battery DFT quantum work specifically, the failure modes are predictable: wrong voltage plateaux for layered oxides, wrong migration barriers for polaronic hops, wrong reaction energies at the electrolyte-electrode interface. You end up calibrating against experiment, which defeats the purpose of predicting before you synthesise.
What quantum hardware actually changes
A superconducting transmon processor of the kind we are building does not "run DFT faster". It runs a fundamentally different algorithm — quantum phase estimation or, more practically in the near term, the variational quantum eigensolver (VQE) — that natively represents electronic wavefunctions including correlation. The exponential cost of tracking entangled electrons on classical hardware becomes a polynomial cost on a quantum register, because the qubits themselves are the wavefunction.
For a 100-qubit machine on a heavy-hex lattice, you are not solving a full battery cathode end-to-end. What you can credibly do is the active-space calculation: pick the strongly correlated fragment — the transition-metal cluster, the migration bottleneck, the interface dimer — embed it in a classical mean-field environment, and solve the hard part on the quantum processor. This is where the phrase quantum battery chemistry stops being marketing and starts being a real workflow. The classical DFT does what it is good at; the quantum coprocessor does what it is good at; the answers compose.
The hardware constraints that decide what is actually possible
It is worth being honest about the engineering. Transmon qubits sit at the bottom of a dilution refrigerator at around 10–15 millikelvin, lower than deep space. Coherence times on modern fixed-frequency transmons run in the hundreds of microseconds; gate times are tens to hundreds of nanoseconds. Two-qubit gate fidelities on production hardware are around three nines on the best devices, which sounds fine until you remember that a useful chemistry circuit may need thousands of two-qubit gates. The errors compound multiplicatively.
This is why surface-code error correction is on the roadmap rather than already in production: you need physical-qubit fidelities above the threshold (~99%) and you need enough physical qubits per logical qubit (a few hundred at typical code distances) before logical operations beat physical ones. A 100-physical-qubit machine like the one we are commissioning in Tipperary is in the noisy intermediate-scale (NISQ) regime by design — useful for variational algorithms with shallow circuits, error mitigation rather than full correction, and active-space chemistry rather than full first-principles cathodes.
That is not a limitation to apologise for. It is the right tool for the right slice of the problem. The slice happens to include the bits of li-ion quantum chemistry — and post-Li chemistry, which matters more — that classical methods get wrong.
A realistic workflow for a solid-state battery candidate
Suppose you are screening a halide electrolyte family — Li₃MX₆ with M being a trivalent cation and X a halogen — for ionic conductivity and stability against a lithium-metal anode. A workflow that uses both classical and quantum resources sensibly looks like this:
- Classical pre-screen. Run DFT (VASP, Quantum ESPRESSO, or similar) across the compositional space. Compute formation energies, lattice parameters, electronic band gaps. Eliminate candidates that are obviously unstable or metallic.
- Identify the active space. For each survivor, isolate the migration pathway: a Li⁺ ion moving through a tetrahedral-octahedral-tetrahedral hop, with the surrounding coordination shell included. Typically 20–40 active orbitals.
- Map to the quantum register. Use a Jordan-Wigner or Bravyi-Kitaev transformation to encode the second-quantised Hamiltonian onto qubits. Apply active-space reduction techniques to fit within the available register width and circuit depth.
- Run VQE on the quantum processor. Compute the ground-state energy at the saddle point and at the equilibrium sites. The difference is the migration barrier — the number that determines ionic conductivity through an Arrhenius relation.
- Validate against experiment. Cycle the cell. The whole point of doing the quantum chemistry is that you cycle fewer cells, not zero cells.
The toolchain to do this is open and maturing fast. OpenQASM 3 as the gate-level intermediate representation, Qiskit Nature or PennyLane for the chemistry-to-circuit compilation, classical orchestration in Python, hardware backend swappable between simulator and real device. None of this is exotic. The exotic bit is the millikelvin cryostat at the bottom of the stack, and the discipline to know which problems actually need it.
Why this matters for climate, and why we are doing it in Tipperary
Battery chemistry is a climate technology. Grid-scale storage decides whether intermittent renewables can carry baseload. Vehicle electrification decides whether transport decarbonises in time. Better solid-state cells — safer, denser, longer-lived, ideally with less cobalt and nickel — are not a nice-to-have, they are a prerequisite. The reason climate workloads sit at the front of the queue on Ireland Quantum 100 is that the deployment timeline of this chemistry matters more than the deployment timeline of, say, portfolio optimisation. We have written more about that prioritisation on the Ireland Quantum 100 page, and the specific chemistry stack — including how we handle materials discovery workloads — is documented there.
Doing it sovereign, in Ireland, on Irish soil, matters for a separate reason: the queue. If your chemistry team has to wait three weeks for time on a US or Chinese cloud quantum service, your iteration loop is broken. Local hardware with local access, prioritised by national climate need rather than by who pays the most per shot, is a different operating model.
Where to start this week
If you work on battery materials and you have not yet touched a quantum SDK, start with PennyLane or Qiskit Nature on a laptop simulator. Pick one migration pathway from a paper you already know — LLZO is well-studied and a fair benchmark — and work through encoding the active-space Hamiltonian and running VQE against a classical simulator. You will hit the limits of simulation quickly, around 25–30 qubits, which is exactly the point: you will understand viscerally why hardware matters, and you will know what to ask for when access opens. That preparation is worth more than any roadmap slide.