Nuclear physicists working with the U.S. Department of Energy have pushed quantum hardware into serious “science mode”: they’ve run one of the largest digital quantum simulations ever performed on IBM’s quantum processors, targeting a particle physics problem that even top classical supercomputers struggle with.
The team focused on a one-dimensional quantum electrodynamics (QED) model, a simplified playground used to understand what happens in high-energy particle collisions. First, they used classical computers to design optimal circuits for small systems. Then, by exploiting symmetries and scale hierarchies in the equations, they expanded those circuits to more than 100 qubits and executed them on IBM quantum machines.
The result: they could prepare vacuum states and localized “hadron-like” pulses with percent-level accuracy, something that pushes beyond what classical hardware can simulate directly. That opens the door to studying matter under extreme conditions – from supernova nucleosynthesis to the ultra-dense environments found in neutron stars – using quantum computers as science instruments, not just tech demos.
For anyone following the road to “useful quantum advantage,” this is a strong proof-of-concept: practical physics problems are starting to migrate from classical supercomputers to hybrid classical–quantum workflows. In the long run, the same techniques could be adapted to condensed-matter systems, exotic materials and other strongly interacting quantum systems.