Quantum Experiments and High-Performance Computing: New Method Enables Complex Calculations in the Shortest Possible Time
October 22, 2024 -- Scientists at Paderborn University have used high-performance computing (HPC) for the first time to analyse a quantum photonics experiment on a large scale. Specifically, this involved the tomographic reconstruction of experimental data from a quantum detector. This is a device that measures individual photons, i.e. particles of light. Among other things, the researchers developed new HPC software for this purpose. Their results have now been published in the journal ‘Quantum Science and Technology’.
Quantum tomography on a photonic quantum detector on a mega scale
Highly scaled photon detectors are being used more and more frequently in quantum research. Characterising these devices precisely is of central importance in order to be able to use them effectively for measurements - and has been a challenge to date. This is because it involves large amounts of data that need to be analysed without neglecting their quantum mechanical structure. Suitable tools for processing these data sets are particularly important for future applications. While conventional approaches do not allow comparable calculations of quantum systems beyond a certain scale, the Paderborn scientists have made use of high-performance computing for the characterisation and certification tasks. ‘By developing customised open-source algorithms using high-performance computing, we have carried out quantum tomography on a photonic quantum detector on a mega-scale,’ explains physicist Timon Schapeler, who wrote the paper together with computer scientist Dr Robert Schade and colleagues from PhoQS (Institute for Photonic Quantum Systems) and the PC2 (Paderborn Center for Parallel Computing). The PC2, an interdisciplinary research centre at Paderborn University, operates the HPC systems. The university is one of the National High Performance Computing Centres in Germany and is therefore at the forefront of university high performance computing.
‘Unprecedented scale’
‘The results open up completely new possibilities in the field of scalable quantum photonics in terms of the size of the systems to be analysed. This also has implications for the characterisation of photonic quantum computer hardware, for example,’ continues Schapeler. The scientists completed their calculations to describe a photon detector within a few minutes - faster than anyone else before. The system also completed calculations with even larger amounts of data within a very short time. Schapeler: ‘This shows the unprecedented extent to which this tool can be applied to quantum photonic systems. As far as we know, our work is the first contribution in the field of classical high-performance computing that enables experimental quantum photonics on a large scale. This area will become increasingly important when it comes to demonstrating the quantum advantage in quantum photonic experiments. And at scales that cannot be calculated by conventional means.’
Basic research to shape the future
Schapeler is a doctoral student in the ‘Mesoscopic Quantum Optics’ working group led by Prof Dr Tim Bartley. The team is researching the fundamental physics of the quantum states of light and their applications. These states consist of several 10, 100 or 1000 photons. ‘The order of magnitude is crucial, as it illustrates the fundamental advantage of quantum over classical systems. The benefits are visible in many areas, including measurement technology, data processing and communication,’ explains Bartley. The broad field of quantum research is one of Paderborn University's profile areas. Recognised experts conduct basic research in order to shape the future through concrete applications.