Physics World 10月07日
Q2B大会聚焦量子纠错技术进展
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

近期在巴黎举行的Q2B会议汇聚了500多名专家,共同探讨了量子计算、AI、传感、通信和安全等领域的最新进展。会议特别关注了量子纠错(QEC)技术,认为这是构建容错量子计算机的关键挑战。尽管量子比特易受环境噪声影响而产生错误,但通过将信息分布在多个物理量子比特上,可以有效保护逻辑量子比特的状态。会议展示了各种QEC理论的最新进展,包括串联码、定向码和双变量自行车码等,并讨论了硬件层面的挑战,如实时解码和量子比特平台的性能优化。此外,会议还强调了学术界与工业界在QEC领域的紧密合作以及对量子人才培养的迫切需求。

💡 量子纠错(QEC)是实现未来容错量子计算机的核心技术,尽管量子比特易受环境噪声干扰,但通过多物理量子比特协同工作,可以有效保护逻辑量子比特的状态,实现长时计算。

🚀 QEC理论研究取得多项进展,包括串联码的回归、定向码利用特定门操作的潜力,以及IBM开发的新型双变量自行车码,这些编码技术在提升编码率和简化解码策略方面各有优势。

⚙️ 硬件层面面临实时解码需求高、延迟要求严苛等挑战,同时不同量子比特平台(如离子阱、中性原子、超导和光子量子比特)在速度、相干时间和制造良率方面存在差异,需要算法和硬件协同优化。

🤝 学术界与工业界在QEC领域的合作日益紧密,共同推动研究进展,尽管未来商业竞争可能影响信息共享的开放度,但对量子人才的需求依然迫切。

This year’s Q2B meeting took place at the end of last month in Paris at the Cité des Sciences et de l’Industrie, a science museum in the north-east of the city. The event brought together more than 500 attendees and 70 speakers – world-leading experts from industry, government institutions and academia. All major quantum technologies were highlighted: computing, AI, sensing, communications and security.

Among the quantum computing topics was quantum error correction (QEC) – something that will be essential for building tomorrow’s fault-tolerant machines. Indeed, it could even be the technology’s most important and immediate challenge, according to the speakers on the State of Quantum Error Correction Panel: Paul Hilaire of Telecom Paris/IP Paris, Michael Vasmer of Inria, Quandela’s Boris Bourdoncle, Riverlane’s Joan Camps and Christophe Vuillot from Alice & Bob.

As was clear from the conference talks, quantum computers are undoubtedly advancing in leaps and bounds. One of their most important weak points, however, is that their fundamental building blocks (quantum bits, or qubits) are highly prone to errors. These errors are caused by interactions with the environment – also known as noise – and correcting them will require innovative software and hardware. Today’s machines are only capable of running on average a few hundred operations before an error occurs; but in the future, we will have to develop quantum computers capable of processing a million error-free quantum operations (known as a MegaQuOp) or even a trillion error-free operations (TeraQuOps).

QEC works by distributing one quantum bit of information – called a logical qubit – across several different physical qubits, such as superconducting circuits or trapped atoms. Each physical qubit is noisy, but they work together to preserve the quantum state of the logical qubit – at least for long enough to perform a calculation. It was Peter Shor who first discovered this method of formulating a quantum error correcting code by storing the information of one qubit onto a highly entangled state of nine qubits. A technique known as syndrome decoding is then used to diagnose which error was the likely source of corruption on an encoded state. The error can then be reversed by applying a corrective operation depending on the syndrome.

<><>é

While error correction should become more effective as the number of physical qubits in a logical qubit increases, adding more physical qubits to a logical qubit also adds more noise. Much progress has been made in addressing this and other noise issues in recent years, however.

“We can say there’s a ‘fight’ when increasing the length of a code,” explains Hilaire. “Doing so allows us to correct more errors, but we also introduce more sources of errors. The goal is thus being able to correct more errors than we introduce. What I like with this picture is the clear idea of the concept of a fault-tolerant threshold below which fault-tolerant quantum computing becomes feasible.”

Developments in QEC theory

Speakers at the Q2B25 meeting shared a comprehensive overview of the most recent advancements in the field – and they are varied. First up, concatenated error correction codes. Prevalent in the early days of QEC, these fell by the wayside in favour of codes like surface code, but are making a return as recent work has shown. Concatenated codes can achieve constant encoding rates and a quantum computer operating on a linear, nearest-neighbour connectivity was recently put forward. Directional codes, the likes of which are being developed by Riverlane, are also being studied. These leverage native transmon qubit logic gates – for example, iSWAP gates – and could potentially outperform surface codes in some aspects.

The panellists then described bivariate bicycle codes, being developed by IBM, which offer better encoding rates than surface codes. While their decoding can be challenging for real-time applications, IBM’s “relay belief propagation” (relay BP) has made progress here by simplifying decoding strategies that previously involved combining BP with post-processing. The good thing is that this decoder is actually very general and works for all the “low-density parity check codes” — one of the most studied class of high performance QEC codes (these also include, for example, surface codes and directional codes).

There is also renewed interest in decoders that can be parallelized and operate locally within a system, they said. These have shown promise for codes like the 1D repetition code, which could revive the concept of self-correcting or autonomous quantum memory. Another possibility is the increased use of the graphical language ZX calculus as a tool for optimizing QEC circuits and understanding spacetime error structures.

Hardware-specific challenges

The panel stressed that to achieve robust and reliable quantum systems, we will need to move beyond so-called hero experiments. For example, the demand for real-time decoding at megahertz frequencies with microsecond latencies is an important and unprecedented challenge. Indeed, breaking down the decoding problem into smaller, manageable bits has proven difficult so far.

There are also issues with qubit platforms themselves that need to be addressed: trapped ions and neutral atoms allow for high fidelities and long coherence times, but they are roughly 1000 times slower than superconducting and photonic qubits and therefore require algorithmic or hardware speed-ups. And that is not all: solid-state qubits (such as superconducting and spin qubits) suffer from a “yield problem”, with dead qubits on manufactured chips. Improved fabrication methods will thus be crucial, said the panellists.

 

Collaboration between academia and industry

The discussions then moved towards the subject of collaboration between academia and industry. In the field of QEC, such collaboration is highly productive today, with joint PhD programmes and shared conferences like Q2B, for example. Large companies also now boast substantial R&D departments capable of funding high-risk, high-reward research, blurring the lines between fundamental and application-oriented research. Both sectors also use similar foundational mathematics and physics tools.

At the moment there’s an unprecedented degree of openness and cooperation in the field. This situation might change, however, as commercial competition heats up, noted the panellists. In the future, for example, researchers from both sectors might be less inclined to share experimental chip details.

Last, but certainly not least, the panellists stressed the urgent need for more PhDs trained in quantum mechanics to address the talent deficit in both academia and industry. So, if you were thinking of switching to another field, perhaps now could be the time to jump.

The post Advances in quantum error correction showcased at Q2B25 appeared first on Physics World.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

量子纠错 QEC 量子计算 Q2B 量子技术 Quantum Error Correction Quantum Computing Quantum Technology
相关文章