Physicists have applied the ability of machine learning algorithms to learn from experience to one of the biggest challenges currently facing quantum computing: quantum error correction, which is used to design noise-tolerant quantum computing protocols. In a new study, they have demonstrated that a type of neural network called a Boltzmann machine can be trained to model the errors in a quantum computing protocol and then devise and implement the best method for correcting the errors.
- Blind Quantum Computing for Everyone Digest Article
- IBM Just Achieved a Deep Learning Breakthrough Digest Article
- Cloud Quantum Computing Calculates Nuclear Binding Energy Digest Article
- Report on Post-Quantum Cryptography Reference Document
- CSIAC Webinar – Leveraging Machine Learning – Chat Log Information Page