DESIGN TOOLS

Invalid input. Special characters are not supported.

Micron technology glossary

Quantum computing

Quantum computing is an advanced computing paradigm that applies the principles of quantum mechanics to solve problems beyond the reach of classical computers. Traditional computers are based on classical physics, which governs the behavior of macroscopic objects and processes information using bits that exist as either a 0 or a 1.

Quantum mechanics, by contrast, describes the behavior of matter and light at atomic and subatomic levels. Quantum computers leverage these principles by using quantum bits (qubits). Unlike classical bits, qubits can exist in a state of superposition — a linear combination of 0 and 1 — that encodes the probability of each outcome. When a qubit is measured (for example, in the standard computational basis), the superposition collapses to a single definite result: 0 or 1. This capability lets quantum systems represent and manipulate many possible states at once, which can make certain complex calculations far more efficient than on classical computers.

Find out more information on quantum computing with Micron and contact the Micron Sales Support team for any further questions on other Micron products.

What is quantum computing?

Quantum computing definition: Quantum computing is a computing paradigm that processes information using quantum bits (qubits), governed by the principles of quantum mechanics.

  • Qubits: Qubits are fundamentally different from classical bits. Rather than holding a fixed value of 0 or 1, a qubit is described by a quantum state. This state defines the probabilities of measuring a 0 or a 1 when observed — a phenomenon known as superposition. A useful analogy is a dimmer switch rather than a simple on/off switch, capable of existing across many states until measurement occurs
  • Entanglement: Qubits can become entangled, meaning the state of one qubit is strongly correlated with the state of another, even across large distances. Think of two dice that always show the same number, no matter how far apart they are. Measure one, and you instantly know the result of the other.
  • Measurement: When a qubit is measured, its superposition collapses to a definite value — either 0 or 1 — similar to observing the result of a coin that was previously spinning in the air.

These quantum properties allow quantum computers to represent and explore multiple possibilities in ways that are difficult for classical computers to match. This enables new algorithmic strategies that can outperform classical approaches for certain types of problems (though not all problems). Classical computing can be thought of as evaluating possibilities sequentially, while quantum computing more closely resembles a multi‑lane, multi‑level highway where many possibilities are represented at once and shaped through quantum effects toward useful outcomes.

Quantum computing spans disciplines such as quantum mathematics, quantum algorithms and quantum hardware. Rather than replacing classical computing, quantum computing systems are expected to coexist with classical technologies, each contributing unique strengths to future computational workflows. Quantum computing is best understood as specialized machines designed to address specific challenges, such as simulating quantum systems, certain optimization tasks and specific cryptography-related problems that can be impractical for even the most advanced classical systems.

In the long term, quantum computing could push beyond the practical limits of classical computation, opening new possibilities for science, industry and society.

How does quantum computing work?

Quantum computing operates on principles of quantum mechanics, making it fundamentally different from traditional computing. While classical computers use bits — units that are always 0 or 1 — quantum computers use qubits, which are described by quantum states and can behave in ways classical bits cannot.

Quantum computing leverages two foundational quantum phenomena:

  • Superposition: Unlike a classical bit, which must be either 0 or 1, a qubit is described by a quantum state that can represent a combination of possibilities rather than a single fixed value. With multiple qubits, the combined quantum state can encode many possible outcomes at once, giving quantum algorithms a richer space to work in for certain problem types.
  • Entanglement: When qubits become entangled, the state of one qubit is strongly correlated with the state of another, even when they are separated. This correlation lets quantum circuits coordinate information across qubits in ways that are difficult to reproduce with classical systems, enabling new computational strategies.

Quantum computation is performed by manipulating qubits with quantum gates, which create and control superposition and entanglement. Many quantum algorithms then rely on interference to steer probabilities toward useful results: constructive interference increases the likelihood of measuring desired outcomes, while destructive interference reduces the likelihood of undesired ones. The final step is measurement, which converts the quantum state into a classical output we can interpret by collapsing the superposition into a definite result.

What is the history of quantum computing?

The concept of quantum computing dates back several decades, originating as a theoretical solution to problems classical computers could not efficiently solve. Over time, it has evolved from abstract ideas into experimental systems and, today, into hardware capable of tackling real-world challenges.

  • 1980s, theoretical foundations: American physicist Richard Feynman proposed that computers struggle to simulate physical systems, suggesting the need for quantum-specific machines. British physicist David Deutsch described a quantum generalization of the quantum Turing machine and introduced the idea of a universal quantum computer — theoretical, but foundational models for quantum computing.
  • 1990s, breakthrough algorithms: Two landmark algorithms — Shor’s algorithm for factoring large numbers (posing a challenge to classical cryptography) and Grover’s algorithm for accelerating database searches — demonstrated the advantages and transformative potential of quantum computing.
  • 2000s, from theory to experiment: Researchers began building and testing qubit systems, validating quantum principles in the lab and moving the field from concept to tangible quantum computing prototype solutions.
  • Present day, scaling up: Quantum hardware has advanced from a handful of qubits on quantum hardware devices to quantum hardware systems with hundreds of qubits, enabling computations once considered impossible. While still in its early stages, this rapid progress is driving breakthroughs in optimization, simulation and cryptography.

What are the key types of quantum computing platforms?

Quantum computing platforms are often distinguished by the physical systems used to create and control qubits, the quantum equivalent of bits. Each qubit implementation has unique characteristics that influence performance, scalability and suitability for different quantum computing applications. Here are the most prominent approaches:

  • Spin qubits using quantum dots: This approach uses the spin of a single electron (or hole) confined in a quantum dot, with the spin states encoding the qubit. In silicon-based designs—often using MOS-like gate structures—these devices can align with established semiconductor manufacturing (CMOS/VLSI) methods, supporting dense integration and a potential path to scaling.
  • Superconducting qubits: Built from superconducting circuits operating at cryogenic temperatures, these qubits enable fast gate operations, making them ideal for problems that demand speed. Current platforms focus on improving fidelity (accuracy of operations) and control to scale quantum processors effectively.
  • Trapped ion qubits: These use individual ions suspended in electromagnetic fields and manipulated with lasers. Their strength lies in exceptional coherence (ability to maintain quantum state) and high-fidelity operations, meaning the quantum states and logic gates are extremely precise. While they operate more slowly than superconducting qubits, this precision makes them ideal for algorithms where accuracy matters more than speed.
  • Photonic qubits: Made from particles of light, photonic qubits operate at room temperature and excel in quantum communication, transmitting quantum information over long distances. They’re also promising for certain sampling and networking tasks where light-based systems have natural advantages.
  • Topological qubits (experimental): An emerging approach that aims for intrinsic error protection by encoding information in exotic quantum states involving particles called non-Abelian anyons. These quasiparticles have unusual properties: When they are braided (moved around each other), the system’s quantum state changes in a way that depends on the order of the braiding, not just the position. This makes them highly resistant to local disturbances, offering a path toward quantum computers that are far less error-prone and more stable than other methods.

How is quantum computing used?

As quantum computing advances, so do its applications. Quantum computing is expected to complement, not replace, classical systems. Two promising areas are artificial intelligence (AI) and machine learning (ML). These technologies already place heavy computational demands on classical systems, requiring the processing of massive datasets and complex models.

Quantum computing, by representing and exploring many possibilities at once, makes it potentially well-suited to enhance AI and ML performance, supporting faster training and more efficient data analysis for certain workloads.

Other emerging applications include:

  • Drug discovery
  • Financial modeling
  • Materials science

In these fields, quantum simulations can help model physical and chemical systems, enabling insights that are difficult or impractical to achieve with classical computing alone.

When quantum computing matures and its scalability challenges are progressively resolved, quantum computing will play a pivotal role in hybrid workflows — combining classical and quantum systems to transform industries and accelerate innovation across science, technology and business.

Frequently asked questions

Quantum computing FAQs

Commercial quantum computers are expected to emerge gradually, with significant milestones anticipated by 2040. Progress depends on advances in hardware stability, error correction and scalable architectures.

Classical bits represent a binary value—either 0 or 1. A qubit, by contrast, can exist in a superposition of 0 and 1, meaning it represents a probability of being measured as either value. When a qubit is measured, this probability collapses to a definite 0 or 1. This quantum property of superposition allows quantum computers to approach certain complex problems differently than classical systems. While quantum computers are not yet widely deployed alongside supercomputers, hybrid approaches are being explored for tasks such as optimization, simulation, and cryptography

As quantum computing matures, the demand for high-performance, low-latency memory and storage will intensify. Micron’s expertise in advanced memory technologies positions it as a potential foundational enabler for quantum-classical hybrid systems, supporting data movement and future quantum data centers.

Quantum computing enables tackling problems that are intractable for classical systems, such as large-scale optimization, molecular simulation, and some cryptographic analysis. Its key advantage lies in how quantum states can represent many possibilities and use interference, allowing quantum computing to increase the probability of finding useful answers for certain problem types. This capability can accelerate certain tasks, making quantum computing a powerful complement to classical high-performance systems.

Despite its promise, quantum computing remains fragile and error-prone. Qubits are highly sensitive to environmental noise, causing decoherence (the loss of quantum state), which often demands extreme conditions such as cryogenic temperatures to maintain stability. Current systems also face challenges in error correction, scalability and cost, keeping practical, large-scale quantum computing a challenge for the future.