Every programming language has its own set of primitives—the foundational data types that combine to create more complex structures. Some form natural pairs (e.g., integers vs. floats), while others stand alone (e.g., booleans). They can be grouped into indispensable “core primitives” and more specialised ones, useful in edge cases.
When I step into a new field, I always try to identify its primitives. Nowhere has this been more useful—or more challenging—than in quantum technology.
It’s useful because quantum is such a sprawling domain, filled with bottomless rabbit holes. If you don’t keep your curiosity in check, it’s easy to lose sight of the big picture.
At the same time it’s also challenging because the quantum world is intrinsically enigmatic.
Take the photon, for example. Its “wave-particle duality” might seem like a straightforward pairing of opposites, but in truth, a photon is both a particle and a wave, depending on how you measure it. This is quantum’s nature: concepts that defy simple categorisation.
With that in mind, what follows is my attempt to organise an ontology of quantum primitives. I’ll aim for a high-level overview, while being aware that all things quantum has a tendency to get… complicated.
Here goes.
1. Architectures & Computational Paradigms
Quantum technology as a field comprise more than just computing, but let’s begin there, and let’s start with the most fundamental way of slicing the pie.
Most quantum computers adhere to the circuit model, where algorithms are expressed as sequences of logic gates that manipulate qubits. These systems operate on principles analogous to classical computers: while qubits can perform quantum magic like superposition and entanglement, they ultimately collapse into binary states upon measurement, much like traditional digital bits. This architecture is commonly referred to as gate-based.
Quantum annealers are different beasts, custom-built to solve specific optimisation problems. Instead of gates, problems are mapped onto an energy landscape, and the system uses quantum tunnelling to “cool” into its ground state, representing the optimal solution. While not programmable like gate-based machines, annealers are efficient for certain tasks. They can be seen as a subset of Adiabatic quantum computing, which in theory, can implement any quantum algorithm by encoding it in a Hamiltonian.
Measurement-based quantum computing takes yet another approach. It begins with a large entangled structure called a cluster state or resource state. Computation proceeds through sequential measurements, where the outcome of one is fed forward to determine the next. In theory, it is as powerful as gate-based models, but it remains less mature and is often considered a curiosity.
Topological quantum computing encodes quantum information in the braiding of anyons—exotic particles in two-dimensional systems. These braiding operations correspond to logical gates and are inherently robust against local errors. Practical implementations are still in early stages.
In summary: Gate-based quantum computers dominate, with room for annealers and, to a lesser extent, measurement-based systems. Beyond topological architectures, there’s a long list of other exotic designs that are still in experimental phases.
2. Encodings & Computation Models
The previous list is organised around computational paradigms, by which we mean fundamentally different ways to structure and realise computations. This is not to be confused with computational models, which concerns the formal principles underpinning said paradigms.
Both gate-based and measurement-based quantum computers rely on a digital computation model, whereas annealers are essentially analog machines.
Beyond the distinction between analog and digital, quantum computation can also be classified based on how quantum information is encoded. In the discrete-variable (DV) paradigm quantum information is encoded in distinct states, such as the two-level states of a qubit.
But quantum information can also be represented in continuous variables (CV), utilising degrees of freedom like the amplitude and phase of a light field, or the position and momentum of particles. CV systems are typically realised using squeezed light or other harmonic oscillator systems, making them well-suited for implementations in the photonic and the microwave regimes.
It’s worth noting that the distinction between CV and DV does not directly map onto the analog versus digital computation model. Measurement-based quantum computing, for example, exists in both DV and CV paradigms yet consistently relies on a digital computation model.
3. Qubit modalities
It’s not uncommon to hear people talk about quantum computer “designs”, when really they’re referring to different types of qubit modalities. The term differentiates between physical systems that can serve as the foundation for implementing qubits. There are at least five candidates to take seriously here, even though a creative physicist could easily dream up an infinite array.
The star in this crowd is the superconducting qubit. It’s by far the most popular, largely thanks to its high-speed gate operations and it’s compatibility with integrated circuit manufacturing processes.
Trapped Ion qubits are implemented using lasers for gate operations and electromagnetic fields as traps. Can potentially yield long coherence times (more on that later).
Photonic qubits, where information is encoded into properties such as polarisation or phase of photons, can exist at room temperature and is naturally compatible with telecom equipment. One of its limiting factor is the ability to efficiently generate and detect single photons.
Spin qubits encode information in the spin states of electrons or nuclei, controlled using electromagnetic fields. While coherence poses a significant challenge, their scalability is a distinct strength.
Topological qubits, lastly. I’ve had hour long talks with world leading scientists in this field, but I still can’t wrap my head around exactly what they are or how they work. Let’s just say that they’re good at resisting noise because they’re “topologically protected”, which means long coherence times.
4. Cold Atoms & Neutral Atom Arrays
The following two technologies occupy a unique space, sitting at the intersection of architecture, computational paradigm, and modality. Their shared traits warrant a combined discussion.
Cold atoms represent a broad paradigm in quantum technology, encompassing systems where neutral atoms are cooled to near absolute zero. At these temperatures, atoms exhibit quantum behaviours that are critical for a variety of applications, including quantum computing, simulation, and metrology. This paradigm supports multiple modalitiesfor encoding and manipulating quantum information, such as spin qubits, Rydberg states, vibrational states, and hyperfine states, each suited to different tasks.
One of the most exciting implementations within the cold atom paradigm is the neutral atom array, which uses individual atoms trapped in precise arrangements, typically by optical tweezers or electromagnetic fields. In these systems, quantum information is encoded in the atoms’ electronic states and manipulated with precision, often by exciting them into specific configurations.
Neutral atom arrays shine at scalability, with the potential to host thousands of qubits, and are theoretically capable of evolving into gate-based universal quantum computers. However, implementing high-fidelity gates in these systems remains a challenge. For now, their primary use case is quantum simulations, especially in fields like quantum chemistry and condensed matter physics.
Like quantum annealers, neutral atom arrays rely on an analogue computation model, but they offer significantly more programmability. By adjusting the spatial arrangement of atoms and tuning their interactions, they can model a wide range of quantum systems. In essence, neutral atom arrays can be seen as analogue machines on an evolutionary path toward becoming digital systems. Companies like Pasqal and QuEra are leading the way in leveraging neutral atoms for both quantum simulation and computation. Meanwhile, Atom Computing takes a unique approach within the cold atom paradigm by focusing on nuclear spin qubits, a modality offerring exceptional coherence times
In summary: The technologies discussed in this section transcend traditional boundaries between architectures and modalities, and their relevance also extends beyond computing to broader applications in quantum technology.
5. Qubit Implementations
Looping back to qubit modalities, these can be further unpacked into a variety of specific implementations, each leveraging distinct physical systems and encoding methods. If qubit modality answers the what question, implementation answers the how question.
Take superconducting qubits, for example. The most common implementation of this modality is the transmon—short for transmission-line shunted plasma oscillation qubit. However, this is just one of many possible implementations. Other variations include flux qubits, phase qubits, and more exotic designs like gatemon qubits and quantum annealing qubits (not to be confused with the quantum annealing computational model).
The same principle applies to other modalities. For instance, trapped ions can involve different atomic species—typically ytterbium or calcium—and are manipulated using combinations of magnetic and electric fields. Each combination has its effects on key parameters like gate fidelity, cooling efficiency, and scalability.
In the photonic regime, quantum information can be encoded in various ways, such as polarisation, time-binning, or specific spatial paths of photons. These choices determine how the photons interact and how information is processed, influencing everything from noise resilience to hardware integration.
It might seem like we’re delving too deeply into the specifics, but in quantum computing, there’s no avoiding the details. Building a functional machine demands balancing trade-offs, which requires close attention to implementation.
And it’s not just about performance; implementation choices can deeply influence system design. For instance, trapped ion systems use either lasers or microwaves for gate operations. This seemingly minor distinction can shape how external devices interact with the quantum processor—a crucial factor for specific projects. In quantum, the devil truly is in the details.
6. Error Correction & Logical Qubits
When a qubit collapses into a binary state it does so probabilistically, meaning the same input can yield different outcomes across repeated measurements. That’s very different from how classical computers operate, and it’s the root cause of one of the biggest challenges this technology is faced with, which is uncertainty.
There’s no elegant solution to this issue. The current approach relies on brute force—running quantum algorithms multiple times and averaging the results to statistically determine the most likely outcome.
This means it’s difficult to ascertain how “good” a quantum computer is just by looking at its spec sheet, since, in practice, a large portion of any given system’s available qubits will be dedicated to error correction.
So-called logical qubits represent one effort to address this. The way they work, in principle, is by encoding a single logical qubit into many physical qubits. This redundancy allows the system to detect and correct errors that arise due to quantum noise, decoherence, or imperfections in hardware operations.
The ratio between physical and logical qubits varies depending on the type of error-correcting code used, but also on the error rates of the underlying physical qubits.
7. Coherence & Space-Time Volume
Coherence measures the time a single qubit can maintain its quantum state. It’s an important metric for benchmarking but its practical relevance is limited, especially as actual quantum computers grow to consist of tens if not hundreds of qubits.
Space-Time Volume offers a more holistic way of looking at a systems overall capacity. It’s a measure of how many qubits that can be coherently operated on at the same time; how long the system as a whole can maintain coherence while running a quantum algorithm.
That is to say: Space-Time Volume measures the system’s capacity to perform useful quantum computations rather than just the longevity of a single qubit’s quantum state.
8. Gates & Gate Operations
In classical computing, gates perform well-defined and straightforward logical operations. Not so in the world of quantum computing, where different gate implementations can feel more like mystical incantations. For instance, the Hadamard gate creates superposition states, the CNOT gate entangles two qubits, and the Pauli gate performs rotations around the axes of the Bloch sphere (a Bloch sphere is a geometric representation used to describe the state of a single qubit in quantum mechanics.)
9. Frameworks
Qiskit (IBM), PennyLane (Xanadu), and Cirq (Google) are all prominent examples of quantum computing frameworks. While these may seem out of place in a discussion of primitives—and I agree to some extent—I’ve included them here for pragmatic reasons. Not only are they central tools for quantum computing, but as programming frameworks, they also define their own sets of so-called primitives.
Qiskit, by far the most popular, adopts a layered approach to primitives. At the core is the “Primitives Framework”, a collection of prebuilt, optimised components designed to streamline the development and execution of quantum algorithms. Additionally, Qiskit provides low- and mid-level primitives, which are more akin to what “primitives” traditionally mean in classical programming languages—basic operations and constructs on which higher-level functionality is built.
In contrast, PennyLane specialises in hybrid quantum-classical computation, focusing on primitives tailored for variational algorithms and quantum machine learning. Its emphasis on seamlessly integrating quantum circuits with classical machine learning frameworks makes it a go-to tool for researchers exploring hybrid approaches.
Cirq, on the other hand, targets the direct design and optimisation of quantum circuits, with an emphasis on interoperability with Google’s quantum hardware. Its primitives are geared toward fine-grained control and customisation, geared towards the intersection of software and hardware.
While these frameworks abstract away many complexities of hardware-level primitives, their own primitives serve as critical tools for translating quantum concepts into functional algorithms and experiments. They provide the foundational building blocks necessary for tackling quantum problems, regardless of the hardware platform or application domain.
With that, it’s time to shift focus momentarily from engineering to science.
***
<Intermission : A Quantum of Physics
With the above nine chapters, we’ve covered the core primitives of quantum computing. Now, we turn to quantum technology’s broader horizons: communication and sensing. To prepare, let’s briefly explore two pivotal concepts—entanglement and teleportation.
Quantum entanglement links the states of two or more particles so that measuring one instantly determines the state of the other, no matter the distance between them. It’s a deeply strange phenomenon that has confounded the sharpest minds—Einstein famously called it “spooky action at a distance.”
Less mysterious—in spite of its intriguing name—is quantum teleportation, which builds on entanglement to transfer the quantum state of a system from one place to another.
Is it “beam me up, Scotty”? Not quite. While the effect is instantaneous, it still relies on a classical communication channel (e.g., a radio signal) to decode the information. This means teleporting a state to our closest neighbouring star system, Alpha Centauri, would still take 4.37 years—no shortcuts.
Critically, quantum teleportation doesn’t move physical particles. Instead, it transfers the quantum state itself, with the original state destroyed in the process, adhering to the no-cloning theorem.
Both entanglement and teleportation play supporting roles in quantum computing, but they’re the stars of the show in the world of quantum communication and networking—areas we’ll explore next.
End of Physics Intermission/>
***
10. Quantum Communication : the Basics
The Internet is great and all, but from a technological standpoint, it’s really just a souped-up version of the old telegraph network. Sure, it’s resilient against nuclear attacks thanks to clever routing of information packets, but you can never be certain if an eavesdropper has intercepted or tampered with those packets before they reach you.
Encoding information into quantum systems, such as the polarisation of single photons, solves this issue because the laws of physics don’t allow an eavesdropper to intercept the data unnoticed. Any attempt to measure the quantum state would destroy the information, immediately alerting the communicating parties.
This simple idea is the seed that will eventually, I predict, grow into a full-blown quantum Internet. The road to that distant destination is long, and progress will be incremental. However, it’s already possible to outline some key milestones ahead, the first of which is spelled QKD.
11. Quantum Key Distribution
(& Post Quantum Cryptography)
To understand the need for PQC and QKD, we must first establish a fundamental fact of modern day cryptography: the vast majority of it depends on secret keys that are shared publicly. If intercepted, they’re of no value to an eavesdropper, since it would take eons for the fastest supercomputer to decrypt the information without the corresponding private key. This security hinges on the mathematical complexity of certain problems, such as factoring large numbers or solving discrete logarithms.
This level of security has been sufficient since its invention in the late 70’s (the design, known as RSA after the names of its inventors, has been one of the most lucrative software patents of all time). It’s not considered acceptable anymore however, since even moderately powerful quantum computers can crack it. Given that the entire global banking system largely relies on RSA-style cryptography, that realisation has made a lot of people very nervous.
Two lines of defence have been proposed. The first one relies on better cryptographic schemes, which purports to stand up even to the most brutal of brute-force attacks from quantum computers, such as they exist today and will come into existence tomorrow.
These types of cyphers are marketed as Post Quantum Cryptography, or PQC. And I use the word marketing here because there’s an element of optimism baked into the name, since PQC relies on mathematical assumptions that haven’t yet faced the full scrutiny of future quantum capabilities. (For this same reason, I don’t really consider PQC to be a quantum primitive, but since the term is so pervasive I’ll look the other way).
The second line of defence, then, is represented by Quantum Key Distribution, or QKD. There are two versions of QKD and understanding the nuances will prove useful in the following chapter.
The “simple” way of sending a secret key encoded in a series of quantum systems, is referred to as prepare-and-measure QKD. That’s when point A and B (known as Alice and Bob) are both using hardware that they trust, much like a dedicated line. As explained in the previous chapter, Bob knows someone has listened in on the message (the eavesdropper is always referred to as Eve), because Eve’s interference introduces detectable errors in the key exchange process.
Prepare-and-measure QKD has been around since the 1980s, with BB84 as its most prominent protocol. It’s robust and secure, but its reliance on the precise generation and detection of single photons—an inherently imperfect process—has raised concerns about potential vulnerabilities.
The evolution of this technique is entanglement-based QKD, where a single source generates pairs of entangled photons, sending one to Alice and the other to Bob. Making this work is no small engineering feat, but in return, you gain a communication channel that is fundamentally immune to tampering. The go-to protocol here is E91, named after its creator, Artur Ekert—whose innovation is deeply intertwined with the work of Alain Aspect, John Clauser, and Anton Zeilinger, the recipients of the 2022 Nobel Prize in Physics.
In summary: QKD networks are already being deployed, connecting a select few sensitive nodes. The cost is high, certainly, but the value proposition is compelling.
12. Entanglement-based Networks
QKD is great and all, but no matter its flavour, it’s only designed to securely encrypt and exchange keys. Once that’s done, the actual communication happens over classical channels, using those keys to encrypt and decrypt the messages. As such, even the most flawless QKD protocol is just a step on the thorny path towards fully entanglement-based networks. Or to call that beast by the name that can only be whispered: the Quantum Internet.
The Quantum Internet is the stuff of dreams and science fiction. Unlike today’s Internet, where data travels as streams of classical bits, the Quantum Internet would transmit information using entangled quantum states. This vision promises not just unbreakable security but also capabilities far beyond the reach of classical networks.
If this penultimate chapter is brief, it’s because my imagination falls short. I know that everything we’ve achieved so far will pale in comparison, but I can’t quite envision what the new paradigm will bring. What I am certain of, though, is that it won’t be the Internet on steroids—it will be something entirely, fundamentally different.
12 ½. Quantum Sensing
This is the shortest section, and for good reason: I don’t know much about quantum sensing. What I do know is that it’s an important field in its own right, with a technological readiness level generally ahead of quantum computing and communication. Examples include atomic clocks and quantum-enhanced microscopes.
At its core, quantum sensing relies on the same principles—superposition, entanglement, and quantum coherence—that underpin quantum computing and communication. But rather than processing or transmitting information, it harnesses these properties to measure the world with unprecedented precision. While I can’t speak to the finer details, one thing is clear: quantum sensing isn’t a technology of tomorrow—it’s already here, and it’s already making an impact.
Conclusion & Personal Reflections
It’s fitting that this long post breaks into twelve (and a half) chapters, because it marks the one-year anniversary of my journey up the quantum river.
A year ago, I was asked to explore how I might help strengthen this country’s emerging quantum ecosystem. My only real credential was a brief stint working on a technology so exotic and niche it wouldn’t even feature in this ontology of quantum primitives (Coherent Ising Machines, which I’ve previously written about).
I doubt many readers have made it all the way here, but if you’re that one person, I hope you’ll indulge me as I zoom out of the quantum realm for a moment to share a reference I feel works to wrap this post up. It comes from an old writing coach of mine, who, after working tirelessly to drum the fundamentals of storytelling into my bones, one day looked me in the eye to say that I was now ready to forget all I’d learnt and start doing the only thing that’s important: to write the way I am meant to write.
By analogy, I’ve worked hard to arrive at the overview represented in this post. I’ve read stacks of books, sat through endless hours of dense lectures, and attended the most esoteric of conferences. I’ve paid my dues, and I finally know the lay of the land; I know what’s what.
Now the real journey begins—the one that isn’t about the what but the why. I’m looking forward to that: to sinking my teeth into understanding how this whole quantum endeavour can be put to use to serve a greater purpose.
Because, after all, getting the technology in place is the easy part—it’s putting it to good use that makes our hearts sing.