As I made my way through continental Europe this summer, I took a detour to Göttingen. I had just read Julia Ravanis’ biography of the mathematician Emmy Noether and wanted to walk the medieval cobblestone alleys she once did.

Of course, it wasn’t just her. For a time, Göttingen was the beating heart of mathematical and physical thought — home to Hilbert, Born, and Heisenberg, and deeply entangled with the revolutions sparked by Planck and Einstein. It was here that quantum mechanics blew the doors off classical physics.

A century later, the questions have changed, but Europe’s position in the quantum landscape hasn’t. Quietly, steadily, European researchers continue to lead — in publications and in foundational breakthroughs. The continent may not always shout the loudest, but in sheer scientific output, it still punches above its weight.

The European Commission’s newly released Quantum Europe Strategy is, in part, an attempt to turn that intellectual advantage into something more cohesive — to translate deep research into deep infrastructure.

It’s received a fair bit of flak from industry insiders, frustrated by the underwhelming ambition of reaching just 100 logical qubits by 2030.

I get where they’re coming from — it feels like too little, too late.

Still and all, I think the criticism misses the point.

The real issue isn’t the modest hardware targets — it’s that hardware may not be where our true opportunity lies.

We’re already seeing the early outlines of a global, cloud-based quantum ecosystem. Machines will be accessible. What’s scarce — and far harder to scale — is understanding. Not just how to operate these machines, but how to formulate problems in a way that quantum devices can actually solve.

That work — the conceptual heavy lifting — is deeply technical, often thankless, and almost invisible from the outside. But it’s also where Europe could lead. Not by out-building the competition, but by out-thinking it.

Which brings us to what, in my mind, is sorely lacking from the current strategy: a serious, concerted effort to develop real, grounded use-cases.

Because from an algorithmic point of view, quantum doesn’t just mean “faster” or “more powerful” — it’s an entirely different kind of beast.

And if that beast is going to be useful, we need to start over from first principles: rethinking how we model real-world problems so they become quantum-native.

This isn’t about making old algorithms run faster. It’s about discovering which problems only become tractable once you start thinking in quantum terms. Simulating molecular interactions, for instance — something classical computers can only approximate — becomes natural on a quantum device, because molecules themselves are quantum systems. Certain optimisation problems, too, may reveal shortcuts not by brute force, but by letting solutions interfere and cancel out like waves. These aren’t incremental improvements. They’re structural shifts in what counts as solvable.

It’s also not a question of chips. It’s a question of mindset.

The problem, of course, is that mindsets don’t change just because a policy document says they should.

To be perfectly honest, they don’t even change when the maths is watertight.

I’m thinking, of course, of what may be the biggest quantum breakthrough I’ll witness in my lifetime: Danish mathematicians Jørgen Ellegaard Andersen and Shan Shan’s recent announcement that they had used Gaussian Boson Sampling — running on a photonic quantum device — to outperform classical Monte Carlo methods on a non-trivial, structured problem.

Simulations based on Monte Carlo techniques underpin large parts of modern science and finance. Demonstrating an exponential quantum speedup within that framework — not just on paper, but in practice — should be a big deal.

And yet: silence.

No headlines. No investor excitement. Barely a ripple in the community. At the conference where it was presented, the result passed by almost unnoticed. I remember walking out thinking of that old joke — the mother watching her son march in a parade and proudly remarking that he’s the only one in step. That’s how I felt. If nobody else seems to care, maybe I’m the one who’s off-beat.

But I don’t think I am.

Because this — this quiet, under-recognised result — is exactly the kind of breakthrough that could reshape the field. Not a bigger machine, but a new kind of usefulness. A shift in what quantum computation is for.

And it’s precisely this kind of insight the EU strategy fails to grasp — the ability to recognise when something foundational has changed, even if it doesn’t come wrapped in a headline.

If Europe wants to lead, it shouldn’t just aim to build more qubits. It should invest in the thinking required to notice when something like this happens — and to act on it.

And look — if we really must keep developing hardware, then let’s at least do so on a platform where we have a fighting chance. The Commission itself notes that Europe is already ahead in photonic quantum computing — not just in papers, but in actual deployed systems.

That’s not a footnote — it’s a foothold. A chance to shape the field on our own terms, rather than racing to follow someone else’s playbook.

Insight, not scale, is what will shape the next chapter of quantum computing — and it will take courage to follow it, even when no one else is marching in step.