The problem with coming up to speed in a new field of knowledge, is that there’s really no such thing as distinct fields. Disciplines endlessly bleed into each other and in order to understand one, there’s always some fundamental piece of knowledge missing; a piece that seem to bear little relation to whatever you’re trying to come to grips with, but without which you’ll never achieve true understanding. That’s why trying to learn new things is so uncomfortable. Like Alice jumping down her rabbit hole, you have to be fine with letting one thing lead to the next, even when the connected dots seem to draw a completely random line.

I tend to get this a lot in my day job, which often requires me to wrap my head around new domains. That’s why most of the books on my bedside table are some sorts of primers.

And here’s the thing with trying to “prime” your mind: You might think it feels like being enlightened, when really the opposite is true. Instead of getting surprising new perspectives on something fundamentally familiar—which is really the privilege of specialists—you’re flying high above the surface of a hitherto unknown planet, trying to get a general sense of the lay of the strange land. Which more often than not tend to be quite painful. Because if the field you’re looking at is interesting enough to merit your attention in the first place, it’s likely also foreign enough to induce anxiety.

All this is to say: I wish I had solid enough math and physics to really understand Brian Culshaw’s brief little book Introducing Photonics.

Even while missing most of the finer points however, I do come away from the reading with a feeling that I’ve glimpsed something big. Maybe somewhat akin to what those folks who sent the first emails back in the 70’s must have felt. Like: give this technology enough time, and it’ll change everything.

And just like folks back then probably underestimated the deep impact the Internet would turn out to have, I can only assume that I’m underestimating what, for lack of a better term, I’ll call the photonic paradigm.

If such a paradigm is indeed in the making, then by some measure we might already be five years into it, because 2018 was the year when the photonics industry outgrew the electronics industry in Europe. (It was also the year when professor Culshaw published his book).

So what’s the big deal with photonics?

It’s just that, to me at least, it seems to be the discipline which most powerfully delivers on the third law of Arthur C. Clarke, the one which says that “Any sufficiently advanced technology is indistinguishable from magic”.

Because really, what other branch of science can lay claim to solve teleportation and levitation? Where else to turn for invisibility cloaks and computers which run on light alone?

Teleportation is a very real effect of quantum entanglement, the only phenomenon known to mankind which defies the speed of light. Granted, the only “thing” that can be transmitted is information, but hey it’s a start.

Levitation is what’s actually happens when an object is held in place by optical tweezers, a method that has garnered two Nobel prizes (physics, 1997 and 2018). The object in question need to be tiny, but it really is literally held up by a ray of light.

As for invisibility we might not quite be there yet, but according to Culshaw it’s perfectly possible from a theoretical point of view to construct a meta-material, or a cloak, that:

“renders the object which its surrounds totally invisible. The necessary index distribution to ensure that this actually happens can, in principle, be readily calculated for a particular candidate object.”

In practice, “such a distribution is tractable only for simply object, for example cylinders of spheres.” Still though, isn’t it pretty fantastic!

Lastly, photonic integrated circuits, or PIC’s (sometimes also referred to as “integrated optical circuit”) might seem mundane compared to above mentioned wizardry, but it’s what raises my pulse the most.

A word or two on the nomenclature first.

Although the term electronic was coined in the late 19th century and carried a different meaning back then, we tend to associate it now with the rise of electronic integrated circuits, a technological revolution largely driven by the invention of the transistor.

The term photonics will carry different meanings depending on who you ask. To an academic, it’s a subfield of applied physics. To industry people however, its connotation is closely tied fiber-optical communication, and also to to the idea of integrated circuits where the architecture is built around pulses of light instead of electric currents.

This isn’t as crazy as it first seems, given that light and electricity are both examples of electromagnetic radiation, albeit on different wavelengths. That’s why the idea has been around since the 60’s (when it was spurred by the invention of the laser).

Moving from idea to implementation has proven challenging though, for a couple of reasons.

First of all, there’s no equivalent to silicon for PIC’s.

Silicon isn’t necessarily the optimal candidate for building semiconductors. Its low thermal properties necessitates cooling, and Germanium and Gallium would both have better electronic properties, not to mention Graphene, which is the most conductive material that researchers know of. Optimizing for robustness, there are still other options. A traditional integrated circuit would fry within minutes of landing on Venus, but the silicon-carbide chip developed by researchers at KTH, is ready to be “sent to hell“.

Yet in spite of all these splendid alternatives, silicon remains the dominant material platform for semiconductors; because it’s cheap, abundant and easy to manipulate.

There’s simply nothing like that all-purpose middle of the road material that can become the default when it comes to integrating photonic circuits. Instead pretty much all devises on a PIC call for different materials.

Indium phosphide is great for active laser generation, amplification, control, and detection, all of which is crucial components of communication and sensing applications.

Silicon nitride has a spectral range that makes it suited for detectors.

Lithium niobate exhibits strong nonlinear optical properties and its refractive index can be modulated by an applied electric field, meaning it’s the perfect choice for building tunable filters.

Then there’s Gallium arsenide with its high electron mobility, which means transistors manufactured in this material are great for making integrated circuit drivers for high speed lasers and modulators.

The list goes on, and even include Silicon itself, which might not have the best optical properties, but is great for integrating photonic and electronic components.

This plethora of materials means it’s difficult to scale production. Compounding the challenge, PIC’s also requires quite a number of different devices to be integrated in order to work. While microchips have given rise to many innovations, what set off the revolution was really only one key enabling technology: the transistor. By contrast, for PIC’s to start delivering on something like Moore’s law, breakthroughs will need to happen in a number of key areas; from coming up with better waveguides, modulators and power splitters, to ways of getting light on and off chips in an efficient manner.

All this is to say that while photonic circuits might very well be the future of computing, we probably shouldn’t hold our breath in anticipation of that future actually materializing anytime soon. After all, the history of technology is full of examples of superior technologies failing to change the status quo, electric vehicles being just one of many examples. But then again: once the shift does occur, it tends to take people by surprise. Just think of the Internet, which was decades in the making, before becoming an “overnight success” with the invention of the web.

I think we’re on the brink of a similar breakthrough when it comes to optical computing. Research is moving faster than ever; the telecommunication industry has been honing implementations for a long time, and the push for quantum makes vast amounts of venture capital pour into the space.

When we’ve landed in this new paradigm, we’ll see radical gains in terms of higher speed and lower energy consumption, but that’s not what’s most exciting. I foresee the real interesting shift to be all the fundamentally new computer architectures that we’ll soon see flourish.

The digital computer such as we know it, is most often based on the von Neuman architecture. In fact, this model has been so pervasive over the last seven decades, that all alternatives are lumped together as “Non von Neuman“, even though they can be vastly different from each other. (An analogy comes to mind: when I was traveling through India in the 90’s, vegetarianism was such an absolut default behaviour that the establishments which strayed had signs outside reading “Non-vegetarian restaurant“).

I’ve written previously about Ising machines, which I think are absolutely fascinating examples of Non von Neuman architectures. Neuromorphic computing is yet another emerging field, where processors are modelled on biological systems (i.e. brains), promising not just improvements over the existing state of the art, but completely new ways of even looking at what computing is; ways which will probably require us to even find a more fitting term than computing.

Exciting times!