What happens when we are in an unpredictable world and one of the only reliable ‘answers’ is taken away? Any manager operating in the modern environment of digitisation, disintermediation and disruption will be faced with this sooner or later no matter how traditional or entrenched their industry or position. Just ask ex-Kodak executives or licensed taxi drivers the folly of imagining otherwise. Where might we look for lessons?
The future of computing has always been difficult to predict but for years one prophesy has held so true as to become axiomatic, a ‘law’. Almost 50 years ago Intel’s co-founder Geoffrey Moore foresaw that processing power would double roughly every two years as smaller transistors are packed onto silicone wafers, increasing performance and reducing costs. The upshot is that the ‘phones’ many of us carry in our pocket have more processing power than the ‘supercomputers’ of the 1980s that used to take up a whole room. Moore’s law is beginning to unravel, however: as the size of transistors approaches the atomic scale, making them smaller doesn’t mean they’ll be faster or cheaper. When things can’t get smaller, where does the space to improve exist?
Another great scientific seer, Arthur C. Clark, created with Stanley Kubrick one of the defining visual shorthands for a technological game changer, the black monolith. A version of it lives on in the shape of the D-Wave X2. Its matt black cuboid dimensions aren’t purely for show of course. Most of that space is required to make the right type of a much smaller space. A liquid helium system cools the innards to 0.015 Kelvin – a smidge higher than the coldest temperature that is physically possible. It’s cold, dark heart is shielded from the natural but minuscule flutters and wafts in the earth’s magnetic field.
What requires this very certain environment?
D-Wave is a quantum computer. Quantum computers use the uncertainty of the state of quantum particles (along with their entanglement) to perform certain types of computing much, much more effectively than traditional binary and transistor-driven computing. The basic unit of binary computing is a bit: it holds one of two states, 1 or 0. The basic unit of a quantum computer is a qubit. A 300-qubit machine could represent 2300 different strings of 1s or 0s at the same time – a number roughly equivalent to the number of atoms in the visible universe. And because these myriad qubits are entangled you can manipulate all of them in one go.
The conditions required to deal effectively in the currency of complexity are very hard to create and can deteriorate – decohere – at the slightest fluctuation in temperature or the magnetic field. And here’s the lesson for managers. Those ‘80s supercomputers needed their own climate control and you’d not have wanted to wave a big magnet in their general proximity but the tangible world of 1s and 0s required only equally easy-to-measure-and-grasp environmental management: you could rely on existing technology and its cost was marginal. When you get to the quantum scale of uncertainty and complexity the management of the environment becomes one of the fundamental challenges. The cost is high, it will require bespoke and specialist tools, constant vigilance and will decohere at the slightest provocation. In other words, creating the right environment – the right space – for your teams to deal with uncertainty and complexity isn’t marginal anymore, it’s core. As a manager this means you can’t buy in a ‘black box’ solution, you’ve got to get to grips with the guts of it. That’s a high cost, it’s difficult. The question is this, can you risk avoiding it?