flex-height
text-black

Robotic finger touching a keyboard

6 surprising innovations for the future of computing

Researchers are pushing into new computing frontiers using carbon, DNA strands, and other means to transcend the limits of silicon.

default

{}

default

{}

primary

default

{}

secondary

As silicon-based transistors become so tiny that they bump up against the laws of physics, manufacturing techniques can no longer keep up. That signals the upper limits of Moore’s Law, which posits that the number of transistors on a microprocessor (and therefore its computing power) can double every two years. But does that mean the era of exponential tech-driven change is about to come to a screeching halt?

Absolutely not.

Moore’s Law has never been an immutable truth, like gravity or the conservation of energy. It’s been more of a self-fulfilling prophecy: it set expectations for chip makers to meet, and so they did. That helped stoke the world’s insatiable hunger for more and more computing power—and that demand isn’t going to disappear just because we’ve taken silicon-based microprocessors about as far as they can go. So now we need to explore new ways of packing more power into ever tinier spaces.

The future of computing is being shaped by transistors made from materials other than silicon. It’s being amplified by approaches that have nothing to do with transistor speed, such as deep-learning software and the ability to crowdsource excess computing power to create what amounts to distributed supercomputers. It may even redefine computing itself.

Machine making sensor

Here are some of the landmarks on computing’s new frontiers:

We may be approaching the limits of what silicon chips can do, but technology itself is still accelerating. It’s unlikely to stop being the driving force in modern life. Its influence will only increase as new computing technologies push robotics, artificial intelligence, machine-to-human interfaces, nanotechnology, and other world-shaking advances past today’s accepted limits.

In short, exponential growth in computing may not be able to go on forever, but its end is still much further in the future than we might think.