close
US

AFTER SILICON

By  Adiah Qazi
08 May, 2026

Photons. Neurons. Qubits. The transistor era is not over, but its long monopoly on computing is beginning to fade. What follows is not a single replacement, but a convergence of new paradigms that will reshape every device, every network, and every nation on Earth....

AFTER SILICON

COVER STORY

The transistor changed everything. From its invention in the late 1940s through seven uninterrupted decades of miniaturisation, it powered every major leap in computing: the mainframe, the personal computer, the smartphone and the deep-learning revolution now transforming global economies and scientific discovery. For most of this period, progress followed a simple formula: make transistors smaller, fit more of them onto a chip and performance would multiply. This principle, widely known as Moore’s Law, held true far longer than expected.

Today, however, that model is beginning to strain. Transistors have been reduced to dimensions measured in atoms. At this scale, the rules of classical physics give way to quantum effects. Electrons no longer behave predictably; they tunnel through barriers meant to contain them. Meanwhile, the energy demands of modern computing - particularly artificial intelligence - are reaching levels that challenge even national power grids. The industry’s own projections now acknowledge that transistor scaling, while not yet obsolete, is approaching its practical limits.

What comes next is not incremental improvement, but structural reinvention. Three emerging technologies - photonic computing, neuromorphic architecture and quantum information systems - are rapidly moving from research laboratories into real-world deployment. Together, they represent a fundamental shift in how computation is performed.

The transistor was never the end of computing history. It was only the first chapter. The next is being written in light, in quantum states and in the dynamic patterns of artificial neurons.

AFTER SILICON

Why silicon is reaching its limits

Modern computing rests on the metal-oxide-semiconductor field-effect transistor (MOSFET), a device that acts as a switch, encoding binary information through electrical states. Over decades, engineers have continuously reduced its size. In the early 1970s, a transistor measured roughly ten micrometres. By 2026, leading-edge devices are approaching two nanometers - smaller than a strand of DNA.

At such scales, previously negligible quantum effects become dominant challenges. Electrons leak through insulating barriers, causing errors and inefficiencies. Heat density rises to levels that are difficult and costly to manage. At the same time, the economic burden of manufacturing advanced chips has soared, with fabrication plants costing tens of billions of dollars - accessible only to a handful of nations and corporations.

Energy consumption presents an equally pressing issue. Large-scale AI systems now require vast data centres consuming electricity comparable to entire cities. As demand for AI continues to grow, energy requirements are projected to outpace even aggressive expansions in renewable infrastructure. Computing, in other words, has become energy-constrained.

Several physical and economic limits are now driving the transition beyond silicon:

* Quantum tunnelling: At extremely small scales, electrons behave probabilistically, leading to leakage and reduced reliability

* Thermal density: Heat generation has reached practical cooling limits, restricting further performance gains

* Fabrication cost: The exponential rise in manufacturing expenses limits global participation

* Interconnect bottlenecks: Data transfer between components often consumes more energy than computation itself

* AI energy demand: Data-centre energy use is increasing at an unsustainable pace under current architectures

These constraints are not theoretical - they are already shaping the direction of the industry.

Computing at the speed of light: photonics

Photonic computing replaces electrons with photons as carriers of information. Instead of electrical currents moving through copper wires, light travels through optical waveguides - microscopic pathways that guide photons with extraordinary precision.

Remembering one of the co-inventors of the first transistor: Walter Brattain
Remembering one of the co-inventors of the first transistor: Walter Brattain

The advantages are profound. Light travels faster than electrical signals and produces virtually no resistive heat. Moreover, different wavelengths of light can coexist without interference, allowing multiple streams of data to travel simultaneously through a single channel. This principle, known as wavelength-division multiplexing, dramatically increases bandwidth.

For artificial intelligence, photonics offers an especially powerful advantage. Neural networks rely heavily on operations such as matrix multiplication. In photonic systems, these calculations can be performed naturally through the physics of light interference. Rather than consuming energy through active computation, the system allows light itself to carry out the operation.

Laboratory demonstrations through 2025 and 2026 show that photonic AI systems can reduce energy consumption by up to 94 per cent compared to traditional GPU-based systems, while also achieving extremely low latency.

A photonic processor does not struggle against physical laws - it leverages them. Computation becomes an inherent property of light’s behaviour.

Key differences between photonic and electronic systems include:

* Signal medium: Photons instead of electrons

* Heat generation: Minimal in photonics, significant in electronics

* Parallelism: Multiple data streams via wavelength multiplexing

* Bandwidth: Terahertz-scale for photonics versus gigahertz for electronics

* AI efficiency: Native compatibility with neural network operations

AFTER SILICON

The brain reimagined: neuromorphic computing

While photonics draws inspiration from physics, neuromorphic computing looks to biology - specifically, the human brain. The brain performs complex tasks such as pattern recognition and language processing using only about 20 watts of power. In contrast, modern AI systems require hundreds of kilowatts to achieve similar outcomes.

Neuromorphic systems replicate how the brain processes information. Instead of relying on a central clock, they operate asynchronously. Artificial neurons activate only when necessary, based on incoming signals. Information is encoded in the timing and frequency of these signals rather than simple binary states.

This approach offers significant advantages. Computation occurs locally, reducing the need for energy-intensive data transfer. Memory and processing are integrated, eliminating traditional bottlenecks.

Modern neuromorphic chips demonstrate remarkable efficiency, particularly in tasks such as real-time sensory processing and anomaly detection. They consume orders of magnitude less energy than conventional processors and respond instantly to incoming data.

Recent developments have further advanced this field through the integration of photonics. Neuromorphic photonic systems combine brain-inspired architecture with light-based communication, achieving both high efficiency and exceptional speed. Experimental systems now process billions of synaptic events per second, with the ability to learn directly on-chip.

AFTER SILICON

By 2026, neuromorphic computing is already finding practical applications:

* Medical diagnostics at the point of care

* Real-time agricultural monitoring via drones

* Cybersecurity anomaly detection

* Industrial predictive maintenance

* Satellite-based edge computing

These systems excel where traditional architectures struggle: in low-power, real-time and decentralised environments.

Quantum information: the unconventional computer

While photonic and neuromorphic systems extend classical computing, quantum computing introduces an entirely new framework. Classical bits represent either 0 or 1. Quantum bits, or qubits, can exist in multiple states simultaneously through superposition. Furthermore, qubits can be entangled, meaning their states are interconnected regardless of distance.

This allows quantum computers to explore vast solution spaces in parallel, offering potential breakthroughs in fields such as cryptography, optimisation and scientific simulation.

However, practical implementation remains challenging. Qubits are highly sensitive to environmental interference, a problem known as decoherence. Maintaining stable quantum states requires extreme conditions, often near absolute zero. Current systems, known as Noisy Intermediate-Scale Quantum (NISQ) devices, are limited in scale and reliability.

Despite these challenges, progress has accelerated significantly. Advances in error correction have improved qubit stability, with fidelity levels surpassing critical thresholds for practical use. The outlook has shifted: large-scale, fault-tolerant quantum computing is no longer a distant possibility but an approaching reality.

AFTER SILICON

Quantum computing is particularly suited to specific problem domains:

* Cryptography: Potential to break traditional encryption

* Optimisation: Applications in logistics and finance

* Simulation: Modelling molecular and physical systems

* Machine learning: Enhancements in specialised algorithms

Importantly, quantum computing does not replace classical systems. Instead, it complements them by solving problems that are otherwise intractable.

Quantum security: a new defence layer

The rise of quantum computing poses a direct threat to current cryptographic systems. Widely used encryption methods rely on mathematical problems that quantum algorithms could solve efficiently.

To address this, two approaches are emerging. First, post-quantum cryptography aims to develop classical encryption methods resistant to quantum attacks. Second, quantum key distribution (QKD) offers a fundamentally different solution.

QKD uses the quantum properties of photons to generate secure encryption keys. Any attempt to intercept these keys alters their state, making eavesdropping detectable. Real-world implementations have already demonstrated secure communication over hundreds of kilometres, with satellite-based systems extending this capability globally.

This represents a shift from computational security to physical security - where protection is guaranteed by the laws of physics themselves.

AFTER SILICON

Convergence - where technologies meet

The most transformative developments are not occurring within individual technologies, but at their intersections. Photonics, neuromorphic systems and quantum computing are increasingly being integrated into hybrid architectures.

Neuromorphic photonics combines energy-efficient neural processing with high-speed optical communication. Photonic quantum interconnects link separate quantum processors into larger systems. Quantum-secured photonic networks ensure that the infrastructure supporting these technologies remains secure.

These advancements are no longer theoretical. Prototypes and pilot deployments are already underway, with commercial applications expected within the next few years. The pace of development is accelerating, driven by both technological readiness and urgent economic pressures.

Challenges and the path forward

Despite their promise, these technologies face significant hurdles. Photonic systems struggle with certain programmable operations. Neuromorphic hardware lacks general-purpose flexibility. Quantum computing requires further scaling and infrastructure development.

These limitations do not halt progress - they define its timeline. The transition to post-silicon computing is not a question of possibility, but of readiness.

AFTER SILICON

Organisations must begin preparing now:

* Assess vulnerabilities to quantum threats

* Explore photonic acceleration for AI workloads

* Pilot neuromorphic solutions for edge computing

* Invest in specialised technical expertise

* Engage with emerging global standards

AFTER SILICON

Conclusion: the second chapter

Technological revolutions rarely arrive suddenly. They emerge gradually, at the intersection of necessity and innovation. We are now in that transitional phase.

Silicon continues to dominate, but its limitations are clear. Meanwhile, new paradigms are proving their capability in real-world systems. The future of computing will not be defined by a single technology, but by the integration of many.

The transistor was not the end - it was the beginning. The next chapter is already unfolding in light, in quantum states and in systems that learn and adapt in real time.

The silicon age was extraordinary. What follows may be even more so.

More From US
XWit
By US Desk

THE GREEN ROOM
By Sameen Amer

POETS’ CORNER
By US Desk

DIGITAL DISTRACTIONS
By Meera Ayaz

MOBILE PHONE ADDICTION
By Tooba Samad

The illusion of belonging
By Lucky Sartaj Gichki

The art of noticing
By Umaima Hoorain