An Electronics Revolution

Hi Future Fans,

I’ve been tracking some pretty exciting advancements and future promise in the area of electronics & computing, and thought it would be interesting to write a piece on this. It is a bit of a dense topic and is rather fascinating (in my opinion 😉). I hope you enjoy …

I’ve discussed a few times that progress in many fields is driven by the combination of increases in computing power (think Moore’s Law), advancements towards new methods (e.g. new computing architectures), and advancements in related fields (e.g. AI impacting many other fields). These drivers can multiply to push progress much faster than with any one driver alone.

This post will focus on the second of these three elements (new methods), as this driver is picking up pace and becoming more central to our progress over time.

Firstly, let’s dive into some context by looking at this Wired article talking about dedicated processors for Artificial Intelligence. The designer of this ‘Graphcore’ chip claims improvements (over time) of perhaps 100x in AI processing versus today’s less-tailored / more generalised processors.

https://www.wired.co.uk/article/graphcore-ai-ipu-chip-nigel-toon

For any specific computing task, there’s potentially a multiple-order-of-magnitude benefit gained via a move from a general purpose processor (as are found in most of today’s PCs) towards a specifically tailored processor (known as an Application Specific Integrated Circuit or ‘ASIC’). Most consumers won’t notice that we benefit from this every day; as a simple example, our mobile phones contain specifically-designed processors to deal with playing compressed music and video. If our phones used a general purpose processor to do this, then our phones would either need huge batteries or they would run out of power very quickly indeed.

Bitcoin experienced the same journey, with most coins originally being created (mined) via general purpose CPUs and now commonly being mined via dedicated application-specific processors (ASICs) that are (conservatively) 10,000x more efficient at this cryptographic task. Computer graphics experienced a similar journey, moving from a general-purpose processor (the CPU) doing the work, through to dedicated ‘Graphics Processing Units’ (GPUs) that are delivering vastly superior graphics realism. This is a very well-trodden path that hasn’t fully hit AI yet. Of course, Graphcore aren’t the only company working on dedicated AI circuitry; notably, google has been designing dedicated processors for AI (the google ‘Tensor Processing Unit’ or ‘TPU’ for machine learning).

Moving away from current and historical context, we may be heading for a grand new era of boutique specialised computing and, indeed, a fundamental revolution in electronics.

DARPA announced the ‘Electronics Resurgence Initiative’ (ERI) about a year ago, and some of the more recent announcements prompted this futurist.tech post. This ERI program wraps together several significant investment streams that are designed to ensure ‘far reaching improvements in electronics performance beyond the limits of traditional scaling’.

https://www.darpa.mil/work-with-us/electronics-resurgence-initiative

The program combines multiple elements that I’ll summarise here, with a little commentary, and there will be some (more recent) further reading links below.

1. The ‘Architectures’ area is seeking to create computing systems that can re-configure themselves in real time, allowing them to very efficiently adapt to new types of work. This potentially provides ASIC-like efficiency from general-purpose software and devices, breaking down one of the core historical computing conflicts between efficiency and generalisability

2. The ‘Design’ area will dramatically reduce the cost and complexity of new chip design. This promises to drive us into a new era of specialist, super-efficient, chips for all kinds of specialists tasks. For me, this also dovetails nicely with the ‘Internet of Things’ whereby all kinds of objects around us will contain sensors, computing capability, and connectivity to the world. These IoT systems will ideally sip tiny amounts of power, and this suggests benefits via application-specific circuitry for all kinds of unique devices

3. The ‘Materials & Integration’ area is possibly the most exciting of all, and it aims for nothing less than to completely change the ways that electronics and computing is done. The reality is that the modern electronic industry still relies on the concepts of transistors (circa mid-late 1940s) and Integrated Circuits (circa late 1950s to early 1960s). There is a huge opportunity to leverage modern capabilities to re-think electronics and computation from the ground-up

I, for one, am truly excited to see how this plays out over the coming years and decades. The impacts of these initiatives could be truly game changing.

Further reading:

https://www.darpa.mil/news-events/2017-06-01

https://www.darpa.mil/news-events/2018-07-24a

jumpmosaic-619-316