There’s a common perception of how innovation happens: someone locked in a dark room has a lightbulb moment and with a finger click a world-changing idea is born.
This is true for the most part. Without people of immense passion, ambition and energy, we wouldn’t have much of the technology we take for granted today. But it doesn’t tell the whole story.
Many of the world’s most innovative technologies have come about thanks to great minds combined with institutional support. In fact, economist and author Mariana Mazzucato has argued that, of the 100 most important innovations from 1971 to 2006 (as identified by R&D Magazine), almost 90% depended heavily on government research support.
Although an inherent part of nearly every smart or connected device, Global Positioning System (GPS) technology wasn’t developed by a smartphone manufacturer like Apple or Samsung – it’s a product of the US Department of Defense.
In 1964, Roger Easton, a scientist at the US Naval Research Laboratory, began experimenting with satellite tracking to understand the path of satellites. In 1974, he was awarded a patent for a system which became GPS.
It was subsequently used by the US military to gain a more accurate understanding of the global position of its assets, but in 1998 US Vice President Al Gore announced a plan that would make GPS satellites transmit two additional signals that could be used for civilian means (for example, ensuring aircraft safety), leading the way for its inclusion in any number of devices.
Creator of the first commercially successful electric sports car and some of the most promising battery technology in the energy industry, Tesla is arguably one of the most exciting companies of the past few decades. Much of this is owed to the ground-breaking vision of its CEO, Elon Musk. However, it is equally indebted to early government support.
Telsa received a $465 million loan from the US Department of Energy in 2009, while two of Musk’s other companies SolarCity and SpaceX have also received substantial support. In total, Musk’s three companies have received $4.9 billion in US government support.
Today jet engines take millions of people around the world every minute, but before the Second World War they only existed in labs. While training for the Royal Air Force (RAF) in the late 1920s, Coventry-born engineer Frank Whittle outlined in a thesis the idea of using a gas turbine to propel a jet.
At the time he struggled to drum up any interest, with the UK’s Air Ministry believing the concept was impractical. Nevertheless, Whittle went on found his company Power Jets and prove his engine’s design in a lab in 1937.
With the political landscape in Europe becoming increasingly tense, the Air Ministry soon realised the potential importance of jet engines. It signed a contract in 1939 for Power Jets to develop an engine for aircraft manufacturer Gloster.
One of the most important innovations of the last 50 years, the internet has its roots in the US military. In the 1960s, the Department of Defense developed a communication system that could directly link computers, called the ARPANET (Advanced Research Projects Agency Network).
This led to further development of computer-connecting systems, and in the mid 80s, British computer scientist Tim Berners-Lee, who at the time was working CERN, developed the Worldwide Web, leading to the increasingly connected world we know today.
In 1959, Texas Instruments, one of the USA’s leading electronics companies, announced the launch of the integrated circuit – a small, self-contained circuit that didn’t require additional, disparate parts to function.
The potential of this early microchip was huge, but it was expensive to produce, which limited its growth. When the government realised it could use microchips to improve its missile and rocket guidance systems, it fuelled an increased demand that facilitated mass production of the new technology, and sparked the beginning of true growth.
The black box
In 1950, Australia’s Aeronautical Research Laboratory (ASL) was trying to figure out why British Comet aircraft were crashing. Chemist David Warren was researching fuels for newly-arrived jet engines, when he realised what ASL needed was not speculations on crashing Comets, but more data on what was happening inside the planes.
Development of the black box was gradual as it evolved from a simple tape recording of pilots’ conversation. The concept was even passed over by the Royal Australian Air Force.
It was a visit by Sir Robert Hardingham, the Secretary of the British Air Registration Board, in 1958 that eventually led to Warren presenting the idea at the UK’s Royal Aeronautical Establishment. Production of what we now know as a black box soon began in the UK.
Modern touchscreen technology was initially developed by the University of Kentucky, but it wasn’t until 1996, when the NSF and CIA began funding research at the University of Delaware, that the technology truly took off.
In 2001 the first touchscreen tablet was introduced by a company called FingerWorks, which was started by the University of Delaware research team. In 2005, Apple purchased their technology and adapted it to create the iPhone screen, which led to the technology’s ubiquity across many of the ‘smart devices’ we use today.
Renewable and low carbon energy
Transforming an entire industry takes time. But when the matter is as urgent as decarbonising power generation on a national or global scale, government backing of the technologies making this possible is crucial. The electric revolution or ‘electricity era’ will positively impact heating, manufacturing, transport and other industries to lower their own carbon intensity.
The UK is aiming to drive adoption of renewable technologies and nuclear power through its Contracts for Difference (CfD) scheme. It works by generators being paid the difference between the ‘strike price’ – the cost of electricity with investment in the low-carbon technology factored in – and the market price to stabilise revenues for low-carbon suppliers.
An earlier scheme, the Renewables Obligation (RO), now closed to new projects in favour of CfDs, involves power generators receiving Renewable Obligation Certificates (ROCs) for every megawatt-hour (MWh) of electricity they generate. If they fail to meet their obligation to produce renewable power, they pay a penalty. CfDs allow for longer-term certainty, both for suppliers and consumers of power.
On a smaller scale, feed-in tariffs support adoption of renewable technologies by property owners with payments to cover the electricity it generates in excess of its own usage.
It’s not easy to get a new technology up and running – or to adapt and improve it to a challenge such as man-made climate change. The infrastructure investment, the supplier network, the manufacturing of multiple facilities all take time and investment. It often means companies at the vanguard of cutting edge industries need to spend a few years investing money before they and their shareholders stand to see any returns.
CfDs encourage businesses to secure their shareholders’ approval to commit capital over periods of a decade or more. Such investment has enabled wind, nuclear and biomass projects to get off the ground more quickly, whereas without a CfD, the upfront costs may have been prohibitive. Where flexible, low carbon biomass power generation at Drax is concerned, it’s providing certainty up to 2027, after which we hope to be able to operate subsidy-free.
The examples above show how early government support has been so critical in spurring commercial investment – and why it has played a large part in developing the industries and technologies we now see as innovative.
This process of innovation may be more than a lightbulb moment, but it does help keep the lights on.