Whether we first heard about it in school, over dinner, or on TV, most of us are familiar with the rapid growth of technology in today’s society. Some claim that this growth is so extreme that technology doubles every year. Hearing this factoid often makes people wonder: How fast does technology really grow?
Technology is building on itself and, in many ways, accelerating exponentially. Since 1965 the approximate number of transistors on a computer chip has doubled about every year and a half. In an electronic sense, technology doubles approximately every 18-24 months.
This article will discuss some current and future technologies and what technologies will develop in their place?. It also looks at modern transistors with the smallest structures on the most advanced microchips currently measured in nanometers in the concept of Moore’s Law and how it applies to technological growth in today’s world. Read on to learn more about the rate at which technology is growing.
How Fast Does Technology Grow?
When it comes to explaining the rate of technological growth in today’s society, many experts refer to Moore’s Law. Although it is called a law, this is not a legal term. Instead, it is a principle used in the computing industry to explain the growth of technology.
Moore’s Law
Moore’s Law originated in the 1960s and is named after Intel co-founder Gordon Moore. Intel was an essential part of the development of computers in the late 1960s. In fact, the company is known to this day for the invention of the microprocessor; over 50 years later, we still use microprocessors in our computers and other electronic devices!
Gordon Moore’s involvement in developing the microprocessor led to some interesting observations. Over a few years, Moore realized that the number of transistors that could fit on a microchip was increasing at a remarkable rate. This observation led to the theory that technology doubles after a certain amount of time.
At the beginning of the theory’s existence in the late 60s, technology was thought to double every year. However, Moore revisited his forecast in 1975 and predicted that technology would double every two years. This prediction marked the beginning of Moore’s Law as we know it today.
What Qualifies As Technology?
Technology is anything from a Neo-Lithic Axe Head to Artificial Intelligence, depending on your current time and place in history. Arthur C. Clarke said Any sufficiently advanced technology is indistinguishable from magic.”
Current and Upcoming Technologies Include:
- Artificial Intelligence (AI)
- Augmented Reality (AR)
- Big Data and Analytics
- DNA Data Storage
- Genetics, Genetic Engineering
- Internet of Things (IoT)
- Metaverse
- Nanotechnology
- Quantum Computing
- Robotics
- Virtual Reality (VR)
- 3D Integrated Circuits
- 3D Printing
- 5G
- 6G
In terms of Moore’s Law, technology refers to computers and their capabilities. To fully understand the growth of technology, it’s essential first to have some basic knowledge of how computers work.
Transistors and Integrated Circuits allow your computer to complete calculations and complex tasks on a fundamental level.
What Are Transistors?
A transistor or nano transistor is an incredibly tiny binary device capable of switching between two modes or states. Believe it or not, these two simple states make it possible for computers to complete simple calculations and, eventually, complex tasks.
The most used semiconductor device globally is the metal-oxide-semiconductor field-effect transistor (MOSFET), which Mohamed M. Atalla and Dawon Kahng invented at Bell Labs in 1959.
Individual nano transistors today are unbelievably small. For reference, one nanometer is a billionth of a meter—that’s smaller than a strand of DNA! A modern transistor is approximately 2 nanometers wide.
Transistors work together to make an integrated circuit, or what is more commonly known as a microchip. The more transistors a computer has on a chip, the faster it can complete tasks and calculations. With all of the individual transistors working together, computers can complete more complex tasks.
What is an Integrated Circuit?
Integrated circuits or microchips are at the heart of almost all current electronics and have achieved the use of nanoscale components. An integrated circuit (IC) is a set of electronic circuits on a small substrate of semiconductor material like silicon. Transistors act as miniature electrical switches that turn an electrical current on or off on the IC.
Integrated circuits can hold billions of nano transistors, resistors, and capacitors. The integrated circuit comes in many forms like a System on a Chip (SoC), microprocessors, microcontrollers, logic gates, motor controllers, and voltage regulators. They are in everything from your smartwatch and headphones to supercomputers.
Integrated circuits can perform calculations and store data using digital or analog technology. Digital integrated circuits use binary code with values of 1 and 0. Digital integrated circuits are what run all modern computers and most consumer electronics.
Moving Transistors, Integrated Circuits, and Moores Law Forward
Extreme Ultraviolet Lithography (EUV) Technology is a testament to moving Moore’s Law forward and advancing technology. If you look at what scientists and engineers are doing with tech like this, we would say Moore’s Law is far from dead.
A Dutch company named ASML is building school bus-sized machines that apply billions of transistors to the most advanced microchips at 7.5 and 3 nm nodes. Microchip companies utilize novel and tinier transistor designs and chip architectures to make their chips more energy-efficient, powerful, and faster.
Thanks to integrated circuit microchips, technology has increased computing power one trillion-fold since its inception in the mid-1950s to today. They run the SoC on-device artificial intelligence used in smartphones and robotics and the many new and upcoming technologies listed above.
Please see some of our other interesting articles on “What Is DNA Data Storage?” and “Smartphone AI: Helpful Artificial Intelligence For The Beginner.”
How Do We Measure Technological Growth?
With a basic understanding of transistors and microchips, it becomes much easier to understand how technological growth is measured.
Today, such growth is calculated according to how many operations a computer can carry out in a second. Specifically, we can track this progress by measuring how many floating-point operations a computer can complete in one second.
What Are Floating-Point Operations?
Floating-point operations (also known as FLOPs) are the most commonly used devices for measuring technological growth. Scientists measure this growth by tracking computer efficiency using FLOPs.
Essentially, FLOPs represent the number of valuable calculations a computer can complete. By frequently measuring how many calculations a computer can complete in a second, we can more clearly understand how computers’ efficiency changes over time.
What Causes Technological Growth?
At any given moment, technological engineers are working to advance technology as we know it today. Such exponential growth that we have seen in computers and technology results from various factors.
Engineers and developers constantly discover ways to utilize their time and resources more efficiently. As they find new ways to increase productivity, tech developers also make discoveries that make technology continually more efficient. As a result, we see growth in overall technology.
At the same time, scientists are constantly researching to understand modern technology and its capabilities better. As a result, new scientific discoveries surface. This type of productive collaboration also leads to growth in technology.
Algorithms and Software Improvements
In today’s world, Algorithms and Software Improvements will probably be the innovation leaders to take us to the next level of improvement in computing capacity. We are already using AI-enhanced computational techniques, algorithms, and machine learning-powered neural nets to push forward.
It is expected that AI will be necessary to offer a hundredfold growth in technology from where we are today. With the expansion of deep learning, neural network architectures move into other areas like self-supervised learning.
If AI can learn without the need for labeled data and become closer to how the human mind works on its own without human intervention. In that case, it is definitely on the path to continued technological expansion and growth.
Technological Growth in History
In 1971, an essential computer chip could hold 2,308 transistors. From 1975 to 2009, it took 18 months for technology to double. By consistently progressing at this rate, by 2009, engineers could fit up to 2.31 billion transistors on a computer chip.
On its own, Moore’s forecast has shaped the development of technology over the previous 50 years. As of October 2021, the Apple M1 Max contains 5nm technology that makes it possible to fit 57 Billion transistors on a single GPU chi. This exponential growth is an excellent display of Moore’s Law in action.
Depending on the source, TSMC will manufacture its first 3 nanometer chips that will arrive in late 2022 or early 2023. IBM has already released news about its 2-nanometer chips, so who knows how many transistors will ultimately fit on an IC chip.
How Quickly Will Technology Grow in the Future?
Technological growth has remained relatively consistent over the last five decades or so. However, recent developments in the modern world are making some scientists question whether this progress will continue at the same rate it has moved since the 1960s.
Is Moore’s Law Still True?
When Gordon Moore first formed his theory about technological growth, he predicted it would hold for ten years or more. Moore was correct in his prediction; the forecast has consistently represented technological progress over nearly fifty years!
As Jim Keller, a believer in Moore’s Law discusses in this video, there are millions of people working on thousands of technologies moving it forward. There are so many things that contribute to these systems, like transistor architecture, materials, components, power dissipation, and operating voltage, just to name a few.
Is Moore’s Law Ending?
Moores Law has remained consistent for 50 years, so why would it suddenly come to an end with the parameters and constraints of physics (as we see it)? In short, computer engineers have not run out of ways to make computer chips smaller while simultaneously increasing the number of transistors they can hold.
As we have already seen in recent years, as discussed with EUV Technology above, scientists and engineers have found their way around transistors and IC limitations and continue to do so.
With the variants of technology and the infusion of AI into the equation with algorithms and software picking up the possible slack of future hardware, it is unlikely that an equivalent Moore’s Law will end. It may not be Moore’s Law if the only variant we look at moving technology forward is transistors, but technology will continue to move forward.
What Will Happen Next?
Although engineers are not at a loss for further innovating the already remarkable computer chip, progress is still very possible. Many technological experts have formed predictions about how engineers will make this progress. Some predict that instead of focusing on transistors, specialized engineers will dedicate more of their attention towards increasing energy efficiency and the lifetime of the technology we use today.
We have computer algorithms that can improve automatically through experience and by the use of data. Soon will be the time when AI can design a device or chip that will be beyond human engineers. We are already on our way.
We also see that arenas like Quantum Computing combined with classical computing and machine learning will somehow intertwine and facilitate the growth of the Quantum state. Testing and better metrics will be developed with systems engineering aspects of building a quantum computer.
Quantum Computing
In the early 1980s, brilliant men like Yuri Manin, Paul Benioff, and Richard Feynman were credited with introducing the field of quantum computing. Richard Feynman proposed building universal quantum computers to simulate other quantum systems.
While there are disagreements on the metrics for measuring Quantum Computing systems, today, Google believes it can create a functional, error-corrected quantum computer within the decade. IBM seems concerned with quantum advantage where quantum computers outperform classical ones on common problems for companies.
IBM’s current largest quantum computer of 127 quantum bits (qubits) was introduced on November 15, 2021. In the realm of Moore’s Law with qubits instead of transistors, IBM’s first goal to stay on track with its quantum processors of 127 then 433 qubits in 2022 and 1,121 qubits by 2023 has been met.
Comparison of a Classical Computer and a Quantum Computer
The difference between a Classical Computer and a Quantum Computer is space and scaling, besides the fact that quantum computers currently look like chandeliers. Theoretically, one hundred qubits on a single quantum computer processor would be more powerful than all the supercomputers on the planet combined. Three hundred qubits can hold more states than the number of atoms in the universe.
Quantum computers are still very much experimental devices and are being made available for research with practical access to business and the public. Major corporations are using them for specific tasks that they are better at than classical computers. They cannot yet do what classical computers, as we understand them, do daily, but their potential is so far beyond their counterpart.
The current goal is to turn experimental quantum machines into reliable machines with “logical qubits” with logical gate error rates below 1%.
Do We Know What We Are Doing In The Quantum World?
The Quantum world operates at a sub-atomic level. Our answer to the question “Do We Know What We Are Doing In The Quantum World?” is most assuredly “sort of” or “no.” We do not entirely understand quantum physics, let alone quantum computing.
At a quantum level, Entanglement describes two particles that can be at two places across the universe from each other simultaneously but affect each other the moment they are measured and observed. Showing that quantum mechanics permits instantaneous connections between distant locations or, as Einstein called it, “Spooky action at a distance.”
Quantum tunneling shows that particles can exhibit the “Hartman effect,” and a tunneling particle can travel faster than light. So good luck if we flip the wrong “sub-atomic quantum switch!!” Oh, and let’s not forget parallel and multiple universes. Are you guys staying in touch with your buddies at CERN?
Why Is Technological Growth Important?
Improved technology makes our lives more convenient, but it also has a massive impact on economics and the overall social environment of today’s world.
In terms of economics, tech companies are constantly trying to increase their product quality and decrease their prices simultaneously while still making a profit. As tech prices fluctuate over time, so does the consumer market and flow of currency.
Final Thoughts
Since the 1960s, technology has grown steadily, doubling every 18-24 months. In the long run, Moore’s Law has shaped innovation and our views of technological growth over the last fifty years. However, despite the end of the Law, if that ever happens, growth will continue to progress according to upcoming innovations and discoveries in science and technology.
References:
- Moore’s Law: Moore’s Law
- ASML: Moore’s Law
- How Stuff Works: How Transistors Work
- New Atlas: IBM’s new 2-nm chips have transistors smaller than a strand of DNA
- Our World in Data: Technological Progress
- Economics Discussion: Technological Progress and Economic Growth | Economics
- Science Direct: Floating-Point Operation
- Niklas Rosenberg: What Does It Mean To Have 60 Billion Transistors In A Computer Chip?
- MIT Technology Review: We’re not prepared for the end of Moore’s Law
- Tech Jury: How Fast Is Technology Advancing in 2021?
- Can Nanosheet Transistors Keep Moore’s Law Alive? | March 2020 | Communications of the ACM