Why Doesn't Anyone Understand How Quickly Artificial Intelligence Will Develop? - Alternative View

Table of contents:

Why Doesn't Anyone Understand How Quickly Artificial Intelligence Will Develop? - Alternative View
Why Doesn't Anyone Understand How Quickly Artificial Intelligence Will Develop? - Alternative View

Video: Why Doesn't Anyone Understand How Quickly Artificial Intelligence Will Develop? - Alternative View

Video: Why Doesn't Anyone Understand How Quickly Artificial Intelligence Will Develop? - Alternative View
Video: A.I. is Progressing Faster Than You Think! 2024, May
Anonim

Many of us are now familiar with Moore's Law, the famous principle that the development of computing power follows an exponential curve, doubling in value for money (that is, in speed per unit of cost) every 18 months or so. When it comes to applying Moore's Law to their own business strategies, even forward thinking thinkers fail to see the huge AI blind spot. Even the most successful, strategic business people who see their industry through and through cannot understand what exponential development is. And on this exponential curve, there is one technology that particularly benefits from the exponential: artificial intelligence.

Exponential curves on paper

One of the reasons people don't understand how fast artificial intelligence is advancing is laughably simple: exponential curves don't look good when we humans try to explain them on paper. For practical reasons, it is nearly impossible to completely depict the steep path of an exponential curve in a small space such as a diagram or slide. Visually depicting the early stages of an exponential curve is not difficult. But as the cooler part of it is rapidly gaining momentum, things get more complicated.

To solve this problem of inadequate visual space, we use a convenient mathematical trick - the logarithm. Thanks to the "logarithmic scale", we learned how to twist exponential curves. Unfortunately, the widespread use of logarithmic scales can also cause scientific myopia.

Chart 1
Chart 1

Chart 1.

The logarithmic scale is designed so that each tick on the vertical y-axis does not correspond to a constant increase (as in the usual linear scale), but to a multiple, for example, 100. The classic Moore's law diagram (diagram 1) uses a logarithmic scale to exponentially improve the cost of computing power (measured in computing / second / dollar) over the past 120 years, from the mechanical devices of the 1900s to modern silicon-based graphics cards.

Log charts have become a valuable form of shorthand for people who are aware of the visual distortion that such charts present. It is now a convenient and compact way to display any curve that grows rapidly and radically over time.

Promotional video:

However, logarithmic charts fool the human eye.

By mathematically compressing huge numbers, logarithms make exponential growth appear linear. Since they compress exponents to line graphs, it is more convenient for people to look at them and speculate about the upcoming increase in computing power.

Our logical brains understand slide rules. But our subconscious brains see curved lines and tune in to them.

What to do? First, you need to go back to the original linear scale.

In the second chart below, the data follows an exponential curve, but is scaled linearly along the vertical axis. Again, the vertical bar represents the computational speed (in gigaflops) that a dollar can buy, and the horizontal axis represents time. However, in Chart 2, each tick on the vertical axis corresponds to a simple linear increase in only one gigaflop (not 100 times as in Chart 1. The flop is a standard way of measuring computation speed, which means "floating point operations per second".

Chart 2
Chart 2

Chart 2.

Chart 2 shows the actual, true exponential curve that characterizes Moore's Law. Looking at the way this diagram is drawn, it is easy for our human eyes to understand how quickly the performance of computers has grown over the past ten years.

But there is something wrong with the second diagram. It might seem that in the 20th century, the cost and performance of computers have not improved at all. This is obviously not the case.

Chart 2 shows that using a linear scale to show how Moore's Law changes over time can be dazzling. The past seems flat, as if there was no progress. Moreover, people mistakenly conclude that the current point in time represents a period of unique, “near-vertical” technological progress.

Linear scales can trick people into believing they are living at the height of change.

The blind spot of living in the present

Let's take another look at Chart 2. Looking from 2018, the previous price-performance doublings that have occurred every decade for much of the 20th century appear flat, almost insignificant. A person studying this diagram would say: How lucky I am to live now. I remember the year 2009 when I thought my new iPhone was fast. I had no idea how slow it was. It's good that I have reached the vertical part.

People say we went through the "kink of the hockey stick." But there is no such transition point.

Any curve shape in the future looks the same as it did in the past. Below, Chart 3 shows the exponential curve of Moore's Law on a linear scale, but this time from a 2028 perspective. The curve suggests that the growth we have experienced over the past 100 years will continue for at least another 10 years. This chart shows that in 2028, one dollar can buy 200 gigaflops of computing power.

Chart 3
Chart 3

Chart 3.

However, Chart 3 also presents a trap for the analyst.

Take a close look at exactly where modern computing power (2018) lies on the curve shown in the third diagram. From the point of view of a person living and working in the future 2028, it would seem that there were practically no improvements in computing power during the early 20th century. It looks like the computing devices used in 2018 were slightly more powerful than those used in 1950. An observer could also conclude that the current year 2028 represents the culmination of Moore's Law, where advances in computing power are finally skyrocketing.

Chart 3 could be recreated each year, changing only the time span shown. The shape of the curve would be identical, only ticks would change along the vertical scale. Note that the shape of Charts 2 and 3 looks the same except for the vertical scale. On every such graph, every past moment would be flat when viewed from the future, and every future moment would be a sharp departure from the past. Alas, this misperception would be the result of a flawed business strategy, at least when it comes to artificial intelligence.

What does it mean?

The exponential themes of change are difficult for the human mind to understand and see with the eye. Exponential curves are unique in the sense that they are mathematically self-similar at every point. This means that the always doubling curve does not have flat parts, does not have the rising parts, bends and kinks that people talk about. Its shape will always be the same.

As Moore's Law continues to work, it is tempting to believe that it was at this very moment that we reached a unique stage of great change in the development of artificial intelligence (or any other technology that extends to Moore's Law). However, as long as computing power continues to follow an exponential price-performance curve, every future generation is likely to look back at the past as an era of relatively little progress. In turn, the opposite will remain true: each current generation will look 10 years into the future and will not be able to assess how much progress in AI is still ahead.

Thus, for anyone planning a future driven by the exponential growth of computing, the challenge is to overcome their own misinterpretations. There are three charts to keep in mind to truly appreciate the power of exponential growth. Because the past will always look flat and the future will always look vertical.

Ilya Khel