Friday, April 7, 2023

What Era of Computing Comes Next?

By now, all of us are aware that rapid reductions in computing and storage cost, with rapid increases in capability, can enable applications, use cases and revenue models that were not feasible in the past because computing or storage costs precluded them. 


So ridesharing is possible because people have capable smartphones and mobile internet access fast enough to support that use case. Netflix and other video streaming services are possible because digital infrastructure capabilities have been improved at Moore’s Law rates. 


Applied artificial intelligence is among the capabilities that benefit directly from rapid processing improvements. A study shows that, “before 2010 training compute grew in line with Moore’s law, doubling roughly every 20 months.”


But “since the advent of deep learning in the early 2010s, the scaling of training compute has accelerated, doubling approximately every six months,” say professors Jaime Sevilla, Lennart Heim, Anson Ho, Tamay Besiroglu, Marius Hobbhahn and Pablo Villalobos in a study


source: Jaime Sevilla, Lennart Heim, Anson Ho, Tamay Besiroglu, Marius Hobbhahn and Pablo Villalobos


source: Jaime Sevilla, Lennart Heim, Anson Ho, Tamay Besiroglu, Marius Hobbhahn and Pablo Villalobos


“Our findings seem consistent with previous work, though they indicate a more moderate scaling of training compute,” the researchers say. “In particular, we identify an 18-month doubling time between 1952 and 2010, a six-month doubling time between 2010 and 2022, and a new trend of large-scale models between late 2015 and 2022, which started two to three orders of magnitude over the previous trend and displays a 10-month doubling time.”


Moore's Law and rapid increases in computing power, with corresponding reductions in price, matter hugely. It allows entrepreneurs to innovate by asking the question “ what would my business look like if computing or bandwidth no longer were barriers?” 


Does anybody doubt that near-zero pricing remains among the biggest business threats in the connectivity business? And does anybody really doubt that Moore’s Law has led to substitute products for telco voice and messaging while diminishing the cost of transporting bits? 


Has bandwidth not increased, in lead markets, at the headline level, at about the rate Moore’s Law or Nielsen’s Law predicts? 


Edholm’s Law states that internet access bandwidth at the top end increases at about the same rate as Moore’s Law likewise suggests computing power will increase.


Nielsen's Law essentially is the same as Edholm’s Law, predicting an increase in the headline speed of about 50 percent per year. 


The point is that an inflection point has been reached for applied use of artificial intelligence. As we once might have asked “what does my business look like if computing or bandwidth were essentially free,” we now must start asking questions such as “what does my business look like if artificial intelligence is available to use?” 


As when those earlier questions were asked, the cost of training is nowhere near “free.” But neither was computing or bandwidth when the founders of Microsoft and Netflix laid out their plans. 


The most-startling strategic assumption ever made by Bill Gates was his belief that horrendously-expensive computing hardware would eventually be so low cost that he could build his own business on software for ubiquitous devices. .


How startling was the assumption? Consider that, In constant dollar terms, the computing power of an Apple iPad 2, when Microsoft was founded in 1975, would have cost between US$100 million and $10 billion.


Reed Hastings, Netflix founder, apparently made a similar decision. For Bill Gates, the insight that free computing would be a reality meant he should build his business on software used by computers.


Reed Hastings came to the same conclusion as he looked at bandwidth trends in terms both of capacity and prices. At a time when dial-up modems were running at 56 kbps, Hastings extrapolated from Moore's Law to understand where bandwidth would be in the future, not where it was “right now.”


“We took out our spreadsheets and we figured we’d get 14 megabits per second to the home by 2012, which turns out is about what we will get,” says Reed Hastings, Netflix CEO. “If you drag it out to 2021, we will all have a gigabit to the home." So far, internet access speeds have increased at just about those rates.


Everyone has struggled to define what the next era of computing would look like. We might have found our answer, at least relating to nomenclature. Some say we are in the era of cloud computing. Others might prefer mobile computing or web-based computing.


The point is that we left the mainframe, mini-computer, personal computer, client-server eras. Where we are now might be considered the internet, web, cloud-based or mobile era. We have not yet agreed on a specific term. 


What comes next might well be the AI era.


No comments:

Post a Comment

MWC and AI Smartphones

Mobile World Congress was largely about artificial intelligence, hence largely about “AI” smartphones. Such devices are likely to pose issue...