Friday, December 23, 2022

Does Technology Leadership Lead to Financial Outperformance? Maybe Not

High achievers exist in virtually every sphere of life and business. Leaders and laggards often are differentiated by heavy use of digital technologies, so many equate profiency with digital technology and outperformance. 


But profits and technology use do not seem to correlate all that closely. Financial results often do not correlate with digital technology spending. Some studies show a small percentage of firms with high profits also are digital technology leaders. 


It is always possible that the expected correlation between digital investment--”digital transformation or digitalization”--exists only weakly. That suggests organizational high performance is not necessarily and directly caused by the investment.


Some firms might have been better at thinking through how the technology could boost performance. Those firms might have other assets and levers to pull to maximize the impact and return. 


High-performing firms (probably measured by revenue growth or profit margins and growth) that excel with technology might also tend to be firms that manage people, operations, acquisitions, product development, logistics or other functions unusually well. Perhaps high-performing organizations also have unique intellectual property, marketing prowess, better distribution network skills. 


In fact, digital technology success appears relatively random, where it comes to producing desired outcomes. In part, that is because it will be devilishly difficult to determine the technology impact on knowledge work or office work productivity at all. 


So productivity measurement is an issue. To be sure, most of us assume that higher investment and use of technology improves productivity. That might not be true, or true only under some circumstances. 


Nor is this a new problem. Investing in more information technology has often and consistently failed to boost productivity.  Others would argue the gains are there; just hard to measure.  There is evidence to support either conclusion. 


If the productivity paradox exists, then digital transformation benefits also should lag investment. Before investment in IT became widespread, the expected return on investment in terms of productivity was three percent to four percent, in line with what was seen in mechanization and automation of the farm and factory sectors.


When IT was applied over two decades from 1970 to 1990, the normal return on investment was only one percent.


So this productivity paradox is not new. Information technology investments did not measurably help improve white collar job productivity for decades. In fact, it can be argued that researchers have failed to measure any improvement in productivity. So some might argue nearly all the investment has been wasted.


We have seen in the past that  there is a lag between the massive introduction of new information technology and measurable productivity results, and that this lag might conceivably take a decade or two decades to emerge.


The Solow productivity paradox suggests that technology can boost--or harm--productivity. Though perhaps shocking, it appears that technology adoption productivity impact can be negative


That there are leaders and laggards should not surprise. That there are higher performers and trailing performers in business should not surprise. Perhaps leaders outperform for reasons other than technology.


Thursday, December 22, 2022

ChatGPT Hype is All About Automated Content Creation

Chat Generative Pre-trained Transformer, or ChatGPT, is the hype term of the moment. The interest comes from ChatGPT abilities to create content and provide conversational results to an inquiry. It essentially promises to connect artificial intelligent processing with automatic conversational responses. 


The applications for customer service are obvious. Also perhaps obvious are applications that could augment or replace “search,” or “writing.” As TikTok alarmed Facebook, perhaps ChatGPT now alarms Google. 


The big deal is the ability to create content based on existing content and data. It is not so much a use case related to the equivalent of human “thinking” as to “content creation” based on precedent and existing data. 


Generative AI is the larger trend that ChatGPT is part of. AI-created original content is the promise. An annual or quarterly report, for example, is a fairly structured document drawn from existing data, the sort of thing generative AI is supposed to be good at. News stories, sports scores and advice (legal, financial, business strategy, for example) are the sorts of content that are based on existing formats, precedents, databases and conventional wisdom or rules of thumb. 


source: Sequoia Capital 


When to buy a product; why it provides value; how to buy; where to buy; from whom to buy; understanding pros and cons are some of the questions generative AI is ultimately expected to provide. 


What options might be in any legal matter, what the precedents are, and what courses of action can be taken are legal questions all based on past experience. “How to invest, at a given age, with assets of different amounts, with defined goals, in what instruments, for how long, and why” are all questions with answers based on clear rules of thumb used by financial advisors. 


The uses in education, which mostly consists of knowledge transfer, are endless. 


It probably is not too hard to see how generative AI could be used to create personalized marketing, social media, and technical sales content (including text, images, and video). 


Some believe generative AI could write, document and review code. Applications in many other fields, ranging from pharmaceutical development to health outcomes, in fact all human endeavors with large existing data sets and “expert” advice, could be enhanced. 


Anywhere there are patterns in data, and lots of data to be worked with, it is possible that generative AI could add value. The more complicated processes are--such as weather--the more value could be obtained, in principle. Generative AI essentially creates based on existing data. So the more data, the more creation is possible. 


Is that “new” content derivative? Yes.It is based on the existing data, which can be manipulated and displayed in original ways. And generative AI is about creating content, not “thought.” But content creation is expensive and important in almost every sphere of life. 


The hype will pass. But disruption and substitution clearly can be seen as possible outcomes, eventually. Anywhere content has to be created, where there are existing rules of thumb about what is important, where lots of precedent and data exists, where some questions have obvious standard answers, generative AI is likely to be valuable and important. 


It is not simply content creators, but advice givers that could ultimately see their output devalued. If you ask me when MPLS adds value, and why, and how it compares to SD-WAN, there are a limited set of answers I can provide that correspond with industry wisdom about such choices. 


Over time, a greater number of questions will have answers computers can assemble and deliver. It’s coming, if not right away.


Tuesday, December 20, 2022

Meta Mixed Reality Viewed as Key to Next Generation of Computing

Meta has gotten criticism in some quarters for allotting as much as 20 percent of its research and development spending for new products, rather than existing products. Investors argue Meta is spending too much, too soon on metaverse products and software that might not be commercially viable for some time. 


Meta, on the other hand, is betting on possible leadership of the next generation of computing, and believes its investments now will pay off. 


If one believes in product life cycles, then one is forced to look at business strategy as including the development of new products to replace sales of products with declining demand. That, in turn, presupposes capital allocation and effort to discover or create those new products. 


Virtually nobody disagrees with that general principle. But as with staffing levels at software and technology firms, there is concern firms have “overhired,” and need to cut back on spending in the face of expected recession in 2023 and possible slower growth beyond. 


It should be noted that financial analysts often prefer that firms “stick to their core” business while business strategists more often emphasize what is needed to ignite and sustain growth. Either view has merit at times. 


Much the same possible divergence of opinion about research and development investment also exists. 


Some industries invest more in research and development than do others. Pharmaceutical, information technology and computing industries are heavy R&D spenders, for example. Computing and technology firms spend about 13.6 percent of revenue on R&D, for example. In many cases, R&D as a percentage of gross profits is higher. 


But Meta has in recent years been a heavy spender on R&D, compared to other firms. Microsoft, for example, spent 13 percent of net sales on R&D in 2020, compared to Meta’s 22 percent level.  


At least right now, many criticize Meta’s investment priorities. Meta seems determined to plow ahead. And few public companies of its size have a governance structure that allows Meta to proceed aggressively, without risking pushback from its equity investors. 


One can always make the argument that some of the R&D investment is essentially wasted, and that Meta might be able to achieve what it wants at a lower spending level. But that is a judgment call. 


But a recent statement by Andrew Bosworth, Meta CTO, makes clear the firm’s continued belief that mixed reality is so vital that high levels of research and development must be sustained, even as 80 percent of research and development continues to support existing lines of business. 


We may agree or disagree, but Meta is clearly betting that something else is coming, and that Meta has to spend now to lead that “something” that comes next, and represents the next era of computing.


MWC and AI Smartphones

Mobile World Congress was largely about artificial intelligence, hence largely about “AI” smartphones. Such devices are likely to pose issue...