Friday, December 23, 2022

Does Technology Leadership Lead to Financial Outperformance? Maybe Not

High achievers exist in virtually every sphere of life and business. Leaders and laggards often are differentiated by heavy use of digital technologies, so many equate profiency with digital technology and outperformance. 


But profits and technology use do not seem to correlate all that closely. Financial results often do not correlate with digital technology spending. Some studies show a small percentage of firms with high profits also are digital technology leaders. 


It is always possible that the expected correlation between digital investment--”digital transformation or digitalization”--exists only weakly. That suggests organizational high performance is not necessarily and directly caused by the investment.


Some firms might have been better at thinking through how the technology could boost performance. Those firms might have other assets and levers to pull to maximize the impact and return. 


High-performing firms (probably measured by revenue growth or profit margins and growth) that excel with technology might also tend to be firms that manage people, operations, acquisitions, product development, logistics or other functions unusually well. Perhaps high-performing organizations also have unique intellectual property, marketing prowess, better distribution network skills. 


In fact, digital technology success appears relatively random, where it comes to producing desired outcomes. In part, that is because it will be devilishly difficult to determine the technology impact on knowledge work or office work productivity at all. 


So productivity measurement is an issue. To be sure, most of us assume that higher investment and use of technology improves productivity. That might not be true, or true only under some circumstances. 


Nor is this a new problem. Investing in more information technology has often and consistently failed to boost productivity.  Others would argue the gains are there; just hard to measure.  There is evidence to support either conclusion. 


If the productivity paradox exists, then digital transformation benefits also should lag investment. Before investment in IT became widespread, the expected return on investment in terms of productivity was three percent to four percent, in line with what was seen in mechanization and automation of the farm and factory sectors.


When IT was applied over two decades from 1970 to 1990, the normal return on investment was only one percent.


So this productivity paradox is not new. Information technology investments did not measurably help improve white collar job productivity for decades. In fact, it can be argued that researchers have failed to measure any improvement in productivity. So some might argue nearly all the investment has been wasted.


We have seen in the past that  there is a lag between the massive introduction of new information technology and measurable productivity results, and that this lag might conceivably take a decade or two decades to emerge.


The Solow productivity paradox suggests that technology can boost--or harm--productivity. Though perhaps shocking, it appears that technology adoption productivity impact can be negative


That there are leaders and laggards should not surprise. That there are higher performers and trailing performers in business should not surprise. Perhaps leaders outperform for reasons other than technology.


Thursday, December 22, 2022

ChatGPT Hype is All About Automated Content Creation

Chat Generative Pre-trained Transformer, or ChatGPT, is the hype term of the moment. The interest comes from ChatGPT abilities to create content and provide conversational results to an inquiry. It essentially promises to connect artificial intelligent processing with automatic conversational responses. 


The applications for customer service are obvious. Also perhaps obvious are applications that could augment or replace “search,” or “writing.” As TikTok alarmed Facebook, perhaps ChatGPT now alarms Google. 


The big deal is the ability to create content based on existing content and data. It is not so much a use case related to the equivalent of human “thinking” as to “content creation” based on precedent and existing data. 


Generative AI is the larger trend that ChatGPT is part of. AI-created original content is the promise. An annual or quarterly report, for example, is a fairly structured document drawn from existing data, the sort of thing generative AI is supposed to be good at. News stories, sports scores and advice (legal, financial, business strategy, for example) are the sorts of content that are based on existing formats, precedents, databases and conventional wisdom or rules of thumb. 


source: Sequoia Capital 


When to buy a product; why it provides value; how to buy; where to buy; from whom to buy; understanding pros and cons are some of the questions generative AI is ultimately expected to provide. 


What options might be in any legal matter, what the precedents are, and what courses of action can be taken are legal questions all based on past experience. “How to invest, at a given age, with assets of different amounts, with defined goals, in what instruments, for how long, and why” are all questions with answers based on clear rules of thumb used by financial advisors. 


The uses in education, which mostly consists of knowledge transfer, are endless. 


It probably is not too hard to see how generative AI could be used to create personalized marketing, social media, and technical sales content (including text, images, and video). 


Some believe generative AI could write, document and review code. Applications in many other fields, ranging from pharmaceutical development to health outcomes, in fact all human endeavors with large existing data sets and “expert” advice, could be enhanced. 


Anywhere there are patterns in data, and lots of data to be worked with, it is possible that generative AI could add value. The more complicated processes are--such as weather--the more value could be obtained, in principle. Generative AI essentially creates based on existing data. So the more data, the more creation is possible. 


Is that “new” content derivative? Yes.It is based on the existing data, which can be manipulated and displayed in original ways. And generative AI is about creating content, not “thought.” But content creation is expensive and important in almost every sphere of life. 


The hype will pass. But disruption and substitution clearly can be seen as possible outcomes, eventually. Anywhere content has to be created, where there are existing rules of thumb about what is important, where lots of precedent and data exists, where some questions have obvious standard answers, generative AI is likely to be valuable and important. 


It is not simply content creators, but advice givers that could ultimately see their output devalued. If you ask me when MPLS adds value, and why, and how it compares to SD-WAN, there are a limited set of answers I can provide that correspond with industry wisdom about such choices. 


Over time, a greater number of questions will have answers computers can assemble and deliver. It’s coming, if not right away.


Tuesday, December 20, 2022

Meta Mixed Reality Viewed as Key to Next Generation of Computing

Meta has gotten criticism in some quarters for allotting as much as 20 percent of its research and development spending for new products, rather than existing products. Investors argue Meta is spending too much, too soon on metaverse products and software that might not be commercially viable for some time. 


Meta, on the other hand, is betting on possible leadership of the next generation of computing, and believes its investments now will pay off. 


If one believes in product life cycles, then one is forced to look at business strategy as including the development of new products to replace sales of products with declining demand. That, in turn, presupposes capital allocation and effort to discover or create those new products. 


Virtually nobody disagrees with that general principle. But as with staffing levels at software and technology firms, there is concern firms have “overhired,” and need to cut back on spending in the face of expected recession in 2023 and possible slower growth beyond. 


It should be noted that financial analysts often prefer that firms “stick to their core” business while business strategists more often emphasize what is needed to ignite and sustain growth. Either view has merit at times. 


Much the same possible divergence of opinion about research and development investment also exists. 


Some industries invest more in research and development than do others. Pharmaceutical, information technology and computing industries are heavy R&D spenders, for example. Computing and technology firms spend about 13.6 percent of revenue on R&D, for example. In many cases, R&D as a percentage of gross profits is higher. 


But Meta has in recent years been a heavy spender on R&D, compared to other firms. Microsoft, for example, spent 13 percent of net sales on R&D in 2020, compared to Meta’s 22 percent level.  


At least right now, many criticize Meta’s investment priorities. Meta seems determined to plow ahead. And few public companies of its size have a governance structure that allows Meta to proceed aggressively, without risking pushback from its equity investors. 


One can always make the argument that some of the R&D investment is essentially wasted, and that Meta might be able to achieve what it wants at a lower spending level. But that is a judgment call. 


But a recent statement by Andrew Bosworth, Meta CTO, makes clear the firm’s continued belief that mixed reality is so vital that high levels of research and development must be sustained, even as 80 percent of research and development continues to support existing lines of business. 


We may agree or disagree, but Meta is clearly betting that something else is coming, and that Meta has to spend now to lead that “something” that comes next, and represents the next era of computing.


Saturday, December 17, 2022

Will User-Generated Content be Eclipsed by AI Content?

We sometimes forget the debates over whether user-generated content could actually replace professionally-produced content in the news, video and other content realms, though two to three decades ago, that was a key issue. 


And we continue to see new forms of that older debate played out when proponents talk about the architecture of Web3. Some might say the original ethos of the early pre-Web internet-- participation and sharing--continues as a meme. 


But coming content generated directly by artificial intelligence might limit the amount of UGC influence and the opportunities to monetize UGC.


If there is a difference to the Web3 debate, it is over the concept that UGC can be monetized by its creators. The interest in blockchain is partly its role in allowing content creators to monetize their creations. In a real sense, the principle of decentralization is driven by the belief that UGC can be monetized directly by UGC creators. 


We have already ended the debate about UGC monetization at a macro level. The business success of YouTube, Facebook and social media in general, where users create the content, is clear. The rise of social media influencers also has shown an early glimpse of how UGC can be monetized by its creators. 


Platforms use customer reviews as a commerce-related form of UGC as well. And the existence of robust advertising and marketing revenue streams attests to the viewership UGC drives. The existence of what we now call the gig economy is further evidence of how monetization can happen around users monetize their time. 


But that is where the debate now rests. We no longer debate the attention-getting value of UGC. We do not doubt that platforms can monetize that content. The only unsolved issue is monetization of their UGC by content creators. 


The big debates that started in the 1990s were about whether UGC could actually gain attention at the expense of professionally-produced media. The juxtaposition was usually something like “blogs versus newspapers/magazines; short form video versus TV/cable; direct distribution of UGC music versus sponsorship by branded labels and so forth. 


Rarely, two and a half  decades ago, were the big issues about monetization of their content by the creators. But monetization did happen for the aggregation platforms: IMDb, Wikipedia, Facebook, YouTube and Google, to name just a few. 


In an extended sense, the existence of any platform for buyers and sellers also uses the “UGC” concept, only in the sphere of asset ownership. So Amazon, Airbnb, Uber, Lyft and other e-commerce platforms essentially mobilize latent assets, with the owners being paid for participation. 


Some now see live streaming, mostly from mobile devices, as an incipient market. Gaming might be another content area where UGC could grow.  


source: Andresson Horowitz 


In fact, the value of metaverse worlds might be said to be as dependent on UGC as on realism, experience persistence, three-dimensional environments and the ability to interact, create, consume, buy and sell within the experience.  


source: eMarketer 


Left to be decided are issues such as the trust level of content, recommendations, opinions and arguments made by users, as compared to brands (who supply the equivalent of professionally-created content), in the metaverse. If history is a guide, UGC will be quite potent. 


How much artificial intelligence contributes content might be a growing issue. Already, artists object to AI-generated art. Writers will soon have to contend with AI-generated text and content. And so on. 


But the extent to which UGC can be monetized directly by creators is going to be an issue. Opportunities will be greater in some contexts than in others, and might still be relatively rare that generating significant monetization amounts is possible.


Sunday, November 20, 2022

70% of Digital Transformation Efforts Fail

Perhaps only fools or those without the means refuse to take advantage of obviously-useful new tools and technology  On the other hand, we possibly should stop fetishizing the use of new tools. By all means, explore the application of artificial intelligence, virtual reality, augmented reality, internet of things, network slicing, 5G, 6G, the internet and so forth.


New technologies will be applied. But maybe we should stop confusing the matter by insisting that "digital transformation" is qualitatively different from all the other adaptations we have made over the decades and centuries.


Looking only at agriculture, we can note the application of new technology over time, that has transformed that business and pursuit.


source: Researchgate 


But all that happens whether the term "digital transformation" exists, or not. And keep in mind that 
70 percent or more DX efforts will fail. In fact, that has been true of information technology programs in the past. 

“74 percent of cloud-related transformations fail to capture expected savings or business value,” say McKinsey consultants  Matthias Kässer, Wolf Richter, Gundbert Scherf, and Christoph Schrey. 


Those results would not be unfamiliar to anyone who follows success rates of information technology initiatives, where the rule of thumb is that 70 percent of projects fail in some way.


Of the $1.3 trillion that was spent on digital transformation--using digital technologies to create new or modify existing business processes--in 2018, it is estimated that $900 billion went to waste, say Ed Lam, Li & Fung CFO, Kirk Girard is former Director of Planning and Development in Santa Clara County and Vernon Irvin Lumen Technologies president of Government, Education, and Mid & Small Business. 


That should not come as a surprise, as historically, most big information technology projects fail. BCG research suggests that 70 percent of digital transformations fall short of their objectives. 


From 2003 to 2012, only 6.4 percent of federal IT projects with $10 million or more in labor costs were successful, according to a study by Standish, noted by Brookings.


IT project success rates range between 28 percent and 30 percent, Standish also notes. The World Bank has estimated that large-scale information and communication projects (each worth over U.S. $6 million) fail or partially fail at a rate of 71 percent. 


McKinsey says that big IT projects also often run over budget. Roughly half of all large IT projects—defined as those with initial price tags exceeding $15 million—run over budget. On average, large IT projects run 45 percent over budget and seven percent over time, while delivering 56 percent less value than predicted, McKinsey says. 


Beyond IT, virtually all efforts at organizational change arguably also fail. The rule of thumb is that 70 percent of organizational change programs fail, in part or completely. 


There is a reason for that experience. Assume you propose some change that requires just two approvals to proceed, with the odds of approval at 50 percent for each step. The odds of getting “yes” decisions in a two-step process are about 25 percent (.5x.5=.25). 


In other words, if only two approvals are required to make any change, and the odds of success are 50-50 for each stage, the odds of success are one in four. 


The odds of success get longer for any change process that actually requires multiple approvals. 


Assume there are five sets of approvals. Assume your odds of success are high--about 66 percent--at each stage. In that case, your odds of success are about one in eight for any change that requires five key approvals (.66x.66x.66x.66x.66=82/243). 


In a more realistic scenario where odds of approval at any key chokepoint are 50 percent, and there are 15 such approval gates, the odds of success are about 0.0000305. 


source: John Troller 


So it is not digital transformation specifically which tends to fail. Most big IT projects fail.

If one defines digital transformation “as the integration of digital technology into all areas of a business resulting in fundamental changes to how businesses operate and how they deliver value to customers,” you can see why it is so hard to measure. 


DX affects “all” of the business; produces “fundamental change” in “operations and value” creation, it often is said. How often does any single technology change or program affect “the whole business?” How often does any technology program produce “fundamental change” in operations or value creation? 


Also, by that standard of “fundamental change,” many industries arguably already have achieved most of the value of DX. If “value for customers” is correlated with “how we make our money,” then content businesses and many retailers already have succeeded, for the most part. They sell online; they fulfill remotely; they handle customer interactions online. 


CaixaBank Research

 

Many other industries, such as marketing, consulting and research, likewise largely rely on online processes and fulfillment. Other industries possibly cannot pursue “fundamental transformation.” 


The qualifications, such as saying DX “will look different for every company,” only highlight  the problem. DX requires technology, to be sure, but also cultural change. And how do you measure that? 


Some might say DX requires a “culture of experimentation; a willingness to fail or an ability to successfully challenge older ways of doing things.” Some of us would say success in any single one of those areas will succeed only about 30 percent of the time. 


So what people often do not expect is failure. And there is no reason to believe any single effort at some part of DX will succeed more often than three times out of 10. 


source: BCG 


The e-conomy 2022 report produced by Bain, Google and Temasek provides an example of why DX is so hard to define or measure. Literally “all” of a business, all processes and economic or social outcomes are linked in some way to applied digital technology. 


And what we cannot precisely quantify or measure is hard to track or monitor. If one thinks of DX as simply the latest description of “applying digital technology” to processes, then one also understands why there actually is no end point. We simply keep evolving our technology use over time. 

source: Harvard Business Review 


We should not expect people and organizations to stop talking about “digital transformation.” But maybe we shouldn’t listen quite so much. 


source: The Marketing Technologist 


Yes, by all means continue to experiment with new ways to apply internet, communications and  computing technologies to improve operations, product value and customer interaction capabilities.


But we also should understand that some businesses and industries make money pushing "digital transformation." That is just reality. They make money even if you do not.


And it always is harder than it might seem.


So maybe it is time to stop paying attention. Apply new tools as you can see the value. You know, like we always have done. But tune out the hype.


Tuesday, November 15, 2022

Why Telcos Might Use Public Cloud Even if it is Not Cheaper than Private Cloud

One reason enterprises might eventually rethink why, where and how they use public cloud--including access service providers--are the normal economies of scale that apply to enterprise computing at scale. In almost every case, small volume tends to support the economics of a cloud approach that comes with leased access to compute cycles. 


But in almost every case, very-high volume tends to shift economics back to ownership. In other words, pay-per-use is affordable at low use volumes but increasingly expensive as volume grows. At some point, as the number of instances, licenses, subscriptions or other usage metrics grow very large, ownership starts to offer lower cost.


Private Cloud, Public Cloud, On-Premises Stack Considerations

Issue

Private cloud

Public cloud

Public cloud on-premises stack

Financial treatment

Capital expenditure

Operational expense (opex)

Operational expense

Capital investment

Required

None (costs were opex)

Commodity

Operational expense

People, environmental, space, maintenance

Consumption-based

People, environmental, space, maintenance

Existing infrastructure

Sunk cost, repurpose

Excess

Excess

Consistent operations

High with control of hardware and software

Variability when using multiple public clouds

Variability when choosing more than one solution

People

High

Low/medium

Medium, depending upon support

Capacity provisioned/scaling

Provision to peak demand. Slower to scale beyond

Public cloud provider can absorb surge in traffic

Provision to peak demand. Could burst/ scale into public cloud.

Life cycle

Control selection and upgrades of hardware and software

Controlled by cloud provider

Controlled by cloud provider

Demand variability

Predictable, less variable demand; must plan capacity

Variable demand supported by cloud infrastructure

Less variable demand to fit capacity; could burst to public cloud

Cost per traffic volume

Cost per gigabyte can be lower for high volume

Cost efficient for low volume or varied traffic patterns

Cost per gigabyte can be lower at high volume

Application dependencies (e.g., multicast, real-time, network fabric, VxLAN)

Adaptable, customizable to meet complex requirements

Most are not supported. Standard compute, memory, storage, network. May offer telco services.

Most are not supported. Standard compute, memory, storage, network. 

Compute intensity

GPU, CPU pinning under control

Extra costs for GPU, no CPU pinning

Extra costs for GPU, no CPU pinning

Storage intensity

Control storage costs and performance

High cost of storage services

High cost of storage services

Data privacy and security

Control

Some providers may collect information

Some providers may collect information

Location and data sovereignty

Control

Country- and provider-dependent

Control

Regulatory compliance

Built to requirements

May not meet all requirements

May not meet all requirements

Portability and interoperability

High within private cloud environment

Lower when public cloud(s) used with private cloud

Lower when public cloud(s) used with private cloud

Intellectual property

Owned, controlled, managed. Can differentiate offers.

Subject to terms of service. Can add cost, limit differentiation.

Subject to terms of service. Can add cost, limit differentiation.

source: Red Hat 


The key, though, is whether the “owned” capabilities are functionally similar to the leased facilities. And that might be the key insight about access service providers using public cloud for their computing infrastructure. The advantages of using public cloud might, in fact, not be scale advantages at all, but the ability to take advantage of the most-advanced and up-to-date computing architectures. 


Flexibility and agility or skills availability might be cited as advantages for using a public cloud, but those same advantages might be attributed to sufficiently-large private clouds as well. The issue there is “might be.” If there are advantages to scale for cloud computing, then perhaps private cloud as built by an enterprise. never can approach the scale advantages of a public cloud supplier.


But again, scale advantages might not be the main issue. Even when reliance on public cloud costs more than an equivalent private cloud, the perceived upside of using public cloud might lie in its expected advantage for creating new applications and use cases at scale. 


In other words, customers might trade higher cost for higher agility. Those with long memories will recall the operational problems caused by many decades of accumulated proprietary telco operations support systems, for example. 


The shift to reliance on public clouds for OSS type functions is a move to eliminate the downside of proprietary approaches and legacy solutions.


MWC and AI Smartphones

Mobile World Congress was largely about artificial intelligence, hence largely about “AI” smartphones. Such devices are likely to pose issue...