Tuesday, November 15, 2022

Why Telcos Might Use Public Cloud Even if it is Not Cheaper than Private Cloud

One reason enterprises might eventually rethink why, where and how they use public cloud--including access service providers--are the normal economies of scale that apply to enterprise computing at scale. In almost every case, small volume tends to support the economics of a cloud approach that comes with leased access to compute cycles. 


But in almost every case, very-high volume tends to shift economics back to ownership. In other words, pay-per-use is affordable at low use volumes but increasingly expensive as volume grows. At some point, as the number of instances, licenses, subscriptions or other usage metrics grow very large, ownership starts to offer lower cost.


Private Cloud, Public Cloud, On-Premises Stack Considerations

Issue

Private cloud

Public cloud

Public cloud on-premises stack

Financial treatment

Capital expenditure

Operational expense (opex)

Operational expense

Capital investment

Required

None (costs were opex)

Commodity

Operational expense

People, environmental, space, maintenance

Consumption-based

People, environmental, space, maintenance

Existing infrastructure

Sunk cost, repurpose

Excess

Excess

Consistent operations

High with control of hardware and software

Variability when using multiple public clouds

Variability when choosing more than one solution

People

High

Low/medium

Medium, depending upon support

Capacity provisioned/scaling

Provision to peak demand. Slower to scale beyond

Public cloud provider can absorb surge in traffic

Provision to peak demand. Could burst/ scale into public cloud.

Life cycle

Control selection and upgrades of hardware and software

Controlled by cloud provider

Controlled by cloud provider

Demand variability

Predictable, less variable demand; must plan capacity

Variable demand supported by cloud infrastructure

Less variable demand to fit capacity; could burst to public cloud

Cost per traffic volume

Cost per gigabyte can be lower for high volume

Cost efficient for low volume or varied traffic patterns

Cost per gigabyte can be lower at high volume

Application dependencies (e.g., multicast, real-time, network fabric, VxLAN)

Adaptable, customizable to meet complex requirements

Most are not supported. Standard compute, memory, storage, network. May offer telco services.

Most are not supported. Standard compute, memory, storage, network. 

Compute intensity

GPU, CPU pinning under control

Extra costs for GPU, no CPU pinning

Extra costs for GPU, no CPU pinning

Storage intensity

Control storage costs and performance

High cost of storage services

High cost of storage services

Data privacy and security

Control

Some providers may collect information

Some providers may collect information

Location and data sovereignty

Control

Country- and provider-dependent

Control

Regulatory compliance

Built to requirements

May not meet all requirements

May not meet all requirements

Portability and interoperability

High within private cloud environment

Lower when public cloud(s) used with private cloud

Lower when public cloud(s) used with private cloud

Intellectual property

Owned, controlled, managed. Can differentiate offers.

Subject to terms of service. Can add cost, limit differentiation.

Subject to terms of service. Can add cost, limit differentiation.

source: Red Hat 


The key, though, is whether the “owned” capabilities are functionally similar to the leased facilities. And that might be the key insight about access service providers using public cloud for their computing infrastructure. The advantages of using public cloud might, in fact, not be scale advantages at all, but the ability to take advantage of the most-advanced and up-to-date computing architectures. 


Flexibility and agility or skills availability might be cited as advantages for using a public cloud, but those same advantages might be attributed to sufficiently-large private clouds as well. The issue there is “might be.” If there are advantages to scale for cloud computing, then perhaps private cloud as built by an enterprise. never can approach the scale advantages of a public cloud supplier.


But again, scale advantages might not be the main issue. Even when reliance on public cloud costs more than an equivalent private cloud, the perceived upside of using public cloud might lie in its expected advantage for creating new applications and use cases at scale. 


In other words, customers might trade higher cost for higher agility. Those with long memories will recall the operational problems caused by many decades of accumulated proprietary telco operations support systems, for example. 


The shift to reliance on public clouds for OSS type functions is a move to eliminate the downside of proprietary approaches and legacy solutions.


Sunday, November 13, 2022

Brace for 70% Failure Rates for AI Initiatives

It long has been conventional wisdom that up to 70 percent of innovation efforts and major information technology projects fail in significant ways, either failing to produce predicted gains, or producing a very-small level of results. If we assume applied artificial intelligence, virtual reality, metaverse, web3 or internet of things are “major IT projects,” we likewise should assume initial failure rates as high as 70 percent.


That does not mean ultimate success will fail to happen, only that failure rates, early on, will be quite high. As a corollary, we should continue to expect high rates of failure for companies and projects, early on. Venture capitalists will not be surprised, as they expect such high rates of failure when investing in startups. 


But all of us need to remember that failure rates for innovation generally and major IT efforts specifically will have high failure rates of up to 70 percent. So steel yourself for bad news as major innovations are attempted in areas ranging from metaverse and web3 to cryptocurrency to AR, VR or even less “risky” efforts such as internet of things, network slicing, private networks or edge computing. 


Gartner estimated in 2018 that through 2022, 85 percent of AI projects would deliver erroneous outcomes due to bias in data, algorithms or the teams responsible for managing them.


That is analogous to arguing that most AI projects will fail in some part. Seven out of 10 companies surveyed in one study report minimal or no impact from AI so far. The caveat is that many such big IT projects can take as much as a decade to produce quantifiable results. 


Investing in more information technology has often and consistently failed to boost productivity, or appear to have done so only after about a decade of tracking.  Some would argue the gains are there; just hard to measure, but the point is that progress often is hard to discern. 


Still, the productivity paradox seems to exist. Before investment in IT became widespread, the expected return on investment in terms of productivity was three percent to four percent, in line with what was seen in mechanization and automation of the farm and factory sectors.


When IT was applied over two decades from 1970 to 1990, the normal return on investment was only one percent.


This productivity paradox is not new. Even when investment does eventually seem to produce improvements, if often takes a while to produce those results. So perhaps even AI project near-term failure might be seen as a success a decade or more later. 


Sometimes measurable change takes longer. Information technology investments did not measurably help improve white collar job productivity for decades, for example. In fact, it can be argued that researchers have failed to measure any improvement in productivity. So some might argue nearly all the investment has been wasted.


Most might simply agree  there is a lag between the massive introduction of new information technology and measurable productivity results.


Most of us likely assume quality broadband “must” boost productivity. Except when it does not. The consensus view on broadband access for business is that it leads to higher productivity. 


But a study by Ireland’s Economic and Social Research Institute finds “small positive associations between broadband and firms’ productivity levels, none of these effects are statistically significant.”


Among the 90 percent of companies that have made some investment in AI, fewer than 40 percent report business gains from AI in the past three years, for example.

 

MWC and AI Smartphones

Mobile World Congress was largely about artificial intelligence, hence largely about “AI” smartphones. Such devices are likely to pose issue...