AI has become a big topic of conversation this year. I was in Davos last week, and it probably came up at almost every single discussion around technology, what’s happening with AI as well as what’s happening with OpenAI.
So said IBM CEO Arvind Krishna as he outlined his company’s hybrid cloud and AI strategy this week. In his view, there have been three moments that matter in the past decade in terms of AI emerging to have business potential and impact:
One, when IBM won Jeopardy! with Watson, I think it was a big moment, and AI came onto everyone’s roadmap. Second, when Deep Mind from Google or Alphabet started winning competitions around, for example, GO, and that became another big moment along with the protein folding that they did and now with OpenAI and ChatGPT.
Despite accusations of hype, those latest developments are built on solid tech foundations, he suggested:
All of this latest version is based on what is called large language models as the underlying science. Universities do it, Google does it, IBM does it, as does OpenAI. To just get to why it’s so exciting, for example, for us, it allows us to do 13 language models when we are looking at understanding different natural languages in the same cost as originally one. That is what is so exciting about these technologies because if you can get an order of magnitude improvement in cost and speed and the resource consumed, both in terms of hardware and people, that is incredibly exciting.
Show me the money
Fair enough, that’s the technocrat thesis. Now, show us the money! Krishna said that there is a clear path to monetization:
Our monetization of AI is very much focused on that $16 trillion of productivity that I’ve talked about that we’re going to get over the decade. The vast majority of that comes from enterprise automation, and when I say enterprise, I include governments into it. Some examples – if you can automate the drive-through and order taking for quick-serve restaurants, that’s an example of what can happen. If we can get deflection rates of 40%, 50%, 60% at everyone’s call centers, that’s a massive operational efficiency for all of our clients. If we can help retirees get their pensions through interacting with a Watson-powered AI chatbot that is an enterprise use case where all of these technologies come into play.
By the way, all my three examples are real clients where we are resulting in anywhere from hundreds to thousands of people, efficiency for each of these clients. So that’s how we get it. If I look inside IBM, how we do promotions, how we do people movement, how we begin to improve our code to cash, how we improve our customer service and people ask complicated questions around triage of IT systems going down are all very real examples where we are improving client service and saving money all at the same time.
Khrisna also pointed to IBM’s play in AIOps:
We made a couple of small acquisitions Instana and Turbonomics, we built our own AIOps portfolio. And we’re seeing tremendous pick-up from that as our clients want to take out labor complexity, but also want to optimize their overall IT infrastructure, hardware and software. They also want to have uptime that is now the talk about not just two nines and three nines, but up to five nines. They also want to worry about how to make sure some go to always on.
I think our AIOps portfolio there really advantages us, and I believe we’re in a unique position because we help our clients in an environment across multiple public clouds and on-premise, and with their private clouds in that space. If I think about data and AI, our focus on data fabric and allowing our clients to leverage the data wherever it is, not always moving it, but allowing them to catalog it, leveraging AI, deep inside our products is another example of where we have a unique capability.
Clients are still doing new development in the current turbulent macro-economic climate, he added:
From our perspective, if somebody is doing an expanded Salesforce deployment, I call that a new application. If somebody is doing a new application on Azure or if they are…refactoring, putting in new function, integrating with other applications they might have in their shop, or that they buy a SaaS properties, we consider all that new development. For us, our consulting teams are largely doing that new development for our clients. And in that process, they tend to use OpenShift from Red Hat, it tends to use Red Hat Linux, they tend to use our AI automation.
Our AI automation then surrounds all those things to make them much more resilient, much more robust, much more secure, and those are the capabilities we bring…There is likely a focus that in that new application. Is it helping automate things more? Is it helping make things ‘straight through’ as opposed to with a lot of manual intervention?
IBM has certainly made the tech investment in AI and if Krishna is correct, it’s about to pay off.
The firm this week announced its highest annual growth in over a decade. Separately we need to add IBM to the increasingly long list of enterprise firms making layoffs. In IBM’s case, it’s 3,900 roles, roughly 1.5% of the total headcount. It’s also worth noting that some of the layoffs come from the AI Watson Health unit.