Apple Intelligence is the beginning of the end of the AI bubble

Recorded by:

Written by:

Senior Director and Principal Analyst

Recently, Apple had its annual developer conference, much of which was devoted to its new AI features and products. The company made a number of announcements, the largest of which was undoubtedly its partnership with Microsoft-backed AI startup OpenAI. The announcement also painted a much clearer picture of how AI services will be rolled out to consumers and the kinds of business models that will ultimately support the continued operation of large language models (LLMs). A few highlights:

  • No killer apps: Perhaps the most famous tech demo ever is Apple’s launch of the iPhone, where Steve Jobs boldly and correctly proclaimed the iPhone to be five years ahead of anything else on the market. That’s a standard few launches will ever meet, but despite the AI hype, nothing anywhere close to this was presented. Features ranged from moderately useful incremental improvements, like an AI feature for email that helps you rewrite or edit messages, to downright pointless, like the ability to generate pictures in chat to send to your friends. The demos were all prerecorded, which didn’t help alleviate the growing skepticism among consumers about the functionality of AI tools after high-profile struggles like Google’s Gemini rollout. 
  • Siri will get ChatGPT:  Siri (Apple’s voice assistant) will use OpenAI’s LLMs to better respond to questions and more fluidly complete tasks. The demo showed Siri understanding and responding to more contextual questions like “When will my mom’s flight arrive?” (which requires knowing who “mom” is and correctly identifying the right flight). A big part of the way this will work is giving Siri access to a lot of data, so that it can have that context. 
  • Privacy raises red flags: Perhaps most significantly, Apple has been billing itself as a privacy-focused tech company for a long time and has generally backed up those claims, most notably by limiting iPhone user tracking. Apple is trying to bring this privacy focus to AI, which may be a narrow tightrope to walk given the contradiction between Apple’s general policy of encrypting user data (like texts) in ways that even it can’t access and the need to access those data to perform contextual AI functions. This blog from Apple (published at the same time as the event) digs into its approach, but Apple will be building custom data center hardware and software to allow LLM processing with customer data in a completely closed environment. It’s a potentially huge commitment of resources (more on that later), but it wasn’t enough to stop noted OpenAI hater Elon Musk from firing off a few incendiary tweets threatening to ban Apple devices from his companies. That’s extreme, but the shift from “We’re the company who won’t let companies snoop in your data” to “We’re the company you trust to snoop in your data” is still quite a let down. It remains to be seen if Apple can convince both actual experts and the public that privacy is still a core value.

Our consumer insights research shows that consumers want AI tools to say something about themselves — AI that makes them smarter and reflects their sophistication and embrace of novel technologies. The Apple demos were largely practical and failed to capture this desire for connection or self-actualization in the product, despite Apple explicitly positioning it as “AI for the rest of us.” This failure to build a resonant consumer message, coupled with the wobbling privacy announcement, marks Apple Intelligence as a distinct failure.  

What’s interesting about Apple Intelligence is what it tells us about business models. I’ve written a bit about AI business model issues in the Innovation Matters column in the past, but to sum it up, I see three possible business models. The first is the monopoly model: Like Google in search, one player (let’s say OpenAI) will get 90% or more of the LLM business and use that huge market share to drive huge volumes of tokens through models while charging per token. More on this later. Option two is a kind of hybrid, where you charge upfront fees to developers and sell tokens at the same time — this sort of approach is common in a lot of B2B software, and it’s generally a good option for high-value but relatively low-volume applications. Option three is the Apple App Store walled garden model: You take a percentage of revenue from every developer you let into your ecosystem (here, every company that leverages OpenAI’s GPT models). You don’t need a monopoly to make this work, but you do need a large user base and a strong ability to exclude companies from it. 

This launch points in the direction of the first model. The Apple-OpenAI relationship is really reminiscent of the Apple-Google relationship, in which Google pays Apple approximately USD 20 billion to be its search provider. Google is the ultimate tech monopolist,  and it increasingly seems like OpenAI is trying to replicate this success, having formed relationships with both Microsoft and Apple to become the provider of AI for consumers. What’s still a little unclear is how AI generates revenue, even for a monopolist. Early reporting now indicates that no money is changing hands: Apple is giving OpenAI access to its users a form of payment. Google can afford to pay Apple billions because its search business makes a ton of money by promoting advertisers. 

Who’s paying for AI? Siri, and the other AI products rolled out by Apple, appear to be free of charge (Siri certainly has been). Will Apple ultimately pay OpenAI to use its models? That seems like the most sensible option, but how will Apple justify that payment? Apple makes most of its money from selling devices; if we assume a hidden one-time modest fee (USD 20) for OpenAI baked into the sale price of each of the roughly 200 million iPhones sold in a year, that’s about USD 4 billion. Apple also likely has around 1 billion subscribers to services like iCloud; assuming it devotes some of that revenue stream to paying for OpenAI, it could generate as much or more. That’s a lot of money, but it’s not clear to me that it’s enough to financially support the costs of financing the high capital and operational costs of massive AI models like OpenAI’s. 

It’s clear that OpenAI wants to be a tech monopolist and mediate every interaction between people and computers with AI — but it’s not clear how that makes any money. Consumers have long been the product in the tech world, with Google selling their eyeballs to companies in the form of ads. Do these AI services have a way to turn consumers into a new kind of product? It’s hard to square Apple’s privacy guarantees with the kind of business models that have powered tech for so long that OpenAI seemingly wishes to replicate The Silicon Valley playbook has long been to build a user base and figure out how to make money off it later — and that’s worked for services like YouTube than can monetize itself via ads. However, AI requires so much investment to create and service that traditional consumer software models may not work. We’re this far into the AI hype wave, and yet, no one has a clue on how to actually turn a profit; for all the potential impact of AI, as a business, it looks less like the start of a revolution and more like the peak of a bubble. Companies that are building business cases around leverage AI and LLMs should expect turbulent times ahead for their AI suppliers: increased costs, bankruptcy, and cancellations of major planned features are all on the table. 

It’s a little unclear how Apple’s privacy claims square with this contextual AI approach, however. Apple’s positioning as a tech company that doesn’t share your data is undermined by the fact that it would presumably have to share something with OpenAI, though the exact mechanism of the partnership is not clear. To phrase this a bit more broadly, the value of not sharing your data as a privacy feature is pretty significantly diminished if Apple is just going to do all the things those other companies would do — from “We’re the company who won’t let companies snoop in your data” to “we’re the company you trust to snoop in your data.”

Join Us!

The Lux Forums are a premier opportunity for innovation decision-makers to learn more about the consumer insights, science, and technology needed to enable human-centric innovation.

What do you want to research today?