- Stock Picker's Corner
- Posts
- Beyond the Headlines: Why Everyone Is Wrong About Apple's AI Strategy
Beyond the Headlines: Why Everyone Is Wrong About Apple's AI Strategy
How Apple expert sees the company winning the AI Race ...
In this issue of Beyond the Headlines, our focus is on Apple Inc. $AAPL ( ▼ 0.14% ).
Shares are climbing today on the back of robust iPhone 17 sales, yet the tech giant continues to carry the reputation of being a laggard in the artificial intelligence (AI) race.
That perception comes from AI-enhanced feature and tool launches being delayed.

Apple relying on partners like OpenAI to integrate features into the virtual assistant Siri.

Source: OpenAI
And key talent leaving the Cupertino company.

But not everyone agrees with that narrative.
David Zeiler, a veteran financial reporter who once interviewed Apple co-founder Steve Wozniak, offers a different perspective. With a knack for writing predictive stories that often come true, Dave was the perfect person to weigh in on Apple’s evolving AI strategy.
Here’s what he had to say …
No. 1: The general narrative of the mainstream seems to be Apple is falling behind in the AI race. Is that the wrong take? What do you think Apple is doing well with AI? What could they improve on?
I think the mainstream media – and Wall Street analysts for that matter – are looking at Apple’s strategy all wrong. Wall Street is obsessed with the cloud-based AI represented by OpenAI’s ChatGPT, Microsoft’s Copilot, and Alphabet’s Google Cloud. Because Apple is not dropping billions of dollars on Nvidia Corp. $NVDA ( ▼ 3.65% ) chips and building out massive AI data centers, the perception is that Apple is falling behind.
I think they’re underestimating Apple’s different approach, which is to do as much AI as possible on its devices. Apple has invested considerable effort in giving its in-house designed CPUs, particularly the A series chips in iPhones and the M series chips in Macs and iPads, AI capabilities with what they call the “neural engine.”
This locally-based approach will make Apple’s AI more responsive while minimizing the need for huge – and very expensive – AI data centers. And that in turn will help keep Apple’s AI from being a drain on profits.
I also think that Apple’s vertically integrated ecosystem will give it an edge as it implements Apple Intelligence. They can optimize AI features for their chips and operating systems like no other tech giant.
And then there’s the appeal, or lack thereof, of AI to the general consumer. Sure, we’ve seen plenty of businesses use AI internally to become more efficient and profitable, or to make their products and services better. But regular people haven’t been all that impressed.
CNET did a survey about a month ago that asked U.S. adults what motivated them to upgrade their smartphone. Price (62%), longer battery life (54%), and more storage (39%) topped the list. Getting AI features came in 7th, with just 11% of the vote.
That consumers are not yet clamoring for AI features tells me that Apple has more time to develop Apple Intelligence. It also tells me we still haven’t seen an AI “killer app” that would make people say, “I need this on my phone or PC.”
Apple has a good chance of coming up with such a killer use case. Let me give you an example of the sort of thing I’m talking about. One of the AI features Apple released in September was Live Translation in its AirPods Pro earbuds. When activated it will listen to a person near speaking another language while a synthetic voice in your AirPod translates it to your preferred language in real time.
Tech reviewers who have tried it have walked away impressed.
“This was the strongest example I had seen of AI technology working in a seamless, practical way that could be beneficial for lots of people,” Brian X. Chen wrote in The New York Times.
But Apple has done something else that could also help it win the race to find AI features people actually want and would inspire them to upgrade to new devices.
It just released its Foundation Models framework to app developers. That means third-party developers will now be able to tap directly into the AI capabilities built into Apple’s operating systems.
I think this is a huge deal. If you remember, when the iPhone first came out, the only apps included were made by Apple. About a year later, Apple opened up the platform to third-party developers and created the App Store. The explosion of apps was key to making the iPhone one of the most successful devices of all time. It’s possible that history will repeat now that Apple has opened its AI framework to third-party developers.
No. 2: Without picking any specific companies, do you think there is any AI sector where Apple will start making acquisitions?
Apple is rumored to be looking at acquiring a number of AI companies. One company on the rumor list, Perplexity, is focused on AI-powered search, which would make it an attractive target.
It’s likely Apple has already quietly bought several small AI companies. Apple historically buys a bunch of small companies every year, almost always under the radar.
One AI acquisition that came to light over the summer was a company called WhyLabs, which monitors AI output to prevent “hallucinations” (when AI produces wrong answers). Apple probably acquired WhyLabs in mid-to-late 2024.
In a July earnings call with analysts, Apple CEO Tim Cook said Apple had acquired “around seven companies” so far in 2025, some of which he said were AI-related. He also said Apple was always on the lookout for any company that could boost its AI development.
So Apple almost surely is snapping up AI companies fairly regularly. But since the company rarely announces them, we’ll only know of one if it’s too large to go unnoticed.
Also, Apple has been working on partnerships with the big AI companies to make their tech available to Apple users. A deal with OpenAI put ChapGPT on all new Macs, iPads, and iPhones. Apple is also rumored to be in talks with Alphabet about adding Google Gemini to Siri.
It looks like the company is taking a multi-pronged approach, to be sure.
No. 3: Apple announced it is pausing the Pro Vision headset revamp to pivot to making what Bloomberg called “Meta-like AI glasses.” Do you think the headset still has long-term potential as a revenue generator and as a way to get people into the Apple ecosystem?
I don’t think this is just about what Meta is doing, although the Ray-Ban Meta glasses have drawn the most attention. A bunch of smart glasses have come out over the past year or so, and guess what? They’re selling. That’s what has caught Apple’s attention.
Grand View Research estimates that the global market for smart glasses market was worth about $1.93 billion in 2024. The firm estimates the market will grow to $8.26 billion by 2035, with a CAGR of 27.3%.
Meanwhile, Apple’s high-end Vision Pro hasn’t sold that well, which even Apple kind of expected. A $3,500 price tag is a big ask. And while the tech is impressive, the device is pretty much limited to indoor use. That’s why Apple was working on a cheaper, less sophisticated version, which I expect will be called the Vision Air. But that will still probably cost about $1,500 and wasn’t expected to launch until 2027.
Smart glasses, meanwhile, are going for between $200 at the low end to $800 at the high end. They can’t do what the Vision Pro can do, but most work with audio controls and feedback and have cameras so the wearer can take photos and video. And unlike the Visio Pro, they’re very portable.
With these simpler devices clearly gaining traction among consumers, Apple felt compelled to join the party. I think it’s a smart move. The company has a lot of in-house tech they developed for the Vision Pro that should give them a head start in creating affordable smart glasses.
And to be clear, Apple isn’t giving up on the Vision Pro. They’re just re-allocating resources to get an Apple-branded smart glasses device out the door ASAP.
The truth is, it’s still early for this product category. I expect most of the big tech companies to jump on this bandwagon. And don’t forget Apple will have the advantage of being able to integrate its version of smart glasses into its vast ecosystem.
Anshel Sag, a principal analyst at Moor Insights & Strategy, recently told Wired that these so-called XR (extended reality) devices are “a spectrum,” with simpler devices being more commonplace while more sophisticated ones – like a Vision Air – will appeal to those who want more features and are willing to pay for them.
Apple may continue to fumble around a bit in this space for a while as consumers vote with their pocketbooks. But Apple will find its niche.
As for the role any of these devices will play in the Apple ecosystem, that theme is always the same. Every device connects to and enhances the ecosystem, while incentivizing customers to continue buying within the Apple universe. Any XR device Apple puts out there, whether we’re talking about a Vision Air or smart glasses, will do the same.
We want to thank Dave for sharing his insights with the SPC community.

Before You Go
Your career will thank you.
Over 4 million professionals start their day with Morning Brew—because business news doesn’t have to be boring.
Each daily email breaks down the biggest stories in business, tech, and finance with clarity, wit, and relevance—so you're not just informed, you're actually interested.
Whether you’re leading meetings or just trying to keep up, Morning Brew helps you talk the talk without digging through social media or jargon-packed articles. And odds are, it’s already sitting in your coworker’s inbox—so you’ll have plenty to chat about.
It’s 100% free and takes less than 15 seconds to sign up, so try it today and see how Morning Brew is transforming business media for the better.

