WWDC 2026: The Year Siri Runs Out of Excuses (and Developers Run the Show)
I watched Apple’s WWDC 2026 announcement replay this morning while I waited for the usual buzzwords. Innovation. Platforms. Developers. You know the script by now.
But when they leaned into “AI advancements” and everyone immediately started talking about Siri’s big comeback, something clicked for me. Siri just ran out of excuses.
For years, Apple’s been able to hide behind the usual walls. We don’t have the right hardware yet. The models aren’t ready. We care about privacy. The APIs aren’t there. Now they’re walking into a June 8–12 WWDC that’s literally marketed around AI, with a new cross‑platform “Core AI” framework on deck and a reported partnership with Google’s Gemini models hanging in the background like a neon sign. If Siri is still the same old unreliable roommate after this, that’s not a limitation. That’s a choice.
And the wild part is: developers are the ones holding the bag for that choice.
Apple’s press release is unusually blunt for them. WWDC 2026 runs June 8–12, mostly online and free, keynote and State of the Union on June 8, the usual 100‑plus sessions and labs. But instead of the vague “latest software” language, they promise to “spotlight” AI advancements and “exciting new software and developer tools.” Bloomberg doesn’t even pretend to be neutral about it; they frame the whole thing as Apple’s “AI comeback bid” and call this collection of features make‑or‑break.
Don’t get me wrong. I’ve heard this before. Every couple of years, Siri supposedly gets her big brain upgrade. She learns context. She understands follow‑up questions. She magically becomes more than a slightly nicer command prompt. The problem is, those promises have always floated on top of a platform that still treats developers like potential attackers instead of partners.
This year is different for one simple reason: Apple is finally giving itself the tools it always told us it didn’t have.
We already know iOS 27, iPadOS 27, and macOS 27 are expected to share a new “Core AI” framework that effectively succeeds Core ML. Same ecosystem, same Apple Silicon, same Apple‑blessed models, just exposed in a way that, in theory, lets third‑party apps tap into whatever is powering the new Siri and this “Apple Intelligence” layer everyone keeps whispering about. Combine that with the Gemini deal, where multiple reports say Apple is leaning on Google’s foundation models to juice its own AI stack, and you’re out of technical excuses. Hardware? Solved. Models? Rented. OS‑level plumbing? Apparently on the way.
If Siri still feels brittle in June 2026, it won’t be because Apple couldn’t fix her. It’ll be because they wouldn’t.
That’s the first half of this story. The second half is uglier: Apple’s AI success now depends on the same developers they’ve spent a decade telling “no” on iPad.
WWDC is always pitched as a love letter to developers, but look at what they’re actually selling this year. A mostly online event, free to attend, with a few hundred lucky folks invited to Apple Park on June 8 for the keynote, labs, and an all‑day campus hangout. Dozens of AI‑centric sessions, one‑on‑one labs to help you wire your app into whatever Core AI looks like, and all of it streamed in high resolution through the Developer app, the web, and YouTube.
The subtext is clear: Apple desperately needs third‑party apps to stop feeling dumber than the system. If Mail and Notes and Reminders show up at WWDC with context‑aware summaries and frighteningly good autocomplete, but your to‑do app, your writing tool, and your project management system all feel like 2022, the narrative collapses.
So Apple is about to walk out on stage and say, “We’ve fixed Siri. We’ve built Core AI. Here are the tools. Go make your apps magical.”
And at the same time, they’ve quietly shipped the most honest product they’ve released in years: MacBook Neo, the $599 machine that tells every frustrated iPad user, “You were right. You needed a Mac.”
Neo is fascinating because it’s not subtle. It’s a cheap, fanless MacBook running full macOS, positioned explicitly as “Apple’s AI hardware for everyone” and aimed at students, casual creators, and the growing crowd that just wants a laptop that can actually use these new intelligence features without thinking about it. It lives in the same price neighborhood as a decently‑spec’d iPad plus a Magic Keyboard, which reviewers are already pointing out while asking a brutal question: if you were using an iPad as your main computer, why wouldn’t you just buy this instead?
Reddit and forums didn’t even pretend to be gentle about it. One thread in r/ipad flat‑out calls MacBook Neo “probably bad news for those of us” still trying to make the iPad a laptop, because now Apple has a cheap Mac to point at whenever anyone complains about missing features. You want a terminal? Virtual machines? Pro apps that use real plugins and not some neutered sandbox variant? Cool. Here’s Neo. Problem solved.
At least, for me, that’s the part that stings.
I’ve spent years trying to bend iPadOS into something it clearly doesn’t want to be. I’ve written whole essays about the fantasy version of iPadOS 27 where Apple just admits the obvious and ships macOS with training wheels. I’ve got an entire wishlist of things I want the platform to do so it can actually replace a laptop: better external display behavior, more reliable multitasking, background processes that don’t get strangled the second you look away, more honest file management.
And every time I think, “Okay, this is the year,” Apple ships another half‑measure. Stage Manager that arrives years too late. External monitor support that looks good in keynotes but falls apart in daily use. Files that still behaves like a costume party version of Finder.
The more honest voices in the community have started calling it what it is: structural. MacStories literally ran a piece titled “The iPad’s Software Problem Is Permanent,” arguing that these limitations aren’t bugs – they’re the product’s identity. iPadOS is supposed to be safer, simpler, more controlled. It is not supposed to turn into a Mac.
MacBook Neo makes that argument bulletproof.
Before Neo, Apple could at least pretend the iPad Pro was “the computer” for some slice of people. That was the whole point of the M‑series chips in tablets and the expensive keyboard accessories. If you pushed hard enough, they might eventually have to give in and let the software match the hardware.
After Neo, if you want Mac‑level power, Apple can shrug and say, “We already make that. It’s called a Mac.”
So what does that mean for iPadOS 27 in the shadow of WWDC 2026?
If I’m being honest, I think we get two very different kinds of change.
On the surface, iPadOS 27 probably looks like a genuine step forward. It will almost certainly get the same Core AI framework as iOS and macOS, with on‑device intelligence for summarizing notes, rewriting text, surfacing the right document or project when you need it. The new Siri, whatever branding they give it, will show up on iPad too, with better on‑screen awareness and the ability to chain actions without you repeating yourself like an idiot. The windowing system that started to feel more Mac‑like in iPadOS 26 will get faster, less buggy, more predictable.
That stuff is relatively cheap for Apple. It lives inside the same locked‑down model they already use. It sells the AI story. It photographs well at WWDC.
But underneath, I don’t think the deepest restrictions move at all.
True sideloading? Apple has spent years telling regulators and users that the Mac and iPad have different security assumptions, and has gone out of its way to close every little sideloading loophole on both. Terminal? System‑level virtualization? Real plugin architectures that let apps ship their own executable code? They’ve been stomping those out on iPad, not quietly preparing to unlock them.
And now, when a power user asks for any of that, Apple can point at MacBook Neo and say, “We hear you. That’s why we made this.”
From Apple’s perspective, it’s a clean split:
iPadOS gets the friendly, curated version of Apple Intelligence. A better Siri, smarter defaults, more context on device. Enough multiwindow and external display support to make it feel productive, especially if you live inside Apple’s own apps and a handful of blessed third‑party tools.
macOS – including the bottom‑of‑the‑line Neo – gets the messy stuff. Terminals. Containers. Weird developer tools. All the things that make the security team nervous.
So when we talk about WWDC 2026 as “the WWDC where Siri runs out of excuses,” we also have to admit it might be the WWDC where Apple stops pretending the iPad is ever going to be what some of us wanted.
Developers are smack in the middle of that contradiction.
On one hand, Apple’s AI gamble literally cannot work without them. They need indie note‑taking apps and task managers and creative tools to embrace Core AI, to wire in Siri‑adjacent features, to let that intelligence feel omnipresent instead of trapped in a few first‑party apps.
On the other, Apple’s platform strategy still treats them like guests who might accidentally break something if they’re allowed into the kitchen. That’s especially true on iPadOS, where the things developers have been begging for – better background execution, richer file access, more honest integration points – keep running into the “but what if a kid taps this?” wall.
Long story short: Apple is asking developers to help make Siri and Apple Intelligence feel inevitable while simultaneously giving them just enough rope on iPadOS to build nice, safe, App Store‑approved experiences that never quite reach Mac territory.
And boy, was I right in that earlier piece when I said the only way out was for Apple to decide what the iPad actually wants to be.
Sitting here early in the morning while writing this I’m a bit turned. Part of me wants to be optimistic and say WWDC 2026 could be the moment Apple finally surprises us. That in the middle of all the AI demos and the inevitable “Siri is so much smarter now” montage, they quietly ship three or four iPadOS 27 features that show they’re willing to loosen their grip.
The other part of me looks at MacBook Neo, at Apple’s permanent‑sounding explanations for why iPadOS will never be as open as the Mac, and at the market pressure to show Wall Street a big, clean AI story, and thinks: why would they?
If Siri nails it this year, most people won’t care what OS is underneath. They’ll just know their iPhone, their iPad, their cheap Mac all suddenly feel smarter. If Siri blows it again, no amount of Core AI sessions and developer labs will save the narrative.
Either way, June is going to tell us something uncomfortable about Apple’s priorities. Not just whether they can finally deliver the assistant they’ve been promising since 2011, but whether they’re willing to give developers – and especially iPad developers – enough freedom to make that intelligence feel real everywhere, not just in the places Apple controls.
And if they aren’t, you already know what the answer is. It’s sitting on the shelf with “Neo” stamped on the lid, quietly whispering to every frustrated iPad user: you were right all along.
As for me, I was foolish enough to believe investing in a iPad Pro would be worth it in the long wrong. I guess I’ll have to wait until June to confirm if I was foolish or if I was actually on to something. If the former, I’m not sure I’ll have anything more to say about iPadOS other than it being an amazing tablet and nothing more.


