Updating iOS for the AI Era
Apple has two choices: adapt or die
Sponsored By: Brilliant
This article is brought to you by Brilliant, the app that helps you master core concepts behind AI, neural networks, and more in minutes a day.
Somewhere deep in the bowels of Apple Park, there almost certainly exists a secret team of designers working day and night to answer one question:
How should we update iOS to take advantage of AI?
Apple is smart. It knows we’re living through a Cambrian Explosion of technology thanks to AI. It knows user habits and expectations are changing. And it definitely knows that no product category will be spared. Today it may seem like just Google Search is at risk of disruption thanks to products like ChatGPT, but that is just the beginning. Smartphone operating systems will be forced to change, too.
The stakes for Apple couldn’t be higher. The iPhone is the cornerstone of its entire $2.6 trillion dollar empire. Just counting iPhone hardware sales, it makes up half of Apple’s nearly $400 billion in annual revenue. But much of the other half—App Store revenue, AirPods, Apple Watch, etc—directly depend on the success of the iPhone.
In the next few years, there are only two options for the iPhone: adapt, or die. Of course, it will take time for the consequences to fully play out—the iPhone is not going to die anytime soon. Apple’s moat means it doesn’t necessarily need to be the first mover. But it will need to move quickly—and the next few years will be pivotal.
I’ve been an obsessive iPhone user since the day it came out in 2007, so I can’t help but wonder how this product that is such a big part of my life will evolve in the face of perhaps the biggest technology shift I have ever seen.
I’m sure you could come up with all sorts of completely new designs that totally reinvent the user interface, but I am hopelessly pragmatic. Radical new concepts that would never work in practice don’t interest me. In the real world, you can’t change everything about a product that’s as established as the iPhone and expect it to work. People’s habits and expectations are an important part of why iPhones keep selling year after year. This is a specific example of the more general principle of path dependency: the future depends on (and has to emerge from) the past and present. Real, working systems don’t get totally reinvented. They incrementally evolve.
In a few months, WWDC is happening. Hopefully we’ll see some early glances at what Apple has in store for us as they unveil iOS 17. My guess is that not much will change. But I think it’s entirely possible that over the next two or three years they will make bigger changes than they have in the past twelve.
So let’s walk through how the iPhone might evolve going forward as AI becomes a more useful and important part of our computing lives. What needs to change? What role will Siri play? What is the simplest path forward? Let’s start at the beginning.
AI won’t take your job. Someone using AI will.
Fortunately, Brilliant is the best way to level up your understanding of cutting-edge technology like AI, neural networks, and more.
They have thousands of bite-size lessons in math, data, and computer science that help you master core concepts in 15 minutes a day. Each lesson is built using visual, interactive problem-solving that makes even complex topics click.
Join over 10 million people around the world and try Brilliant today with a 30-day free trial.
The Lock Screen
This is what you see when you first pick up your phone. It has not changed much since 2007:
- You swipe up from the bottom to unlock the phone
- There are two buttons at the bottom for functions you need in a hurry: the flashlight and camera
- At the top there are now widgets to show you quick bits of information
- More visual customizations
So how should this screen incorporate AI? I see two main ways.
The first is about having a faster, more obvious way to access Siri. Siri should become much more useful and important in the coming years, so I wouldn’t be surprised if we see a new button to launch Siri at the bottom of this screen:
You can currently access Siri from this screen by holding down the lock button on the right side of the phone while you speak a command. But I don’t think this design is going to work in the future, for two reasons:
- I can’t always speak out loud to an AI. Sometimes I want to type.
- Holding down an unlabeled button is not an obvious action. A lot of users will miss it or forget to do it. This is fine today, but it would become a problem if Siri became more important.
When you tap this new Siri button, it would launch a new “Siri App” that would be similar to ChatGPT. (To be clear: Apple has not announced anything about a Siri App that is similar to ChatGPT, my point is that I think it should exist. I will go into detail about how I’m imagining it could work below.)
So if the first change to the lock screen is adding a Siri button, the second big change I think Apple should make is to improve the experience of reading through a big list of push notifications.
Notifications are a crucial part of the smartphone user experience, but they can be overwhelming. It would be great if this acted more like a personal assistant who can quickly summarize all the most important things that have happened in your apps since you last checked in.
I’m imagining it could look something like this:
Can you spot how this is different from the current Notifications UI? It’s subtle, but important. In the current iPhone, it just shows you a bunch of different notifications from a bunch of different apps. The text in the notification itself often doesn’t contain the most important information. For example, maybe your mom sent you 5 text messages, but the most important thing, where she asked about dates for an upcoming trip, was in the third one. Today this would be buried, but in the future it will be possible for AI to parse the most important content from the most important notifications and present it as succinctly as possible.
I can’t be certain if this is feasible today for Apple. Huge corporations have to contend with a lot of complexities that startups don’t—not just technical, but also privacy, legal, PR, etc. That being said, based on my experience working with LLMs I don’t think this kind of feature is impossible to build. And I personally would love it.
The Siri App (SiriGPT?)
When Siri launched, it was positioned as a convenient way to issue simple commands to your phone when you can’t use your hands and eyes. The launch video featured people who were running, driving, cooking, packing, and reading braille (Siri is especially useful for those who are blind or visually impaired).
But now, if Apple chooses to focus on Siri it could become as smart or even smarter than ChatGPT. This is of course a bold claim. Siri is notoriously “as dumb as a rock”; Microsoft CEO Satya Nadella said it, but we all know it’s true 😅. Honestly, I do think it’s a bit of a long shot to see Siri becoming as useful as ChatGPT any time soon, but crazier things have happened. Apple is the world’s most valuable company and they will do everything possible to keep it that way. ChatGPT is not magic. Other relatively tiny companies, such as Anthropic, have made models that rival ChatGPT in quality. I would be surprised if Apple can’t build or buy its way towards similar functionality.
This would make it useful in a wide variety of circumstances beyond asking simple questions or issuing simple commands. In this world, it no longer makes sense for Siri to be confined to a simple overlay.
Today, if you want to use Siri, you either say “Hey Siri” or hold down the side button, and you see an overlay on top of whatever screen you’re looking at, like this:
It’s hardly possible to type a question or command into Siri, and there is no conversation history. You can only ask follow-up questions or have multi-turn conversations under extremely limited circumstances. When Siri responds, the answers are often incorrect and/or incomplete.
Apple should solve these issues and make Siri into something that works much more like ChatGPT. In order to do this, Siri will need to become an app.
Here’s how it could look:
Pretty simple! Very similar to ChatGPT. But simplicity and familiarity are good things, and it would be really useful to have a dedicated space to chat with Siri and see a history of conversations and requests.
Apple is in a unique position to launch a ChatGPT competitor, because it has so much personal information about its users, and it has the ability to run AI models—perhaps even large(ish) language models—locally on the device. This enables features like:
- Offline mode
- Access to photos, contacts, text messages, and even data from Apple Health and other very personal info
- Control of home automations and security through Apple Home
It might even make sense for Apple to acquire and integrate a product like Rewind, which records everything you see and hear on your computer and pitches itself as a “hearing aid for your memory.” This would be extremely useful if built into Siri with strict privacy standards.
Siri in the Share Sheet
The share sheet is what you see when you press the “share” icon. Usually this happens on a URL or a photo or video.
Siri should show up here! I added a button called “Ask Siri” below the “Copy” command:
On webpages or any other type of text document, it would allow you to ask any question and have Siri use the information to answer your question. For example, the page in the screenshot above is showing Mac cases, and you could ask Siri which is the most popular waterproof option.
On photos or videos, you could also ask questions, like “who is in this photo?” or even issue commands like “remove the cars from the background” or “style this like it was a Monet painting.”
Other fun ideas
I wish I had more time to design all of these, but just to get your brain going…
- Maps — knowing your location history, Siri could answer questions like, “what was that mini golf place we drove by last night on the 405?” It could also help you find restaurants, coffee shops, bars, and stores of all kinds.
- Fitness — knowing your activity levels, especially if you wear an Apple Watch, Siri could chat with you and answer health questions and give personalized advice, or even function as a personal trainer and nutritionist of sorts.
- Memories — Apple already makes really fun videos based on detected events like a family outing or a concert. It would be really cool if this were more interactive and creative, and you could create these using prompts similar to how people are creating using Midjourney and Runway.
- Weather — What if you could take a photo of your house, and the background of your weather app showed a generated rendering of the current conditions? (Sunny, thunderstorms, snowy, etc.)
- GarageBand — I’ll let your imagination run wild here.
The point is, a lot is going to change in the coming months and years. Apple has the data. They have the compute. They have every incentive to do something with it. Now, we wait to see what happens next.
Thanks to our Sponsor: Brilliant
Whether you're a professional looking for an edge or a lifelong learner building new skills, Brilliant is the best way to learn. Level up on AI and other cutting-edge topics with quick, visual, hands-on learning.