Apple Intelligence: A Guide to Apple’s AI-In-Everything Strategy — WSJ

By Joanna Stern

CUPERTINO, Calif. — If I had a dollar for every time an executive said ” Apple Intelligence” at its developer’s conference on Monday, I’d have a steak dinner with all the trimmings. (So, yes, about 60 times.)

After nearly two years of sitting out the generative-artificial-intelligence frenzy, Apple finally jumped in the deep end. The company is injecting AI tools throughout coming versions of its biggest platforms: iOS 18, iPadOS 18 and MacOS Sequoia. There’s a new and improved Siri you can even text with. There’s a new partnership with OpenAI that will bring ChatGPT into Apple apps. There are ways to generate new images and emojis on the fly.

The company says we’ll start seeing the features this fall, but some of the more ambitious ones might not debut until 2025.

So what was Apple’s strategy with all these updates? To show AI integrated into the apps and products you already use — rather than powering a tacked-on perk or stand-alone chatbot. And it put a big emphasis on privacy and processing things on the device when possible.

“We think AI’s role is not to replace our users but to empower them,” said Craig Federighi, Apple’s senior vice president of software engineering, after the event. “It needs to be integrated in the experience you’re using all the time. It needs to be intuitive. But it also needs to be informed by your personal context.”

Yes, Apple can do far more than any free-standing chatbot when it comes to answering questions about our lives. Our photos, notes and messages — even notifications — could be aided by some good AI tools.

As I sat through Apple’s event and talked with executives on Monday, my big thought was: Are these really cutting-edge, useful tools? Or is Apple just stuffing in AI to catch up to current front-runners Microsoft and Google, and sell us more iPhones to boot? Spoiler: It’s a bit of both.

I’ll know more when I can actually test it. For now, we have to go on Apple’s presentation and tightly controlled demos.

A brand new Siri

Ever since Siri’s birth in 2011, Apple’s been promising “a humble, intelligent, personal assistant that goes everywhere with you and can do things for you, just by you asking.”

Yeeaaah. That’s not exactly how things worked out. ( See Larry David.) Yet maybe now is Siri’s moment. Apple envisions Siri as your AI personal assistant, who you can ask anything about your private info — or public info out in the world.

Similar to other chatbots, you can now text with Siri. But unlike other chatbots, Siri has access to all your Apple stuff. When all of the promised updates arrive, it will be able to see what’s on your screen and work across apps. “Add this address to his contact card.” “Text yesterday’s picnic photos to my mom.” Things like this make total sense to a human but up until now have been out of Siri’s reach.

In one demo, Apple showed Siri helping to fill out a PDF form. The assistant was also able to find a photo of the user’s driver’s license, extract the number and type it into the form. In another, Siri was able to search across Messages and Mail for a recipe sent by a friend.

The thing that really elevates Siri is its new friend, ChatGPT. When you ask Siri to do some things it doesn’t know how to — say, come up with dinner ideas based on your recent grocery haul — it asks your permission to check with an integrated version of OpenAI’s bot.

You don’t have to be a ChatGPT Plus subscriber. If you do have an account, you can connect to it and access the paid extras. Federighi said Apple plans to let users choose their own preferred large-language models later on, including, say, Google’s Gemini.

My AI-nalysis: If Apple can actually pull off what it showed and convince people that Siri is no longer painfully stupid, it might be a tech miracle. That’s a big if. The company has a decadelong history of underwhelming Siri improvements.

Writing tools

What do I use generative AI for every day? Writing. No, not my columns. (You could tell!) I use it to sum up articles and reports, draft emails and more.

Apple is adding summarization, rewriting and proofreading to Notes, Mail, Pages and more. The integrated writing tools will also suggest responses and different tones for messages and emails. While most of that is done using Apple’s own AI, more creative writing (say, “A poem about Apple’s iPhone”) will be farmed out to ChatGPT. Again, it will ask your permission before sending the prompt.

My AI-nalysis: This one was table stakes. Google, Samsung and Microsoft have been integrating these tools throughout their operating systems and devices for the past year, and summarization is perhaps the lowest-risk/highest-reward use of generative AI to date. Still, it’s a very welcome update.

Voice transcription

Nothing gets a journalist more excited than easy ways to transcribe audio. Within the Notes app, you’ll be able to record a conversation, get a transcription and then an AI summary of the transcription.

Even better, the Phone app will now be able to record calls and do the same. Yes, Apple says the other caller will be notified that the call is being recorded.

My AI-nalysis: Samsung and Google both have on-device AI transcription. Apple is just getting with the program.

Image generation

We were promised flying cars and we got…AI that can make emojis of flying cars. Yes, Genmoji. You can generate these new emojis by adding a description. You can use them as stickers, in-line emojis or even “tapback” reactions in Messages.

The Image Playground app is Apple’s answer to OpenAI’s Dall-E and other image-generation apps. It’s also built into apps like Messages and Pages. But it won’t give you photorealistic images or anything close to a deepfake. You can choose from just three cartoony styles: animation, sketch and illustration.

Apple Intelligence will also make it easier to search through photos with more specificity. (“Pictures of Fred standing in front of the red car.”) Plus, like Google’s Magic Eraser, the Clean Up tool will help you remove objects or people in the background of your photos.

My AI-nalysis: Instagram and Facebook have had generative image creators built-in for months and I can’t say I’ve used them more than once. What’s exciting here is the blending of AI with the photos we’ve already taken. But as I’ve also said before, if you can change a photo you’ve taken so drastically, what is reality?

Notifications and more

Taking another page from Google and Samsung, Apple will use AI to prioritize your notifications. The most important will appear on top. And if you get a ton of group messages, your iPhone will summarize the chatter for you. No more plowing through 52 texts to see if Thursday drinks are still on.

My AI-nalysis: Woo hoo! That’s your analysis right there.

The bad news

Apple Intelligence will start appearing with iOS 18, iPadOS 18 and MacOS Sequoia. First in the beta versions this summer and then more broadly this fall. And it’s only available in English at first.

And that brings me to the big bummers: As I said, not all the features will be available right away. For example, Apple won’t say specifically when the ChatGPT integration will arrive, just before the year’s end. Some AI features will likely take even longer.

And all of this requires new — or at least very recent — hardware, and not the lower-end stuff. You’d need an iPhone 15 Pro or Pro Max, or an iPad or Mac with an M chip. That excludes several iPhone and iPad models that Apple currently markets.

AI might change the way we interact with our devices. But it will also force us to buy new ones.

“Hey Siri, do you know where I can dig up $1,200 for a new phone?”

— Sign up here for the Tech Things With Joanna Stern weekly newsletter . Everything is now a tech thing. Columnist Joanna Stern is your guide, giving analysis and answering your questions about our always-connected world.

Write to Joanna Stern at joanna.stern@wsj.com

Scroll to Top