Apple Intelligence Early Review: Don't Expect Your iPhone to Feel Radically Different

One of my favorite things about the iPhone 16 isn’t its new Camera Control button or macro photography mode. Instead, what I’ve come to appreciate most is that I can look down at my phone after getting a barrage of texts or Slack messages and know whether it’s an emergency just from the lock screen.

Of all the features available in Apple Intelligence at launch as part of Apple’s iOS 18.1 update, message and notification summaries are by far the most useful. The summaries aren’t perfect (AI, as it turns out, can’t nail sarcasm and doesn’t know the inside jokes I share with my friends). But this type of functionality is exactly the type of passive, practical intelligence I’m hoping to see more of on smartphones in the future. 

AI Atlas art badge tag

Otherwise, the first wave of Apple Intelligence, which I’ve been testing in a beta ahead of iOS 18.1’s official launch, are mostly geared toward specific tasks such as editing a photo or writing an email. It’s worth noting the version I’ve been using is a preview meant for developers rather than the publicly available build, so functionality will likely improve over time. Still, my experience is enough to give me a sense of the role these AI tools may or may not play in our everyday lives. 

Many of Apple’s features address similar scenarios and use cases as Android-based AI tools that have emerged over the last year, like Samsung’s Galaxy AI and Google’s Pixel AI features. Arguably, the more exciting Apple Intelligence additions will be coming in a future iOS 18.2 update, which just arrived in developer beta last week. That upgrade will bring ChatGPT integration, Visual Intelligence for scanning the world around you with your iPhone 16’s camera and image generation tools like Image Playground and Genmoji among other additions.

It’s impossible to tell exactly how much Apple Intelligence brings to the iPhone experience based on this first set of features. But so far, I’ve found a few instances in which it’s been meaningfully helpful — such as text message summaries — and that provide a glimpse into how our iPhones will get more intelligent in the future. But other features, like the ability to rewrite text, I can see being easily forgotten.  

Apple Intelligence is beginning to roll out for the iPhone 16 lineup, iPhone 16 Pro and iPhone 15 Pro with iOS 18.1. It’ll also be available for compatible Mac and iPad models, although I’ve only been testing it on an iPhone 16. 

Read more: I’ve Been Using the iPhone 16 for a Month. Here’s What Stands Out

Message summaries are sometimes useful, sometimes amusing

I’m grateful Apple is trying to solve one of the most annoying problems with our phones: being inundated with notifications. Apple Intelligence summarizes incoming texts and alerts, which in my experience has been accurate enough to get the general gist of a conversation at a glance.

For example, I’ve found it useful for seeing that my friends decided on dinner plans on a Friday night or knowing whether my editor will be available to edit a story. The same goes for notifications from messaging apps like Discord, Slack, Google Chat and WhatsApp, which certainly comes in handy when messages from my large group chats start to pile up. If you send a photo, Apple Intelligence will usually include a description of the image, although in my experience that wasn’t always the case. 

I found these summaries to be more useful than the email summaries in the Mail app, considering the preview section that shows up in your email is too small to show a meaningful summary of a lengthy email. 

Two screenshots showing bundles of messages with a short sumary Two screenshots showing bundles of messages with a short sumary

Lisa Eadicicco/CNET

But you’ll definitely want to open the notifications to get the full picture. Apple Intelligence can sum up simple messages pretty well. But how many of your conversations with friends and family are actually straightforward? Most of the time, chats are infused with inside jokes, sarcasm and references that only humans, specifically the humans you talk to regularly, will understand. And that certainly shows in Apple Intelligence. 

Look at this example of how Apple Intelligence summarized a conversation in Google Chat to see what I mean. The conversation was about friends that couldn’t make it to another friend’s birthday dinner because of a relative’s party that same weekend. Some friends made jokes about what to wear to the friend’s birthday party throughout the conversation. 

A screenshot of Google Chat messages with a short summary A screenshot of Google Chat messages with a short summary

Lisa Eadicicco/CNET

Otherwise, I noticed that Apple Intelligence will exclude certain details from message summaries, so I certainly wouldn’t recommend relying on them too much. 

It’s also worth noting that in my experience, Apple Intelligence will not summarize messages that include explicit or sensitive content, such as texts that include references to self harm.

Message summaries are far from perfect, and they’re often too clinical to capture the conversation. But I do think it’s the best example so far of how Apple Intelligence can be useful in a way that feels natural and practical. 

One of my biggest criticisms about new AI tools in general is that they require the user to think of a prompt or go out of their way to take advantage of these features. I’m a fan of passive features, like message summaries, that are infused into the operating system and don’t require a user’s effort. It’s one of the few Apple Intelligence features I might actually miss if I were to switch back to an iPhone 15. 

A lot of these AI features involve handling personal information, which is why Apple uses a system called Private Cloud Compute. Apple says it analyzes whether a request can be executed on the device or if it requires more powerful cloud-based computing. If it does need the cloud, Apple says it will only share the information required to complete the task, which Apple doesn’t store and can’t access. 

Read more: Your Phone’s Virtual Assistant Is About to Change in a Big Way

Apple’s photo app can erase objects from photos, create custom montages

Like Google and Samsung, Apple makes it possible to remove pesky background objects from photos. Its new tool called Clean Up allows you to circle unwanted elements in an image to remove them.

What stood out to me the most is the way Apple highlights objects it thinks you’ll want to delete, which I’ve found to be accurate. When editing a photo of me sitting on a bench sipping an iced matcha drink, for example, Apple highlighted my colleague’s backpack and beverage. 

Like similar tools from Google and Samsung, the edits are pretty seamless when removing small objects but can sometimes make the photo look warped when removing bigger items. The results can vary, so if you’re unhappy with the outcome it’s a good idea to try again.

Look at the photo below to see what I mean. Can you tell where the backpack was in the photo?

A woman sitting on a bench A woman sitting on a bench

An image that was edited with Apple’s Clean Up tool.

Numi Prasarn/CNET

Apple’s approach to image alteration is more reserved than Google’s. On Pixel phones, you can add objects to photos that weren’t there and swap facial expressions between images. Apple’s features are more focused on object removal. 

One of the other new Apple Intelligence features coming to the Photos app is the ability to create custom memory montages based on a prompt. The idea is that you’d be able to type in any phrase that describes photos in your library, like “Graysen learning to walk” for example, and the iPhone will do its best to compile a movie of images that fit that theme.

This was hit-or-miss in my experience. For some prompts, like images of landmarks, it worked well. But in other instances, I had to try multiple times to get the result I wanted, and even then it wasn’t exactly right. When I typed in “best things I ate in Italy,” it pulled up a ton of photos that didn’t include food.

A woman's hands holding an iPhone A woman's hands holding an iPhone

With Apple Intelligence, you can create a Memory Movie by entering a prompt.

Numi Prasarn/CNET

The same goes for general search in the Photos app, which gets a boost from Apple Intelligence so that you can use natural language to search for a specific image. Like the memory movies, the results were inconsistent. Typing in a search term like “Oscar being cuddly” worked well and resulted in dozens of images of my cat laying in my lap. But others, like “Me and Courtney at dinner,” only pulled up a fraction of the photos in my library. 

Siri gets a new look and better understanding 

Siri’s bigger upgrade that introduces ChatGPT integration and the ability to draw on personal context will be coming in future updates. But in iOS 18.1, Siri gets a more modern look that consists of a glowing border that wraps around the screen, better comprehension when you stumble over your words, product knowledge for answering questions about your iPhone and support for typed queries. These upgrades don’t feel overwhelmingly new, but they do make Siri feel a bit more convenient than before.

A person holding up an iPhone A person holding up an iPhone

Siri is better at answering questions even when you stumble over your words with Apple Intelligence.

Numi Prasarn/CNET

I tried to stump Siri on multiple occasions with “umms” and “actuallys,” and sure enough Apple’s virtual helper was able to understand the intention behind my request. When I asked a question about what the most popular movie Jennifer Garner was in but changed it to Jennifer Lawrence mid-sentence, Siri correctly mentioned films in the “X-Men” and “Hunger Games” series. Similarly, Siri was able to set an alarm even though I initially asked it to set a timer by mistake. 

I don’t think an update like this will encourage you to use Siri more often if you don’t already do so. But it should make Siri interactions a bit more smooth for the things that you do use it for. 

Read more: iPhone 17 ‘Slim’ Rumors: Apple’s Thinnest Phone Ever May Come Next Year

Apple can rewrite text for you

Productivity is also a major theme in Apple Intelligence. You can have Apple rewrite, proofread and summarize text throughout the operating system, meaning you can use it whether you’re writing an email, text message or note. Just highlight the text you want to work with and you’ll see an option called Writing Tools along with the usual options for copy and paste. Samsung offers similar features on its phones. 

I tested this tool to rewrite text messages in a different tone. Apple provides options for making writing more concise, friendly or professional or to just rewrite the text in general. I chose the friendly option and found that Apple generally did a good job of keeping the same general tone, more so than Samsung. Apple rephrased a few portions to sound more friendly, such as saying “How’s life treating you?” instead of “How’s everything going?,” while Samsung significantly shortened my message and added the word, “Yo,” which I never use.

Screenshots of suggest text from Apple and Samsung Screenshots of suggest text from Apple and Samsung

Lisa Eadicicco/CNET

I don’t find myself going out of my way to use Writing Tools often. It’s not because they don’t work well, but rather because like many AI features, it requires building a new habit. Maybe it’s because I’m a writer, but I’m used to proofreading and re-phrasing my own messages and emails before sending them. However, there’s definitely a practical use case, especially for those who have a hard time constructing complex professional emails from their phones.

Otherwise, there are a handful of features to help you feel more organized, particularly in the Mail app, which can now highlight priority emails. I like this in theory, but I found that it only highlighted emails from my credit card company, which felt a little unbalanced. 

I wished it highlighted other timely emails, like my Amazon shipment notifications and Venmo transactions. On one or two occasions, it surfaced an email thread with friends about a weekend trip to Maine, but it didn’t highlight this email every time someone responded. Overall, priority email groupings have potential but don’t feel useful yet. 

Overall thoughts

A woman holding an iPhone A woman holding an iPhone

The Clean Up tool in Apple Intelligence.

Numi Prasarn/CNET

With iOS 18.1, it’s clear that Apple Intelligence is in its early stages. That’s not necessarily a bad thing; it just means you shouldn’t expect your iPhone to feel radically different after this update. 

There’s a lot of potential in features like message and notification summaries and photo memories, both of which aim to solve the very real pain point of sorting through the massive amounts of alerts, messages and photos found on our devices. These features don’t always work as intended, but they point towards a promising future.

But the most intriguing stuff is yet to come. I expect Apple’s iOS 18.2 update to provide a much clearer picture about what Apple Intelligence will bring to the iPhone.

So is it worth buying a new iPhone for Apple Intelligence? No, at least not yet. But it does make me believe iOS updates and software support will become a more important part of the iPhone upgrade decision in the future. Just how far into the future isn’t clear yet. 

Apple’s iPhone 16, 16 Plus Show Off Bolder Colors and Buttons

See all photos



Fuente