Friday, November 22, 2024

Apple’s iPhone 16 Pro Gets a Little Smarter With Apple Intelligence

Must read

Creating summaries seems to be a thing everyone wants to do with AI, and Apple Intelligence is ready to do the same. You can have your emails summarized, messages summarized, and even your notifications from third-party apps summarized. Some of this can be handy, like when the Mail app calls out an urgent-sounding email in its summary, which I would have missed had I just glanced at the giant collection of emails. But more often than not I just swipe away the summary and dive into all the notifications.

Speaking of, there’s a summarize feature built into Safari, but you have to put the web page into Reader mode. It’s these kinds of things that make it hard to find these smart features and remember that they exist. At the very least, I was able to summarize an 11,000-word story and get the gist of it when I didn’t have time to sit down and read it. (Sorry.) I’ll forgive you if you summarize this review.

Arguably the most helpful Apple Intelligence features for me as a journalist who attends multiple briefings a month are the new transcription tools in the Notes, Voice Memos app, and even in the Phone app. Hit record in Voice Memos and Notes and the apps will transcribe conversations in real time! If you’re on a phone call, tap the record button and after both parties are notified, it will start recording the call, and you’ll get a transcription saved to your Notes app.

For all of these, much depends on the microphone quality for the person on the other end. Either way, it’s certainly better than no transcription at all. It’s too bad there are no speaker labels, like on Google’s Recorder app. You also can’t search these recordings to find a specific quote. (Technically, you can if you add the transcript to your note in the Notes app, but it’s an extra step.)

The Photos app is getting an Apple Intelligence infusion too, and the highlight here is the Clean Up feature. Just like with Google’s Pixel phones that debuted Magic Eraser more than three years ago, you can now delete unwanted objects in the background of your iPhone photos. This works pretty well in my experience, though I’m a little surprised Apple gives you so much freedom to erase anything. I completely erased my eye from existence in a selfie. I erased all my fingers off my hand. (Google’s feature doesn’t let you erase parts of a person’s face.)

Video: Julian Chokkattu

Next, I erased my mug, which was in front of my face as I went for a sip, and Clean Up tried to generate the rest of my face that was previously hidden to some horrifying results. (For what it’s worth, I tried this on the Pixel 9 and the results were just as bad, though Google did give me more options.) As my coworker said in Slack, “They both seem to have been trained on images of Bugs Bunny.”

There’s more to come in Apple Intelligence. Image Playground will let you generate images. Genmoji will let you create new kinds of emoji that right now only exist in your mind. Siri will be able to better serve more contextually relevant information. But I’ll have to do another dive into Apple Intelligence when those features arrive later this year. Just a reminder that Apple Intelligence is a part of the next iOS 18 update, but it’s only available on select devices: the iPhone 15 Pro, 15 Pro Max, and the entire iPhone 16 range.

Latest article