Apple Intelligence: The features I can’t wait to try

5 months ago 142
Apple Intelligence features | Apple promo image

The bad news from yesterday’s keynote is that Apple has never listed so many new features as “coming later.” This includes all of the Apple Intelligence ones.

The other bad news is that AI features will initially be limited to US English, although Apple’s wording here does suggest that those of us in other countries will still be able to try it …

Apple says (our emphasis):

Apple Intelligence is free for users, and will be available in beta as part of iOS 18, iPadOS 18, and macOS Sequoia this fall in U.S. English […]

Apple Intelligence will be available on iPhone 15 Pro, iPhone 15 Pro Max, and iPad and Mac with M1 and later, with Siri and device language set to U.S. English.

This implies that you can try it from anywhere in the world if you’re willing to switch language. Since I already live a mid-Atlantic existence where writing is concerned, I can survive that.

So which of the Apple Intelligence features am I most keen to try?

All-new Siri

This is the biggest one by far for me. It’s almost a decade since I wrote a Feature Request asking for Siri to be given access to third-party apps, and I’m finally going to get my wish!

The original developers of Siri had very grand ambitions for what was then an app, and we’ve made very little progress toward that vision as yet. But that now looks set to change, albeit in increments.

With onscreen awareness, Siri will be able to understand and take action with users’ content in more apps over time. For example, if a friend texts a user their new address in Messages, the receiver can say, “Add this address to his contact card.”

With Apple Intelligence, Siri will be able to take hundreds of new actions in and across Apple and third-party apps. For example, a user could say, “Bring up that article about cicadas from my Reading List,” or “Send the photos from the barbecue on Saturday to Malia,” and Siri will take care of it.

Siri will be able to deliver intelligence that’s tailored to the user and their on-device information. For example, a user can say, “Play that podcast that Jamie recommended,” and Siri will locate and play the episode, without the user having to remember whether it was mentioned in a text or an email. Or they could ask, “When is Mom’s flight landing?” and Siri will find the flight details and cross-reference them with real-time flight tracking to give an arrival time.

Apple’s intelligent assistant will finally be able to use intelligence to assist us with real-life tasks.

Integrated ChatGPT

ChatGPT is far from trustworthy, and Apple made a very smart decision to specifically warn us when it will be answering a question we’ve posed to Siri. But it does have its uses, so to have this deeply integrated into the OS is certainly handy.

Proof-reading and summarizing

Anyone who writes for a living will know the value of proof-reading. It can be especially difficult to spot mistakes in our own text, because we already have a mental picture of what we intended to say, so typos may not always jump out at us.

Existing proof-reading apps like Grammarly can be a big help, though they certainly don’t pick up on everything. It’ll be interesting to see how Apple’s version compares. I’m preparing myself for it to get pedantic about official Apple language which absolutely nobody else uses, like always referring to Vision Pro as Apple Vision Pro.

But it’s the summarizing features which most excites me.

With Summarize, users can select text and have it recapped in the form of a digestible paragraph, bulleted key points, a table, or a list.

This could be especially useful with emails, with Apple promising it will allow us to “view pertinent details with just a tap.”

Recording and transcribing

I’ve become a huge fan of the MacWhisper app. It’s great to be able to record an interview, presentation, or video, and quickly get an accurate transcription. I used it yesterday to record and transcribe the keynote.

I said beforehand that I’d ideally like to see iPhone voice transcription as a universal feature. We’re not yet getting this, but Apple is at least getting us off to a good start.

In the Notes and Phone apps, users can now record, transcribe, and summarize audio. When a recording is initiated while on a call, participants are automatically notified, and once the call ends, Apple Intelligence generates a summary to help recall key points.

Natural-language search in Photos

This one could be huge.

Natural language can be used to search for specific photos, such as “Maya skateboarding in a tie-dye shirt,” or “Katie with stickers on her face.” Search in videos also becomes more powerful with the ability to find specific moments in clips so users can go right to the relevant segment.

The video clip part of this will be fantastic if it works well.

Things that don’t interest me – but I could be wrong

There were other Apple Intelligence features which didn’t excite or interest me.

I’m not interested in having my devices rewrite my emails or other messages; my notifications diet means I don’t need notification summaries; I can’t imagine using Image Playground; and Genmoji doesn’t strike me as my kind of thing, any more than Memoji was.

But I’ve learned not to dismiss these things out of hand. Emoji, for example, went from something I didn’t use on principle to a really handy shortcut form of reply. So we’ll see!

Which Apple Intelligence features excite you?

What Apple Intelligence features most excite you? Please let us know in the comments.

Image: Apple

FTC: We use income earning auto affiliate links. More.

Read Entire Article