It's safe to say Apple Intelligence hasn't landed in the way Apple likely hoped it would. However, that's not stopping the company from continuing to iterate on its suite of AI features. During its WWDC 2025 conference on Monday, Apple announced a collection of new features for Apple Intelligence, starting with upgrades to Genmoji and Image Playground.
In Messages, for instance, you'll be able to use Image Playground to generate colorful backgrounds for your group chats. At the same time, Apple has added integration with ChatGPT to the tool, meaning it can produce images in entirely new styles. As before, if you decide to use ChatGPT directly through your iPhone in this way, your information won't be shared with OpenAI without your permission.
Separately, Genmoji will allow users to combine two emoji from the Unicode library to create new characters. For example, you might merge the sloth and light bulb emoji if you want to poke fun at yourself for being slow to understand a joke.
Across Messages, FaceTime and its Phone app, Apple is bringing live translation to the mix. In Messages, the company's on-device AI models will translate a message as you type into your recipient’s preferred language. When they responded, each message will be instantly translated into your language. In FaceTime, you'll see live captions as the person you're chatting with speaks, and over a phone call, Apple Intelligence will generate a voiced translation.
Visual Intelligence is also in line for an upgrade. Now in addition to working with your iPhone's camera, the tool can scan what's on your screen. Like Genmoji, Visual Intelligence will also benefit from deeper integration with ChatGPT, allowing you to ask the chat bot questions about what you see. Alternatively, you can search Google, Etsy and other supported apps to find images or products that might be a visual match. And if the tool detects when you're looking at an event, iOS 26 will suggest you add a reminder to your calendar. Nifty that. If you want to access Visual Intelligence, all you need to do is press the same buttons you would to take a screenshot on your iPhone.
As expected, Apple is also making it possible for developers to use its on-device foundational model for their own apps. "With the Foundation Models framework, app developers will be able to build on Apple Intelligence to bring users new experiences that are intelligent, available when they’re offline, and that protect their privacy, using AI inference that is free of cost," the company said in its press release. Apple suggests an educational app like Kahoot! might use its on-device model to generate personalized quizzes for users. According to the company, the framework supports Swift, Apple's own coding language, and the model is as easy as writing three lines of code.
An upgraded Shortcuts app for both iOS and macOS is also on the way, with support for actions powered by Apple Intelligence. You'll be able to tap into either of the company's on-device or Private Cloud Compute model to generate responses that are part of whatever shortcut you want carried out. Apple suggests students might use this feature to create a shortcut that compares an audio transcript of a class lecture to notes they wrote on their own. Here again users can turn ChatGPT if they want.
Developing…
This article originally appeared on Engadget at https://www.engadget.com/ai/apple-intelligence-at-wwdc-everything-apple-announced-for-ios-macos-and-more-171133202.html?src=rss
Source link