
Exactly one year after Apple introduced Apple Intelligence, the Cupertino-headquartered tech giant has introduced a couple of new features. The announcements were made at its flagship event Worldwide Developers Conference (WWDC) 2025 in California which happening between June 9-13.
Apple Intelligence suite of on-device and cloud-based Artificial Intelligence (AI) tools integrated across Apple’s platforms—iPhone, iPad, Mac, Apple Watch, Vision Pro, and more. On Monday, Apple introduced new Apple Intelligence features that elevate the user experience across iPhone, iPad, Mac, Apple Watch, and Apple Vision Pro.
This time the new AI capabilities announced at the event were rather measured, unlike rival’s aggressive AI push. That said, following are some of the standout introductions made:
Live translation

The live translation feature will help users communicate in different languages while messaging or speaking. Enabled by Apple-built in models that run on devices entirely, these features are integrated into Messages, FaceTime, and Phone. It means that Live Translation can automatically translate messages; enable live captioning on FaceTime calls; and on a phone call, the translation is spoken aloud throughout the conversation.
Visual intelligence
Built on Apple Intelligence, visual intelligence helps users learn about their environment using their iPhone camera. Moving a step ahead, users can now ask ChatGPT questions about what they’re looking at on their screen to learn more, as well as search Google, Etsy, or other supported apps to find similar images and products.
Opening AI models to developers
After weeks of speculation, Apple confirmed that it is opening up access for any app to tap directly into the on-device foundation model at the core of Apple Intelligence. This will allow developers to build AI-powered features directly into their apps using the same underlying technology as Apple Intelligence.

This means developers can create new, privacy-focused experiences that work offline and don’t rely on paid cloud APIs for inference.
According to Apple, the Foundation Models framework offers native support for Swift, the company’s preferred language for app development across its platforms. Apple says developers can start using its Apple Intelligence models with as little as three lines of code, aiming to make AI integration simple and accessible.
Other announcements
Beyond unveiling the above mentioned features, Apple announced a slew of other capabilities that include intelligent Shortcuts; identifying and summarising order tracking details in Apple Wallet; creating polls in Messages; better writing tools for rewriting, proofreading and summarising; natural language search, among others.
