At WWDC 2025, Apple unveiled an expansion of its Visual Intelligence technology that now allows iPhone users to analyze and interact with on-screen content across any app, enabling features like image searching, event extraction for calendars, and ChatGPT integration through the same button combination used for screenshots.
Visual Intelligence in iOS 26 seamlessly integrates with the screenshot functionality, allowing users to interact with on-screen content in powerful new ways. When users take a screenshot, they'll see new Visual Intelligence options alongside traditional editing tools, enabling them to search for products they spot while browsing social media, instantly add events to their calendar, or send screenshots to ChatGPT for analysis and additional information.123
For developers, Apple is providing tools to integrate their apps with this experience through App Intents, allowing third-party applications to become part of the Visual Intelligence ecosystem. Craig Federighi, Apple's head of software engineering, emphasized that this feature will make it possible to "search visually across your most-used apps using Visual Intelligence with the iPhone camera."14 The system maintains Apple's commitment to on-device processing for privacy while delivering contextual assistance that works automatically with any app.56
Visual Intelligence in iOS 26 introduces powerful ChatGPT integration that allows users to analyze anything on their iPhone screen. Users can highlight objects of interest and ask ChatGPT specific questions about what they're viewing to gain deeper insights and contextual information12. For example, when looking at a photo containing food items, users could ask "what recipe could I make from these?"3 The system works by taking a screenshot and uploading it to ChatGPT, which then uses its knowledge base to provide relevant information back to the user4.
This feature builds upon the ChatGPT integration first introduced in iOS 18.2, but now offers more seamless visual analysis capabilities across the entire operating system4. Developers can leverage this functionality through App Intents, allowing their applications to participate in the visual search experience while maintaining Apple's privacy standards15. The screen analysis capability is particularly valuable for users without the latest iPhone models that include dedicated Visual Intelligence hardware, as it provides similar functionality through software integration4.
Apple's new "Liquid Glass" design language represents a significant visual overhaul across all its operating systems, including iOS 26, iPadOS 26, macOS Tahoe, watchOS 26, and tvOS 26.12 This unified interface features translucent elements with rounded corners that perfectly match Apple's modern hardware design, creating greater harmony between software and devices.2 The glass-like material dynamically refracts light, responds to movement with specular highlights, and adapts its color based on surrounding content through real-time rendering.13
Liquid Glass elements behave with a gel-like flexibility, illuminating from within when touched and morphing between different states as users navigate apps.3 The design creates a distinct functional layer for controls that floats above content, with elements that nest concentrically with the rounded corners of windows and screens.23 This approach prioritizes content visibility while making the interface feel more responsive and alive, with Apple describing the experience as "lively" and "delightful."12 The system also includes accessibility features like Reduced Transparency, Increased Contrast, and Reduced Motion to ensure the interface remains usable for all users.3