Apple has unveiled iOS 26, the latest iteration of its iPhone operating system, at this year’s Worldwide Developers Conference (WWDC). More than a routine update, iOS 26 delivers a visual overhaul led by the all-new Liquid Glass design language—an interface imbued with depth, translucency, and fluid motion—alongside refreshed layouts and enhancements across core apps.
In a notable shift, Apple is also revamping its versioning system. Instead of continuing with sequential numbering—where this release would’ve been iOS 19—major iOS updates will now align with the calendar year they launch. So, iOS 26 reflects its 2025 debut, setting a new precedent for future updates. Here’s a closer look at what iOS 26 brings to the table.
A visual overhaul
Apple’s Liquid Glass aesthetic ushers in a refined, translucent visual language in iOS 26, blending subtle refractions and layered depth to create a system that feels both fresh and familiar, as per Apple. This design evolution breathes new life into icons, widgets, and navigation elements.
The redesign extends across the Home and Lock Screens too. Time elements now adjust fluidly around wallpaper images, while spatial scenes create a subtle 3D effect as users move their device. Core apps get polished refinements — Camera becomes less cluttered, Photos is split into Library and Collections views, and Safari stretches content edge-to-edge for a more immersive reading experience. In Music, News, and Podcasts, the floating tab bar shrinks and expands based on user interaction, keeping content front and centre.
Apple has also added new functionality to the Phone app, including call screening and the ability to stay on hold for you, giving users more control without picking up the phone. And for developers, new APIs unlock Liquid Glass elements so third-party apps can feel just as alive.
Messages gets a makeover
The Messages app gets a playful, personalised makeover in iOS 26. You can now set custom chat backgrounds either from Apple’s curated options, your own photos, or AI-generated images via Image Playground. Polls are also coming to group chats, with Apple Intelligence suggesting one when it detects a decision needs to be made. Typing indicators for group conversations add a welcome layer of real-time feedback.
On the emoji front, Genmoji gets an upgrade—you’ll now be able to fuse two emojis to create unique hybrids, moving beyond just text-based prompts. Image Playground itself also expands with the ability to generate visuals using OpenAI’s ChatGPT, giving users a more expressive toolkit.
Live translation, on-Device and instant
Thanks to Apple Intelligence, iOS 26 introduces Live Translation, a feature that can interpret conversations in real time, whether you’re texting, on a call, or FaceTiming. All translation happens on-device. Developers will be able to build Live Translation directly into their apps using Apple’s new API, making cross-language communication more seamless than ever.
Apple Music also gets real-time lyrics translation and even pronunciation guides, perfect for learning songs in other languages. A new AutoMix feature brings DJ-style transitions between tracks, making your playlists flow effortlessly. You can also now pin favourite artists and playlists to the top of the app for quick access.
Apple Maps gets smarter navigation that learns you
In iOS 26, Apple Maps becomes more intuitive, learning your preferred routes and alerting you to delays or traffic on the way. There's also a new visited places log, making it easier to revisit or share locations with friends.
Games App, one hub to rule them all
Apple is introducing a dedicated Games app, offering a central space for all your App Store downloads, Apple Arcade titles, and social features. The Play Together tab lets you see what friends are playing, with shared leaderboards and challenges to keep the competition lively.
Apple Wallet to go fully digital
This fall, Apple Wallet will start supporting digital IDs, allowing users to store official identification securely. Boarding passes are also getting smarter, with access to indoor airport maps for smoother travel. Plus, Apple Pay is tapping into Apple Intelligence to track purchases made outside the Apple ecosystem, consolidating your order history in one place.
Visual Intelligence for your iPhone
Apple is expanding its AI toolkit with Visual Intelligence, a new feature that allows your iPhone to understand and act on what’s displayed on your screen. Demonstrated during the WWDC keynote, users can now take a screenshot of an item, like a jacket in a photo, and search for it online (via Google), or capture an event listing and have Visual Intelligence suggest adding it to your calendar. You can also ask ChatGPT directly about on-screen content, making it easier to get context or next steps without switching apps.
This marks the second wave of AI upgrades under the Apple Intelligence banner. While iOS 18 introduced writing tools, Genmoji, and early ChatGPT integration, iOS 26 pushes further into on-device contextual awareness. Notably, features like Siri acting on visual cues—previously announced but delayed—are now taking shape in this more evolved form.