Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
At WWDC 2025, Apple just announced iOS 26, a rebranded, new version of the software powering the iPhone. It is due to ship in the fall (likely with new iPhone 17 models), but Apple developers will have access to the developer beta on Monday; a public beta is expected in July.
Apple has determined that the future of iPhone is brighter and more translucent with the announcement of the next version of iOS, with a new look called Liquid Glass that takes on visual characteristics of glass similar to the VisionOS interface on Vision Pro. And how do we know it’s the future? Because the next iPhone system is now iOS 26, renamed to coincide with next year as part of a lineup-wide rebranding to bring symmetry to the system names, such as MacOS Tahoe 26 and WatchOS 26.
After more than a decade of a flat, clean user interface — a revamp introduced in iOS 7 when former Apple Chief Design Officer Jony Ive took over the design of software as well as hardware — the iPhone is getting a new look. The new design extends throughout the Apple product lineup, from iOS to WatchOS, TVOS and iPadOS.
The Liquid Glass interface also now enables a third way to view app icons on the iPhone home screen. Not content with Light and Dark modes, iOS 26 now features an All Clear look — every icon is clear glass with no color. Lock screens can also have an enhanced 3D effect using spatial scenes, which use machine learning to give depth to your background photos.
The new All Clear icon look in iOS 26 is part of the Liquid Glass design.
Translucency is the defining characteristic of Liquid Glass, behaving like glass in the real world in the way it deals with light and color of objects behind and near controls. But it’s not just a glassy look: The “liquid” part of Liquid Glass refers to how controls can merge and adapt — dynamically morphing, in Apple’s words. In the example Apple showed, the glassy time numerals on an iPhone lock screen stretched to accommodate the image of a dog and even shrunk as the image shifted to accommodate incoming notifications. The dock and widgets are now rounded, glassy panels that float above the background.
As notifications fill the bottom of the screen, the subject in the background image is pushed up and the time numerals resize to accommodate.
Watch this: Apple Unveils Liquid Glass, a New Design Language
The Camera app is getting a new, simplified interface. You could argue that the current Camera app is pretty minimal, designed to make it quick to frame a shot and hit the big shutter button. But the moment you get into the periphery, it becomes a weird mix of hidden controls and unintuitive icons.
The Camera app has fewer distractions.
Now, the Camera app in iOS 26 features a “new, more intuitive design” that takes minimalism to the extreme. The streamlined design shows just two controls: Video or Camera. Swipe left or right to choose modes. Swipe up for settings such as aspect ratio and timers, and tap for additional preferences.
With the updated Photos app, viewing the pictures you capture should be a better experience — a welcome change that customers have clamored for since iOS 18’s cluttered attempt. Instead of a long, difficult-to-discover scrolling interface, Photos regains a Liquid Glass menu at the bottom of the screen.
The Photos app gets a welcome redesign.
The Phone app has kept more closely to the look of its source than others: a sparse interface with large buttons as if you’re holding an old-fashioned headset or pre-smartphone cellular phone. iOS 26 finally updates that look not just with the new overall interface but in a unified layout that takes advantage of the larger screen real estate on today’s iPhone models.
The Phone app’s unified layout should make for less switching between screens when dealing with calls.
It’s not just looks that are different, though. The Phone app is trying to be more useful for dealing with actual calls — the ones you want to take. The Call Screening feature automatically answers calls from unknown numbers, and your phone rings only when the caller shares their name and reason for calling.
Or what about all the time wasted on hold? Hold Assist automatically detects hold music and can mute the music but keep the call connected. Once a live agent becomes available, the phone rings and lets the agent know you’ll be available shortly.
The Messages app is probably one of the most used apps on the iPhone, and for iOS 26, Apple is making it a more colorful experience. You can add backgrounds to the chat window, including dynamic backgrounds that show off the new Liquid Glass interface.
Enliven your daily chats with backgrounds and more group features.
In addition to the new look, group texts in Messages can incorporate polls for everyone in the group to reply to — no more scrolling back to find out which restaurant Brett suggested for lunch that you missed. Other members in the chat can also add their own items to a poll.
A more useful feature is a feature to detect spam texts better and screen unknown numbers, so the messages you see in the app are the ones you want to see and not the ones that distract you.
In the Safari app, the Liquid Glass design floats the tab bar above the web page (although that looks right where your thumb is going to be, so it will be interesting to see if you can move the bar to the top of the screen). As you scroll, the tab bar shrinks.
Web pages occupy the entire screen and the address bar shrinks to get out of the way.
FaceTime also gets the minimal look, with controls in the lower-right corner that disappear during the call to get out of the way. On the FaceTime landing page, posters of your contacts, including video clips of previous calls, are designed to make the app more appealing.
FaceTime minimizes its controls into one corner.
Do you like the sound of that song your friend is playing but don’t understand the language the lyrics are in? The Music app includes a new lyrics translation feature that displays along with the lyrics as the song plays. And for when you want to sing along with one of her favorite K-pop songs, for example, but you don’t speak or read Korean, a lyrics pronunciation feature spells out the right way to form the sounds.
AutoMix blends songs like a DJ, matching the beat and time-stretching for a seamless transition.
And if you find yourself obsessively listening to artists and albums again and again, you can pin them to the top of your music library for quick access.
Keep the beat going with intelligence-based song transitions using AutoMix.
The iPhone doesn’t get the same kind of gaming affection as Nintendo’s Switch or Valve’s Steam Deck, but the truth is that the iPhone and Android phones are used extensively for gaming — Apple says half a billion people play games on iPhone.
Trying to capitalize on that, a new Games app acts as a specific portal to Apple Arcade and other games. Yes, you can get to those from the App Store app, but the Games app is designed to remove a layer of friction so you can get right to the gaming action.
The new Games hub has a simple control screen to let you navigate all of your Apple games on any device.
Although not specific to iOS, Apple’s new live translation feature is ideal on the iPhone when you’re communicating with others. It uses Apple Intelligence to dynamically enable you to talk to someone who speaks a different language in near-real time. It’s available in the Messages, FaceTime and Phone apps and shows live translated captions during a conversation.
Live translation during a voice call
Updates to the Maps app sometimes involve adding more detail to popular areas or restructuring the way you store locations. Now, the app takes note of routes you travel frequently and can alert you of any delays before you get on the road.
A Maps widget shows a frequently-used route.
It also includes a welcome feature for those of us who get our favorite restaurants mixed up: visited places. The app notes how many times you’ve been to a place, be that a local business, eatery or tourist destination. It organizes them in categories or other criteria such as by city to make them easier to find the next time.
Liquid Glass also makes its way to CarPlay in your vehicle, with a more compact design when a call comes in that doesn’t obscure other items, such as a directional map. In Messages, you can apply tapbacks and pin conversations for easy access.
Widgets are now part of the CarPlay experience, so you can focus on just the data you want, like the current weather conditions. And Live Activities appear on the CarPlay screen, so you’ll know when that coffee you ordered will be done or when a friend’s flight is about to arrive.
The new CarPlay interface with Liquid Glass.
The Wallet app is already home for using Apple Card, Apple Pay, electronic car keys and for storing tickets and passes. In iOS 26, you can create a new Digital ID that acts like a passport for age and identity verification (though it does not replace a physical passport) for domestic travel for TSA screening at airports.
The app can also let you use rewards and set up installment payments when you purchase items in a store, not just for online orders. And with the help of Apple Intelligence, the Wallet app can help you track product orders, even if you did not use Apple Pay to purchase them. It can pull details such as shipping numbers from emails and texts so that information is all in one place.
The Wallet app supports legal state IDs and national IDs for age and identity verification.
Although last year’s WWDC featured Apple Intelligence features heavily, improvements to the AI tech were less prominent this year, folded into the announcements during the WWDC keynote.
As an alternative to creating Genmoji from scratch, you can combine existing emojis — “like a sloth and a light bulb when you’re the last one in the group chat to get the joke,” to use Apple’s example. You can also change expressions in Genmoji of people you know that you’ve used to create the image.
Combine existing emoji using Apple Intelligence.
Image Playground adds the ability to tap into ChatGPT’s image generation tools to go beyond the app’s animation or sketch styles.
Visual Intelligence can already use the camera to try to decipher what’s in front of the lens. Now the technology works on the content on the iPhone’s screen, too. It does this by taking a screenshot (press the sleep and volume up buttons) and then including a new Image Search option in that interface to find results across the web or in other apps such as Etsy.
This is also a way to add event details from images you come across, like posters for concerts or large gatherings. (Perhaps this could work for QR codes as well?) In the screenshot interface, Visual Intelligence can parse the text and create an event in the Calendar app.
Not everything fits into a keynote presentation — even, or maybe especially, when it’s all pre-recorded — but some of the more interesting new features in iOS 26 went unremarked during the big reveal. For instance:
The finished version of iOS 26 will be released in September or October with new iPhone 17 models. In the meantime, developers will get access to the first developer betas starting on Monday, with an initial public beta arriving in July. (Don’t forget to go into any beta software with open eyes and clear expectations.)
Follow the WWDC 2025 live blog for details about Apple’s announcements.
iOS 26 will run on the iPhone 11 and later models, including the iPhone SE (2nd generation and later). That includes:
This is a developing story.