iOS 26 screenshots could be an intriguing preview of Apple’s delayed Siri rework

[ad_1]

At the beginning Apple’s visual exploration The feature allowed you to point out to your phone’s camera to your own phone or asking Google Image search or asking questions via ChatGpt. In WWDC 2025, the company carried out updates to expand the usefulness of visual intelligence, including the screen images system. Quote Company press release“Visual intelligence already helps users learn the items and places around using iPhone cameras, and now ensure that users are faster and faster with the content on the iPhone screen.”

This reminded of the “screen awareness” of the German described As one of the capabilities of SIZI, when Apple announces Apple intelligence last year. In this The press release said, the company said: “Siri with screen awareness will understand and measure with the content of users in more applications over time.” Although not quite the same, the updated screenshot is based on visual intelligence, less or less iPhone allows you to prepare contextual actions from your screen content.

In a way, it makes sense. Most people are accustomed to receiving a screenshot when you want to share or save important information they see on a website or Instagram post. Instead of not talking to the Apple Intelligence movements here, instead of not talking to Siri, it intended the tools you expect (or wait for the update).

Basically, on iOS 26 (turn on) Devices supporting Apple intelligence) It will result in a new page by pressing the power and sound levels to take a screenshot. Instead of the small image of your visible image in the left left, the options to edit, edit, share or maintain, as well as apples to get the action-based answers and actions. Sitting options to ask Chapppt in the lower left and right corners and search for Google image accordingly.

Depending on your screenshot, Apple Intelligence can offer different movements at the bottom of your image. This can ask an event to be added to your calendar, an event or plant, animals or food to determine where a similar-looking element will receive. If you have a lot on your screenshot, you can take it on an item to celebrate it (how you choose an object to delete the images) and you can get information specific to the image of the image.

Third-party applications or services with applications such as Google, ETSY and Pinterest can also be moved inside this space so that the services may also appear here. For example, if you find a brone you like and identify it and identify it, you can shop in Etsy or get used to Pinterest in Pinterest.

This is a aspect of this update to the visual intelligence that will take a nervous step between people who don’t want to do anything like me and to take anything other than buying me. It sounds like you can turn off the interface and stick to the existing screenshot system.

Apple’s SICI’s samples caused by the ability to feel like a slightly similar to your screen. As a press release since last year, “For example, if a friend uses a new address in a friend’s messages, the receiver can add this address to its contact card.”

Like visual intelligence in screenshots, it helps you to scan the content on the screen for relevant information and put it together (like contact or calendar) where it is most useful. However, along with the promise of the SIZI’s new era, the same and third party applications were more about interaction with all parts of your phone. Thus, you can open an article added to the list of readers to the assistant in Safari or send photos from a particular event to a contact.

This still ensured these progress to Siri, and CRAIG Federighi, WWDC 2025 key, can be are discussed only at the end of this year. Again, as we expect this status update, changes that come to screenshots can be a review of future things.

If you buy something through a link in this article, we can win the commission.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *