Apple has been known to play the long game. It rarely rushes a feature out the door, and when it finally makes a move, it tends to go all the way. The latest report from Bloomberg‘s Mark Gurman – who has arguably the most reliable track record when it comes to Apple leaks – suggests that iOS 27 will bring one of the most meaningful upgrades to the iPhone camera experience in years. And it doesn’t involve a new sensor, a bigger aperture, or any hardware at all. It’s all about making your iPhone’s camera smarter – powered by Siri and Apple Intelligence.
Right now, a lot of iPhone users don’t even know Visual Intelligence exists. That’s not exactly surprising, because the way Apple implemented it in iOS 26 wasn’t exactly designed for discoverability. You had to either press and hold the Camera Control button – a hardware button only available on iPhone 16 and newer models – or dig into Control Center to find it. For the majority of iPhone users, especially those on older devices or anyone who hasn’t explored every corner of their phone’s interface, Visual Intelligence has been hiding in plain sight.
Apple appears to know this. According to a Bloomberg report, the company is planning to move Visual Intelligence directly into the Camera app in iOS 27, giving it the same first-class treatment as Photo, Video, Portrait, and Panorama modes. The new mode will reportedly be called “Siri mode,” and when activated, the standard white shutter button will transform into an Apple Intelligence-styled icon – making it immediately clear that something different is happening. This isn’t just a cosmetic tweak. It’s a signal that Apple wants this feature to become a core part of how people interact with their cameras every single day.
Think about what Visual Intelligence already does and you start to see why Apple is putting it front and center. In its current form, the feature can identify objects, translate text, pull up search results from Google or ChatGPT, and help you figure out what you’re looking at in real time. But the version coming in iOS 27 is reportedly going to be significantly more capable – and notably, much less dependent on ChatGPT. That’s a meaningful shift. Apple has been working to build out its own on-device AI capabilities through Apple Foundation Models, and reducing the reliance on third-party AI services like OpenAI is very much in line with where the company wants to be headed.
So what can the upgraded Visual Intelligence actually do? The new capabilities that have leaked so far are genuinely practical. One of the most talked-about features is the ability to scan a nutrition label on food packaging and automatically log that dietary information into the Health app. It sounds simple, but for anyone who tracks their food intake, it removes an incredibly tedious manual step. Another feature will let you point your camera at a business card or any contact information – a phone number on a flyer, an address on a receipt – and have it automatically pulled into your Contacts app. These aren’t flashy demo moments. These are the kinds of features people will actually use on a Tuesday morning when they’re rushing around and don’t have time to type anything manually.
There’s also the broader upgrade to the Visual Intelligence experience itself. Gurman previously reported that hidden code found in Apple’s software pointed to the ability to convert physical tickets and paper passes into digital Wallet passes by simply pointing your camera at them. The era of digging through your junk drawer for a concert ticket stub could genuinely be coming to an end. Safari is reportedly also getting AI-powered tab grouping as part of this same wave of iOS 27 features, which suggests Apple is doubling down on making intelligence feel ambient and ever-present across the entire iOS ecosystem.
What makes the “Siri mode” framing so interesting is the branding choice itself. Apple has been under significant pressure to prove that Siri can compete with the likes of ChatGPT, Google Gemini, and other AI assistants that have leapfrogged it in terms of conversational ability and real-world usefulness. Placing a Siri-branded mode directly inside the Camera app – one of the most-opened apps on any iPhone – is a clear statement of intent. It says, “Siri is here, Siri is capable, and you’re going to see it every time you open your camera.” It’s less about the technology itself and more about changing user perception, which has always been half the battle for Apple’s AI ambitions.
Beyond the camera, Gurman’s reporting paints a picture of a much broader Siri overhaul coming in iOS 27. This includes a standalone Siri app and a persistent chatbot-like interface – essentially Apple’s answer to what ChatGPT and Gemini have been offering for some time. Siri is also expected to get better at handling multi-step commands, so instead of asking it one thing at a time, you could string together a more complex request and actually get a coherent result. Whether Apple delivers on that promise is something we’ll find out soon enough.
The big reveal is expected on June 8, when Apple kicks off WWDC 2026 at Apple Park in Cupertino. That’s when iOS 27 – along with updates to macOS, watchOS, and the rest of Apple’s operating systems – will be announced to the world. Developer betas will follow shortly after, and the public release is expected in September, likely alongside the iPhone 18 lineup. For now, what we have is a compelling preview of an Apple that is clearly trying to make AI feel less like an experiment and more like something genuinely useful – something that works for you quietly in the background, or right there in the app you already open dozens of times a day.
Whether Siri mode in the Camera app will live up to the hype is a fair question. Apple has made big promises about Siri’s AI capabilities before and delivered late or underwhelmed. But integrating Visual Intelligence into the Camera app, making it harder to ignore and easier to use, is exactly the kind of thoughtful, user-focused move that Apple does best. If the features work as advertised, iOS 27 could be the update that finally makes people think of Siri the way they think of Google Lens – as a camera tool that genuinely gets things done.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
