Apple has launched a new feature called Visual Intelligence to improve how users interact with their iPhones. This tool, similar to Google Lens, lets users point their phone’s camera at real-world objects to get information about them. This feature was announced with the iPhone 16e and is also compatible with the iPhone 15 Pro.
Visual Intelligence works by analyzing images taken by the camera. While details on how it operates are limited, it’s suggested that it requires enough RAM to function. Its main purpose is to provide useful information about the objects being photographed, although the specific types of information it can offer aren’t fully explained.

Currently, Visual Intelligence is available on the iPhone 16 and iPhone 16 Pro, which launched with this feature. The new iPhone 16e, which doesn’t have a Camera Control button found in other iPhone 16 models, still supports Visual Intelligence. Apple has made it accessible through the Control Center, and it can be set up using the Action Button.
Related: One of Google Play Store’s Biggest Competitors Is Dying
The iPhone 15 Pro didn’t originally come with this feature because it lacked the Camera Control button, but Apple plans to include it in a future software update. They confirmed that it will be available through the Action Button and Control Center on the iPhone 15 Pro as well. While they didn’t specify which software version it will be part of, there’s speculation that it could be included in iOS 18.4.
The release of Visual Intelligence is a big move for Apple, as it expands the iPhone’s capabilities beyond just taking photos. Users can now identify objects and access related information quickly, making it more convenient. The fact that this feature is included in various iPhone models, including the more affordable iPhone 16e, is good.
Source: daringfireball
Published: Feb 20, 2025 08:00 pm