Apple unveils "Visual Intelligence" feature, similar to Google Lens, expected to launch with iOS 18

2024-09-10

Apple has announced a new feature called "Visual Intelligence," which is expected to be launched later this year as part of the Apple Intelligence AI suite in iOS 18. The mechanism of this feature is similar to that of other multimodal AI systems provided by Google or OpenAI. "Visual Intelligence" allows users to "instantly understand everything they see." During today's press conference, Craig Federighi introduced this feature, which is achieved through "camera control" using the capacitive camera button added to the side of the iPhone 16 and 16 Pro. Users need to click and hold the button, then aim the phone's camera at the object of interest to trigger the feature. The "Visual Intelligence" feature is powered by a combination of on-device intelligence and Apple services (which do not store user photos). For example, users can take photos of a restaurant to obtain its operating hours, or point the camera at a flyer to automatically record details such as the title, date, and location. Federighi also mentioned that this feature is a "gateway to third-party models," meaning that users can use "Visual Intelligence" to search for bicycles discovered in the wild on Google or take study notes for conceptual assistance. Apple has not disclosed the specific release date of this feature, only stating that it will be launched "later this year along with the camera control feature."