Earlier this month, Apple updated a section on its website regarding how it collects and utilizes images for the Look Around feature in Apple Maps. This functionality is similar to Google Maps' Street View, and the update was spotted by 9to5Mac. New information reveals that starting from March 2025, Apple will use images and data gathered during Look Around surveys to "train models that support Apple products and services, including those related to image recognition, creation, and enhancement."
Apple captures images and 3D data through vehicles and backpacks equipped with cameras, sensors, and other devices, including iPhones and iPads, to enhance and improve Apple Maps. As part of its commitment to privacy, the company states that any images published in the Look Around feature will have faces and license plates blurred. Apple also notes that only these blurred images will be used when training models. While Apple accepts requests to blur specific houses, such blurring is not applied by default.
We have reached out to Apple to clarify which specific models will be trained using these images, and updates will follow based on their response. Several of Apple's intelligent features are powered by AI image generation models, including Image Playground, the cleanup tool in the Apple Photos app (which can remove parts of an image), and advanced image recognition capabilities that enhance photo search in the app.