Share this @internewscast.com
According to Apple expert Mark Gurman from Bloomberg’s Power On Newsletter, Apple is in the process of integrating cameras into the Apple Watch to introduce AI capabilities like Visual Intelligence within the next two years.
Gurman mentions that the cameras will have different placements based on the watch model. For the standard Series Watch, the cameras will be positioned “inside the display,” while the Apple Watch Ultra will have the camera located on the side, next to the digital crown and button. These cameras will enable the Apple Watch to use AI to perceive the surrounding environment and provide users with relevant information. This aligns with the speculated plan for upcoming camera-equipped AirPods from Apple.
Visual Intelligence features, initially seen on the iPhone 16, utilize the device’s camera to perform tasks such as extracting event details from a flyer and adding them to your calendar, or retrieving information about a restaurant. These features are currently supported by AI models from external sources. However, Gurman reports that Apple aims to transition to using its proprietary in-house AI models by 2027. This transition is timed to coincide with the anticipated release of the new Apple Watches and AirPods equipped with cameras.
Visual Intelligence and other AI features coming to Apple’s wearables would depend a lot the leadership of Mike Rockwell, who Gurman reported last week is now in charge getting the delayed Siri LLM upgrade on track. Rockwell was previously in charge of the Vision Pro and reportedly will continue to work on visionOS. That software is expected to power another Apple wearable that’s likely to have a big AI component, but that’s likely still several years away: AR glasses in the vein of the Orion concept that Meta showed off last year.