Share this @internewscast.com
One Apple Intelligence feature that has been successfully implemented is Visual Intelligence. This feature leverages your iPhone’s camera to recognize and provide information about the items and places you encounter in daily life.
For instance, you can take a picture of a pizza restaurant to learn its operating hours, or use the camera to identify a plant and receive guidance on its care. If you’ve previously used Google Lens, you’ll understand how this works.
However, this capability is not universally available. It requires you to be on iOS 18.2 with the iPhone 16 lineup, including iPhone 16, iPhone 16 Plus, iPhone 16 Pro, or iPhone 16 Pro Max; iOS 18.3 on the iPhone 16E; or iOS 18.4 on the iPhone 15 Pro and iPhone 15 Pro Max. Additionally, you need Apple Intelligence enabled via the Apple Intelligence & Siri settings.
How to launch Visual Intelligence
If you have an iPhone 16 with a Camera Control button on the right-hand side, you can tap and hold this button to bring up the camera and Visual Intelligence.
If you’ve got an iPhone 16E, iPhone 15 Pro, or iPhone 15 Pro Max, you’ve got a few different options to choose from:
How to use Visual Intelligence
There are all kinds of ways to use Visual Intelligence. Most of the time, it’ll be able to recognize and respond to prompts about anything you show it, so try experimenting and see what you get.
Outside of those options, you’ve got two features you can use, which appear as buttons onscreen whenever Visual Intelligence is looking at something.
To exit Visual Intelligence at any time, swipe up from the bottom of the screen.