Learn More With Visual Intelligence
Visual Intelligence is a cool new feature that started out as exclusive to the iPhone 16 models due to its reliance on the new Camera Control. However, Apple also brought this to its latest iPhone 16e. Since that model doesn’t have the Camera Control, it updated Visual Intelligence to work with the Action button — a change that’s paved the way for it to come to the iPhone 15 Pro when iOS 18.4 is released next month.
With this feature, you can use your iPhone’s camera to learn more about anything you point it at. For instance, you can point your camera at an animal or a plant and instantly get more information about it. If you point it at a restaurant, you can search the web for more information about it, like its menu or contact numbers.
Additionally, you can ask ChatGPT for more information about the information you are pointing to. For instance, you can point your camera to an equation and ask ChatGPT for the solution. You won’t just see the result, but it will also show you how to do it yourself.
To use Visual Intelligence on your iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max, press and hold the Camera Control on your iPhone’s bottom right side.
On your iPhone 16e, you’ll need to add the Visual Intelligence Control to your Control Center or add it to the Action button. You can also do this on an iPhone 15 Pro or iPhone 15 Pro Max right now if you’re running the iOS 18.4 beta.
Once you open Visual Intelligence, point to an object and use the button on the bottom left side of the screen to use ChatGPT or the button on the bottom right to search the web.