close
close

Visual Intelligence Made Using Camera Control on My iPhone 16 Worth It

Visual Intelligence Made Using Camera Control on My iPhone 16 Worth It

One of the biggest selling points of the iPhone 16 hardware is the camera control button. It’s a small physical button located at the bottom right of the bezel that also has capacitive capabilities. With the initial launch of iOS 18, a single tap launches the camera app of your choice, and you can perform half-taps and swipe gestures to adjust camera settings. It’s a good idea, but it has a few flaws that prevent it from being a great trigger.

But now we have iOS 18.2, and that brought many new Apple Intelligence features to our phones, especially if you own an iPhone 16. With iOS 18.2, Apple finally added Visual Intelligence, a feature similar to Google Lens, but on iPhone.

After playing around with the latest update, I’m happy to report that Visual Intelligence is a real game-changer for camera control, and that the good but awkward button is finally worth using.

Camera control let me down at first

Andy Boxall / Digital Trends

When I received my iPhone 16 Pro on launch day, I was excited because I couldn’t wait to try Camera Control for all my photography needs. But as I used the camera control as a trigger and a way to adjust settings, problems started to arise.

Firstly, the position of the Camera Control is not ideal if you want to use it as a trigger. It’s more toward the bottom center of the frame instead of closer to the bottom. If you’re taking a photo in landscape orientation, you may still need to reach out to press the camera control. For me, that meant part of my thumb would end up in front of the screen, obstructing it. This would be a bigger problem on the iPhone 16 Pro Max due to its size.

Christine Romero-Chan / Digital Trends

Another issue I encountered was that pressing the camera control involved slight camera shake, which could result in some blurring in a still image. Adjusting the pressure sensitivity helped a bit, but there will still be a slight shake compared to touching the on-screen shutter button.

When I initially launched the iPhone 16, I struggled with how much pressure it took to get half-press to access the camera settings. It seems Apple has fixed this issue with recent updates, but I still find it faster to just use the touchscreen. A few months after launch, I pretty much only use Camera Control to launch the Camera app. During this time, I continue to take photos with the on-screen shutter to make sure my photos aren’t blurry or blurry.

Visual intelligence is what Camera Control needed

Christine Romero-Chan / Digital Trends

Before iOS 18.2, camera control for me was just a camera-only action button. But now that I’ve updated my phone and finally have access to Visual Intelligence, I’m using Camera Control more.

To activate visual intelligence, press and hold the camera control. It displays a viewfinder to let you point the camera at something in the real world. Then you can either select the shutter/camera control button to take a quick capture (not saved to the library) before inquiring about it, or select Ask or Search. The Ask option will default to asking ChatGPT a simple “What is this?” » or you can ask for more details about what you’re looking at. The search will display Google results related to the item you are searching for.

Jesse Hollington / Digital Trends

What you can achieve with visual intelligence depends on what you point your camera at. So far I’ve used it to identify plants, animals and random objects. But you can also use it to search for details about points of interest, businesses, services, contact details, translate text, etc.

Although I haven’t had much time to use it since installing iOS 18.2, I can see myself using this feature quite a bit when I’m on the go. It also appears that camera control placement works better for visual intelligence than a camera’s shutter button. I’m right-handed, so I usually hold my phone that way, with my thumb on Camera Control. I can easily use Visual Intelligence with one hand, unlike using Camera Control as a camera trigger.

I no longer regret buying my iPhone 16 Pro

Nirave Gondhia / Digital Trends

I’ve upgraded my iPhone every year since the beginning, but this was the first year where I had doubts, at least at first. When the iPhone 16 series launched, Apple Intelligence didn’t ship with them, so while the hardware was good, the software seemed incomplete.

But now that Apple has rolled out the much-hyped Apple Intelligence features it’s been advertising for, I’m happy with my iPhone 16 Pro purchase. The main goal of Camera Control is visual intelligence, in my opinion, as well as quick access to the camera. And add to that the fact that the smaller iPhone 16 Pro now has 5x optical zoom, yes, I’m a happy camper.

From what I’ve seen, it doesn’t seem like many people have used Camera Control since its debut. I definitely only used it for one thing. But now, with iOS 18.2 and Visual Intelligence, I think Camera Control might be my favorite new iPhone feature.