From CNET: While Pixel 9 and Samsung Galaxy S25 owners have had access to Gemini Live's camera mode for a while now, during its I/O conference earlier this month Google announced that the feature was rolling for all Android users and iOS users, too. The big news here is that iPhone owners can now have access to one of the coolest AI features out there, especially since all other Android users supposedly got access to the camera mode back in April.
If you're unaware of what the camera mode feature is, to put it in simple terms, Google gave Gemini the ability to see, as it can recognize objects that you put in front of your camera.
It's not just a party trick, either. Not only can it identify objects, but you can also ask questions about them -- and it works pretty well for the most part. In addition, you can share your screen with Gemini so it can identify things you surface on your phone's display. When you start a live session with Gemini, you now have the option to enable a live camera view, where you can talk to the chatbot and ask it about anything the camera sees.
I spent some time with it when it showed up on my Pixel 9 Pro XL in early April and was pretty wowed overall. I was most impressed when I asked Gemini where I misplaced my scissors during one of my initial tests.
"I just spotted your scissors on the table, right next to the green package of pistachios. Do you see them?"
View: Full Article