The modern generation of Android phones is getting new Google Lens-like features that put Search front and center in every single app on Android by letting you highlight parts of images or text to generate Search results. The newly-announced Samsung Galaxy S24 and last year’s Pixel 8 models are all getting access to this, and surprisingly, it is the first time any phone will have AI Search features natively without needing to sign up for a beta.

The Circle to Search feature works akin to Google’s multisearch, though it can go from any app, whether you’re looking at a picture, text, or video. You use a gesture to highlight the object, which could be a tap, swipe, or a circle with your finger, and up pops a search bar from the bottom bar that gives you information about the image—including pricing on products—or search results based on the text. Users can access the feature through a long press on the navigation bar or home button for those who refuse to get rid of their precious back button.

So, if you’re browsing along on Reddit and spot an image, you don’t have to go through reverse image searching to possibly garner more info about where your favorite meme came from. As for text-based Circle to Search, you only need to highlight the text to create search results as if you had typed the query into Google. When you’re done, swipe down to return to your previous app without actively closing the new interface.

Circle to Search is coming out Jan. 31 exclusively to the Pixel 8, Pixel 8 Pro, and the new and shiny Samsung Galaxy S24 series, at least for now. It’s a use case coming to the Android ecosystem over time, though for right now, only those few phones are getting access to the feature. The new feature is baked into the Google app, so Android phones need to have that application enabled and up to date to get the most out of their device.

But that’s where things get a little more AI-heavy. This week, Google’s adding even more search options on both Android and iOS through the multisearch function in Google Lens. Now, instead of receiving a report on your photos through both images and text on your phone, it can also generate a response from the company’s still-in-beta Search AI. Say you take a photo of your dying Azalea and then ask Lens how to take care of the plant. Google should then generate a text result explaining how to help your shrub survive, so long as it doesn’t hallucinate, and offer you a result that will kill your precious plant.

That function goes through the current Search Generative Experience (SGE) beta. Suppose you haven’t seen these in-Search responses yet. In that case, they come in collapsible responses with links underneath to videos or articles where the AI supposedly found (stole) the information. With the new multisearch with AI connectivity, you can get a response through the in-Search AI chatbot, and it also works with the new Circle to Search.

It’s the first time Google is opening up its SGE to primetime, or at least to all those current Pixel 8 and future Samsung S24 users, with no sign-up required. Even more Search AI features could be making their way to Android users directly through the company’s AI beta, though Google continues to call these features experimental. With more users getting easy access to text generation directly through their phones, it’s getting harder to tell who the regular users are and who the company’s guinea pigs are.

Shares:

Leave a Reply

Your email address will not be published. Required fields are marked *