We recently caught up with Clark Boyd, a visual search[1] expert and regular contributor to Search Engine Watch. We discussed camera-based visual search – that futuristic technology that allows you to search the physical world with your smartphone – what it means for the way search is changing, and whether we’re going to see it become truly commonplace any time soon.

In case any of our readers aren’t up to speed on what ‘camera-based visual search’ actually is, we’re talking about technology like Google Lens and Pinterest Lens; you can point your smartphone camera at an object, the app will recognize it, and then perform a search for you based on what it identifies.

So you can point it at, for example, a pair of red shoes, the technology will recognize that these are red shoes, and it’ll pull up search results – such as shopping listings – for similar-looking pairs of shoes.

In other words, if you’ve ever been out and about and seen someone with a really cool piece of clothing that you wish you could buy for yourself – now you can.

 

First of all – what’s your personal take on camera-based visual search – the likes of Google Lens and Pinterest Lens? Do you use these technologies often?

I have used visual search on Google, Pinterest, and Amazon quite a lot. For those that haven’t used these yet, you can do so within the Google Lens app (now available on iOS), the Pinterest app, and the Amazon app too.

In essence, I can point my smartphone at an object and the app will interpret it based on what it sees, but also what it assumes I want to know.

Read more from our friends at Search Engine Watch