CNBC recently aired a program called “Inside the Mind of Google” and revealed that Google is working on a visual search project that internally is referred to as Google Goggles. The application, which will premier on Android devices, will let a user snap a photo of anything and then Google will deliver search results based on that image. Google Product Manager Hartmut Neven took a photo of the Santa Monica pier during a demo given to CNBC interviewer Maria Bartiromo and the application correctly identified the photograph and gave relevant search results. Thankfully it worked for the purposes of the demo, because back in August of this year when Google was testing the application with a focus group, they didn’t like the application at all and even called it “useless”, saying they’d rather just type in a query. Google Goggles is built on top of technology that was acquired when Google purchased Neven Vision in 2006. Originally his software was used for facial recognition, but now Google has become more ambitious. It’s out now for Google Android devices, just open up Android Market and search for “Google Goggles”, more platforms are expected, but no time frame has been given.
[Via: eWeek, Mobile Burn]
Update: Video found with a demo of the service on Search Engine Land. Fast forward to 3 minutes 30 seconds and crank the audio since it’s very faint. Nokia is also working on this, they’re calling it Point and Find.
Update: Official video from Google: