In Demand Horizon a book about product innovation, there’s a sidebar that describes Search as a perfect marketplace marrying user intent with market solutions. This jumped out at me. I consider Search a good marketplace, but far from perfect – for the simple reason that words are hard.
Search was a pretty amazing break through. It connected markets with consumer intent more tightly than any thing before it. But search is only as good as what goes into it. That’s why auto-suggest is so popular. We often fall short of expressing our intent and the search engines are good at identifying trends and interest.
Keywords vs. Natural Language
Progress continues to be made with the text-based searches. Back when Google trained us to use keywords, it made sense. We’re not all speed typists, so fewer words were better.
Now natural language is getting better and more relevant. Ask.com, when it was Ask Jeeves, encouraged us to ask questions and Bing added natural language searches early in the first few years of its existence. So, this is nothing new, but Google recently improved its natural language search efforts with their Hummingbird release since more of us our searching from our phones and using voice to define the query instead of the keyboard.
No-Text Search Queries
Yet, this still doesn’t address the core issue of how poorly we all tend to communicate when we only use our words. There’s good research on how well we communicate in email and how we communicate directly with another person. And we’re really not that good at it. That’s notable whether we’re talking to a search engine, a personal relation or professional partner.
Basically, people tend to believe that they can communicate more effectively than they actually can. Which gets me back to the position that Search is a good marketplace, but is still just an early prototype of what will eventually emerge.
Visual Search – The Growing Trend
One of the things emerging is visual search. Not to be confused with a product like Oscope, which allows you to search by looking at a bunch of pictures (think Google images). Visual search is using an image of what you want to be the defining query of your search.
I spent the past few months advising an Israeli-based startup, Imagry, one of the companies in the 10xelerator. They are advancing image recognition technology in a significant way, removing many of the barriers seen in the current market place.
Yet, even in the limited use cases, what’s available in the visual search space is pretty exciting.
Google Goggles works with certain types of queries, like books, landmarks, text and logos – and it can recognize up to three items at a time. It has limitations in less rigid structures like plants, animals or clothing.
Amazon Snap it and the Flow app are both visual search engines from Amazon. One helps you use pictures of the cover of books, CDs and DVDs to launch a traditional search query (on Amazon). The other an augmented reality tool that provides an overlay of all the information Amazon knows about a given book, CD or DVD.
Pounce uses pictures of “real world advertisements” to return shopping results and a click-to-buy option from any picture you take from select store flyers and advertisements.
Kooaba has a tenured history in visual search with a number of apps in market and based on its technology. Their Shortcut and Deja Vu seem most noteworthy, but their full list can be found on the Kooaba apps page.
Kooaba basic technology is very similar to Google Goggles allowing for recognition of print and paintings (including newspapers, magazines, catalogs, billboards, and more), many products (including consumer goods with label or printed packaging), wine labels and places like landmark buildings, statues and houses.
TinEye is another early player in this space but with a different approach. TinEye is a reverse image search engine The intended user base of the site includes owners of copyrighted images who are looking to see if their work is being used online without authorization, as well as brand marketers who are tracking where their brands are showing up on the web
Still in the Early Days of Search
Visual search is the emerging technology, but far from the optimal tool. While it’s much more complete in some respects, it still falls short from capturing a full range of intent, meaning, and emotion… all the things that combine to make for more clear and meaningful communication.
Which is why mobile search will soon integrate the other sensory fields available to the phone including touch, sight, sound so our temperature, facial expression, tone of voice will be reflected in the queries we make. There are interesting days ahead.
What are your predictions for what will emerge in the search space? Your thoughts and comments are welcome below.