CamFind: Snap a Picture, Learn More about Any Object

February 28, 2013

5:02 pm

With two weeks left, CamFind is raising money on Kickstarter to bring its vision of the future of search into reality.

That reality might look like this: you see a pair of shoes you like, snap a picture, and see stores nearby that carry them. A photo of a movie poster pulls up a preview; a snapshot of a Starbucks coffee can bring up the word “coffee” in Chinese.

The technology behind CamFind is ten years in the making. Brad Folkens and Dominik Mazur used part of it to build TapTapSee: a camera for the blind that tells them what the object in front of them is. Now, CamFind is combining image recognition and some human crowdsourcing to recognize objects almost instantly.

And not just generic objects: the goal is to have CamFind see not a “leather purse,” but a “leather Louis Vuitton monogram clutch,” for example.

According to Dmitriy Konopatskiy, marketing, CamFind has already achieved 80-90 percent accuracy, which he claims is much better than Google Googles.

 “The evolution of search engines is inevitable. People love taking pictures, and people already have their cell hpones. We just combined the three things – we took search, we took the love for cell phones, and we took the love and passion that people have for taking pictures,” says Konopatskiy.

If the Kickstarter project reaches its $60,000 goal, CamFind will use the funds to hire developers and improve the prototype – increasing accuracy, speed, and features.

Update, April 2013: The app is now available. 

Did you like this article?

Get more delivered to your inbox just like it!

Sorry about that. Try these articles instead!

Kira M. Newman is a Tech Cocktail writer interested in the harsh reality of entrepreneurship, work-life balance, and psychology. She is the founder of The Year of Happy and has been traveling around the world interviewing entrepreneurs in Asia, Europe, and North America since 2011. Follow her @kiramnewman or contact

Leave a Reply

  • (will not be published)