Portland-based OnTheGo Platforms (OTG) today announced the beta release of their Augmented Reality Interface SDK, otherwise known as Ari. The newly designed SDK will allow developers building for smart glasses, like Google Glass, to implement hand-gesture-controls for users.
“We’ve been moving quickly in this land-grab space of wearables,” says Ryan Fink, OTG CEO.
The OTG team recognized that while voice and touch controls work well on smartphones, smart glasses technology needs a different take with a different interface. Part of the inspiration behind Ari, then, is to give developers the extra excitement they need to create amazing apps for the smart glasses realm.
Developers, enterprises, and smart glass OEMs can integrate Ari directly into their app, product, or hardware. Thus, Ari uses a single, outward-facing camera to track the user’s hand motions and gestures, thereby enabling them to interact naturally with their displayed content.
Since all current smart glass technology on the market runs Android, Ari was designed off of an Android stack. OTG wants to remain device agnostic and distribute Ari across all major smart glass devices for the forseeable future, and they’re hoping the result will be a universal smart glasses user experience regardless of hardware.