Integrating gesture control into AR apps - a customer case

As a company, we started our journey in 2012 with the founding idea of turning AR into an efficient tool for enterprises. With the adoption of augmented reality and smartglasses a major obstacle back then was lack of user-friendly interaction. There were devices and content but no sensible ways for a user to manipulate the content. We decided to change that fact and created the first version of Augumenta Interaction Platform (AIP) SDK, a toolset that enables developers to integrate simple and efficient hand gestures into their smartglasses AR apps.

Industrial-Epson2

Since the early days, our offering has evolved from one tool into a whole suite of apps and tools for enterprise customers and all solutions support a wide variety of interaction methods. But our advanced gesture control technology holds its place at the core of each and every app, and we want to keep sharing that know-how to others. And so we have done: there are currently developers around every corner of the world engaged with our SDK.

We had the opportunity of making a short interview with one of our customers and a great privilege to talk to Mr. Joeie Oon, co-founder of FXMedia, and ask some questions to shed a light into an industrial AR project using gesture control. 

 

FXMedia

FXMedia, founded in 1994, specializes in the field of advanced visualization technologies, AR, VR and MR. They help customers and audiences to see things in completely new ways enhancing understanding, creating WOW and getting the message across more effectively. Their current team of over 30 people consists of creative and dedicated experts focusing on Web, Mobile and Immersive Multimedia including Augmented (AR), Virtual (VR)and Mixed Reality (MR) development.

 

What AR application you developed and why?

We developed a trial app for industrial use. The app uses AR overlay information to assist onsite service personnel to get the job done and to record the completed tasks. The app is currently used as proof of concept for applying government funding to build the next release.

The solution aims at addressing problems like shortage of manpower due to high turnover rates, knowledge retention as best practices and steps can be recorded, inconsistency in service delivery, and improving speed and accuracy of service quality.

 

Why did you choose to develop an AR solution for smartglasses?

We chose it for handsfree operations: UX is important and should not affect but enhance the user’s efficiency. The app currently works on Epson Moverio BT300 and BT350.

 

What type of interaction you chose for this app?

We used the Augumenta toolkit to include:

  • Palms open and close gestures to open and close overlay information.
  • Thumbs up and down to signify check passed or failed.

UX is important cause the end users aren’t necessarily IT/computer literate, they come from all walks of life. The gestures we use must be as simple and common as possible.

 

Why did you choose Augumenta SDK for your project?

We learned about your toolkit via Epson’s website. The gesture support on Epson is limited. We chose the Augumenta SDK as it is simple and easy to integrate into Unity Platform, which we are specialized in.

 

How did you find the SDK to work with?

For now, we have only been touching the surface of the toolkit, using pre-defined gestures. So far, the experience has been really smooth and easy. Currently, we are still using the SDK for the pilot project, the further use depends on the take on rate and industry acceptance.

 

What kind of future do you envision for AR?

AR has been around for long time and will be here to stay, with the advancements in technology, edge compute devices, 5G, AR will boom! We are just touching the surface on how AR can change everyday lives now, but it is not hard to imagine what else it can do for us. What we see in movies is becoming closer and closer to reality.


 

Intuitive industrial AR

Today, augmented reality applications and devices are operated in a multitude of ways. One method won’t work for all end users or use cases in the industry. The end users come from different training and skills background and the environment sets many kinds of restrictions. People have to have the freedom of choice of picking the type of interaction that works, and that method should be as intuitive for a user as possible without requiring a steep learning curve. Gesture control is, without a doubt, such a method. There’s hardly anything more natural for a human being than to communicate with the world with gestures.

 

For more info and free trial, check out the SDK website.

 

Tags: augmented reality