3 min read

What is gesture-based UI?

Topics:
Featured Image

Gesture-based UI refers to using specific physical gestures in order to operate an interface. Take your smartphone for instance. You can already interact with your phone without using the keypad by swiping, tapping, pinching, and scrolling. The latest smart devices also allow for “touchless” gestures where users can scroll or click without ever touching the screen.

With some embedded GUIs, you can simply tilt or shake the device to engage with it using built-in accelerometers, gyroscopes, or magnetic sensors. Some products on the market today also support advanced camera and sensor technology that pick up facial expressions and eye movements to scroll, click, and interact.

Read our gesture documentation

 

Where do we commonly see gesture-based UI?

The mobile device market isn’t the only industry that’s already using gesture-based interactions. Gestural UI is also commonplace in the gaming, automotive, and medical industries. Popular consoles, such as Xbox, use cameras and sensors to track player movements and gestures for many of their interactive games. Automotive engineers are integrating GUI interfaces in their driver information displays to change temperature and volume by making a touchless gesture in front of the screen. Jaguar Land Rover, for example, teamed up with Cambridge University to develop a gesture reading system in response to the COVID-19 pandemic:

"In the ‘new normal’ once lockdowns around the world are lifted, a greater emphasis will be placed on safe, clean mobility where personal space and hygiene will carry premiums.”

In hospitals, gesture-based UI is being used to help doctors and surgeons view images and records without having to step foot out of the operating room. These are just a few examples of the many industries that are exploring all the benefits that this can provide for its customers.

Why touchless gestures matter

Consumers’ attitudes toward public touchscreens have quickly changed since the COVID-19 pandemic started. Now with social distancing and hygiene being top of mind, there is further need for “touchless” devices as more consumers now fear having to physically touch screens in public areas. In fact, experts have already noticed a sudden decline in fingerprint technology shipments worldwide due to hygiene concerns. The demand for touchless options to pay for items at the grocery store, access money from a banking machine, or sign for packages, has already started to fuel the future of touchless GUIs.

Read the significance of voice and gesture control in embedded products.

Gesture-based UI using Ultraleap Touchfree air-push interaction

The evolution of GUI technologies has also improved UX design and development processes in general. Previous UI interfaces quickly became a huge source of frustration for consumers. Low-quality touchscreens on smart devices created poor response times and precision, faulty fingerprint scanning, and even damage from wear and tear. Today, consumers want higher-quality touchscreens with better graphics, faster response times, and touchless GUI designs that are far more hygienic and convenient to use.

Advancements in displays, cameras, and sensors over the past few years have opened the gateway for better experiences for consumers and developers alike. For example, the recent update to the Google Fit app allows mobile device users to measure their heart rate and respiratory rate to track day-to-day wellness, all using the device’s camera.

The future of embedded GUIs

Experts believe that we are only at the tip of the iceberg regarding the gesture capabilities of embedded GUIs. As the technology evolves, we can expect to see improved responsiveness, moving towards the ability to predict what users are going to do before they even do it. The gaming industry is already developing GUI-enhanced software that utilizes gaze-tracking and telepathic-based science to create better virtual reality experiences. The medical industry is moving towards highly advanced GUI devices for high-risk environments that will reduce the transmission of germs and bacteria and speed up necessary processes.

There’s no doubt that the demand for gesture-based GUIs is exploding as well as the technology around it. Developers and design teams that are integrating these technologies now are already starting to get a leg up on their competition. For any embedded GUI team, now is the time to start putting touchless gestures into your development framework.

Are you ready to get started with gesture-based GUI development? Check out this webinar with Crank, NXP, and Ultraleap on gesture control, touchless technologies, and hardware platform selection:

New call-to-action