Human-computer interaction via human machine interface (HMI)—using touch-sensing devices, such as keyboards, buttons, sliders and touchpads—is something we have been accustomed to since the beginning of the computer era. Today, touch-free technologies are gaining more and more attention, taking user experience to a completely new level of engagement. Thanks to Microsoft Kinect and Apple motion coprocessors, touchless interaction through body gesture recognition has been introduced into multiple industries, bringing in a fresh perspective on the human–computer interaction paradigm and resulting in productive ideas and unique user experiences.

Everything Begins with an Idea

Natural User Interface (NUI) models allow users to interact with a device through actions that are natural, intuitive and common to everyday human behavior. Powered by Kinect sensors, user interfaces can react to human gestures and body movements, providing an immediate and seamless response. Motion sensing and gesture recognition as part of NUI interactions introduce new approaches to interactive user experience in the areas that require audience attraction and involvement, such as education, entertainment, public relations, and marketing. Presentations, training sessions, and promotional events can become highly interactive and engaging with NUI applied.

To that end, earlier this year we used Microsoft Kinect motion technology applied to enhance the audience engagement at the Behance Portfolio Review held for the local designer community. This volunteer-organized event was initiated by our company’s UX team with the goal of bringing together creative professionals for knowledge sharing and networking.

Bringing Innovation into Event Management

The depth of the audience engagement is one of the keys to successful events. That is why organizers sometimes opt to do things differently and look for alternative ways to engage with attendees. Inspired by the NUI trend, we started experimenting with touch-free interaction to develop our own unique design patterns.

The concept of an interactive installation began with an idea to bring the Behance online experience to the physical reality with the evaluation process being not only transparent and constructive, but also interactive and highly engaging. We strived to completely reimagine the way people review and evaluate Behance showcases. Our intent was to let the attendees review the Behance website works and vote for them in real time using simple hand movements. While preparing the interaction scenario for installation interface, we aimed at both high responsiveness and simplicity. We chose swipe gestures for users to navigate through selection of works, swiping up and down to scroll the portfolio and two thumbs up to appreciate the work while voting.

To empower this concept we opted for Microsoft Kinect motion technology that enables body gesture recognition. From the visual perspective the intent was to create a unique UI in the form of a 3D environment that would enable users to experience a picture’s depth while it’s being projected onto the wall with the lights in the room dimmed down. This idea shaped up into a concept of interactive installation empowered by Kinect technology intending to break the wall between the digital and real experience.

Shaping the User Experience

Overcoming the pitfalls of gesture recognition

Kinect enabled the use of body gestures as UI controllers, so people didn’t have to utilize their gadgets to interact with the installation, while receiving live feedback on their works and reviewing the works of others. This scenario seemed like a great opportunity to boost our event. However, the implementation of this idea turned out to be even more challenging than expected.

NUI is an emerging technology, so it has a number of technological constraints. With touch devices, you receive tactile feedback even without receiving visual feedback from the user interface. With NUI, you receive no feedback when gesture recognition fails. This may happen because the motion sensor device that captures skeleton data in real time may have detected and comprehended an individual motion trajectory with discrepancies, as gestures are more or less different from person to person.

As simple as possible

For the first prototype of the installation we used simple swipe and drag gestures to enable navigation through the listed showcases but in our case this approach didn’t work as expected.

We strived to completely reimagine the way people review and evaluate Behance showcases

The thing is that after swiping right the person’s hand reflexively moves back to the left and Kinect can detect this as a separate swipe, which breaks the whole user experience. And even with the second version of our installation powered by Kinect v2 that included machine learning capabilities and enhanced camera features, one hand swipe gesture to both sides didn’t always work properly. Obviously, getting users to move their hands exactly the way we needed was not likely to happen. To reduce the risk of failure, we had to use only the simplest and most intuitive gestures, so we opted for a swipe.

While designing UI controls we aimed at both simplicity and high interactivity—and we managed to implement this approach successfully. We assigned swipe gestures to navigate through selection of works, swiping up and down to scroll the portfolio, two thumbs up—to appreciate the work. It’s that simple.

Active user recognition and voting

Another challenge was setting up the active user detection when multiple people were present in front of the installation in the interaction zone. Therefore, we adjusted the installation to rotate the whole interface towards the active user, detecting the person as an object from the whole group and, in the same way, showing other people that they are not interacting with the UI. This way we not only ensured the transparent and straightforward interaction but also created a unique feeling of the live response from the UI.

As for voting, we wanted the user experience to feel the same as is on the Behance platform, assigning the simple thumbs up gesture as a UI control for work appreciation. But, unfortunately, even with Kinect v2 we couldn’t assure this hand move is always properly recognized. That’s why we thought of different and even more interesting control for appreciation expression, where both hands were raised with the thumbs up. It’s quite easy to perform and even adds more fun to the overall user experience.

Implementing the Concept

A technology stack

To create an immediate response for users, we had to recognize gestures as quickly as possible and communicate with the interface with the lowest latency. The first challenge of the UI development process was to choose the underlying technology. The most common way to develop applications with rich UI that can interact with Kinect is to use Windows Presentation Foundation (or WPF), a graphical subsystem for rendering user interfaces in Windows-based applications by Microsoft.

ELEKS Behance Kinect review

WPF is a very powerful UI library, enabling to build almost any UI paradigm, however if you need special controls, the standard library may have them. For example, if you need to create a parallax effect for your background, there are almost no options available here, in contrast to HTML, which offers dozens of well-designed and easily maintained libraries. Therefore, we replaced WPF with HTML5 for UI and decided to use AngularJs as an application framework.

Reducing response latency

Another technical challenge was to establish the communication between the driver application and HTML application. Obviously, we needed it to be very fast, as any lag between the gesture move and actual response of the UI would be noticeable. This is where HTML5 Web Sockets came in hand. We had a few concerns about the result, but it actually exceeded our expectations. The application turned out being fast enough to send video source as a raw binary image for each frame without any noticeable latency. Consequently, as we applied HTML5 for UI development, we were not able to utilize WPF controls from Kinect SDK. On the other hand, not being limited to WPF capacities meant that we could choose an alternative approach here. For example, we opted for a swipe move instead of grab-and-drag interactions.

Maintaining a seamless interaction

To add more interactivity to the voting process, we decided to connect the installation parts with the help of the statistics server. An idea behind this was to connect two or more UIs to the single system, so, for example, when somebody “likes” a project in the first installation, the other parts become aware of this and their interface responds accordingly. From the technical perspective, to establish the connection, we used ZeroMQ for communication between nodes, Pub/Sub channel to notify installations about any changes in statistics and Push/Pull for one way communication. As a result, through trial and error, we managed to maintain a seamless interaction between the user interface and the Kinect driver, having the latency reduced to almost unnoticeable value.

How to Get the Most out of the Technology

Leveraging technology in a creative way was not only a smart marketing move that helped us transform a promo event into an interactive performance with more than 150 participants. It was also about creating a special atmosphere that motivates people to engage into the process, to communicate and share their feedback. Our goal was to inspire the innovative thinking and promote non-standard approaches to event organization—and it looks like we have succeeded. Thanks to our experiment with Kinect technology, we not only managed to boost engagement and improve attendance but also created a truly memorable experience.

We believe that this approach can be customized for promotional events of almost any kind, as it’s flexible enough to meet and even exceed the expectations of various audiences. We recommend keeping several important things in mind that can help ensure your audience receives a truly impressive and memorable user experience.

  • Consider all the aspects of the physical space for event location and think out the visualization of your concept carefully.
  • Use animation to bring your installation to life and make your audience experience a feeling of truly responsive and live UI.
  • Make sure the latency in the UI response is minimal, so the person won’t feel any noticeable lag in interaction as the immediate feedback to every user action is crucial here.
  • While assigning gestures to be recognized as UI controls, try to stick to a simple and transparent scenario, making interaction intuitive, straightforward and natural for your audience.
  • And the last but not least: don’t limit yourself by the commonly used design patterns. Dare to innovate!

After the event, we received dozens of likes on the Facebook page and numerous comments like "Awesome," "Absolutely fantastic," and many other praising words. But what we liked the most was to hear from people that we created some kind of a benchmark for others to follow. We believe that Albert Einstein was right saying: “creativity is contagious, pass it on”.

Article No. 1 056 | July 17, 2013
Article No. 1 438 | May 18, 2015
Article No. 1 351 | December 1, 2014

Add new comment

Comments

And of course, the ultimate gesture interaction is LM3LABS (http://lm3labs.com) for 12 years, based in Tokyo, Singapore and France.

Hundreds of projects delivered over the decade: http://references.lm3labs.com