AWARE UI

An interface for natural interactions

Field:

Interface Design

Team:

Malte Fial & me

My Role:

Research, Ideation, Concept, UI-Design, Sound-Design

Context:

study project, third semester

Duration:

3 weeks

Year:

2023

A person looks at the Aware UI interface hanging on a wall

Aware UI is an innovative interface concept based on Machine Learning that enables natural interaction. The system combines touchscreen and voice control to create a way of interaction that takes human gestures, speech and non-verbal communication into account to provide a familiar and user-friendly experience.

Objective

The goal of the project was to provide a new type of communication with an interface that comes closer to natural communication between people and is directly derived from it. Natural actions and reactions like focusing on the other person, eye contact, listening should be transferred.

Interaction Capabilities

Voice User Interface

Aware UI uses voice control because conversation is a very human form of communication. Furthermore, the user can move freely around the room during the interaction and use his hands for other activities.

Graphical User Interface

In addition to the VUI, a touchscreen is used. The screen is used for information that is difficult to convey via voice output, such as more complex lists and tables, maps and videos. Depending on the user's position, the contents of the display are dynamically adjusted. The depiction varies in the amount of information and size, so that the screen can always be optimally utilized while remaining legible.

Character of the System

To avoid a robotic feeling interaction, Aware UI needed a character. The smart system is visualized by an animated circle on a dot grid. The animations are supported by sounds that mimic human non-verbal communication, bringing the circle to life. When Aware UI speaks, it has a gender-neutral voice. The goal is to challenge the notion that a female voice is generally preferred for supportive tasks and a male voice for commanding tasks.

Technology and Methodology

Aware UI was coded using the ML5 library, among others, and runs browser-based. It uses technologies such as Facemesh, Object Detection and Teachable Machine to process information from webcam and microphone and track the user. Once a person is detected in the room, the UI responds with visual feedback. The distance to the user is calculated using facial landmarks, the focal length of the webcam and the pixel density. Speech input is processed by a chat GPT connection and enables context-sensitive responses.

Use Case: Smart Home

In the smart home scenario, Aware UI can be placed in the kitchen and allows the user to start music, find shopping and control other smart home devices.

Conclusion

Aware UI is an innovative concept that enables natural and human-like interaction with digital systems. Using machine learning and a combination of voice and touchscreen interactions, it provides a user-friendly experience that adapts to the user's needs and context.

NEXT PROJECT BELOW

Mockup of four Splitwise screen designs

With over 10 million downloads, Splitwise is a popular app for shared flats or travel groups to keep track of money spent together and calculate the corresponding shares of the group members. However, users often get lost in the abundance of features and have difficulties navigating through the core functions.

Objective

The app was used by all team members and considered very useful, but sometimes the user experience was frustrating, and the UI design, from our perspective, was not meeting professional standards. We decided to undertake a fundamental redesign of the app.