← back to overview


An interface for natural interactions


Interface Design


Malte Fial & me

My Role:

Research, Ideation, Concept, UI-Design


study project, third semester


3 weeks



Aware UI is an innovative interface concept based on Machine Learning that enables natural interaction. The system combines touchscreen and voice control to create a way of interaction that takes human gestures, speech and non-verbal communication into account to provide a familiar and user-friendly experience.


The goal of the project was to provide a new type of communication with an interface that comes closer to natural communication between people and is directly derived from it. Natural actions and reactions like focusing on the other person, eye contact, listening should be transferred.

Interaction Capabilities

Voice User Interface

Aware UI uses voice control because conversation is a very human form of communication. Furthermore, the user can move freely around the room during the interaction and use his hands for other activities.

Graphical User Interface

In addition to the VUI, a touchscreen is used. The screen is used for information that is difficult to convey via voice output, such as more complex lists and tables, maps and videos. Depending on the user's position, the contents of the display are dynamically adjusted. The depiction varies in the amount of information and size, so that the screen can always be optimally utilized while remaining legible.

Character of the System

To avoid the feeling of talking to a machine, but to an intelligent entity, Aware UI needed to be given a character. The smart system is visualized by an animated circle on a dot grid on the screen. This circle indicates whether Aware UI is listening, talking, or simply activated. The grid can also dynamically free up space for information by shrinking the dots. To prevent gender bias, Aware UI has a gender-neutral voice. The goal here is to break the mindset that the female voice is generally preferred for supportive tasks and the male voice for commanding tasks.

Technology and Methodology

Aware UI was coded using the ML5 library, among others, and runs browser-based. It uses technologies such as Facemesh, Object Detection and Teachable Machine to process information from webcam and microphone and track the user. Once a person is detected in the room, the UI responds with visual feedback. The distance to the user is calculated using facial landmarks, the focal length of the webcam and the pixel density. Speech input is processed by a chat GPT connection and enables context-sensitive responses.

Use Case: Smart Home

In the smart home scenario, Aware UI can be placed in the kitchen and allows the user to start music, find shopping and control other smart home devices.


Aware UI is an innovative concept that enables natural and human-like interaction with digital systems. Using machine learning and a combination of voice and touchscreen interactions, it provides a user-friendly experience that adapts to the user's needs and context.