DEFI Kitchen
Developing a comprehensive VUI for a kitchen scale that adaptively assists visually impaired users during meal preparation and cooking.
Project Info
Duration: Spring 2024
My Role: UX Research, Prototyping, Usability Testing
Team: 7 undergrad + 1 grad UX students
Background
Our client aimed to investigate how a voice user interface (VUI) kitchen scale could assist individuals with low vision and those who are non-sighted in meal preparation. To accomplish this, our team outlined the user flow for interactions with the adaptive kitchen scale and developed a dynamic VUI script to guide users throughout the cooking process.
How do our Users Currently Operate?
We interviewed seven fully-blind participants to learn about their experiences in the kitchen and how they currently utilize assistive technologies. These interviews uncovered important points regarding technology usage and adaptive strategies.
Most of our interviewees used some combination of assistive technology (Siri, Alexa, PenFriend) while cooking
Users are less inclined to cook recipes that require precise steps such as dicing or mincing, because of these steps tend to emphasize vision
Users struggle with portioning ingredients and often resort to tactile and possibly unsanitary methods of measuring ingredients
The process of finding recipes is tedious, causing users to stick with cooking food they are familiar with

Creating an iterative journey map using our interview findings
Examining Large-Language Models
We examined Gemini, ChatGPT, and Copilot to evaluate how these tools respond to prompts from visually impaired users. This analysis also serves as a potential method for structuring the VUI's responses based on the recognized response patterns of common AI tools.

Constructing the Workflow
After familiarizing ourselves with how different tools guide users through meal-preparation and cooking, our team began mapping out the different steps and interactions that would occur post-recipe selection. The flow enables users to adjust settings and controls as needed, tailoring the device to their preferences — these features include language, voice, volume, speed, and unit outputs.

Scripting the Scale's VUI
The team drafted a script outlining scale actions and corresponding voice-over lines. Our VUI dialogue was created for a simple spaghetti recipe, chosen for its straightforward cooking process and ease of modification for personal tastes and possible dietary restrictions. Each step requires manual confirmation by the user to minimize user errors in the cooking process and centers a user-driven experience, whereby the user may determine the pace of the cooking process at which they are comfortable.
Evaluating our Design with Wizard of Oz Testing
The team conducted in-home testing with participants to observe their cooking process while simulating the experience of interacting with our scale's VUI. Using our script, one team member posed as the scale and guided/responded to the participant. Throughout the process, we maintained a double-blind format for a more accurate simulation of user interaction.

Presenting the Final Design: Functionality in Context
We developed this demonstration to creatively and effectively "pitch" the features of our VUI scale. We started by drafting a script for the video, then moved on to produce a stop-motion video using Adobe Premiere Pro and Canva assets. You can view the finished product here!
Reflection
Working with DEFI introduced a unique design challenge, which I'm grateful to have been involved with. Exploring non-traditional interfaces (VUIs) and audio-based technology has presented opportunities for me to experiment with new processes and design methods, such as bodystorming and Wizard of Oz Testing. This project sparked my interest in accessible design and designing solutions for individuals with disabilities. Moving forward, I am eager to pursue more projects that drive positive change and make technology more inclusive for everyone.