Keep your hands busy with the food,
not with your tech

SideChef kitchen assistant app (spec work)

Hero Apple TV mobile mockups


Client Overview

SideChef is a company that wants to “empower eaters everywhere to cook great food” (SideChef Mission Statement)

Their existing app on Android and Apple allows users to search through over 11,000 recipes and guides them step-by-step with photos, how-to videos, and basic voice control.

The Request

Bring the SideChef experience to the Apple TV, targeting busy millennials.

The Solution

Redesign the existing SideChef app to optimize the context of a user following the app recipes on a TV – make the experience hands-free and rich with images and video. Incorporate a search function designed around the habits of young professionals.

My role

I was the lead researcher, responsible for the user survey and a task analysis. I also designed and prototyped animations for the hi-fidelity mockup.
Project management was done by Tucker Adelman and hi-fidelity visual design and applying style guides to artifacts was primarily by Kristen Yang. All team members contributed in interviewing users, performing usability tests, and ideating wireframes.


To learn about the existing app’s features, its competitors, and users’ responses to the app, we did the following:

  • Heuristic analyses: We each performed heuristic analyses of the current SideChef mobile app by each cooking a recipe from the app
  • Comparative analysis: We compared SideChef’s features with other cooking apps as well as kitchen assistant “skills” for Amazon Alexa
  • User reviews: I read reviews of the current SideChef mobile app and identified patterns in what users liked and disliked

We also did the following to get to know SideChef’s target demographic, millennials:

  • Online survey : I designed a survey incorporating questions from all team members. We gathered responses from 18 millennials (See survey in new tab)
  • User interviews: We conducted 5 in-person interviews with millennials with different levels of cooking experience (See interview questions in new tab)
  • Affinity map: We synthesized interview data by looking for patterns and organizing them in an affinity map
    affinity map of interview data

Our Findings

Millennials do cook (88% of survey respondents cook at home at least once a week)
Millennials don’t spend much time cooking (66% of survey respondents spend 1 hour or less when they cook)
Millennials use technology to help them cook (94% of survey respondents use their phone or computer to find recipes)

Pain Points

  1. Users expressed in interviews that they would have to interrupt their cooking to wash their hands to handle their tech – to turn their phone screen on or scroll down on their laptop to continue reading the recipe.
  2. Users also mentioned in interviews sometimes running into unfamiliar terms or poor descriptions, causing them to have to guess what to do or search the internet for a video or better explanation.



We took characteristics of our users and embodied them in a single persona to help us focus on these characteristics and maintain empathy with an individual.

User persona

User persona (click to enlarge). Artifact by Tucker Adelman.

User flow

I diagrammed the user flow for following an online recipe to cook a meal (without and with our SideChef for Apple TV app) (See user flow without SideChef for Apple TV) (See user flow with SideChef for Apple TV)

User journey map

I created a journey map of following an online recipe to cook a meal, based on our interview data.

User journey map

User journey map

Why Apple TV?

The Apple TV requirement was given by the client, but we still wanted to validate.


  • Larger format for instructional cooking videos and helpful images
  • Potentially more space to fit more content concurrently

Our findings:

  • None of our interviewees had a TV viewable from their kitchen.
  • Unless someone explicitly installed a TV in their kitchen, surely someone has a TV conveniently viewable from their kitchen… right? All I had to do was browse through a trove of apartment/home layouts and find such a layout. How? Airbnb.

Examples of TV setups viewable from the kitchen. None were ideal.

Shown above are the best-case scenarios I found – None were ideal. Even when there’s line-of-sight, the TV is too far or not facing the kitchen.

We found no evidence that a significant audience exists with an appropriate kitchen/TV setup AND uses Apple TV.

Because some research was still being done concurrent to team members doing design work, these discoveries were made pretty late in the game when design work was almost complete. As next steps, we would highly consider designing for tablets instead of Apple TV, which still allows a larger viewing format but a much more accessible technology and physical setup. For this particular sprint, we maintained our focus on an Apple TV solution.


To help encourage busy millennials to cook more, we decided to focus on ways our app could increase the user’s confidence and increase convenience for the user.

How do we help busy millennials explore new recipes without interruptions from handling their technology or confusion from unclear terms and directions?

Initial vision

Our original ambitions were to push the user to go hands-free the moment they were dropped into the app. This was a paradigm shift – think Alexa plus a display, instead of a standard app plus some voice commands.

The user would use the Apple TV’s remote for voice input and find the ideal recipe through a conversation, with SideChef prompting about what the user has in the refrigerator – if they want something quick, light, or if they’re in the mood for a particular type of cuisine.

Initial concepts

My initial concepts of using voice to iteratively search and filter recipe results

When navigating the recipe, the user would be able to use voice commands to step forward and back in the recipe, start and stop how-to videos on steps with cooking techniques or terminology possibly foreign to the user, and control timers for steps mentioning time.

Deeper Research

To get an idea of how someone might interact with a virtual kitchen assistant, I performed additional research – part contextual inquiry, part task analysis (and part none-of-the-above?). What help would a user want from a kitchen assistant? What different ways might they ask to find recipes containing chicken?

I had a user plan and cook a meal as if they had a “digital kitchen assistant” – I did this by placing certain limitations and providing certain assistance.

exploratory research


30-year-old Asian female, recently employed as a pharmacist and moved into her own home and is getting used to cooking for herself. Currently knows enough recipes to count on one hand.

User Task

Your friends (my wife and I) are coming over for dinner (did I mention I invited myself?). You’d like to try cooking something new.

Research Method

The user (the “cook”) cannot touch the technology. I act as the kitchen assistant and control the technology based on what I think the cook wants and needs.

Some questions and factors that drove my research…

  1. How do I get subject out of the mindset that they are just using me to control the interface?
  2. We still want to respect the limits of current technology and stay within project scope

I limited our interactions by placing certain rules on who can interact with what, and in what way.

Because the user was not comfortable being video recorded, instead I took a screen recording of the session (including ambient audio).


Ultimately, my research ended up applying to features we decided not to implement for this MVP, but I did gain an insight: The user didn’t want just one step displayed at a time – she wanted to scan the recipe while she cooked.


Early usability test

We collectively prioritized some features by sorting them into “Must haves”, “Should haves”, “Could haves” and “Won’t haves”. We then designed a simple landing page we thought might be intuitive enough to navigate by voice with no written labels. We performed 2 usability tests with these low-def wireframes and found our interface was far from intuitive.
Usability test of hands-free landing page

The recipe search page I designed also didn’t connect well in my usability test.

usability test

Although we tossed around a couple solutions to try, ultimately…


Due to time constraints, we agreed the best way to go at this point was to reduce features so we’re meeting just the original scope of the project. Drop the voice control from the rest of the app – only the recipe steps would have voice control features. The additional voice control would have required more onboarding, research into natural language processing, and research and design of voice user interfaces (super cool, but… time).


We realized…

  1. On Apple TV, the user must press the voice button on the remote to give voice commands. Not so hands-free.
    Apple Remote voice button
  2. In usability tests of the mockups, some users wanted to have the ingredients list available at a glance.

To address these concerns, we designed a SideChef “companion app” on mobile that communicates with the Apple TV app. The companion app displays the ingredients while the TV displays the steps, and the companion app can be “always listening”, becoming SideChef’s hands-free voice input.

An Opportunity for Future Research

We ran usability tests of the Apple TV app in tandem with the companion app and reported users were able to navigate around the app and through a recipe, but some users said they’d like to see the recipe steps displayed on the companion app. Recalling an insight from my earlier research, I wondered if the users’ stated request really represented their want to scan the recipe steps. We added the steps to the companion app but tabled the idea of scannability to research further in the future.

Companion app wireframes from design studio

Low-def wireframes for my designs of the companion app during our team design studio (click to enlarge)



We validated our overall user flow by running a couple usability tests using paper printouts of mid-fidelity mockups. Users were able to navigate the interface and recipes smoothly.

I took the high-fidelity Sketch wireframes and created a prototype in Principle. I added animations where appropriate to make transitions feel natural and to help draw the user’s attention where useful.

Animation Gallery

Prototype animation 1
Prototype animation 2
Prototype animation 3
Prototype animation 4
Prototype animation 5
Prototype companion animation
Using our prototype, we presented the user flow of finding and following a recipe. Mouse actions were used to simulate voice and Apple TV remote commands.(Kristen plays our persona Clarice, Tucker plays SideChef’s voice, I control the prototype.)



Be careful about taking users’ feature requests for face value. Support design decisions through additional evidence, such as observations from testing.

Also be careful about taking clients’ requests for face value. We could have saved time and created a product that potentially better fits users’ needs if we had validated earlier the request to use Apple TV as the platform.

Next Steps

Research/test whether increasing scannability of recipes (possibly by displaying more steps at once) improves user flow.

Research the availability of users’ technology and physical setup to figure out whether TV or tablet would be a more popular platform.

Implement voice UI for all other features of the app.

See previous case study: Los Angeles County Fair