Protopie Concept

A voice-driven smart recipe app prototype built using ProtoPie, leveraging its advanced interactions (voice, sensors, and gestures) to create a fully functional, realistic user experience without code.

Title

Protopie Concept

Protopie Concept

Industry

Food

Food

Date

2025

2025

Overview:

The goal of this project was to explore advanced interaction design using ProtoPie by creating a recipe application that goes beyond traditional touch-based interfaces. The focus was on leveraging multi-modal interactions such as touch, voice, and device sensors to create a more immersive and intuitive cooking experience.

Project Duration:

4 weeks (Exploratory Project)

Role:

UI/UX Designer

  • Designed interaction concepts and user flows

  • Built high-fidelity interactive prototypes in ProtoPie

  • Experimented with non-traditional input methods

Tools Used:

  • ProtoPie (Advanced Interactions & Prototyping)

  • Figma (UI Design)

  • Pen & Paper (Concept ideation)

Process:

1. Research & Discovery:
  • Identified limitations of traditional recipe apps:

    • Heavy reliance on touch interaction (not ideal while cooking)

    • Difficulty navigating steps with messy hands

    • Lack of contextual, hands-free assistance

  • Defined opportunity:
    → “How might we create a hands-free, context-aware cooking experience?”

2. Wireframing & Prototyping:
  • Designed key flows:

    • Recipe browsing

    • Step-by-step cooking mode

    • Hands-free navigation

  • Built interactive prototypes using ProtoPie:

    • Voice commands (e.g., next step, repeat instruction)

    • Touch gestures for quick actions

    • Sensor-based triggers (e.g., tilt, proximity)

3. Visual Design:
  • Created a clean and distraction-free interface:

    • Large, readable typography for cooking scenarios

    • Minimal UI to keep focus on instructions

    • High contrast for visibility in different environments

  • Designed UI to support an interaction-first experience rather than screen-heavy navigation

4. Usability Testing:
  • Conducted informal testing with 4–5 users

Key Observations:

  • Voice interactions were highly preferred during cooking scenarios

  • Users appreciated reduced dependency on touch

  • Some confusion around the discoverability of advanced interactions

Iterations Made:

  • Added visual cues for available interactions

  • Simplified interaction patterns

  • Improved feedback for voice and sensor actions

5. Implementation & Launch:
  • Delivered a fully interactive prototype demonstrating advanced interaction capabilities

  • Showcased how multi-modal inputs can enhance real-world usability

(Note: Development not in scope — focus on interaction exploration)

Results:

(Exploratory prototype outcomes)

  • Increased usability in hands-busy scenarios (qualitative feedback)

  • Users showed a strong preference for voice-assisted navigation

  • Demonstrated feasibility of multi-modal interaction in everyday apps

Key Takeaways:

  • Context-aware interactions significantly improve usability

  • Voice and sensor inputs can reduce friction in real-world scenarios

  • Discoverability is critical when introducing non-traditional interactions

  • Advanced prototyping tools like ProtoPie enable realistic experience testing

Next Steps:

  • Improve onboarding for interaction discovery

  • Expand voice command capabilities

  • Test in real kitchen environments for validation

Overview:

The goal of this project was to explore advanced interaction design using ProtoPie by creating a recipe application that goes beyond traditional touch-based interfaces. The focus was on leveraging multi-modal interactions such as touch, voice, and device sensors to create a more immersive and intuitive cooking experience.

Project Duration:

4 weeks (Exploratory Project)

Role:

UI/UX Designer

  • Designed interaction concepts and user flows

  • Built high-fidelity interactive prototypes in ProtoPie

  • Experimented with non-traditional input methods

Tools Used:

  • ProtoPie (Advanced Interactions & Prototyping)

  • Figma (UI Design)

  • Pen & Paper (Concept ideation)

Process:

1. Research & Discovery:
  • Identified limitations of traditional recipe apps:

    • Heavy reliance on touch interaction (not ideal while cooking)

    • Difficulty navigating steps with messy hands

    • Lack of contextual, hands-free assistance

  • Defined opportunity:
    → “How might we create a hands-free, context-aware cooking experience?”

2. Wireframing & Prototyping:
  • Designed key flows:

    • Recipe browsing

    • Step-by-step cooking mode

    • Hands-free navigation

  • Built interactive prototypes using ProtoPie:

    • Voice commands (e.g., next step, repeat instruction)

    • Touch gestures for quick actions

    • Sensor-based triggers (e.g., tilt, proximity)

3. Visual Design:
  • Created a clean and distraction-free interface:

    • Large, readable typography for cooking scenarios

    • Minimal UI to keep focus on instructions

    • High contrast for visibility in different environments

  • Designed UI to support an interaction-first experience rather than screen-heavy navigation

4. Usability Testing:
  • Conducted informal testing with 4–5 users

Key Observations:

  • Voice interactions were highly preferred during cooking scenarios

  • Users appreciated reduced dependency on touch

  • Some confusion around the discoverability of advanced interactions

Iterations Made:

  • Added visual cues for available interactions

  • Simplified interaction patterns

  • Improved feedback for voice and sensor actions

5. Implementation & Launch:
  • Delivered a fully interactive prototype demonstrating advanced interaction capabilities

  • Showcased how multi-modal inputs can enhance real-world usability

(Note: Development not in scope — focus on interaction exploration)

Results:

(Exploratory prototype outcomes)

  • Increased usability in hands-busy scenarios (qualitative feedback)

  • Users showed a strong preference for voice-assisted navigation

  • Demonstrated feasibility of multi-modal interaction in everyday apps

Key Takeaways:

  • Context-aware interactions significantly improve usability

  • Voice and sensor inputs can reduce friction in real-world scenarios

  • Discoverability is critical when introducing non-traditional interactions

  • Advanced prototyping tools like ProtoPie enable realistic experience testing

Next Steps:

  • Improve onboarding for interaction discovery

  • Expand voice command capabilities

  • Test in real kitchen environments for validation

Challenge

Traditional recipe apps rely heavily on touch interactions, which can be inconvenient and impractical during cooking scenarios.

This creates friction for users who need hands-free or quick-access interactions while actively preparing meals.

Approach

Explored advanced interaction design using ProtoPie to create a more immersive and context-aware experience.

  • Implemented voice commands for hands-free navigation

  • Utilised touch gestures for quick interactions

  • Integrated sensor-based inputs for contextual actions

  • Designed a minimal UI to support an interaction-first experience

  • Added visual cues to improve discoverability of interactions

The project demonstrates how multi-modal interactions can enhance usability in real-world scenarios.

By reducing reliance on touch and enabling intuitive inputs, the app creates a more seamless and practical cooking experience.

The project demonstrates how multi-modal interactions can enhance usability in real-world scenarios.

By reducing reliance on touch and enabling intuitive inputs, the app creates a more seamless and practical cooking experience.

Projects

Explore more like this one

Selected projects that reflect my approach to design, development, and execution.