EviCare

EviCare is an AI-first exploratory app that allows women to record and revisit their physical experiences, creating structured data for pattern recognition. The focus of the project is on human–AI collaboration and process experimentation, rather than providing medical advice or a finalized product.

Project Type

University Group Project,

MSc Interaction Design,
Amsterdam University of

Applied Sciences

University Group Project,

MSc Interaction Design,

Amsterdam University of Applied Sciences

Collaborators: Ève Bérard,

Emily Sonnier

Collaborators: Daniël Bolt, Ève Bérard, Halszka Barczuk

Duration

2 weeks (22 September -

5 October 2025) 

8 weeks (27 October - 18 December 2025) 

Methods

AI Prompting

Desk Research, Survey, Prototyping,

UI Design, Interaction Design,
User Testing

Tools

Figma Make, Claude, Copilot,

ChatGPT, Google Stitch,

Gemini, v0 by Vercel, Canva

Figma Make, Claude, Copilot,

ChatGPT, Google Stitch,

Gemini, v0 by Vercel, Canva

Context

Evicare is an AI-first design exploration during a master’s studio, investigating how human–AI collaboration can shape design workflows. The project focused on creating an application where women can log and track their bodily sensations over time, enabling the identification of recurring patterns through AI-assisted analysis. Rather than delivering a finalized product or providing medical advice, the exploration emphasized experimentation, iterative prompting, and workflow-driven learning to understand how AI can act as a collaborative, non-authoritative design partner.

Exploration

The project was aimed at investigating how prompting, iteration, and AI-generated outputs influence design thinking. Instead of targeting a production-ready solution, AI was explored as a creative collaborator, and the ways design decisions could emerge through human–AI interaction were examined.

Approach

An AI-first workflow was adopted, in which ideation, branding, interaction concepts, and visual assets were co-created using multiple AI tools. The process was shaped by experimentation, rapid iteration, and critical evaluation of AI-generated outcomes rather than by traditional linear UX methods.

PROCESS

v0 by Vercel

Early Prototyping with v0 by Vercel
  • Early prototypes were generated using V0 to explore the capabilities of different AI tools.

  • The experimentation highlighted that AI tools vary greatly in output style and control, providing insight into which tools support rapid ideation versus detailed refinement.

  • Key learning: Initial exploration helps define which AI tools are suitable for specific stages of the design workflow.

v0 by Vercel

Google Stitch

Google Stitch Interface Exploration
  • Interface prototypes were explored using Google Stitch to gain more control over layout and interaction concepts.

  • The experimentation revealed the need for a tool that allows precise design refinement while still leveraging AI generation.

  • Key learning: Balancing automation with manual control is essential when designing interactive workflows.

Copilot Visual Experiments
  • Multiple logo and visual concepts were generated in Copilot based on the project name “Clinic Card”.

  • Outputs were highly variable due to limited input, revealing the importance of structured prompting for consistent results.

  • Key learning: AI can produce diverse ideas quickly, but refining prompts and providing clear constraints improves alignment with project goals.

Copilot

Copilot

Final Logo with Copilot
  • Using the refined color palette and project name EviCare, final logos were generated in Copilot.

  • The outputs reflected the curated guidance and iterative refinement provided through structured prompts.

  • Key learning: Focused input allows AI to produce visually coherent and project-aligned designs.

Claude

AI Functionality Prompting
  • Iterative prompting and back-and-forth experimentation with Claude were used to define the behavior of the AI ​​embedded within the project.

  • This step was critical to ensure that the AI ​​component could respond accurately and reliably while maintaining the intended collaborative, non-authoritative role.

  • Key learning: Effective AI-driven design requires not only visual exploration but also careful crafting of functional prompts for interactive systems.

HOW IT

WORKS?

HOW IT WORKS?

HOW IT WORKS?

Figma & Figma Make

Claude

Outcome

As part of the exploration, a speculative application concept was developed addressing the delayed recognition of women’s health symptoms. The application does not diagnose or provide medical advice; instead, it enables users to log bodily sensations and recurring discomfort over time.

AI was used solely to structure and analyse user-entered data, generating neutral pattern-based summaries without interpretation or recommendation. The concept functioned as a design probe, demonstrating how AI can support reflection and pattern recognition while maintaining a non-authoritative role.

EXPERIMENT

EXPERIMENT

Design directions were explored through iterative prompting and

AI-generated variations, enabling rapid divergence beyond traditional ideation methods.

Users explore curated articles, podcasts, and videos that introduce sustainable digital design practices in an engaging way.

Users explore curated articles, podcasts, and videos that introduce sustainable digital design practices in an engaging way.

STRUCTURE

STRUCTURE

Personal experiences were translated into structured data through logging, transforming subjective sensations into trackable patterns.

IDENTIFY

IDENTIFY

Recurring patterns were surfaced through analytical summaries generated solely from user input, supporting awareness through observation rather than guidance.

FINAL PRODUCT
NEXT STEPS & REFLECTION
  1. The EviCare project demonstrated how AI can actively participate in the design process, shaping ideation, visual exploration, and interaction concepts. Iterative prompting allowed diverse possibilities to emerge rapidly, highlighting the ways AI can expand creative workflows without replacing human decision-making.


  1. Future iterations could focus on refining the AI‑human dialogue by establishing clearer prompting strategies and visualizing AI contributions throughout the design process. Exploring integration across multiple AI tools could further enhance iteration speed and concept diversity.


  1. The project also emphasized the importance of structured curation: while AI accelerates exploration, human judgment remains critical in selecting, combining, and refining generated outputs. Ongoing experimentation could investigate best practices for balancing autonomy and control in AI-assisted design.


  1. Finally, this exploration reinforced that AI-first design is as much about process as it is about output. Continued research could explore workflow reproducibility, ethical AI usage, and ways to support designers in developing consistent, meaningful interactions with generative systems.