Checkit
Conscious Food Intelligence — AI-powered health ratings for what you eat
Checkit is an AI-powered food health app designed to help people make genuinely informed choices — not just count calories. It was born out of a frustration I had personally during my exchange semester at San José State University in Silicon Valley: coming from Germany, the opacity of US food safety standards was striking. A lot slips under the radar, and most people simply don't have the knowledge to understand what certain ingredients do to their bodies over time.
The project started as a university UI design course, then grew into something real when an external startup founder joined our three-person team and adopted our designs and concepts as the foundation for his company.
The challenge
Nutrition labels tell you calories and macros. They don't tell you whether an ingredient has been linked to long-term health risks, whether it conflicts with your specific allergies or conditions, or whether the product in your hand is simply a bad choice for you. People are left to navigate this alone — usually without the knowledge to do so.
Existing apps offer generic traffic-light ratings that ignore individual health contexts, or bury users in information that requires nutritional expertise to read. Neither works at the moment that matters: standing in a supermarket aisle with thirty seconds to make a decision.
How we approached it
The project began with field research in grocery stores — observing how people actually make purchase decisions, where they look, and how long they spend. This was paired with interviews focused on health-conscious shoppers and people managing specific dietary conditions or allergies.
Key insights:
- The in-store decision takes under 30 seconds — any solution requiring more time will not be used
- Personalisation is the key differentiator — a rating that accounts for your conditions is trusted; a generic rating is ignored
- Two contexts, two entirely different needs: scanning in-store and researching at home require completely different interfaces
- Transparency builds trust — users want to understand why a product got its rating, not just what it is
- The barcode scan is the natural trigger — it fits existing behaviour without introducing a new habit
What we built
Checkit works in two modes, each designed around a different context.
In Supermarket Mode, a barcode scan delivers an instant, personalised AI health rating — a single clear score with a one-line explanation tailored to the user's profile. Not "this product is high in sugar" but "this exceeds your daily goal and may affect your blood pressure management." The entire UI fits above the fold, designed for one-handed use. In Home Mode, users can go deeper: full ingredient breakdowns, references for flagged additives, pattern insights across their scan history, and preparation for the next shopping trip.
- Supermarket Mode: Instant barcode scan → personalised AI health rating in under three seconds
- Home Mode: Full ingredient analysis, scientific references, cumulative scan pattern insights
- Personalisation engine: Accounts for allergies, dietary preferences, health conditions, and personal goals
- Transparent ratings: Every score explained in plain language — no black-box verdicts
- Long-term damage flagging: Identifies ingredients that accumulate risk over time, not just immediate red flags
The fully designed Figma prototype was adopted by an external startup founder as the foundation for his product — an outcome none of us planned at the start, and probably the most validating thing a university project can produce. The two-mode structure was what he identified as the core insight: that food decisions happen in two fundamentally different contexts requiring two fundamentally different interfaces.
The project was presented at San José State University as part of the HCI programme.