
Aeri
AI-powered preventative care for diabetes with AR integration
AI Hackathon
AR
Timeline
1 day (Feb 27-28 2026)
Role
Product Design — Interaction Design, Visual Design, Motion Graphics, System Architecture
Team
Melody Ekbatani (Product Design)
Isabella Mixton-Garcia (Research)
Tools

Overview
Aeri was created for the Parsons x University of Arizona 24-hour AI Hackathon, where teams were assigned. It is an AI-powered preventative care concept designed for people managing Type 2 diabetes.
The project explores a more proactive alternative to today’s reactive health tools by helping users anticipate changes earlier. Aeri combines continuous sensing, predictive AI, and an augmented reality interface to deliver clearer, more actionable guidance throughout the day.

Design Challenge
What if diabetes care could become more preventative?

Problem Space
Diabetes care today is largely reactive
Where current tools fall short
Health tools provide constant data, but diabetes management still depends heavily on the individual. Users must interpret readings, track patterns, and repeatedly check devices throughout the day.
With sensors that require routine replacement, care becomes a cycle of monitoring, maintenance, and reaction. This revealed an opportunity for a more preventative and supportive experience.
1.0 Current devices show the data, but users are left to carry the cognitive load.

Solution
Shifting toward proactive care by anticipating changes early
Our goal
We set out to design a system that helps people act before issues intensify. We focused on the management of Type 2 diabetes approaching the problem through earlier awareness, clearer guidance, and more confident decision-making.
Our solution, Aeri, is an AI-powered preventative care concept that utilizes an AR-integrated contact lens to help users anticipate changes earlier throughout the day.

Approach
A more supportive experience
Turning metabolic data into everyday guidance
Aeri’s core experience is built around four connected parts: live interventions, daily overviews, metric breakdowns, and user controls. Together, they make health insights more useful, timely, and actionable.
User controls also give people more choice over what is surfaced, how prominently it appears, and when they are notified.





Key Features
Aeri is structured across three integrated layers
Biosensing
Continuously tracks metabolic signals
Captures glucose data in real time
Sends readings to a connected system
Smart lens
Delivers support through an AR-integrated lens
Interact with insights in real time, hands-free
Adapts to your focus with three modes: widget, side panel, and full view
AR interface
Detects patterns across glucose and behavior
Predicts shifts before they escalate
Turns data into timely recommendations

Context
More than 1 in 10 people worldwide live with diabetes
A growing global health reality
Its scale reflects a growing need for tools that better support the ongoing decisions, adjustments, and mental effort of daily care. Managing diabetes is not limited to isolated moments of checking a number. It is shaped by repeated choices around food, activity, sleep, stress, and routine, all of which can influence glucose levels throughout the day.
As a result, care often becomes a continuous process of monitoring, interpreting, and responding. This creates an opportunity to design tools that do more than report data by offering guidance that feels more supportive, timely, and easier to live with.

Market Research
Reactive systems create reactive behavior
This revealed a gap.
Conversations and user interviews point to a system that interrupts daily life rather than supporting it: frequent sensor failures, constant replacements, and the need to repeatedly check numbers create frustration and fatigue. Instead of feeling supported, many feel tethered to their devices, managing alerts, troubleshooting issues, and trying to make sense of inconsistent readings.
2.0 Research revealed the emotional and mental strain of reactive diabetes management.

Competitive Research
Tracking diabetes data isn't easy
The limits of today’s tools
Across the category, products have improved in sensing accuracy, connectivity, and real-time display. However, reliability issues, short sensor lifecycles, and fragmented device ecosystems continue to introduce friction.
As conditions become more complex, the operational overhead increases highlighting an opportunity to reduce system friction and create a more seamless experience that requires less active management from the user.






Iterations
Translating ideas into a working vision

Exploration
Mapped early system directions
Explored sensing, prediction, and interaction flows
Tested multiple concepts to identify viable pathways

Prompting
Generated interface behaviors and system responses
Explored interaction patterns at higher fidelity
Evaluated clarity, usability, and feasibility

Asset Creation
Built interface and visual system outputs
Developed scenes to show the product in context
Refined consistency across touchpoints

Pitch Video

Reflection
My takeaways
This hackathon strengthened my skills in time management, prioritization, and decision-making under pressure. In a 24-hour sprint, I had to stay focused, move quickly, and make strategic choices about scope.
It also reinforced the value of team collaboration, delegation, and execution under constraint. By dividing responsibilities early based on each person’s strengths, we worked more efficiently and built a stronger final story.
As my first AI hackathon, it expanded how I think about AI in the design process. I learned how to use prompting, rapid asset generation, and concept development more intentionally, while also considering scalability, feasibility, and product strategy.

Next Steps
Paths for further development

User Research
Expand user research with individuals managing Type 2 diabetes to ground the system in lived experience.

Model Accuracy
Refine predictive modeling with clinical datasets to improve accuracy and reliability.

Biomedical Feasibility
Further evaluate the biomedical feasibility of the sensing approach and underlying hardware.

AR Validation
Test AR guidance interactions in real-world contexts to validate usability and behavior.

