Part of a 10-week startup incubator with 2 devs and a PM. We pitched every week, new idea, new slides, hundreds of user interviews and scrapped ideas. Super fun to be that scrappy and messy.
Younger family members spend a lot of time worrying about older loved ones who live on their own. To monitor them, families have to watch 24/7, causing constant stress and anxiety.
So we designed a vision-based in-home monitoring system that can detect falls and other emergency situations, to help older adults maintain their autonomy while providing peace of mind for their loved ones.
Visited a senior center. Interviewed in English and Korean.
Autonomy is one of the most important priorities for older adults age 60-70. Any solution that visibly compromises independence will be rejected, regardless of how well it works.
Current solutions like wearables or caregivers cause friction. Most forget to wear them or find them uncomfortable. Caregivers infringe on independence.
Jasmita's grandma wants to continue to live alone, but her health is deteriorating. She was recently diagnosed with a neurodegenerative disease. They set up baby monitors but those need to be watched 24/7, which means Jasmita's parents are constantly worried. This is where Oasis steps in…
(1/2)
Multiple cameras cover danger zones around the house, connecting to a central node that processes the visual data to determine if an incident happened.
(1/3)
Designed the older adult interface to be on the companion app on a phone.
Many older adults don't have a phone or don't keep it close to them. Added all important uses to the physical hardware. If an older adult prefers the companion app, they can still use it.
(1/2)
Added a small screen on the camera box to display messages from family.
The small message wasn't something anyone would really notice. Changed to soft lights instead. Tried sound first but it scared some people, so erred on the side of aesthetic. Light is calm, ambient, and doesn't startle.
(1/2)
With recent advancements in computer vision and processing, algorithms are now much more accurate and can detect incidents in real-time, on-device.
"Understanding a user" is just a buzzword for listening to someone tell their story. My interviews yielded a lot more when I treated them like actual conversations — just trying to learn as much as I could, being as curious as possible, going deep.
Designing for hardware is genuinely different from software. Having a hardware interface meant I could think beyond text, using sound, light, and color as ways of communicating. Other forms of interfacing entirely.