The plane, the train, the car, and now what? Insulin pumps?! Yes! Finally? What do these things have in common? They are becoming increasingly automated. A few of us at Tidepool have spent the last 4 months designing the interface of the iLetTM, Boston University’s Bionic Pancreas (BP) device created by Ed Damiano and Firas El-Khatib. My hope is by sharing the process and learnings we have had, that your devices, understandings and designs can benefit. Please share, use, and iterate on it all.
This is the first of four posts on the design process of the iLet that we went through. The posts will cover...
1) Kick off and goals
2) Mental models and user research
3) Details from the interface
4? FAQ (or what I think your FAQ will be)
If you have diabetes, I’m sure you come across this too, but most people assume that the CGMs and pumps are connected; well of course the pump is responding to the CGM value, right? Well, no, not yet. And on one level it's an incredibly simple problem when it is hypothetical; - sugar goes up - give insulin, sugar goes down - give sugar. And on another level, getting a system like this in place in the real world, when you get down to the nitty gritty of it all, is pretty complicated. The margin of error is so so tiny, the variables are many, the risk is high, the consequences are severe, the resources are slim, the pace is slow, there are patents to avoid, there is red tape to get around and the regulations are serious (for good reason). All of these things become real barriers for good design and fast iteration of medical devices, but we are pushing the boundaries as best as we can, and things are moving and improving in meaningful ways. I'd like to share our process because part of what is so slow about the development of these devices is the closed-ness of processes and learning. We started the project with a kick off meeting in person in Boston with the two documents below.
Big Questions We put our biggest questions down on stickies, brainstormed through them and then had a time boxed discussion for each one.
(How can we help people using the BP take care of it through using relatable measurements? How might we build trust through communicating device status to people? What are good BP habits and how can we encourage them? What things aren’t supposed to happen but will happen anyways? What models or framework for alerts do we want to implement? What do we believe safety means? What happens when the CGM is not connected? What hardware are we designing for (speakers, vibration, screen)?
Feature mapping We put every action and element on a sticky and talked through the backend of it, the limitations, behaviors, and needs and then had time-boxed discussions for each one. This helped us make a complete map of the features that we were working with.
(Phone connection, alerts, BG entry, carb entry, target, system settings, first experience, CGM, G-burst, hormone [cartridge] replacement, basics controls, stats, weight entry, current status)
Design tenets What will guide our decisions?
- Be reliable and transparent, show cause and effect.
- Communicate respectfully - people with diabetes are smart, they are experts in their own care.
- Simplify and take away as much interaction as possible.
- Design for safety. Safety means - intentional entry, customizable alarms, education and transparency.
Control and collaboration! How much control does the person using the bionic pancreas have? How does the relationship we have with current insulin pumps shift with the bionic pancreas? Control is a pretty big issue - for blood sugars, exercise, food, and insulin dosing. Pre-BPers have spent their diabetes years trying to have as much control as possible, to imagine relinquishing all of it to algorithm, or a system of many connected parts, can be terrifying. Some responsibilities will be passed off to the device and some will still be on the BPer. The relationship with the bionic pancreas will be a back and forth, give and take. Together, things will be easier.
Humans are good at living, and through the intimate journey with food and insulin, people with diabetes develop an innate sense of what those things do and how they make you feel. This extra sense and understanding is valuable day to day, in relationships, in work and in the world in general. The other thing that is uniquely human is that we are unpredictable and we change and grow. We also know when we are going to exercise or if it is Thanksgiving and we will be sitting and eating all day. These things we need to tell the iLetTM, it can't know without our help. It is however, SO good at monotonously checking, correcting, estimating, and adjusting. It is good at repetition and being meticulous.
The interaction with the iLetTM is not about passing off everything. It is about collaborating in the management of blood sugar levels. And the better I take care of it, the better it will take care of me. The mental model for the relationship as a collaboration guided our design decisions. What could we do in the interface to nudge people to do ‘good’ things for the device? And what repetitive tasks can the device do for the person in return?
We started with two kinds of goals for the design of the user interface...
Functional goals : What does it do?
- Tells person if device needs help
- Tells person if device is working
- Tells person how body is doing
- Lets person influence and inform device and therapy
Experience goals : How does it feel?
- Safe / effective
- Not intrusive / considerate
- Honest / transparent
- Liberating / control
- Easy to use / simple / basic
- Enabling / independent
- Regular / everyday
While it’s easy to go straight into feature details, limitations, worst and best case scenarios - it is so important to all agree on guiding principles and goals. It sets a foundation for the whole team to align on a design direction from the beginning. This part is what guides decision making the rest of the way. The next post will be on our User Research process and understanding mental models.