Advice for Designing Home Healthcare Devices
A human-centered approach is key to survival in the digital age.
As recently as five years ago, medical devices for in-home use were mostly analog, as opposed to digital, and characteristically ugly, as opposed to sleek. This is no longer the case, as digital disruption in the home healthcare market and technological advances now imbue these products with the kind of agility and elegance more commonly associated with consumer electronics.
In-home medical devices constitute a wide range of products, from prosthetic and orthotic appliances to home monitoring devices like blood glucose meters to medical wearables. But with all the progress and excitement that digital healthcare provides, medical device designers face a new challenge: how to infuse products redolent of disease and decline with intuitiveness, comfort, and even delight.
appliance DESIGN spoke with two experts in this growing field about the role of human factors in medical device design. Bryce Rutter, PhD, is the founder and CEO of Metaphase Design Group, which specializes in handheld and ergonomic product design. Tor Alden is the CEO of HS Design, a product development firm for medical, life science, and digital health products.
Circumvent Usability Issues
Bryce Rutter, Metaphase Design Group: There are a couple things that have happened that are counter to usability and good, thoughtful design. One is that our hands are not shrinking, so stop making these devices so small that I can’t even get my hands around them, or I’m dropping them too much, or they’re too slippery, or any of the buttons or switches that I have to touch are so tiny that it slows me down, and I have to go miserably slow to operate the device. Size matters. The size of the device must be scaled for optimal usability, which means it should be scaled to fit the 95th percentile of hands and hand function.
Another thing I see happening is that a lot of the practice of care and procedures are being pushed out on the users, people like you and me. We are given a set of instructions, and there are about 27 steps that are poorly illustrated, if illustrated at all; so it’s extremely difficult for the users to understand the correct way to use the product and teach themselves. It goes back to the emotional component of design: If it needs this many instructions, this must be really serious, and it must be really dangerous if I get it wrong. It sends so many negative emotional cues for usability back to the user that it directly impacts compliance. When people are confronted with complicated interfaces, it really dampens the level of compliance, which is counterintuitive to the whole process of taking care of yourself.
HSD’s fundamentals of good user experience (UX) design. Source: HS Design
And now we have medical devices that are connected to the Web, with digital health records and automated alarms going back to the clinician, who can call the user and say: ‘I’ve been monitoring your status and your heart rate is up. Is everything okay? I think you should come in.’ It allows for a lot of proactive healthcare, but there is a significant portion of the population that didn’t grow up on technology—specifically the Baby Boomers, who, to a large extent, are not as technically literate as younger users, Gen-Xers and Millennials, who grew up in a digital environment—and networking these home devices with your wireless router, say, is not the easiest task for many users. In this clash between the Internet of Things and the Healthcare of Things, older users can get bogged down in just trying to set up the device and get it connected.
Tor Alden, HS Design: The digital and physical, UI and UX, have to be combined; they have to be as intuitive as possible; and they have to be based on their environment.
The only difference between consumer and medical, really, is that the FDA is controlling the safety and efficacy of the medical product for the user. With a medical device, you need to go through a certain amount of different steps to understand how you get to your solution: documenting, both from a contextual inquiry and formative studies, then doing a validation study, to prove that the device actually works well and that the majority of people that you recruited and tested for it complied—went through the series of tasks without use error. You have to demonstrate that a user can navigate the device on their own, with the proper training.
Rutter: Now, some people have gone so far as to take medical devices and make them electronic devices, so you really don’t even know it’s a medical device. I think that’s an interesting strategy; but in my research, I’ve found that, routinely, consumers of these healthcare products that have been pushed into this consumer electronics space say, ‘That’s not really working for me. I still want it to look like a medical device; I just don’t want it to look like a torture device.’
So, be honest and true, and don’t try to camouflage it. Be authentic in your design. That way, you come back to the concept of functional aesthetics. By sculpting the product in certain ways, you can indicate, for example, where the user’s fingertips should grasp, and that can be reinforced with textures, and so on. By manipulating these human factors, you can develop a preconceived user experience that is going to be consistent with what you want to convey, and that is also going to be representative of what your brand stands for: that brand’s design DNA.
Track the User’s Journey
Alden: With a medical device, you need to worry about reducing the cognitive load; and the best way to do that is to promote an intuitive, natural reaction to the interface. However, when you do that, you have to decide how complex the user interface is, and how far the user can go off the specific task.
With certain interfaces, you want to keep them extremely linear, so they only go one way until you complete the task; and if you need to go back, you only go back one way. In other cases, you can develop a user interface that’s more agile, where if you want to go from one spot to another, you can go off that path and move, in a similar way to an iPhone app. The challenge with that is you might lose your way and lose your navigation; and if you have a critical task to finish, it may not get completed.
We need to understand the use cases that are involved, as far as how users are supposed to complete the task. That comes into the workflow, of understanding if there are primary and secondary paths, and all of that is based on creating what we would call a ‘good foundation.’ For example, if you’re designing for a double-gloved surgical nurse in a surgical operation, and they have to interact with a touch screen, can they even do that, or do they have to remove the glove? Would voice gesture be a better approach? When does the user get a defined completion of the task? Is it a haptic response, visual? All of these things need to be considered within their environment.
Example of a product storyboard. Source: HS Design
Rutter: First, we start to model the user’s journey. How do they begin their journey when they are passed the device in the hospital or their clinician recommends that they go out and buy one? That is the beginning of the ecosystem that they’re going to be living in with that particular product. It’s also the first step of looking at a wide variety of different types of users.
With a lot of healthcare devices, the market skews to the Boomers more than any other cohort. So, it is not uncommon that in this cohort, because it’s an older cohort, we may have some cognitive issues that are beginning to settle in, like reduced memory and attention span issues. Taking into account all of those factors, on top of the physical design of the product, I think is key to success.
The journey map is a tool that we use to analyze what’s going on—how does the user experience differ from a small person to a large person, male to female, young to old—and we’ll run those journey maps across different types of people. Then we run conceptual designs that start to address the pain points that we’ve identified through the journey mapping, and we weave in the ergonomic factors like strength, range of motion, and any kind of cognitive or physical impairment that the user may be dealing with as well. Then we test with these users, without instructions—because the best designs don’t need instructions—and we’re able to determine how the learning curve is working, and how the daily use interactions are unfolding. From there, you can get a good handle on the overall user experience, which allows you to narrow down and then select one design, which is typically a hybrid of more than one of the concepts that’s been tested, and then do another round of design validation testing with the device, with the same intended user groups.
That’s the path to go down, to tease out the risk for not getting it right. It’s very iterative, very user-centered, and, fundamentally, it’s all based on human factors, right down to the sensory level. What can you feel? What can you hear? Does it feel cheap? Does it sound cheap? The haptic signature, the acoustic signature, and the visual signature of the product—these are the key factors.
Create a Strong Value Proposition
Alden: Similar to any other product, you have an innovation adoption lifestyle with medical devices, where you have your early adopters all the way to the laggers. Nurses tend to be laggers because they understandably don’t want change. They’ve got a million things to do, and the last thing they want to do is to have is learn a new protocol and a new system—unless it saves them time. That gets back to your value proposition: You need to be providing a gain or removing a pain in order for the user to appreciate the technology.
Rutter: The first challenge in this space is that there are many companies that are engineering-driven. They use an engineering-driven innovation strategy, not a user-centered innovation strategy. With the engineering-driven strategy, it’s all about the cool tech that they have figured out, worked out, and finessed, and how can they manufacture at the line speeds they need to in order to make money. It is the most common innovation strategy that I see in the industry, especially in healthcare design. But as we’ve been talking about here, they forgot about you and me. There is minimal consideration taken into account about the users’ capabilities, both their physical and emotional capabilities. Is there a better way to do the same task by redesigning the device or instrument? I think that is the biggest challenge: trying to expose engineering-driven companies to the value of design and the value of talking to end-users. And as I say this, you’re probably thinking, ‘Well, why wouldn’t you do that? That’s so obvious. Go talk to the people you want to sell your stuff to.’ But it’s not done. Surprisingly, it’s not done.
Another challenge that is persistent in healthcare design is not designing for the human form. When you look at the human body, there’s not a flat spot or corner on it. Everything is three-dimensional and amorphous. Yet we take mobile devices and make these flat, recto-linear slabs for our hands that are so slippery that we now are forced to buy a case to add non-slip tactility to it and to protect them from the inevitable drop. The designs of our cell phones are so bad that a whole other industry for protecting bad designs has grown out of it with cell phone cases. That’s a remarkable statement about how bad their design is, when a whole new industry crops up.
To my point: Don’t let the technology package dictate the form of the product. If it is a handheld device, look at the shape of the hand and figure out what kind of tactile controls are needed, and where the finger naturally falls in space that would be comfortable, and then scale the form and size of the product to fit that negative space in your hand when you just hold it out. This will mean getting away from traditional, recto-linear designs and hard circuit boards to flex circuits that can be stuffed into nonlinear and amorphous shapes. We are at a point now with new technology that we have foldable and rollable display screens, which will give designers a lot more freedom to not follow the slab mentality.
Another thing that I’m continually astonished by is that most brands design the device and then forget about all of the other stuff that goes along with the user experience. They do a really nice job on the product, hopefully, and then they put it in a box that’s hard to open: a clam shell that you need an X-ACTO knife to cut your way into and try not to slash yourself. And once you get into the box you have these lousy instructions; and when you can’t figure those out, you go to the website, and the interface on the website is not intuitive. So, all of these touchpoints in the ecosystem of that product—the packaging, the instructions, the web portal, the app on your smartphone—all of these are part of that consumer product experience that needs to be deliberately planned and designed. By manipulating human factors—through form, through color, through sound—the designer can enable the user to have the most seamless experience possible.
Make the Right User-Technology Match
Rutter: There’s an interesting thing happening, and it’s being amplified right now with small medical devices, wearables and in-home devices. The technology has shrunk to a point where the shape of the device can be anything it needs to be.
One of the reliable design strategies when you’re developing products for the home healthcare market is to use legacy experiences that people have from other mobile devices. Pinching and squeezing, swiping—all of these gesture commands and voice entry capabilities—they’re all commonplace, and they do have a place in home healthcare products, because the user is familiar with them, which takes away stress from having to learn something new.
Alden: If you look at the actual users of these medical devices, they’re the same users that’re wearing Apple watches, driving high-end cars, and experiencing the design touches that are given to consumer products. Currently, we’re seeing a lot more of that design value being added to medical.
The reason I’d say the medical device industry wasn’t previously following consumer trends was because of the obsolescence curve: where medical devices typically had to live up to 10 years in their environment; and so, by the end of their 10-year-cycle, they were becoming more dated. But now, with new technology being so influential, we’re seeing that new medical devices are not necessarily as old or moving towards their obsolescence, and they’re actually integrating with newer technology, like Apple and Android devices. For the most part, medical device users are interfacing with touch screens or interactive displays. And if you go into the consumer healthcare space, there is a huge pattern of patients being given more access to their healthcare.
We just developed a new device called gammaCore, which stimulates the vagus nerve to reduce chronic headaches. It’s a first-of-its-kind product; it’s never been done before. The user is given a prescription of 30 doses, they buy them, they’re sent an RFID card, they can reload their device, and then they’re allowed to stimulate their vagus nerve for a period of sessions that would be very similar to a pharmaceutical delivery device. In this case, you have untrained users, new technology, and an unknown interface; and in these types of cases, it’s really important to understand the environment that the users are in.
For example, when we first developed this product, we did some very early concept designs, and we thought it would be very similar to an iPhone, in that the user would tell an app how you wanted to dose, change the volume, and stimulate the vagus nerve. But what we quickly found out in the formative interviews was that patients were very adverse to displays when they had migraines. It seems obvious now, that bright lights would cause problems, but what we had to do was pivot and change the whole device to make the interface less like an iPhone’s and much more subdued, while still giving the same amount of information. We used in-mold decorating to provide a backlit display that created more of an icon approach, more subtle light, to give users the information without the bright light. So, this was a case of using new technology and moving into an unknown area, with a new environment and a new patient characteristic, to create a preferred design platform.
How Can Designers Prepare for the Future?
Alden: Augmented reality is probably the newest and most uncharted Wild West we’re in right now. It’s been set by the dominant players, like Microsoft with the Holo Lens, and they’ve created templates and beginning structures. But for every new use and product that we’re encountering, it’s a new environment, it’s potentially different users, and the users are all basically working with a computer. So, you have the human-computer interface interaction, you have the perception of the computer in the environment, and on the third triangle, you have the conventional reality, if you will, between the human and the environment—all of that leads to a mixed-reality component. And when you start looking at those three elements, how human, computer, and environment overlap, it can be challenging.
Each user interface, now and looking forward, has its own factors of success. So, when we start bringing it back to the FDA, and demonstrating that it’s meeting IEC 62366—which is the holy grail for usability for the FDA—it’s really important to provide the documentation of how you got to a solution. It’s almost to say that you can do any solution as long as you prove that you tested it and it meets the environmental conditions. But because of its unknown use cases, it’s just much more challenging to create the formative studies to document how we got there.
For example, if we’re developing an augmented reality interface, we start with paper models in the beginning and work through by trying to understand the user needs. One study might be through the paper models, the next study might be through an animated model or a similar prototype model, and then the final model might actually be with something like the HoloLens. And when you’re testing real users with a HoloLens, you also have to consider that most of the users have never used a HoloLens before. Not only are you testing your new interface, but you’re bringing users up to speed to a whole new environment, so these challenges probably need a little more time and longer [adoption periods] for the users to understand what you’re trying to do.
Rutter: First of all, I think that telemedicine is here today, and it’s probably going to have the most profound impacts on taking care of yourself at home. We can find many examples where a user who’s had a stroke or a heart attack is now at home with a tablet that has their healthcare provider’s portal already loaded on it, which gives a handshake to their Wi-Fi. They’re wearing some kind of wearable, like a heart-rate monitor, and the signal from the device Bluetooths to the tablet, which goes back into a monitoring center, where they’ve got nurses monitoring the patient. That’s going to change healthcare quite a bit, and I think that’s tremendous. But I also think that the requirement for success there is the ease and intuitiveness of that user interface.
I’ve asked so many people about, and I’ve done some lectures on, the dignity of living. Everyone talks about the dignity of dying, but nobody is talking about the dignity of living. I think there is an enormous opportunity, and probably the biggest one in my career, to bring dignity to living with great design and humanizing medical devices—taking the stigma away from how these things look and making money. Usually, these last two things are not in lockstep with each other; one usually comes at the cost of the other. But the first brands that really get there and nail it are going to have astronomical financial results, because this is the richest cohort to go into retirement, they have more disposable income than any other age group going into retirement, and they’ve all had exposure to great designs in other areas of their lives.
Our cars are amazingly sophisticated now. We have automated appliances in our kitchens. We have smartphones. And then we look at the healthcare side, and we say, ‘Really? The walkers with the tennis balls at the bottom are the status quo? That’s ridiculous.’
Many medical devices that we use outside the hospital we carry with us, or wear on our bodies, or use in our bathrooms or kitchens. They have been engineered, but they have not been designed. They definitely have not taken into account human factors that have a direct impact on how the use of that product makes a person feel. We’ve always heard about ‘form follows function,’ and I think that is the price of entry, and that takes into account the physical interactions between the person and the product. But now, there’s a form follows emotion side, and this is where the designer really needs to pay attention to functional aesthetics.
You need to tell the user a story about how to pick up, use, and interact with the product. You have to do this through the manipulation of form, color, and audible feedback, and do it in a way that does not remind the user of how sick they are or whatever state their condition is in. Rather, the product should bring an uplifting, positive lifestyle to the person as they use it.