After iteratively designing and developing the Family Snack Buddy application, we wanted to get it into the hands of users and see how they would use it in their everyday lives. Its one thing to get users’ feedback on the application design and see how they use it in a controlled setting to complete tasks. Its another thing completely to understand how people use it “in the wild.”
A field-trial, where users engage with the application for a prolonged period of time, was an ideal methodology for exploring “in the wild” use. We were interested in learning about how people used the application and their experiences interacting with it on a regular basis. We also wanted to understand how the application affected their beliefs about healthy eating and their eating behaviors.
The field trial, illustrated in the image above, was 12-weeks in length. However, participants only used the Family Snack Buddy application for four of those weeks. The last six weeks of the study involved a follow-up period where we wanted to see if any changes to participants’ beliefs about healthy eating and their behaviors sustained after they stopped using the application. So during that time we only collected data on participants eating behaviors and beliefs.
In this study, we randomly assigned families (each of which had one parent and one child) to one of two groups. The first group, the intervention group, used the application during the four week “intervention period.” The second group, the control group, did not use the application during that period and instead went about their daily lives as normal. The goal was to control for external factors (e.g., changing seasons and weather) when looking at the effects of the application on participants’ beliefs and behaviors.
In total, we had 5 families in the intervention group (each with a child and a parent, N=10) and 5 families in the control group (each with a child and parent, N=10).
As illustrated in the overview diagram, we used a photo-elicitation interview methodology (PEI) to collect information about participants eating behaviors before and after the study. For the PEI, we asked participants to take pictures of all the food that they ate for the one week period. After taking pictures for a week, we met with them to discuss the photos and explored participants’ perceptions of healthy eating.
With the “intervention group,” we also conducted semi-structured interviews discussing their experiences using the application. This was done both at the mid-point of their four week use period and then again at the end. We did interviews at two time points because in previous studies we found that participants often forgot about their earlier experiences if we only had an interview at the end. Interviewing at the midpoint allowed us to learn more about their experiences learning the application and integrating it into their lives.
During this study, we collected a rich dataset of qualitative and quantitative data related to participants’ use of the application, their experiences using it, the interactions between family members surrounding the application, their health beliefs and behaviors and the application’s effects on them, and more.
One of the main focuses of this research was understanding how participants used the application. I discuss a few of our findings related to how participants used the application and their evaluation of the individual features we included in the application.
The graph above shows the average number of days per week participants. We logged every interaction that users had with the application, so that we could understand exactly how they were using it. We counted a “day of use” if the participant used at least one major feature in the application that day (i.e., entering a snack, viewing their snack history, viewing their family’s snack history, creating a new snack, or reviewing the snack suggestions they had previously received).
This graph was particularly interesting as previous research has consistently found a well-defined “wow factor” in using health applications, where participants use it a lot when first starting out, but then use it significantly after the second week and beyond. We did not observe this effect. Instead, we saw consistent engagement across the four week study, where participants continued to use it almost everyday.
We dug into this deeper and broke down the use by each family, which is shown in the second graph. One family (F5) had significantly lower use during the third week and accounted for the drop during that week in the aggregated view. During our interviews we discussed this drop with the family and found that they had a family member in the hospital, which made it more difficult for them to keep engaged.
What did participants like about the application that kept them engaged during the entire study? During the interviews, family interaction surrounding the application emerged as one of the primary factors keeping uses engaged. One participant summarized this well during their interview:
“And you know, kind of looking at [my kids’] snacks and looking at the stuff on the application and thinking: ‘well gee, [my kids] could have this instead of that’ and you know. It kind of helped me monitor I guess, what the kids were eating, and I mean, even my grandson, I didn’t realize he was having unhealthy snacks either. So now we have gotten to the habit where we look at everything, like sugar content and stuff like that and making sure it is, you know … not chips”
We also asked participants to rate the usefulness of each of the features in the application from 1 (Very useful) to 5 (Not useful at all). The star feedback system and the snack history (where participants could review) were the most useful features for participants, followed by the core ability to enter snacks and review family member snack history.
The two features that were identified as not useful (i.e., they had an average rating above the scale midpoint) were the messaging feature and the ability to switch between the two interfaces we built into the application (see the overview of Family Snack Buddy for more info about the interface). This was consistent with what we heard during the interviews and in looking at the frequency in which participants used those two features. More details about the study and additional findings can be found in our CSCW publication.
Findings from the research study were used to inform the design of the Individual Snack Buddy application, which focused on individuals instead of families. The individual application allowed us to explore some different research questions around health beliefs and mechanisms of behavior change, which we were unable to explore in depth during the Bridge Field Trial.
Although Individual Snack Buddy was significantly different in its approach, it retained many of the core features, which we improved with the findings from this field trial. These improvements ranged from new features (a daily snacking highlight that showcased positive behaviors for the week), usability enhancements to make it easier to input snacks, and the ability to favorite or delete items from the snack suggestion list, so that the feature was more useful.
One specific example, pictured above, was adding a passive reminder system. We found that the most common reason that participants did not enter their snacks was because they forgot. A number of participants suggested a reminder system, but emphasized that it not be invasive (no buzzing, beeping, or flashing) and they they be able to set the time when the reminder would go off. We implemented this feature in the Individual Snack Buddy application based on this feedback. In the subsequent field trial of that application, the feature received overwhelmingly positive feedback.