Move It!

Exploring engagement in a mobile app that helps you move more
at work by connecting you with your colleagues.


Prolonged sitting is bad for your health.

But adults spend ~66% of their workday sitting, which can increase the risk of chronic disease, including diabetes, cardiovascular, cancers, and musculoskeletal disorders. It’s clearly a significant workplace health issue.

In my work, I’ve used digital health applications to provide employee health education. But the issue is: attrition rates for digital health promotion initiatives vary wildly.

If we can’t keep people engaged, it will be harder to change their health behaviors. And it’s worth trying to tackle sitting behavior because health markers like blood pressure and body weight can improve when sedentary behavior is interrupted by short bursts of activity.

How can we keep people engaged?

Interestingly, employees are more likely to display healthy behaviors if they have positive relationships with peers who display healthy behaviors.

So, emulating a social network digitally may contribute to greater engagement in these applications. 

What’s the best way to leverage social support in a digital health app to change workplace sedentary behavior?

Implementing persuasive design frameworks, like the software design patterns for social influence, can help foster the social presence needed to support social connections online.

I designed a mobile app to reduce sitting behavior at work using a persuasive design framework focused on social influence. I then explored engagement using quantitative and qualitative measures to discover how to best apply this framework.


Introducing Move It!

Move It! helps people become more aware of their sedentary behaviour and use their social networks to move more at work. Users set goals to stretch, stand or stroll for up to 5 minutes every 30 or 60 minutes, and the app prompts them to take those breaks. 

People track their activity and can view daily, weekly or monthly historical data. Finally, they can participate in challenges, join communities and learn more about why they should sit less.

Check out a walkthrough of the mobile app below.



How can the persuasive software design patterns for social influence be best leveraged to promote engagement in a digital health promotion application to reduce workplace sedentary behavior?


This solo project was completed as my master's thesis.

I gathered user research, designed low and high-fidelity prototypes, conducted usability tests, surveyed participants to explore measures of engagement, and determined recommendations for designing a digital health mobile application while leveraging social influence.


Interviewing, qualitative thematic coding (using MaxQDA software), affinity mapping, low-fidelity paper prototyping, high-fidelity prototyping (using software), quantitative surveying, mixed-methods analysis


February - May 2020

Data collection


I went directly to prospective users to determine the user requirements.

Using voluntary response sampling, I recruited 15 participants (12 women, 3 men, median age 44.6 years) to help with the 3 phases of the study. The initial screening criteria included being at least 18 years old and self-reporting sitting at least 5 out of 8 hours

I quantified their sitting time using the Occupational Sitting and Physical Activity Questionnaire. Overall, they sat a median of 6.75 out of 8 hours, or 84.4% of the working day

Study participants

Semi-structured interviews for user requirements

To learn what strategies would resonate with users to foster behavior change, I interviewed 8 employees to identify their pain points and needs related to their own sedentary behavior at work.

They spoke of their aches and pains and frustrations around how to manage them. They all recognized how their work environment influenced their discomfort. I also explored if and how they used technology to manage their behavior. The personas below were derived from the interviews.

Personas derived from semi-structured user interviews
Personas derived from semi-structured user interviews

A large number of studies and existing apps incorporate behavior change techniques from the Behavior Change Techniques Taxonomy. These 93 behavior change techniques provide “a method for specifying, interpreting and implementing the active ingredients of interventions to change behaviours” and include recommendations for how to implement them in practice.

I transcribed and coded each interview using the taxonomy for my predetermined themes. Because multi-component behavior change interventions tend to be more successful, I incorporated the top 5 techniques into the prototype. The table below describes each technique and their corresponding tasks and features.

Behavior change techniques and corresponding tasks and features

Functional requirements

The beauty of the persuasive software design patterns for social influence is that they provide a framework of features that leverage social influence to motivate users. Some of these patterns have been incorporated into popular apps, including Fitocracy and Endomondo.

In Move It!, the design patterns were represented by social affordances, artifacts that frame a sense of community and enable social interaction. These include images of human faces, discussion boards, activity streams, groups, leaderboards, and badges. The affordances chosen for inclusion in the app are shown in the table below. 

Low-fidelity prototype

Paper prototype

The prototype was mapped out using the architecture below, highlighting the social affordances and behavior change support functions and features of the app.

Paper prototype information architecture
Paper prototype information architecture

A paper prototype was created on card stock to represent the major flows of the app. A few of the screens are shown below.

Paper prototype screens
Paper prototype screens

Usability testing

Four participants completed a moderated usability test with a talk-aloud protocol. The goal was to look for blocks or stumbles in the flow, to capture their opinions, impressions and frustrations and to gain feedback on any potential missing features. 

Three themes stood out:

  1. Unclear how to send a message to other users 
  2. Data on the profile dashboard were confusing; participants wanted to see historical data
  3. Unclear how to modify goals when needed; participants wanted to learn more about the recommended activities.
Low-fidelity paper prototype usability testing
Low-fidelity paper prototype usability testing

Low- to high-fidelity prototype

The revised information architecture included launch screens and connected Settings and Goals to show users where to modify goals.

High-fidelity prototype information architecture
High-fidelity prototype information architecture

Other notable iterations from low- to high-fidelity prototype included:

  • adding launch screens to provide more information around the app’s purpose
Launch screens added to high-fidelity prototype
Launch screens added to high-fidelity prototype
  • amending the visuals associated with sending a message
Send message screen (before and after)
Send message screen (before and after)
  • expanding resources to include articles about the types of activities
Resources screen (before and after)
Resources screen (before and after)
  • expanding the profile dashboard across several tabs
Profile screen (before and after)
Profile screen (before and after)

High-fidelity prototype prototype

Because of the focus on social influence, it was important for users to “see” themselves in the app. Using, I created variables, such as the participant’s name and their selected goals, that passed through to multiple screens

Screens from displaying stored user-defined variables
Screens from displaying stored user-defined variables

Remote usability testing

Everything was humming along beautifully up until this point; I had set up all my usability testing sessions for the high-fidelity prototype and was right on schedule.

Then the COVID-19 pandemic hit. 

And I had to pivot – FAST.

I had to find a way to replicate my testing sessions remotely. I settled on because:

  • participants could use their own phones (for safety);
  • there were minimal installation needs on their end; and
  • I could record their screen, voice and face (with permission). 

And best of all, my schedule was only pushed back by a week!

This time, 8 participants went through the moderated usability test. Because the testing was remote, I was able to expand my reach to participants outside my local area

Additional issues were identified which could be addressed in a future iteration (e.g. enhancing the intra-device communication flow, highlighting newly tracked activity and giving control over snoozing prompts). 

Usability testing using LiveShare testing on
Usability testing using LiveShare testing on

Measuring Engagement

Qualitative: Social influence questions

After the usability testing sessions, participants were asked four (4) questions addressing the social influence design patterns:

  1. If you used an app like this, would you ever join or create a community? (COOP)
  2. What, if anything, would influence you to participate in a challenge? (COMP)
  3. Would you like to be publicly recognized for your efforts? (REC)
  4. Does seeing all the other people inside the app influence your interest in using it yourself? (SFL)

They were prompted to explain their responses while further exploring the screens. I recorded their answers using the categorical scale of yes/maybe/no (Y/M/N) and color-coded them as green for yes, yellow for maybe and red for no.

Color-coded responses to social influence questions.

Quantitative: User Engagement Scale-Short Form (UES-SF)

The UES-SF questionnaire was used to gather quantitative data on the participants’ thoughts and perceptions around the prototype’s appeal, usefulness, and enjoyability. It measures a variety of dimensions related to engagement:

  • Focused attention – feeling absorbed in the interaction and losing track of time
  • Aesthetic Appeal – the visual attractiveness of the interface
  • Perceived Usability – affect from the interaction and the amount of effort expended
  • Reward – interest in the experience, the sense of involvement and success with the interaction.

The 12 questions were answered on a 5-point Likert scale ranging from Strongly Disagree to Strongly Agree. For this study, high engagement was defined as an overall score of 4 or higher.

Six out of 8 (75%) participants expressed high engagement with the prototype. The overall engagement score was 4.13 (median=4.08), with scores within each subscale ranging from 3.71 (Focused Attention) to 4.54 (Perceived Usability). 

Results of the UES-SF questionnaire
Results of the UES-SF questionnaire

Mixed methods analysis

I completed a cross-tabulation to explore a possible relationship between engagement and the highly engaged participants’ responses to the social influence questions.

Cross-tabulation results between UES-SF scores and social influence responses
Cross-tabulation results between UES-SF scores and social influence responses

Results showed that:

  • they were evenly split about cooperating with others in this context
  • half of them favored a competitive environment
  • the allure of public recognition was mixed, with an equal number either seeking it or wanting to avoid it, and
  • the majority were motivated to use the app because “others” were engaged within it.

So how could these results be used to inform the design of this type of app to enhance engagement?


1. Harness the influence of social learning and facilitation.

Participants paid attention to what was “said” in the discussion streams and “done” in the activity streams. 

"I've gone from activity stream, where I'm looking at what everybody else can do, then messages, where people are posting about what they did, and that's interesting."

"I also like the camaraderie that goes along with it...encouraging people to be active."

2. Include competitions or challenges, but make participation optional.

Though some participants were unaffected by how other people “performed” in the app, others found the intra-challenge messaging motivating. Allow users to create their own challenges and incorporate collaboration by encouraging users to sign up in teams.

"112 flights...good on ya. But my 40’s just fine if last week was 35 and this is something I’m working on."

"The messages are pretty encouraging; motivating, too..saying "I started using the bathroom two floors up" just to get moving."

3. Publicly recognize achievements while giving users control over what details are shared.

Highly engaged participants were either totally interested or completely uninterested when it came to being publicly recognized. They liked the external motivation of tokens, but made it very clear that privacy should be respected. 

"Heck, yeah! If I do something good, why not? I mean, it's not public shaming; it's public praise."

"People who are more sensitive to sharing their information should have that ability to have private accounts, private groups..."

4. Include online communities, but be cognizant of the social norms and attitudes embedded within the physical workplace.

Interestingly, more than half of the participants said they would be more likely to use the communities to engage with people outside of work.  

Workplace culture may affect the types of relationships developed among colleagues; if the culture is one where taking breaks goes against the social norm, employees may not want to draw attention to their activities by stating their intentions on a public forum.

"Is this a way to meet up with co-workers or just anybody in the group? This could be a new way to meet people in the community or around your neighborhood."

"It probably wouldn’t be with colleagues at work. It would be with people who aren’t in the same building, or on different floors."


  • Move It! could leverage existing social networking platforms for future iterations; for example, as a Facebook app. Online social networking sites typically have higher engagement levels and lower attrition rates than digital health interventions.
  • The power of workplace culture can’t be underestimated. One participant even stated that the appeal of participating in a challenge is that it would provide “an opportunity to stand, to move around” in a culture where they felt that behavior was not the norm. 
  • It would have been great to use the same participants throughout all phases of the study, which wasn’t possible due to scheduling constraints. Two participants did all 3 phases; in the initial interview, both noted they weren’t affected by their colleagues’ behavior, yet noted that they would move more if others around them did. 
  • Finally, the high engagement scores measured in this study may not necessarily predict long-term adherence or engagement, so a longer-term study would be necessary to truly determine if they would continue to engage with the app for any meaningful length of time.