Move It!

Exploring engagement in a mobile app that helps you move more at work by connecting you with your colleagues.

Image for Move It mobile app


Prolonged sitting is detrimental to your health. But adults spend ~66% of their workday sitting. Because of the resulting health issues, it’s clearly a significant workplace health issue.

Using digital health apps to educate employees can be helpful. But it’s challenging to keep people engaged, making it harder to influence their health behaviors. 

Interestingly, employees are more likely to display healthy behaviors if they have positive relationships with peers who display healthy behaviors. So, supporting social online connections may contribute to greater engagement in these apps. 

designed a mobile app to reduce sitting behavior at work using a persuasive design framework focused on social influence. I then explored engagement to discover how to best apply this framework.


But first…introducing Move It! Move It! helps people become more aware of their sedentary behavior and use their social networks to move more at work. Employees set goals to stretch, stand or stroll and the app prompts them to take those breaks every 30 or 60 minutes.
People track their activity and can view daily, weekly or monthly historical data. Finally, they can participate in challenges, join communities and learn more about why they should sit less. Watch a video walkthrough of the app below.


Research Questions

Which behavior change techniques are best included in an app to foster positive health in the workplace?
How can social influence be best leveraged to promote engagement in a digital health promotion app?

My Role

This solo project was completed as my master's thesis.

I gathered user research, designed low and high-fidelity prototypes, conducted usability tests, surveyed participants to explore measures of engagement, and determined recommendations for designing a digital health mobile application while leveraging social influence.

Methods & Tools

Interviewing, qualitative thematic coding (with MaxQDA), affinity mapping, low-fidelity paper prototyping, high-fidelity prototyping (with, usability testing (with, quantitative surveying, mixed-methods analysis


February - May 2020

Data Collection

Personas derived from semi-structured user interviews
Personas derived from semi-structured user interviews
I recruited 15 participants to help with the 3 phases of the study. The initial screening criteria included self-reporting sitting at least 5 out of 8 hours. Their sitting time was quantified using the Occupational Sitting and Physical Activity Questionnaire. Overall, they sat a median of 6.75 out of 8 hours, or 84.4% of the working day
Semi-structured interviews
To learn what strategies would resonate with users to foster behavior change, I interviewed 8 participants to identify their pain points and needs related to their own sedentary behavior at work.

They spoke of their aches and pains and frustrations around how to manage them. They all recognized how their work environment influenced their discomfort. I also explored if and how they used technology to manage their behavior. 


I transcribed and coded each interview using the Behaviour Change Techniques Taxonomy as themes. The taxonomy provides “a method for specifying, interpreting and implementing the active ingredients of interventions to change behaviours”.

Because multi-component behavior change interventions tend to be more successful, I incorporated the top 5 techniques into the prototype

Behavior change techniques and corresponding tasks and features
Behavior change techniques and corresponding tasks and features
Functional requirements

The persuasive software design patterns for social influence provide a framework of features that leverage social influence to motivate users. These patterns include Social Learning and Facilitation (SLF), Competition (COMP), Cooperation (COOP), and Recognition (REC). 

In Move It!, the design patterns were represented by social affordances, artifacts that frame a sense of community and enable social interaction. These include images of human faces, discussion boards, activity streams, groups, leaderboards, and badges. 

Low-fidelity prototype

The prototype was mapped out using the architecture below, highlighting the social affordances and behavior change support functions and features of the app.

Paper prototype information architecture
Paper prototype information architecture

I used card stock to create a paper prototype representing the major user flows of the app. 

Paper prototype screens
Paper prototype screens
Usability testing

The participants completed a moderated usability test with a talk-aloud protocol. The goal was to look for blocks or stumbles in the flow, to capture their opinions, impressions and frustrations and to gain feedback on any potential missing features. 

Three themes stood out:

  1. Unclear how to send a message to other users 
  2. Data on the profile dashboard were confusing; participants wanted to see historical data
  3. Unclear how to modify goals when needed; participants wanted to learn more about the recommended activities.
Low-fidelity paper prototype usability testing
Low-fidelity paper prototype usability testing
Low to high-fidelity prototype

The results of the usability testing informed changes to the high-fidelity prototype, which included:

  • adding launch screens to provide more information around the app’s purpose
Launch screens added to high-fidelity prototype
Launch screens added to high-fidelity prototype
  • amending the visuals associated with sending a message
Send message screen (before and after)
Send message screen (before and after)
  • expanding resources to include articles about the types of activities
Resources screen (before and after)
Resources screen (before and after)
  • expanding the profile dashboard across several tabs
Profile screen (before and after)
Profile screen (before and after)

High-fidelity prototype

Because of the focus on social influence, it was important for users to “see” themselves in the appI created variables, such as the participant’s name and their selected goals, that passed through to multiple screens

Screens from displaying stored user-defined variables
Screens from displaying stored user-defined variables
Remote usability testing

Everything was humming along beautifully up until this point; I had set up all my testing sessions and was right on schedule.

Then the COVID-19 pandemic hit. 

And I had to pivot — FAST.

After exploring a number of options, I settled on to conduct my testing sessions remotely.

Additional issues were identified which could be addressed in a future iteration (e.g. enhancing the intra-device communication flow, highlighting newly tracked activity and giving control over snoozing prompts). 

Usability testing using LiveShare testing on
Usability testing using LiveShare testing on

Measuring engagement

Qualitative: Social influence questions

After the usability testing sessions, participants answered four (4) questions addressing the social influence design patterns:

  1. If you used an app like this, would you ever join or create a community? (COOP)
  2. What, if anything, would influence you to participate in a challenge? (COMP)
  3. Would you like to be publicly recognized for your efforts? (REC)
  4. Does seeing all the other people inside the app influence your interest in using it yourself? (SFL)

They were prompted to explain their responses while further exploring the screens. I recorded their answers using the categorical scale of yes/maybe/no (Y/M/N) and color-coded them as green for yes, yellow for maybe and red for no.

Color-coded responses to social influence questions
Color-coded responses to social influence questions
Quantitative: User Engagement Scale, Short Form (UES-SF)

The UES-SF questionnaire explored participants’ thoughts and perceptions around the prototype’s appeal, usefulness, and enjoyability. The questions were answered on a 5-point Likert scale; high engagement was defined as an overall score of 4 or higher.

Six out of 8 (75%) participants expressed high engagement with the prototype with an overall engagement score of 4.13

Results of the UES-SF questionnaire
Results of the UES-SF questionnaire
Mixed methods analysis

Cross-tabulation results helped me explore any possible relationships between engagement and the participants’ responses to the social influence questions.

Cross-tabulation results between UES-SF scores and social influence responses
Cross-tabulation results between UES-SF scores and social influence responses
  • They were evenly split about cooperating with others in this context
  • Half of them favored a competitive environment
  • The allure of public recognition was mixed, with an equal number either seeking it or wanting to avoid it, and
  • The majority were motivated to use the app because “others” were engaged within it.
So how could these results inform the app’s design to enhance engagement?


1. Harness the influence of social learning and facilitation.

Participants paid attention to what was “said” in the discussion streams and “done” in the activity streams. 

Social learning and facilitation

I've gone from activity stream, where I'm looking at what everybody else can do, then messages, where people are posting about what they did, and that's interesting.

I also like the camaraderie that goes along with it...encouraging people to be active.

2. Include competitions or challenges, but make participation optional.

Though some participants were unaffected by how other people “performed” in the app, others found the intra-challenge messaging motivating. Allow users to create their own challenges and incorporate collaboration by encouraging users to sign up in teams.


112 flights...good on ya. But my 40’s just fine if last week was 35 and this is something I’m working on.

The messages are pretty encouraging; motivating, too..saying "I started using the bathroom two floors up" just to get moving.

3. Publicly recognize achievements while giving users control over what details are shared.

Highly engaged participants were either totally interested or completely uninterested when it came to being publicly recognized. They liked the external motivation of tokens, but made it very clear that privacy should be respected. 


Heck, yeah! If I do something good, why not? I mean, it's not public shaming; it's public praise.

People who are more sensitive to sharing their information should have that ability to have private accounts, private groups...

4. Include online communities, but be cognizant of the social norms and attitudes embedded within the physical workplace.

Interestingly, more than half of the participants said they would be more likely to use the communities to engage with people outside of work.  

Workplace culture may affect the types of relationships developed among colleagues; if the culture is one where taking breaks goes against the social norm, employees may not want to draw attention to their activities by stating their intentions publicly.

Is this a way to meet up with co-workers or just anybody in the group? This could be a new way to meet people in the community or around your neighborhood.

It probably wouldn’t be with colleagues at work. It would be with people who aren’t in the same building, or on different floors.


  • It would have been ideal to conduct the study in context because the power of workplace culture can’t be underestimated. One participant was interested in the challenges because they would provide “an opportunity to stand, to move around” in a culture where they felt that behavior wasn’t seen favorably. If I could go back in time, I would have added diary studies to capture those moments.
  • Move It! could leverage popular existing social networking platforms for future iterations; for example, as a Facebook app.
  • It would have been great to use the same participants throughout all phases of the study, which wasn’t possible due to scheduling constraints (and the pandemic). Those who participated in all 3 phases noted they weren’t affected by their colleagues’ behavior, yet later revealed they would move more if others around them did.
  • Finally, a longer-term study would be necessary to truly determine if they would continue to engage with the app for any meaningful length of time.

Next project

Using a design sprint to create a mobile app that uses AR technology to
foster citizen participation in urban planning

Previous project


Re-imagining the car buying experience at the dawn of COVID-19