top of page

Usability comparison study

flipgrid.png

the challenge

Clemson Online provides Clemson University faculty and students with online learning technology that integrates into the platform, Canvas. They proposed a comparative study to assess the user experience of students and faculty using the two video software, Flipgrid and VoiceThread. The university purchased VoiceThread on a limited site license, while Flipgrid was a free software through the existing Microsoft license.

 

Due to the lack of usability research into either software, and a lack of user research into the use of video tools used by Clemson Online lecturers and students, the objectives for this study were to obtain an inclusive understanding of their user's needs, assess the usability of the products, and provide a recommendation on which software to integrate officially into the university.

at a glance

Client:         Clemson Online

My role:       Usability Researcher

Duration:      5 months

Tools:

​

​

camtasia-studio.png
Adobe_InDesign_CC_icon.svg1-5a5c2eb047c2
microsoft-powerpoint-document-icon-4.png
611px-Microsoft_Excel_2013_logo.svg.png
logo microsoft word icon.png

Research questions

  • How do faculty respond to software?

  • Which platform do lecturers find easiest to use?

  • Which platform do lecturers prefer to use?

  • What are the needs/interests/concerns of using this type of platform?

  • How do students behave while using the software to complete real tasks?

  • Which platform do students find easiest to use?

  • How integrated are the platforms within Canvas?

  • Which platform do students prefer to use?

  • Overall, which platform is the most user-friendly and best meets both users' sets of needs?

the approach

I collected four types of data:​

  • Focus Groups with faculty, both novice and experienced, who were shown the software to discuss functionality, and their own preferences, needs, and concerns

  • Think-Aloud Protocol with students to articulate their actions, thoughts, and observations as they completed real tasks using Flipgrid and VoiceThread

  • SUS Analysis for students to rank the usability of each software on a 1-5 scale

  • User Interviews in which both the focus group and think-aloud protocol users answer questions about and elaborate further on their observations and opinions

​

The data analysis portion of the project involved summarizing participants comments and observations, alongside my own, to prepare a PowerPoint recommendation report. To see calculations of the SUS scores from student participants, click View SUS Analysis to the right. Due to IRB confidentiality, participant recordings were used; instead, screenshots after the tests were taken of the software, comments were transcribed, and participants were referred to as Users 1-12 to ensure anonymity.

Note: The Recommendation Report is a combined document of the focus group findings and think-aloud protocol findings, submitted to Clemson Online, who requested it in one document. The focus group findings had been discussed upon completion of the focus group sessions early in the project with the main stakeholders; the conference presentation to get additional staff up-to-speed, and to mainly discuss the student think-aloud protocol tests.

​​

tl;dr: I would never, under any other circumstance, give a client a 40+ pg report.

 

Carry on!

constraints & solutions

The major challenge I encountered was in the recruitment phase of the faculty focus groups. While the recruitment was open to any faculty, regardless of their experience with either software, I believe the lack of incentives for participation was a major factor in the overall turnout (which aimed for 8-12). Even through personal outreach on the recruitment side of Clemson Online staff and a full schedule of participants, turn-out was inconsistent, with no-shows and less than ideal group numbers per the three focus group sessions (1st session had 6 participants, 2nd session had 2 participants, and 3rd session had 1 participant).

 

As a result, I had to decide to move forward with the focus group sessions, valuing the hour faculty had set aside to participate and gather as much data as I could. While this skewed the methodology of focus groups and posed the sessions for less group dynamics (i.e. conversation, thought-exchange, etc.), it provided me with the ability to probe individual participants deeper on their thoughts and opinions of the software. I do recognize, however, that the two focus groups with 1-2 participants are not a representative sample of the population in a methodologically sound qualitative study.

​

Serving as the user advocate for both faculty and student users was also a challenging aspect of this project; in my previous experiences, I typically had only one end user group. Having faculty and student users, and data from focus groups (where faculty were only shown the products) versus students (who performed tasks with the products) created a more complex and enriching data analysis procedures, and ultimately, a more well-rounded recommendation to the client.

the outcome

In the report, I framed the delivery around an actionable recommendation to choose one software, rather than recommendations to improve certain usability issues. Staying focused on determining which video software would best meet the needs of Clemson Online's lecturers and students enabled me to provide a recommendation report that met the exigency of the project.

​

To summarize, the usability tests and user research showed that:

  • Faculty need more instruction on how and why to use these tools in their courses,

  • Because faculty don't see the need for video discussions in their classes, a lot of the functionality that VoiceThread provides isn't something that faculty are looking for,

  • The fact that Flipgrid is cleaner, faster, and simpler makes it a better option, particularly since students find it easier to use.

​

This project taught me a great deal about my persona as an empathetic usability researcher, and strengthened my assertive communication skills, as I had to be confident despite variations in age, profession, and skills. While the recommendation for this project was important in order to officially integrate one product and potentially save thousands of dollars, any usability issues noted within the recommendation report would never be seen or considered for further product refinement by the companies; this left me more eager to test in design sprint set-ups, rather than testing completed products.

​

​

bottom of page