top of page

CASE STUDY

Stringent usability testing to ensures new mobile app mirrors existing PC test-taking experience for at-home English proficiency test.

|Anonymous | Mobile app | Ed-tech | New Product | UX Research |

Executive summary​

  • Company: A well-known provider of learning materials and qualifications.

  • Product: A mobile app for mid-stakes English proficiency testing, expanding upon their existing desktop application.

  • Challenge: Adapting the test for mobile devices posed usability challenges due to the smaller screen size, risking inconsistencies in user experience and potentially affecting test scores.

  • Goal: To launch a mobile testing app that ensured a secure, consistent testing experience with comparable scores between desktop and mobile users.

  • Approach: A three-phase research plan focusing on resolving usability issues before conducting score comparability testing. This included in-person and unmoderated testing with global participants to gather comprehensive feedback.

  • Impact: The research identified critical usability issues early on, allowing for design improvements before score comparability testing. This ensured that any differences in test outcomes were due to user performance rather than app design flaws, supporting a successful mobile app launch.

Decorative image.

Background
A well-known learning materials and qualifications provider (Company X) set out on a mission to maintain their leading status through innovative technology. Part of this mission was to make mid-stakes English proficiency test-taking more accessible for those without access to PCs.

 

Goals
Company X already offered a desktop app that enabled users to take an English proficiency test from home, featuring innovative anti-cheating measures. To expand their reach, their goal was to launch a mobile app version with an equally secure and accessible testing environment. This included ensuring consistency in how information and questions were presented, as well as maintaining robust security to prevent cheating. The aim was to ensure that the test scores from the mobile app were equivalent to those from the desktop version (i.e. neither PC nor mobile experience should provide an easier way for test-takers to obtain a higher score).

Challenges
The smaller screen size of mobile devices meant that the test design needed a different structure and layout compared to the desktop app, raising concerns about potential usability issues that could impact test scores.

 

Approach
Before joining the project, the research plan combined usability testing and score comparability into a single round of research. However, after a initial assessment, this original approach posed major risks, i.e. if significant usability issues were present, this would negatively impact test scores, ultimately invalidating the research results from comparing mobile and desktop test scores.

 

I redesigned the research into 3 phases to optimise outcomes and minimise risks:

  • Phase 1: Usability Testing - Focused on identifying and resolving key usability issues that could impact users’ test scores. 

  • Phase 2: Address major usability issue through redesigns.

  • Phase 3: Score Comparability Testing - A complex process of comparing desktop and mobile test scores, involving different participant groups and varying the order of test-taking to avoid conditioning or familiarity affecting the results (The Test Development team led this phase).

I led Phase 1 of the research which involved 2 parts: 

  •  In-person testing using equipment to remotely observe participants’ behaviours and reactions, minimising interruptions and distractions. Follow-up interviews provided deeper insights into their experiences and helped to explore specific behaviours observed during testing.  

  • Unmoderated testing with a follow up survey (translated into 5 different languages) with participants in 5 different regions of the world. This ensured we adequately covered our target market and gain reliable data by working with participants local languages. 

 


Impact
Phase 1 uncovered critical usability issues. The most significant related to a tab feature on an information-based question, where users read a paragraph and then answered multiple-choice questions. Surprisingly, 40% of participants misinterpreted the tab feature, believing they couldn’t navigate back to the related text. A deeper analysis revealed two key reasons for this confusion:

  • Colour Perception: The brand’s light blue colour used for the tab led participants to interpret it as inactive.

  • Inconsistent Expectations: In previous sections of the test, access to supporting information had been restricted. This caused participants to assume that the same restriction applied to this particular question, even though it did not.

As a result, participants couldn’t engage with the question as effectively as they could have if they had understood the tab functionality, ultimately affecting their test scores.

 

Conclusion
This case study highlights that, despite the app’s expert design and multiple rounds of internal testing, critical usability issues still emerged during user research. The findings from Phase 1 allowed the digital team to address these issues before moving on to Phase 3, ensuring that usability problems wouldn’t distort score comparability.
By uncovering these hidden challenges, the team could proceed with confidence, knowing that any differences in test scores would be due to genuine performance rather than usability barriers in the app's design and functionality.

BRANDS I'VE WORKED WITH

More past projects

Decorative image.

Developing a mobile app for an English proficiency test mirrors the existing PC experience.

|Anonymous | Mobile app | Ed-tech | New Product | UX Research |

Reducing major user frustrating with Innovate UK’s portal with a minor design change.

|Innovate UK | SaaS | Optimise UX | UX research |

Decorative image.
Decorative image.

Embedding a system for collecting continuous UX insights to create clear strategy for product improvements for a subsea tech scale up.

|Vaarst | Startup | Physical & digital product | UX research |

Validating a new My Dyson app features to enhance adoption and engagement.

|Dyson | Mobile app | optimise engagement | UX research |

Decorative image.
Decorative image.

Enhancing the UX to increase organic growth of an award-winning children’s diary app.

|DiaryZapp | Mobile, tablet & PC app | Optimise engagement | UX research |

Identifying and addressing the cause of low engagement of a fitness app.

|SoSweat | Mobile app | New feature development | UX research |

Decorative image.
Decorative image.

Multi-stage UX research to generate insights for a CRM to unite 9 different research councils

|UKRI | SaaS | CRM |New product developement | UX research |

Creating a knowledge sharing platform to accelerate efficacy of local councils' response to Covid-19.

|NHS T&T | SaaS | New product development | UX research |

Decorative image.
Decorative image.

Using customer insights to drive website redesign to optimise target audience engagement.

|Clarion Insight | Agency | Website | UX research |

Creating a new website to market i3Works new digital services.

|i3Works | Agency | Website | UX research |

Decorative image.
Decorative image.

Boosting VCA’s digital team’s user-centricity to create better user experiences.

|VCA | UX knowledge share |

Optimising the UX to encourage more sign-ups for data collection.

|HCI Connect | Mobile app | Health-tech | Existing Product | UX Review |

Decorative image.
bottom of page