OpenSRS is a leading domain reseller that offers its customers a platform for managing and reselling domain names and custom email names worldwide.
Despite being extremely successful, the OpenSRS website was outdated and should be redesigned to speak to a more current and larger audience.
Among the redesigned section of the website, the pricing section was a big unknown, since apart from never being tested before, the pricing plans for the product are different from the rest of the market. Additionally, since high churn rates were identified during the sign-up flow, the redesigned sign-up experience also needed to be evaluated.
As a UX Researcher, my role was to test the redesigned pricing section and answer a few questions the Design and Marketing teams raised.
UX Outcomes: Defining success
Validate the marketing and design solutions for the pricing page and the sign-up flow
Highlight important insights, issues, or miscommunication points that need to be addressed by the team.
Target users and recruitment
The audience targeted was the one identified as a priority by the marketing as potential new clients.
The platform UserTesting.com was used for user recruitment. The screener questions were tailored to recruit users that fit into the below demographics:
Due to the nature of the questions, we had to answer, a mix of user interviews and usability testing was chosen as a research method.
The objective of the interview was to understand how users trust companies and how do they chose to partner with them without biasing the users by showing our prototypes.
The usability testing allowed us to validate marketing and design ideas early in the process.
The interview guide comprehended four distinct sessions of the redesigned website: Main page, Why OpenSRS, Pricing section, and Sign-up flow.
Our usability testing was conducted with five users. We provided tasks for users to try to conduct on our prototype and after each task, we asked targeted questions to gather as much information as possible about their needs, opinions, and impressions of the website.
Notes from the findings, taken by me and designated notetakers, were compiled and grouped up on FigJam by conducting an Affinity Mapping.
The findings were divided into four groups: Confidence and Trustworthiness, Pricing section, Sign-up flow, and User understanding of the Brand.
We successfully identified:
Presenting the findings
The findings were presented to all stakeholders and it was decided that the first priority was the rework of the pricing section.
We offered to host an ideation session to assist the design team with potential solutions.
The ideation dynamics were conducted with designers from this project and others. Two concepts stood out and were prone to testing.
We decided that a Preference testing using the two selected concepts would be the best to choose which path to move forward.
The platform Usertesting.com was chosen for Recruitment and conduction of the Preference testing. We used the same screening questions as before for user recruitment. A total of 20 users participated in the test.
Tasks and questions were tailored aiming to understand which of the concepts better informed our users and what specific elements helped them comprehend the pricing plans.
Over 80% of the users preferred concept A over concept B. However, some elements of concept B were mentioned by many users as helpful.
Upon presenting the results we recommended to the design team that a mixed concept containing the highlights from each concept testing would be the best solution for our problem.
The design team agreed with our recommendation and moved forward with the mixed concept, which was approved by all stakeholders and finally published on the new website launch.