PRICING RECOMMENDATION TOOL
Discovery research, User flow/decision tree, wireframing, prototyping, user testing, final UI design, A/B testing
I was tasked with designing a pricing recommendation tool to help users choose the right plan for their needs.
Empathise and define: I was aware that this was initiated by stakeholders in the business, and, with a critical mindset, I decided to carry out research to validate if this was a real user need. I leveraged existing research from the product team and obtained great insights from their reports, and conducted an interview with the head of direct sales who also had valuable insights about user problems. I also conducted competitive analysis of other websites.
Ideate: Following on from the research, I had gathered evidence that there was a real user problem and there was clear desirability for a pricing recommendation tool. I conducted some ideation with my team mates, building on How Might We’s to solve some other problems uncovered in the research.
The next step involved thinking about the business needs, product ladder and how the features and plans were structured. The challenge revolved around how to balance business requirements whilst ensuring that the quiz would be simple and quick for users to complete. It was also a balancing act between generating more revenue for the business, but recommending an appropriate plan for users.
I created a decision logic tree which works in a way that the user is only presented with questions that are relevant to them. This ensured that the quiz would be as simple and easy for users as possible.
Prototype: After getting buy-in from the rest of my team, I created some wireframes which I turned into prototypes.
Wireframes created in Sketch and prototyped in InVision. I then conducted one-on-one moderated user testing and had some great feedback.
Test and Implement: Following on from the user testing and more feedback from stakeholders, I iterated on the designs, leading to the final UI designs.
I handed these designs over to the front-end developer to implement A/B testing. We included a quiz survey on the last page - a “Was this helpful” question which, when selected, allows the user to give further feedback. Our other metrics we measured were sign up rates and drop off rates.
Results: This design resulted in a 16% increase in web sign ups compared to the default, with 99% statistical significance. 90% of users who entered the tool completed it.