Paul Kim • Product Designer
p13n thumbnail.png

Personalized Onboarding

Personalized Onboarding Hero.png

OVERVIEW

Problem

How might RetailMeNot provide a personalized experience to generate the most relevant shopping deals based on explicit and implicit user feedback?

Solution

The experience leverages machine learning algorithms to highlight content that is contextual to shopper’s habits and inputs into the system. The application encourages users to evaluate their preferences and curates an experience on these preferences every time.

See it live today (iOS | Android)

My Role

UX Designer, UX Researcher

Team Members

John Cathey, Anna Hofmanova

Tools

Optimal Workshop, Usertesting.com, Sketch, Principle

 

Approach

Method.png
 

Research

When we set off to create a personalized experience we had a plethora of data points that we could pivot on to craft this experience, but no real data to know exactly what would provide the most value to shoppers. We evaluated the existing personas to find that personalization would help across all our personas. We focused our efforts then on pinning down our best efforts through generative research and an unconventional use of card sorting to help give a framework to prioritize.

 

Competitive analysis

The team undertook intentional exploration around personalization among competitors in the digital e-commerce space. We didn't want to assume that we could just copy and paste every personalized experience, and expect it to perform for our users. We looked at various methods, contexts, and tactics that were being used. And even expanded our scope to just general personalization efforts in different domains.

Products we analyzed in competitive research

Products we analyzed in competitive research

Top level findings included:

  • Personalization interactions fall into two main categories:

Explicit and Implicit.png
  • Competitors did not incorporate personalization experiences into their products. The only main source of personalization was geo-location of the shopper to give deals nearby

  • Other domains provided more case studies of successful personalization efforts, such as Netflix’s “Because you watched” exhibits

This inspired our strategy into two approaches:

  • First being implicit feedback, which we internally referred to as a passive personalization experience

  • Second being explicit feedback, which was a proactive interactive experience from shoppers on our platform

 

Qualitative Interviews

We then knew that we needed to talk to shoppers who use deals online to understand their context better to align our solutions. So we conducted generative research to unpack their current behaviors, and codified the data in order to identify opportunities to personalize the experience. We were able to synthesize their narratives into user stories that helped define our journey:

User stories.png

In addition, we used the data to create an affinity map that helped identify 14 prospective variables that we could potentially influence to help craft the personalized experience

Word cloud.png

Card Sorting

Pref Matrix.jpg

We had great data from shoppers, but we found ourselves not being able to prioritize where to go with this data. So we turned to utilizing card sorting in an unconventional way to have our customers tell us what was important. We gave them each of these variables on a card and asked them to sort it based on two parameters: 1) How important is this to having a personalized shopping experience? 2) How willing are you to share this information?

We broke it up into these two paradigms since our assumption was that we didn’t want to create an experience where we could not get the important information that shoppers were not willing to share with us. We then mapped the aggregated results on a matrix, and set out to start with the variables that were both high in importance and high in willing to share.

 

Design

Experience STrategy

Aggregating all the information, the team decided that there would be two contexts to focus on for personalizing the experience. One would be the onboarding experience, where the explicit inputs would reside. And the other would be the home page where the passive or implicit personalization would occur. The two experiences being stitched together reflected the logic that personalization has to be woven throughout the entire experience. And furthermore, that once a user went through the onboarding experience, they would be able to validate their responses based on the content that the algorithm would generate for them on the home page.

We first set out to unpack the onboarding flow, while ensuring that the existing requirements for the business around the flow were retained. These included making sure that the notification settings for geo-location and push sustained in the experience. We then used the matrix from the card sort during research to craft a few wireframes based on the user flow that we determined through brainstorming several ideas together.

Initial Flow and Frames of MVP

Final Experience Flow

We then took the final concepts into crafting iconography and final UI utilizing the existing design guidelines of the brand. And finally stitching it into an interactive prototype to test with users in the Evaluation stage of the process.

 

Evaluation

The first iteration of the designs incorporated selection of categories and stores.  But also maintaining the notification enablement prompts. The data sets were built out by engineering to ensure that the signals received from the onboarding experience would then be reflected on landing upon the homepage. We worked closely with the data science and engineering teams to craft a real set of data and content that would be reflected for the study. Utilizing this relevant content was crucial in validating the experience, so there were no shortcuts on this end.

We took these two experiences into a prototype to test and validated with users on the interactions, desirability, and validation against the user stories that we generated in research. We were keen to understand what the successes and pinpoints of the experience were. And even more so keen to understand if the selections they were explicitly inputting were generating the content they wanted. This qualitative information not only helped the front-end, but massively influenced the back-end algorithms that our data team iterated on both during and after the implementation of this experience.

Key Results

Some key results we saw upon implementation included:

  • 5% lift in successful sessions of utilizing an offer within the app

  • Retention increased by 5%

  • Reduced bounce rate on the mobile app homepage by 5% when personalized content is present

 

Future considerations

In order to communicate to the user that the selections of categories has implications on the stores that are presented, future designs animate the category selections onto the subsequent screen. 

In this experience, the selections from previous screens retain on following screens through micro-interactions and content strategy to help give the experience that every selection matters. So that we can give the deals that matter.