While reviewing data for issues in the experience, we noticed students were taking a long time to start their coursework. We had just discovered that students who take more than 30 days to start are significantly less likely to complete any courses, so we knew we needed to get students engaged much sooner in their journey.
Working with a cross-functional team, I gathered research, insights, and data to identify gaps in the onboarding experience and design a guide to welcome students onto the platform, deliver important program context, and help them start their first course.
With this solution, we saw a 50% decrease in time to start and early indicators of long-term success metrics.
I was not able to do user research in this phase; however, we already had some research on this population to pull from as well as a whole team who spoke to users regularly. I gathered research and data insights, conducted internal interviews, and reviewed the understand the user needs better and identify any gaps in the user journey.
After gathering the research, I pulled together the notes to identify patterns and relationships.
One of the themes I noticed is that students really want more support and direction. The platform was originally setup to give students full access to all courses in their degree plan. They could complete the courses in whatever order they wanted and even do multiple at a time. However, this seemed to be something the students either did not want to do or didn't know how to.
I also discovered that there were information gaps in the user journey causing students to miss important details around expectations and policies.
The navigation issue is something we were aware of, but we weren't able to tackle that issue at this time. So, I needed to help students understand the current navigation better.
For a couple of the issues, we already had resources to support the solution. We just needed to call attention to it so the students didn't miss it.
Platform tour: It was already created, but it popped up immediately after registration. If a student skipped it, they couldn't get back to it.
Orientation: This explains all the policies and context details the students need to know, but many students skip it or just click through it quickly. I need to communicate its value and importance better.
Based on the insights from the discovery, I mapped out an updated user flow for the onboarding phase. I focused on building up information from a foundational level—each step equipping the student to access, understand, or retain information shared in the next step.
With an understanding of what needed to be accomplished during onboarding, I started pulling the ideas together so I could get feedback from stakeholders and get into some technical feasibility discussions. I pulled together a quick wireframe share the idea of an onboarding guide, the steps that should be included, and how we could help students choose their first course.
I shared this with stakeholders, then met to go over the feedback.
The idea was very well received and got many of our stakeholders excited.
However, there was a big issue that we had to deal with. Giving students the 3 course choices would be a huge development undertaking requiring a logic engine and additional data points for every course we offer.
A two phase approach
Phase 1 would be a simpler solution that does not require any new data architecture, definitions, etc. This would give engineering time to work on the core structure and allow us to test manually providing course recommendations while enrollments were low.
While that was being developed, our academics team developed a tool to rank all of our courses on a common scale based on course data we had on effort, time to complete, course NPS rating allowing me to design a solution that recommends an easy course for them to get familiar with the learning experience, develop good habits and grow in confidence.
Now that I had a more solid understanding of the technical constraints and possibilities, I could design something more realistic. I spent time sketching to try out different ideas for reducing the size, organizing the content, and identifying where it could live.
I had a difficult time finding a place where it felt like it fit. I wanted a sidebar, but that wouldn't really work with our current layout. I considered it working as a modal, but that came with extra accessibility and discoverability issues. I landed on the space between the profile details and the course area. The sections were already clearly defined because of their white backgrounds, so I could make it feel like it was another section on the page, and use their proportions to define the size.
The progress bar served as more than just a list of steps, it was also an opportunity to engage students early by recognizing the completion of these smaller steps and helping them see that they are making progress.
For the phase 1 design, we didn't have a decision engine built in yet, but I wanted to make sure users were taking the time to make a course choice at that time.
For this step I wanted to encourage the user to make that choice now, give some context, and make it super easy for them to ask for help if they didn't want to make that decision themselves.
For phase 2, we now had a way to recommend a "first best course". Rather than making them choose, we decided we would assign each student a first course. This accomplished a few things:
I also tweaked the copy some to accommodate the data model we had to work with. In the current system, a "Course Start" was defined as "the completion of the course's first required assignment". This definition controlled the completion status of this step. While I didn't like jumping straight to the first assignment completion (there are important steps that happen in between), it was helpful to focus the student on a very tangible milestone in their course journey.
Annotating interaction, navigation, and completion definitions.
The final prototype served several purposes for my design process.
Measuring the success of this solution was essential to being able to prove that we understood the users needs. I worked closely with our analyst to define how we would measure and calculate results. All data was compiled in a dashboard, and our analyst reported back data every 2 weeks until we hit our minimum threshold.
Since I did not get to test this solution with students, I wanted to make sure that the guide was not introducing new issues or roadblocks. I worked with our student facing teams to develop an incident reporting plan, ensuring the issues, feedback, and details were compiled and that I was notified. Fortunately. we did not encounter any issues.
The final solution performed better than we anticipated. We were able to launch the phase 2 designs right before a large influx of new students which gave us plenty of data to evaluate the effectiveness.
Reduced time to start by nearly 50% (statistically significant)
Early data showed students were completing their course faster