AI-Powered Guided Practice

Web Design

Education

Gen-AI

Company

Chegg

2024.05-2024.06

Team

1 Product designer/me

1 Product manager

1 UX researcher

1 Content designer

2 Software engineers

1 Learning experience designer

1 Machine learning engineer

My Role

Product Strategy

Product Design

What's the background?

Chegg, a popular homework-help service with 7.7 million subscribers.

In 2023, Chegg began offering AI-powered instant solutions and changed its product structure into the chat UI.

However, the actual situation is not ideal…

More than 70% of students just leave after getting answers in the chat with Chegg.

Our goal was straightforward yet challenging —

How might we deepen engagement beyond just providing instant answers?

Design solution

Built a more interactive and guided practice experience on Chegg to address learner needs and deepen engagement.

This project will be launched later this year based on the focus shift within the company. But we already got some very positive feedback from previous UXR and marketing sessions!

Scroll down to see how I made it happen

Overall process

Gained conviction with a funnel process.

It was challenging to narrow down the scope from a broad starting point. Therefore, we decided to implement a funnel process, conducting a series of tests and learning from each iteration. This approach ensures we gather sufficient signals and build confidence as we progress.

Approaching problems

Identified unmet needs beyond answer-seeking.

After brainstorming…

I narrowed down the scope with XFN.

After getting aligned with the targeted users and their high-level needs, I started initiating the process to narrow down the scope. Starting from 14 initial ideas from previous discovery projects and the brainstorming session, I worked closely with XFN to choose the top 6 ideas based on impact, capabilities, and learning outcomes.

Early validation

The concept of [Guided practice] became the winner of the early UXR.

Then I collaborated with a UX researcher and content designer to conduct the test with 30 college students focusing on feature attractiveness instead of the UI.

The 6 concepts we tested

The decision making process

The 6 concepts we tested

The decision making process

The 6 concepts we tested

The decision making process

This process helped me get a more clear and more user-centered problem:

How might we provide a practice experience to address learner needs and deepen engagement?

Identifying the learning experience

Embedded learning principles into feature ideation.

To better guide the ideation, I worked the learning experience designer to define these principles:

📚 ✨
Relevant & Contextual

Deliver learning materials and information that are contextually aligned with students' current needs and goals, enhancing their engagement and motivation to learn.

🤝 💬
Collaborative & Interactive

Foster opportunities for students to interact with peers or tools, promoting active collaboration that strengthens understanding and engagement.

📊🏆
Evaluative

Provide mechanisms for students to assess their skills and track progress, creating a sense of achievement and reinforcing positive learning behaviors.

Exploring solutions

To ensure our approach was both comprehensive and impactful, I first aligned with our PM on the two main questions we needed to explore about guided practice.

#1 HMW motivate students to engage with the practice experience?

Relevant

Contextual

I generated 3 different concepts that aligned with students' learning needs.

#2 HMW provide the engaging practice experience?

Collaborative

Interactive

Evaluative

Next, I created the interactive practice prototype with guidance and support

Instant feedback ⚡️

Reinforces learning by providing real-time insights, helping students correct mistakes immediately.

Flexible Options 💡

Offers adaptability to different learning paces and styles, ensuring students feel supported while maintaining challenge levels.

Varied Question Formats 🧩

Keeps the experience engaging and dynamic, catering to diverse learning preferences and preventing monotony.


Validate and iterate!

We ran a second round of UXR with 20 participants using concept wireframes and prototypes to understand students' preferences.

And based on the insights from that, I did some iterations:

Iteration 1: Improve discoverability, while considering the time-crunch situation

  • Students preferred practicing a similar question after the original question (Concept 2). While the discoverability could be improved.

  • Sometimes students could feel reluctant to practice due to perceived time consumption.

Iteration 2: Add more delight to provide more support

Students liked the step-by-step structure, and they would like to see a more clear and personalized feedback when working on the practice question.

Aligned on the MVP scope

Chegg has a vast and constantly expanding database of questions, making it challenging to generate practice questions to cover all question types.

As the design evolved, I collaborated with machine learning engineers and learning experience designers to explore using AI prompts for generating practice questions. We faced challenges such as Chegg’s expanding question pool, diverse query types requiring unique formats, and limited time.

To address this, we focused on an MVP, prioritizing procedural questions (the most common) and multiple-choice formats (widely applicable), delivering a functional solution with plans for future expansion.

Dealt with technical constraints

7-10s long loading time for content generation without streaming

This meant users would have to wait up to 10 seconds to see any content—clearly not an ideal user experience.

Recognizing this constraint, I aimed to improve the experience before streaming was enabled. I proposed adding a loading UI to help set user expectations upfront. Additionally, I suggested distributing the content load, where step-by-step content would be loaded incrementally. This approach not only shortened perceived wait times but also saved tokens. The solution was eventually adopted, optimizing the experience despite technical limitations.

Priority shift

New hypothesis: Chat UI is not the best paradigm for searching QnA

While developing the practice experience, the team analyzed student behavior and feedback. Only 5% of follow-ups were genuine; many students simply repeated their questions.

Additionally, UXR revealed confusion with the chat UI, which hindered their primary task of searching for answers.

This prompted us to reassess the UX paradigm—search, chat, or hybrid—to better meet student needs. Traditional search suited initial queries, while chat excelled for follow-ups.

Given this insight, the team paused our initial plan of integrating the practice experience into a single chat thread, we chose to validate a hybrid mode. Now we are testing the hybrid mode with users and it already got some positive signals from user cancel rate and follow-up percentage. The latest plan is to apply the practice experience later this year, and it will be treated as follow-up content, to appear on the right panel.

Although the project didn’t launch as planned, pausing allowed us to validate a better-suited approach for the learning journey. I learned the value of flexibility, continuous feedback, and reassessing assumptions. It reinforced balancing ambition with practicality, highlighting how setbacks can lead to a more user-centered, refined solution.

This process helped me get a more clear and more user-centered problem:

How might we provide a practice experience to address learner needs and deepen engagement?

Identifying the learning experience

Embedded learning principles into feature ideation.

To better guide the ideation, I worked the learning experience designer to define these principles:

📚 ✨
Relevant & Contextual

Deliver learning materials and information that are contextually aligned with students' current needs and goals, enhancing their engagement and motivation to learn.

🤝 💬
Collaborative & Interactive

Foster opportunities for students to interact with peers or tools, promoting active collaboration that strengthens understanding and engagement.

📊🏆
Evaluative

Provide mechanisms for students to assess their skills and track progress, creating a sense of achievement and reinforcing positive learning behaviors.

Exploring solutions

To ensure our approach was both comprehensive and impactful, I first aligned with our PM on the two main questions we needed to explore about guided practice.

#1 HMW motivate students to engage with the practice experience?

Relevant

Contextual

I generated 3 different concepts that aligned with students' learning needs.

#2 HMW provide the engaging practice experience?

Collaborative

Interactive

Evaluative

Next, I created the interactive practice prototype with guidance and support

Instant feedback ⚡️

Reinforces learning by providing real-time insights, helping students correct mistakes immediately.

Flexible Options 💡

Offers adaptability to different learning paces and styles, ensuring students feel supported while maintaining challenge levels.

Varied Question Formats 🧩

Keeps the experience engaging and dynamic, catering to diverse learning preferences and preventing monotony.


Aligned on the MVP scope

Chegg has a vast and constantly expanding database of questions, making it challenging to generate practice questions to cover all question types.

As the design evolved, I collaborated with Machine Learning Engineers and learning experience designers to explore using AI prompts for generating practice questions. We faced challenges such as Chegg’s expanding question pool, diverse query types requiring unique formats, and limited time.


To address this, we focused on an MVP, prioritizing procedural questions (the most common) and multiple-choice formats (widely applicable), delivering a functional solution with plans for future expansion.

Dealt with technical constraints

7-10s long loading time for content generation without streaming

This meant users would have to wait up to 10 seconds to see any content—clearly not an ideal user experience.


Recognizing this constraint, I aimed to improve the experience before streaming was enabled. I proposed adding a loading UI to help set user expectations upfront. Additionally, I suggested distributing the content load, where step-by-step content would be loaded incrementally. This approach not only shortened perceived wait times but also saved tokens. The solution was eventually adopted, optimizing the experience despite technical limitations.

Priority shift

New hypothesis: Chat UI is not the best paradigm for searching QnA

While developing the practice experience, the team analyzed student behavior and feedback. Only 5% of follow-ups were genuine; many students simply repeated their questions.


Additionally, UXR revealed confusion with the chat UI, which hindered their primary task of searching for answers.


This prompted us to reassess the UX paradigm—search, chat, or hybrid—to better meet student needs. Traditional search suited initial queries, while chat excelled for follow-ups.

Given this insight, the team paused our initial plan of integrating the practice experience into a single chat thread, we chose to validate a hybrid mode. Now we are testing the hybrid mode with users and it already got some positive signals from user cancel rate and follow-up percentage. The latest plan is to apply the practice experience later this year, and it will be treated as follow-up content, to appear on the right panel.

Although the project didn’t launch as planned, pausing allowed us to validate a better-suited approach for the learning journey. I learned the value of flexibility, continuous feedback, and reassessing assumptions. It reinforced balancing ambition with practicality, highlighting how setbacks can lead to a more user-centered, refined solution.

This process helped me get a more clear and more user-centered problem:

How might we provide a practice experience to address learner needs and deepen engagement?

Identifying the learning experience

Embedded learning principles into feature ideation.

To better guide the ideation, I worked the learning experience designer to define these principles:

📚 ✨
Relevant & Contextual

Deliver learning materials and information that are contextually aligned with students' current needs and goals, enhancing their engagement and motivation to learn.

🤝 💬
Collaborative & Interactive

Foster opportunities for students to interact with peers or tools, promoting active collaboration that strengthens understanding and engagement.

📊🏆
Evaluative

Provide mechanisms for students to assess their skills and track progress, creating a sense of achievement and reinforcing positive learning behaviors.

Exploring solutions

To ensure our approach was both comprehensive and impactful, I first aligned with our PM on the two main questions we needed to explore about guided practice.

#1 HMW motivate students to engage with the practice experience?

Relevant

Contextual

I generated 3 different concepts that aligned with students' learning needs.

#2 HMW provide the engaging practice experience?

Collaborative

Interactive

Evaluative

Next, I created the interactive practice prototype with guidance and support

Instant feedback ⚡️

Reinforces learning by providing real-time insights, helping students correct mistakes immediately.

Flexible Options 💡

Offers adaptability to different learning paces and styles, ensuring students feel supported while maintaining challenge levels.

Varied Question Formats 🧩

Keeps the experience engaging and dynamic, catering to diverse learning preferences and preventing monotony.

Aligned on the MVP scope

Chegg has a vast and constantly expanding database of questions, making it challenging to generate practice questions to cover all question types.

As the design evolved, I collaborated with Machine Learning Engineers and learning experience designers to explore using AI prompts for generating practice questions. We faced challenges such as Chegg’s expanding question pool, diverse query types requiring unique formats, and limited time.


To address this, we focused on an MVP, prioritizing procedural questions (the most common) and multiple-choice formats (widely applicable), delivering a functional solution with plans for future expansion.

Dealt with technical constraints

7-10s long loading time for content generation without streaming

This meant users would have to wait up to 10 seconds to see any content—clearly not an ideal user experience.


Recognizing this constraint, I aimed to improve the experience before streaming was enabled. I proposed adding a loading UI to help set user expectations upfront. Additionally, I suggested distributing the content load, where step-by-step content would be loaded incrementally. This approach not only shortened perceived wait times but also saved tokens. The solution was eventually adopted, optimizing the experience despite technical limitations.

Priority shift

New hypothesis: Chat UI is not the best paradigm for searching QnA

While developing the practice experience, the team analyzed student behavior and feedback. Only 5% of follow-ups were genuine; many students simply repeated their questions.


Additionally, UXR revealed confusion with the chat UI, which hindered their primary task of searching for answers.


This prompted us to reassess the UX paradigm—search, chat, or hybrid—to better meet student needs. Traditional search suited initial queries, while chat excelled for follow-ups.

Given this insight, the team paused our initial plan of integrating the practice experience into a single chat thread, we chose to validate a hybrid mode. Now we are testing the hybrid mode with users and it already got some positive signals from user cancel rate and follow-up percentage. The latest plan is to apply the practice experience later this year, and it will be treated as follow-up content, to appear on the right panel.

Although the project didn’t launch as planned, pausing allowed us to validate a better-suited approach for the learning journey. I learned the value of flexibility, continuous feedback, and reassessing assumptions. It reinforced balancing ambition with practicality, highlighting how setbacks can lead to a more user-centered, refined solution.

Vera Li. © 2024

Currently designing the AI-powered learning experience @ Chegg.

Previously @ Alipay, Phillips, & Tiktok.

Thx for visiting ☺︎

Vera Li. © 2024

Currently designing the AI-powered learning experience @ Chegg.

Previously @ Alipay, Phillips, & Tiktok.

Thx for visiting ☺︎

Vera Li. © 2024

Currently designing the AI-powered learning experience @ Chegg.

Previously @ Alipay, Phillips, & Tiktok.

Thx for visiting ☺︎