Context
Chegg pivoted from a traditional Q&A website to an AI-powered conversational experience.
As a popular homework-help service with 7.7 million subscribers, Chegg transformed to the conversational AI learning platform in 2023.

Context
Chegg pivoted from a traditional Q&A website to an AI-powered conversational experience.
As a popular homework-help service with 7.7 million subscribers, Chegg transformed to the conversational AI learning platform in 2023.

Challenge
Over 70% of students were not staying engaged after receiving an answer, limiting their learning.
Despite chat-based learning, most students still used Chegg for homework help rather than concept mastery, leaving after getting an answer.
Challenge
Over 70% of students were not staying engaged after receiving an answer, limiting their learning.
Despite chat-based learning, most students still used Chegg for homework help rather than concept mastery, leaving after getting an answer.
Problem statement
How might we deepen student engagement beyond just viewing solutions?
Final Solution
Introduced a structured step-by-step guided practice feature that allows students to reinforce their learning progressively.
Impact
Positive Feedback, Awaiting Launch
Although not yet launched due to shifting priorities, early UXR and marketing feedback has been highly positive.

Impact
Positive Feedback, Awaiting Launch
Although not yet launched due to shifting priorities, early UXR and marketing feedback has been highly positive.

Ideate
Identified the Learning Gap
After analyzing students' learner journey, we discovered the that: Get a quick answer ≠ Master the knowledge. We can try to help students bridge the gap and stay engaged.
I led a cross-functional ideation session with PMs, engineers, and learning experts, generating 14 initial concepts. Then I worked with the core working team to refined the concepts based on feasibility, impact, and learning effectiveness, narrowing them down to six.

By working with UXR, we conducted early round testing with 30 college students to evaluate interest, usefulness, and motivation, leading to the selection of guided practice as the primary direction.

Ideate
Identified the Learning Gap
After analyzing students' learner journey, we discovered the that: Get a quick answer ≠ Master the knowledge. We can try to help students bridge the gap and stay engaged.
I led a cross-functional ideation session with PMs, engineers, and learning experts, generating 14 initial concepts. Then I worked with the core working team to refined the concepts based on feasibility, impact, and learning effectiveness, narrowing them down to six.

By working with UXR, we conducted early round testing with 30 college students to evaluate interest, usefulness, and motivation, leading to the selection of guided practice as the primary direction.

Explore
Entry Point - Caught the Right Moment
Working with learning experience designer, I designed multiple practice entry points to capture students at the right learning moment: immediate practice, practice variations, and topic review practice.
Pattern - Balanced Engagement with Ease of Use
For the interaction part, I explored different modes:
Standalone practice module (structured, clear but rigid)
Conversational practice (interactive but requires high engagement)
Whiteboard-style open practice (flexible but harder to guide)
After evaluating feasibility with engineering and learning experience designers, we aligned on a structured step-by-step approach. balancing engagement with ease of use.
Features - Guided, Interactive
Based on that, I developed some interactive prototypes.
️Instant feedback ⚡
Reinforces learning by providing real-time insights, helping students correct mistakes immediately.

Flexible options 💡
Offers options like hints and reveal the answer to ensure students feel supported and flexible while maintaining challenge levels.

Varied question formats 🧩
Caters to diverse learning question type like multiple choice question, fill-in the bank, drag and match to Keeps the experience engaging and dynamic.

Validate
Ensure We're on the Right Track
I worked with UXR on a moderated testing with 20 subscribers to understand their feedback and preferences. Meanwhile, the product marketing team launched an in-product survey to get more signal from students.

And based on the insights from that, I did some iterations:
Iteration 1: Improved discoverability, while considering the time-crunch situation
Overall prominent UI - Improve discoverability
“save for later” option + est. time tag - Solve for the time-crunch situation
Validate
Ensure We're on the Right Track
I worked with UXR on a moderated testing with 20 subscribers to understand their feedback and preferences. Meanwhile, the product marketing team launched an in-product survey to get more signal from students.

And based on the insights from that, I did some iterations:
Iteration 1: Improved discoverability, while considering the time-crunch situation
Overall prominent UI - Improve discoverability
“save for later” option + est. time tag - Solve for the time-crunch situation
Iteration 2: Added more delight to provide more support
Optimized the UI
Provide more emotional support by showing more empathy when they get the wrong answer and celebrating their success
execute
As the design evolved, I collaborated with Machine Learning Engineers and learning experience designers to explore using AI prompts for generating practice questions.
Technical challenge 1: Chegg’s expanding question pool, diverse query types required unique formats, and limited time
To address this, we focused on an MVP, prioritizing procedural questions (the most common) and multiple-choice formats (widely applicable), delivering a functional solution with plans for future expansion.
Technical challenge 2: 7-10s long waiting time for content generation without streaming

That’s def not an ideal experience. Even if AI-generated practice content is incredibly smart, if it takes too long to load, students won’t use it.
So, I worked with the engineering team to help them see the user impact of slow loading times, hoping to find feasible technical optimizations.
At the time, engineers proposed a few ideas:
Optimize AI computation efficiency to make it run faster.
Adjust the prompt to generate shorter content, reducing processing time.
Preload certain types of problems to reduce computation demands.
However, these solutions either didn’t fundamentally solve the issue or required high engineering costs.
So, I posed a question:
💡 “If we can’t make AI faster, can we change the way content is presented to make users feel like it’s loading faster?”
I collaborated with engineers to break down the prompt and found that it consisted of three parts.:
Technical challenge 2: 7-10s long waiting time for content generation without streaming

That’s def not an ideal experience. Even if AI-generated practice content is incredibly smart, if it takes too long to load, students won’t use it.
So, I worked with the engineering team to help them see the user impact of slow loading times, hoping to find feasible technical optimizations.
At the time, engineers proposed a few ideas:
Optimize AI computation efficiency to make it run faster.
Adjust the prompt to generate shorter content, reducing processing time.
Preload certain types of problems to reduce computation demands.
However, these solutions either didn’t fundamentally solve the issue or required high engineering costs.
So, I posed a question:
💡 “If we can’t make AI faster, can we change the way content is presented to make users feel like it’s loading faster?”
I collaborated with engineers to break down the prompt and found that it consisted of three parts.:
This led us to explore whether we could generate content parts incrementally instead of waiting for everything to generate together.
To communicate this idea, I create the flow of the logic:
And later we realized this approach would also help save the cost of token, since we only generate the next step when students take the actions. Which is a win-win for product and business.
Future scalability
Meanwhile, our team was re-evaluating the product architecture based on user feedback—whether to go with an all-conversational UI or a hybrid model. To adapt to both possibilities, I created two design versions to fit each approach.
Learnings
Navigating Ambiguity
Continuous validation and iteration are crucial to refining solutions and ensuring they address both user needs and business goals effectively.
Proactive Collaboration with Engineering
I realized the value of staying proactive by working closely with engineers to explore feasible alternatives together. It helped ensure the design aligns with technical capabilities.