AI-Powered Career Platform for Interview Preparation
Responsibility:
Time & Status:
Identifying Friction Across the Mock Interview Journey
High Drop-Off During Setup Step

42%
of users dropped off at the onboarding page.
Low Engagement After the Interview
40%
of users never practiced beyond their first mock interview.
27%
retention declined from Day 7 to Day 30.
After completing a mock interview, low retention and return rates suggested that users didn’t receive enough feedback or direction to continue practicing.

How might we design an interview experience that helps users prepare better, improve continuously, and feel more confident?
Analyze user feedback across the end-to-end user flow

Overwhelming Setup
No Post-Interview Insight
Limited Practice Options
Low Confidence of their Performance
Motivated Beginners Seeking Practical Growth

Focus 1: Setup
A Guided and Customizable Setup Experience
Many learners on HackQuest struggled with unclear course structure, limited feedback on their progress, and no clear sense of achievement. To solve this, I introduced a series of features that provide visibility, motivation, and self-assessment tools throughout the learning path.
BEFORE
Overwhelming Single-Page Setup

All onboarding tasks (job info, resume, setup instructions) were packed into a single long page.

High cognitive load — users didn’t know where to start or what was required.

No guiding structure, no sense of progress.
Before - Learning Interface Comment

I reorganized scattered information into a 3-step flow and added key inputs to enable more customized interview practice with less cognitive load.
During the setup flow design, I explored two structural approaches.
Single-page accordion
Option 1
Reduced cognitive load
Clear sense of progress and momentum
Stronger perceived guidance
Multi-page stepper
Option 2
Fast access for experienced users
High cognitive load for first-time users
Unclear progress and completion state
I ultimately chose a multi-stage stepper to reduce complexity, guide users through critical decisions sequentially, and improve the quality of inputs that power personalized interview feedback.
AFTER
We broke the long form into a 3-step guided flow

Introduced a 3-step stepper to Breaks information into focused chunks

Added a Background Questionnaire to capture deeper context.

Introduced customizable interview settings for a tailored experience.
To ensure users could evaluate their understanding and reinforce key concepts, I designed a Test-Out section that appears after each course.



Focus 2: Get Feedbacks
A Centralized Space to Review, Reflect, and Improve
BEFORE
There is no no place to review results or track improvement.

Lack of interview record to review after interview

The system offered no strengths/weaknesses insights of the performance

No learning feedback or recommendations help user improve their interviews
Before - Learning Interface Comment

Design Exploration: Question Navigation in Interview Feedback
Sidebar to Select Questions
Option 1
High visibility of all questions
Fast skim or compare feedback across multiple questions.
Competes with feedback content
Less scalable on smaller screens
Top Dropdown Question Selector
Option 2
Keeps focus on the feedback itself
Reduced cognitive load
Scales better
Less immediate visibility of all questions
Less scalable on smaller screens
I explored both a sidebar and a dropdown approach for navigating interview questions. While the sidebar offered faster access, it competed with the feedback content and encouraged shallow scanning. I ultimately chose a top dropdown to keep users focused on one question at a time, supporting deeper reflection and a calmer review experience.
AFTER
Comprehensive Interview Record

The Interview History Dashboard
allows user to revisit all past interviews.

The AI-Powered Q&A Analysis provides detailed insights for each question

Learning Progress Bar tracks task status, showing estimated time and reading speed on hover
While the interview record allowed users to revisit individual mock interviews, users struggled to understand what patterns existed across sessions and what to focus on improving next.
Interview preparation is not about fixing a single answer — it’s about recognizing recurring strengths and weaknesses over time. Users needed a way to step back and see the bigger picture. To help users better track their progress over time, I designed a Performance Feedback Dashboard synthesizes interview data across sessions to help users understand strengths, identify recurring issues, and take targeted actions to improve future interviews.
Breaking Down Product Goals into User-Centered Questions
Design Exploration: Skill Breakdown Visualization
To present interview skill performance in the dashboard, I explored two visualization approaches:
Option 1 - Bar Chart

High clarity and readability
Easy comparison across skills
Scales well with data changes
Less visually expressive
Option 2 - Radar Chart
Holistic visual snapshot
Harder to compare precise values
Increased cognitive load
Poor scalability
I ultimately chose a bar chart because it offers clearer comparisons, scales better, and more directly supports actionable decision-making.
AFTER
Actionable Performance Feedback Dashboard

Clear Skill Insights — See strengths and gaps at a glance.

Pattern-Based Feedback — Identify recurring strengths and issues.

Actionable Next Steps — Get focused recommendations to improve.

Focus 3: Practice & Improve
A Better Way to Practice and Get Feedback
BEFORE
Unstructured Question Library

Users could not practice questions, only read them passively.

Users had no answer analysis, so they couldn't gauge performance.

Question cards were hard to skim, with unclear labels and structure.
Before - Learning Interface Comment
Designing the Question Card for Fast Problem Identification
When browsing interview practice questions, users struggled to quickly determine whether a question matched their current skill level.
I designed the question card using layered labels to help users quickly identify which problems a question solves, reducing decision friction and supporting targeted practice.
AFTER
A Better Question Bank for Practice and Feedback

Question cards are clearly labeled and filterable, making browsing easier.

Practice mode supports audio and text, enabling flexible, anytime practice.

AI provides answer analysis with strengths, weaknesses, and tips, giving actionable feedback.
Define the guiding principle
Icons

Typography

Colors

Components
























