AI-Powered Career Platform for Interview Preparation

Time: 2024

Time: 2024

Duration: 6 months

Duration: 6 months

My role:

My role:

I worked as a Product Designer on Datalynn’s first AI-powered career platform, leading a key design iteration focused on improving user retention and engagement across the interview experience.

I worked as a Product Designer on Datalynn’s first AI-powered career platform, leading a key design iteration focused on improving user retention and engagement across the interview experience.

Responsibility:

Led the redesign of the interview setup flow, interview report dashboard, and guided practice questions to improve clarity, personalization, and repeat engagement. Collaborated closely with PMs and engineers to iterate based on user feedback and usage insights.

Led the redesign of the interview setup flow, interview report dashboard, and guided practice questions to improve clarity, personalization, and repeat engagement. Collaborated closely with PMs and engineers to iterate based on user feedback and usage insights.

Time & Status:

May 2024 - December 2024

May 2024 - December 2024






CONTEXT

CONTEXT

Identifying Friction Across the Mock Interview Journey

High Drop-Off During Setup Step

42%

of users dropped off at the onboarding page.

By tracking how users navigated the product, we found major friction in onboarding and early setup, reducing activation, confidence, and long-term engagement.

By tracking how users navigated the product, we found major friction in onboarding and early setup, reducing activation, confidence, and long-term engagement.
By tracking how users navigated the product, we found major friction in onboarding and early setup, reducing activation, confidence, and long-term engagement.

Low Engagement After the Interview

40%

of users never practiced beyond their first mock interview.

27%

retention declined from Day 7 to Day 30.

After completing a mock interview, low retention and return rates suggested that users didn’t receive enough feedback or direction to continue practicing.

Business Goal

Business Goal

Increase retention after the first mock interview by encouraging continued practice and sustained engagement.

Increase retention after the first mock interview by encouraging continued practice and sustained engagement.

How might we design an interview experience that helps users prepare better, improve continuously, and feel more confident?

RESEARCH

RESEARCH

Analyze user feedback across the end-to-end user flow
Methodology

We conducted moderated usability testing sessions with 12 participants to understand how users set up, completed, and reviewed a mock interview. Sessions were conducted remotely via Zoom, where participants completed interview tasks and provided qualitative feedback afterward.

We conducted moderated usability testing sessions with 12 participants to understand how users set up, completed, and reviewed a mock interview. Sessions were conducted remotely via Zoom, where participants completed interview tasks and provided qualitative feedback afterward.

Task List & Completion Rates

We measured task completion time and success rates across the end-to-end mock interview flow to identify where users experienced the most friction, and found that interview setup took significantly longer than other steps.

We measured task completion time and success rates across the end-to-end mock interview flow to identify where users experienced the most friction, and found that interview setup took significantly longer than other steps.

Insights from user feedbacks

Through moderated user interviews, we gathered qualitative feedback and synthesized recurring patterns into key pain points and user-driven needs.

Through moderated user interviews, we gathered qualitative feedback and synthesized recurring patterns into key pain points and user-driven needs.

Identify the problems

Overwhelming Setup

Users felt confused because all onboarding fields were packed into a single, cluttered page.

Users felt confused because all onboarding fields were packed into a single, cluttered page.

No Post-Interview Insight

Users had no way to review answers or understand how they performed.

Users had no way to review answers or understand how they performed.

Limited Practice Options

There was no structured question bank for users to practice or reinforce skills.

There was no structured question bank for users to practice or reinforce skills.

Low Confidence of their Performance

Without feedback or guidance, users were unclear about their performance.

Without feedback or guidance, users were unclear about their performance.

Rebuilding the User Flow to Find Opportunities
During research, I learned that users didn’t just need a mock interview—they needed a clear path to improvement. The original product operated as a one-off interaction, offering no structured setup, no performance insights, and no way to continue practicing.
I reframed the experience as a positive reinforcement loop where clearer feedback builds confidence, encouraging more practice and continuous improvement, and focused the redesign on three key stages of the flow: setup, feedback, and practice & improvement.

TARGET USERS

TARGET USERS

Motivated Beginners Seeking Practical Growth

To ensure our solution effectively meets user needs, we conducted extensive research to better understand our target audience.

To ensure our solution effectively meets user needs, we conducted extensive research to better understand our target audience.

Focus 1: Setup

A Guided and Customizable Setup Experience

Many learners on HackQuest struggled with unclear course structure, limited feedback on their progress, and no clear sense of achievement. To solve this, I introduced a series of features that provide visibility, motivation, and self-assessment tools throughout the learning path.

BEFORE

Overwhelming Single-Page Setup

All onboarding tasks (job info, resume, setup instructions) were packed into a single long page.

High cognitive load — users didn’t know where to start or what was required.

No guiding structure, no sense of progress.

Before - Learning Interface Comment

Reorganize information

Reorganize information

I reorganized scattered information into a 3-step flow and added key inputs to enable more customized interview practice with less cognitive load.

Design Exploration

Design Exploration

During the setup flow design, I explored two structural approaches.

Single-page accordion

Option 1

Reduced cognitive load

Clear sense of progress and momentum

Stronger perceived guidance

Multi-page stepper

Option 2

Fast access for experienced users

High cognitive load for first-time users

Unclear progress and completion state

I ultimately chose a multi-stage stepper to reduce complexity, guide users through critical decisions sequentially, and improve the quality of inputs that power personalized interview feedback.

AFTER

We broke the long form into a 3-step guided flow

Introduced a 3-step stepper to Breaks information into focused chunks

Added a Background Questionnaire to capture deeper context.

Introduced customizable interview settings for a tailored experience.

We added customizable questions to collect detailed background

We added customizable questions to collect detailed background

To ensure users could evaluate their understanding and reinforce key concepts, I designed a Test-Out section that appears after each course.

Focus 2: Get Feedbacks

A Centralized Space to Review, Reflect, and Improve

BEFORE

There is no no place to review results or track improvement.

Lack of interview record to review after interview

The system offered no strengths/weaknesses insights of the performance

No learning feedback or recommendations help user improve their interviews

Before - Learning Interface Comment

Design Exploration: Question Navigation in Interview Feedback

Sidebar to Select Questions

Option 1

High visibility of all questions

Fast skim or compare feedback across multiple questions.

Competes with feedback content

Less scalable on smaller screens

Top Dropdown Question Selector

Option 2

Keeps focus on the feedback itself

Reduced cognitive load

Scales better

Less immediate visibility of all questions

Less scalable on smaller screens

I explored both a sidebar and a dropdown approach for navigating interview questions. While the sidebar offered faster access, it competed with the feedback content and encouraged shallow scanning. I ultimately chose a top dropdown to keep users focused on one question at a time, supporting deeper reflection and a calmer review experience.

AFTER

Comprehensive Interview Record

The Interview History Dashboard

allows user to revisit all past interviews.

The AI-Powered Q&A Analysis provides detailed insights for each question

Learning Progress Bar tracks task status, showing estimated time and reading speed on hover

While the interview record allowed users to revisit individual mock interviews, users struggled to understand what patterns existed across sessions and what to focus on improving next.

Interview preparation is not about fixing a single answer — it’s about recognizing recurring strengths and weaknesses over time. Users needed a way to step back and see the bigger picture. To help users better track their progress over time, I designed a Performance Feedback Dashboard synthesizes interview data across sessions to help users understand strengths, identify recurring issues, and take targeted actions to improve future interviews.

Breaking Down Product Goals into User-Centered Questions

Design Exploration: Skill Breakdown Visualization

To present interview skill performance in the dashboard, I explored two visualization approaches:

Option 1 - Bar Chart 

High clarity and readability

Easy comparison across skills

Scales well with data changes

Less visually expressive

Option 2 - Radar Chart

Holistic visual snapshot

Harder to compare precise values

Increased cognitive load

Poor scalability

I ultimately chose a bar chart because it offers clearer comparisons, scales better, and more directly supports actionable decision-making.

AFTER

Actionable Performance Feedback Dashboard

Clear Skill Insights — See strengths and gaps at a glance.

Pattern-Based Feedback — Identify recurring strengths and issues.

Actionable Next Steps — Get focused recommendations to improve.

Focus 3: Practice & Improve

A Better Way to Practice and Get Feedback

BEFORE

Unstructured Question Library

Users could not practice questions, only read them passively.

Users had no answer analysis, so they couldn't gauge performance.

Question cards were hard to skim, with unclear labels and structure.

Before - Learning Interface Comment

Designing the Question Card for Fast Problem Identification

When browsing interview practice questions, users struggled to quickly determine whether a question matched their current skill level.

I designed the question card using layered labels to help users quickly identify which problems a question solves, reducing decision friction and supporting targeted practice.

AFTER

A Better Question Bank for Practice and Feedback

Question cards are clearly labeled and filterable, making browsing easier.

Practice mode supports audio and text, enabling flexible, anytime practice.

AI provides answer analysis with strengths, weaknesses, and tips, giving actionable feedback.

Impacts and Results
Impacts and Results
Impacts and Results

Successful MVP Launch

Successful MVP Launch

Successful MVP Launch

Contributed to the design of DataLynn's MVP, resulting in a successful product launch.

Contributed to the design of DataLynn's MVP, resulting in a successful product launch.

Accelerated Design Efficiency

Accelerated Design Efficiency

Accelerated Design Efficiency

Contributed to developing a scalable design system that reduced design-to-development handoff times by 40%, significantly speeding up feature releases and product iterations.

Contributed to developing a scalable design system that reduced design-to-development handoff times by 40%, significantly speeding up feature releases and product iterations.

DESIGN SYSTEM

DESIGN SYSTEM

Define the guiding principle

During the MVP launch, we developed a design system to establish a cohesive and consistent design language for Datalynn.

During the MVP launch, we developed a design system to establish a cohesive and consistent design language for Datalynn.

Icons

Typography

Colors

Components

© Jinlan Huang 2024

© Jinlan Huang 2024

© Jinlan Huang 2024