Class Catalyst

In-depth UX research for a K-12 social-emotional learning platform, uncovering insights to support student well-being and engagement.

UX Research Education Academic Partner: Class Catalyst

ROLE

UX Researcher

TIMELINE

Jan – Apr 2025

TEAM

4 members (Amulya, Sophia, Vanshika, Zoe)

TOOLS

Interviews, Surveys, Heuristic Evaluation


✦ OVERVIEW

Evaluating a K–12 SEL platform used by thousands of teachers

Class Catalyst is a social-emotional learning platform that helps K–12 teachers run emotional check-ins, monitor student well-being, and track engagement over time. Our team was engaged to conduct a full UX research evaluation — identifying where the product was failing its users and delivering prioritized, actionable recommendations directly to the product team.


✦ RESEARCH PROCESS

Five methods, one converging picture

1

Stakeholder Interviews (n=4)

Semi-structured interviews with teachers and administrators surfaced three recurring frustrations: missing real-time alerts for urgent student updates, no ability to customize check-in prompts, and an analytics dashboard that was hard to read under time pressure.

2

Survey Pilot (n=5)

We designed and piloted a survey instrument with educators, refining question neutrality and identifying scalable deployment paths for broader data collection.

3

Competitive Analysis

Benchmarked against ClassDojo, Otus, and Sown to Grow. Class Catalyst led on emotional check-in depth, but lagged significantly in notification systems and LMS integration.

4

Heuristic Evaluation

Applied Nielsen's 10 heuristics to the live product. Key violations: notifications that didn't reflect actual urgency, unclear navigation across dashboards, and student check-in replies that couldn't be edited after sending.

5

Usability Testing (n=4, via Zoom)

Moderated think-aloud sessions revealed that the analytics dashboard consistently caused confusion — users couldn't determine whether class trends were improving or declining without significant effort. Navigation to individual student data was also a recurring breakdown point.


✦ KEY INSIGHTS

What teachers actually need

Real-time alerts are non-negotiable

Teachers check the platform between classes, not continuously. When a student flags distress, delayed notifications mean missed intervention windows.

Customization builds buy-in

Teachers felt disconnected from generic prompts. Writing their own check-in questions was the most-requested feature — and the highest-impact for adoption.

Analytics need to answer "so what?"

Charts existed but didn't surface meaning. Teachers need to know at a glance whether engagement is trending up or down — not decode raw data.

Navigation doesn't match mental models

Teachers think: Class → Student → This week. The current IA forced multiple detours to reach individual student data during active class periods.


✦ RECOMMENDATIONS

What we delivered to the product team

Redesign analytics with trend-first layout

Surface "improving / declining / stable" signals at a glance, with drill-down detail available on demand rather than as the default view.

Customizable check-in prompts

Allow teachers to create, save, and reuse their own emotional check-in questions alongside platform defaults.

Tiered notification system

Differentiate urgent alerts (student flagged distress) from routine weekly summaries, with push notifications for the former.

Restructure navigation with breadcrumbs

Align IA with teacher workflow: Class → Student → Time period. Persistent breadcrumbs reduce disorientation across dashboards.


✦ IMPACT

Formal research handoff to Class Catalyst

Findings were compiled into a research report and presented directly to Class Catalyst's product team, with recommendations prioritized by severity and implementation effort — giving them a clear roadmap, not a wish list.

5

Research methods

12

Participants across all methods

4

Prioritized recommendations delivered

This project reinforced that research value comes from the handoff — findings that don't reach the right people in a usable format don't change anything. Synthesizing across five methods also taught me when to trust qualitative signals over quantitative ones, and how to frame trade-offs for a product team rather than just listing problems.