Redesigning Participation & Discussion Experience for Clarity and Fairness

Transforming the participation experience with clearer navigation, real-time tracking, and fairer workflows for students and instructors.

Redesigning Participation & Discussion Experience for Clarity and Fairness

Transforming the participation experience with clearer navigation, real-time tracking, and fairer workflows for students and instructors.

Redesigning Participation & Discussion Experience for Clarity and Fairness

Transforming the participation experience with clearer navigation, real-time tracking, and fairer workflows for students and instructors.

At-A-Glance

This redesign improved GCU's online learning platform’s participation model. While discussions drive engagement, participation grades account for 10% of overall scores. However students struggled to track progress and instructors faced inefficient grading.

Role

UX Designer

Team

Product Owner

Frontend, Backend and QA Team

Duration

6 Months

The new design enhanced visibility and grading clarity, resulting in a 12% increase in weekly student participation, and a 19% reduction in instructor grading time.

Design Highlights

Redefined Entry Point

Introduced the Participation Assessment Card in Calendar/Syllabus as the new entry point for participation, with a participation progress tracker that shows real-time progress toward weekly requirements.

Participation-Only Subtype

Created a distinct subtype for discussions that count only toward participation, reducing instructor workload and clarifying expectations for students.

Transparent Credit Indicators

Added tags, icons, and tooltips to differentiate graded vs. participation discussions, while keeping the experience consistent to encourage active engagement.

Background

Why Participation Matters

At Grand Canyon University (GCU), participation is a graded requirement worth 10% of a student’s overall score. In online classrooms, participation is earned exclusively through substantive, quality posts made by students in the discussion forum.

This carries particular weight because 85% of GCU students are online learners. For them, participation is not only a formal grading component but also the primary way to engage with peers and instructors, share perspectives, and foster meaningful learning in a virtual environment.

10%

of a student’s overall score comes from participation

All

online participation comes from discussion responses

87%

of GCU students are online learners

Before Redesign: The Full Participation Journey

Discover

Understanding Our Users

Students

Background

Working adult, full-time job + part-time bachelor program. Balancing coursework with family responsibilities.

Learning Mode: 100% online.

Goals

Stay on top of assessments and participation to maintain GPA.

Earn participation credit consistently (10% of overall grade).

Instructors

Background

Experienced instructor member teaching 4 online courses each term, handling 250+ students at once.

Responsible for facilitating learning, supporting students.

Goals

Support students in successfully completing their courses.

Manage teaching, grading, and communication efficiently across multiple classes.

Understanding the Problems

Through satisfaction surveys and IT support ticket analysis, we developed a comprehensive understanding of our user’s pain points:

Students

Unclear Participation Tracking

Lack of transparency in tracking participation requirements

Confusing Workflow

Confusing workflow led to posting in wrong locations

Instructors

High Monitoring Overhead

Excessive time spent monitoring and communicating with students about incorrect submissions

Manual Grading Workarounds

Manual grading workarounds due to system limitations

“How might we create a participation experience that gives students transparency into their progress while reducing instructor workload?”

Rethinking the Workflow — Calendar as the Entry Point

What We Observed

Standard Assessment Workflow

Participation Workflow

Participation was the only assessment type that didn't follow the familiar, straightforward submission pattern.

What If…

What if we simplified the participation submission workflow to match other assessments?

Design Shift

Instead of redesigning the discussion list, we try to focus on the Participation Assessment Card in Calendar/Syllabus, aligning it with other assessments. This led to Concept 3.

Concept 3: Participation Card as the New Entry Point

  1. Link Related Discussions
    Direct links let students jump into the correct discussion and reply to earn credit

  1. Adding Progress Tracker
    Display how many replies a student has made that count toward participation credit

Student Workflow Refinement

Before

After

When Redesign Means Rethinking the Requirement

I proposed three concepts to the Product Owner and shared them with instructors. The feedback was positive. Instructors believed the new approach aligned with students’ natural workflow and would reduce confusion and communication overhead.

However, the original requirement had been to redesign the discussion list. Concept 3 redefined the entry point and redesigned the Participation Assessment Card, which shifted away from that requirement. Changing how participation was submitted raised concerns for leadership, so the project was paused until the Product Owner secured their approval.

Leadership ultimately agreed with this direction, thanks to:

Observation of student behavior and reliance on Calendar/Syllabus as their checklist.

Instructor feedback validating that the new design reduced confusion.

Edge case evidence showing that the existing list made it impossible for students to tell which discussions counted for which participation requirement.

With leadership approval, we advanced into detailed design of the Participation Assessment Card.

Designing Progress Tracker

Understanding the Complexity

GCU’s Participation Policy

According to the policy, to earn full participation credit, students must post at least 2 qualified responses per day on 4 different days within a week.

Grading Flexibility

Instructors grade with different approaches:

Strict policy followers: Require posts distributed across specific days

Flexible graders: Count total posts regardless of distribution

Example

Strict policy followers: Full credit

Flexible graders: Full credit

Participation days: 4

Days with at least 2 qualified: 4

Total substantive posts: >= 8

Strict policy followers: Partial credit

Flexible graders: Full credit

Participation days: 4

Days with at least 2 qualified: 2

Total substantive posts: >= 8

Design Iterations

Iteration 1: Dual-view Progress Tracker

Design includes dropdown to switch between two views:

Daily post count view - track posting patterns

Qualified post count view - estimate real-time performance 

Instructor Pushback

"The qualified replies count would pressure us to grade immediately, increasing unnecessary workload"

Iteration 2: Minimum Requirement Highlighting

Highlight days with at least 2 replies (minimum requirement)

University Concerns

"This might discourage students from exceeding minimum requirements"

Final Design Solution

Addressing Instructor Pain Points

The Edge Case Problem

The university distinguished between two types of credit within the discussion forum:

Discussion credit: Responding to the main Discussion prompt

Participation credit: Replying to other's responses

However, several edge cases created confusion:

Beginning of term: First 2 introduction discussions worth 0 discussion points, but replies still counted for participation

Subject-specific scenarios: In subjects like math, instructors asked students to respond to prompts then reply to their own responses, counting only replies as participation

General Cases

Earn discussion credit

Earn participation credit

Edge Cases: Participation Only Discussion

Earn discussion credit

Earn participation credit

Design Solution: Participation-Only Discussion

To address these challenges, we proposed introducing a new discussion assessment type that would count only toward participation credit.

We believe this solution will also reduce instructor workload by making expectations clearer for students and removing the need for manual grading fixes.

Backend constraint

At the time, I wasn’t sure which solution would be easier for the backend team to implement. Our backend team was offshore with limited bandwidth, so minimizing development effort was critical.

Two options proposed

To help them evaluate, I proposed two alternative concepts. For each concept, I showcased how it would work and demonstrated the impact on related pages, giving the backend team enough context to measure work complexity.

Concept 1: New Assessment Type

University class editor view

Student calendar view

Student discussion list view

Concept 2: Discussion Subtype

University class editor view

Student calendar view

Student discussion list view

Backend Team’s Feedback

The backend team confirmed that the development complexity of both options was roughly the same. Once feasibility was clear, our focus shifted to usability and scalability.

Decision Factors After Feasibility

Criteria

C1: New Assessment Type

C2: Subtype of Existing DQ

Usability

Risk students see as “not gradable” and lower engagement

Requires students to learn a new type

Consistent experience

Supports active participation

Usability

Each new need = new object type

Add new subtypes easily

Decision

We chose C2: Subtype of Existing DQ, as it offered simpler usability for students and stronger scalability for the platform.

Final Design Solution

Discussion Creation View

Participation Only Discussion Workflow

Workflow Refinement

Instructor Workflow Refinement

Before (Current State)

After (With Participation-Only Discussion)

Student Workflow Refinement

Before (Current State)

After (With Entry Point Redesign and Participation-Only Discussion)

Impacts & Reflections

Impacts

  1. Increase in Student Transparency and Engagement

12% improvement in average weekly participation rate (number of qualified replies per student).

  1. Reduction in Instructor Workload

19% faster grading completion time.

Reflections

User feedback needs probing

Instructors strongly suggested adding a progress tracker in the discussion list, since the old system had one. At first, we thought this was the answer. But digging deeper showed they were reacting to a familiar pattern, not necessarily the best solution. By questioning why they wanted it, we uncovered the real need: students needed a clear, simple way to track participation progress in context, not another busy list view.

Constraints matter

We didn’t have direct access to students, so we relied on instructor feedback, IT tickets, and student quotes surfaced by the university. This made it critical to interpret pain points carefully and triangulate across sources rather than accept any one perspective.

Balancing policy and user needs

Designing for participation required balancing consistency with flexibility. The university’s policies ensured rules applied across all courses, but instructors often had their own teaching practices. For example, while the policy allowed any post to count toward participation, many instructors preferred guiding students to specific discussions. Similarly, when grading, some instructors applied rules loosely, while others followed them strictly. To address these variations, I designed flexibility into the system so it could adapt to different instructor styles while keeping the student experience simple and transparent.