Discovering Problems

For students in National Tsing Hua University in Taiwan, there is a learning management system called eLearn that allows information to be shared between students and faculty. However, eLearn system has frequently become a target of complaints in student forums. Therefore, we conducted UX research on this campus-wide tool with the largest user base and further analyzed the results of user testing.

Problem defining from the view of Human Information Processing (HIP)

From the perspective of human information processing, the original page exhibits the following key issues:

Understand Target Users from interview

In the early phase of this project, we conducted user interview to identify the core problems and get a sense of how university students perceive the website- eLearn.

Students frequently use eLearn to do two main tasks:

(1) Check the latest announcement of a course

(2) Check the Grade Report of a course

Six UI Issues of the system

By interviewing students and collecting online reviews from student forums, we summarized several interface issues.

Three UX Issues of the system for First-Time Student Users

We believe that the school’s system architecture should be intuitive and easy to understand, sparing students from spending unnecessary time adapting to it. Therefore, we focused on first-time users as our target group. To reduce bias, participants were chosen from different departments and we deeply interviewed their feelings and negative experience of using the system.

Finally, THREE main reasons why users were against it are as follows:

Reduce Redundancy through analyzing Information Architecture

We found that the developers intentionally designed two pathways leading to the calendar, which might suggest they believe students will use this feature most frequently. However, this assumption was not validated through surveys or interviews. Even worse, having two distinct pathways to access the calendar can create cognitive confusion for users and may contribute to their inability to become familiar with the overall system structure.

Duplicating features adds significant overhead to both the scanning process and the comprehension process. Because users don't know for sure if a feature is duplicated, they'll have to spend additional time and effort figuring out whether the duplicate is a new feature or an old feature.

Market Research

E-learning platforms usually look clean, accessible and informative which provides students different ways to access information of courses easily. The spacing between the components on these websites is quite large, emphasizing the distinctions between different functional areas. For an educational platform filled with various types of information (including text, images, links, and videos), the top priority is to help minimize cognitive load as much as possible and reduce reading difficulty.

Design Principles and Deliverables

Design System

Revised Information Architecture

Lo-Fi Wireframe

I believe simplicity is the most important usability guideline. For the color scheme, a vibrant and lively blue and yellow palette was selected. To retain the school's unique identity, the panda mascot design is placed in the notification section.

My design guideline is primarily based on the following three principles:

1. Information Visualization:

Use color labels to categorize and classify items (including classes and deadlines). Represent original numerical data in graphical formats.  

2. Enhancing Consistency:

Establish a cohesive design system that aligns styles (including buttons, fonts, and color schemes).  

3. Pinning Key Features to Side Panels:

Based on previous user interviews, identify key features (including the calendar and to-do list) and fix these two major functions to the left and right side panels, allowing users to interact with them at any time.

Validation of redesigned UI through high-fidelity prototype

With high-fidelity prototype, we tested workflow, specific UI components (e.g. mega menus, accordions), graphical elements such as affordance, page hierarchy, type legibility, as well as engagement. Also, because of our high-fidelity prototypes, participants behaved realistically, as if they were interacting with a real system. Finally, accurate feedback from these sessions allows us to address potential usability issues.

Design of Usability testing

We defined two main tasks extracted from feedback in the user interview and conducted the user testing to test the original and new user interface design.

Result of User Testing

The improved design has indeed resulted in more efficient task completion times and lower cognitive load scores, making the process subjectively easier. Additionally, in the post-experiment survey, participants provided more positive feedback regarding the new design.