The research process involved three phases: comparative analysis, generative interviews and observations, and a validation survey.
Comparative Analysis
Research Questions
- What design elements are presented in a typical Online BBA course?
- How do these design elements differ between courses?
- Which design patterns do we need to test further with student participants?
Process
Our team identified a sample of 16 Online BBA building sections to include in the comparative analysis. The courses chosen for this study included at least one course from each Online BBA major. Additionally, the courses were all designed by different instructors and learning experience designers during 2020-2022.
After generating the sample, each team member reviewed 24 elements across 3-5 courses. Si and Victoria generated the list of elements based on their experience as learning experience designers. Our team posted screenshots and observations of different iterations of the elements in a collaborative Miro board.
Our team then met to discuss our findings from the comparative analysis. Our discussion focused on two questions:
- Which elements were already consistent across the majority of Online BBA building sections?
- Which elements are presented in various formats, requiring more user research to determine the best design pattern?
Elements that fell under question #1 were identified as Online BBA standards and did not require additional follow-up. Elements identified in question #2 formed the basis for the next phase of user research.
In addition to the in-depth course review, we also compared the use of intelligent agents, icons, and the mid-semester survey across the sixteen selected courses.
Generative Interviews
Research Questions
- What is the optimal design for a content module? Is it better to use submodules to organize course items, or to link to various content items and assessments through HTML pages?
- How do students use the course homepage, and what kind of widgets are most appropriate in this space?
- What kind of information do students expect to see in the Welcome module, and how can we organize this module to best serve our students?
- How often are mid-semester surveys used in courses?
- What kind of icon standards can we create to ensure consistent icon use across courses?
- Can we create intelligent agent standards to avoid inundating students?
Process
The next phase of user research involved gathering qualitative data from students. We set up video interviews with two Online BBA students and four GSU undergraduates. In the first half of these interviews, our researchers asked about the students’ general experience with online learning. Then, the researchers asked the students to navigate through a prototype iCollege course that featured a homepage design, a Welcome module design, and two content module designs that we generated from the comparative analysis findings.
Now that our team had some initial user feedback on course designs, we needed to validate that feedback with a larger participant pool.
Validation Survey
Our team created a survey to validate the findings from the interviews phase. The survey asked participants to answer questions about how they use and navigate courses in iCollege. The survey also displayed screenshots of a sample iCollege course and asked the participants what they thought of the page and how they would interact with it. Finally, the survey asked participants demographic questions. This survey was distributed to Online BBA students, students that signed up for the UX Email List, and students enrolled in some BBA courses.
Research Questions
- How do students prefer to receive announcements?
- Do students use the Course Navigator widget or the Content tool to access course content?
- How do students find instructor contact information?
- How do students find upcoming assessments?
- Where do students expect to find assistance with external learning tools?
- Where do students expect to find a to-do list for each module?
- Do students prefer the checklist tool or an HTML page to list deliverables?
- Survey screenshot
Our team distributed the survey to all Online BBA students in Spring 2023. Additionally, some Robinson instructors forwarded the survey to students enrolled in their classes.
The validation survey gathered 12 complete responses and 2 incomplete responses. Demographics from the survey respondents indicated that our survey participants were slightly older and less diverse than the Online BBA student body. They were also highly engaged with their courses, which is typical for surveys without incentives.
Looking Forward
The guidance on this website is not intended to be the authoritative word on course interface design, and the results described in this site should not be expanded beyond these specific use cases. The User Experience team plans to continue researching aspects of course design and refining the practices described here.
Contributors
Victoria Patterson – Project Manager
Si Zhang – LX Consultant
Irmak Su Tutuncu – Graduate Research Assistant
Barbara Boone – Graduate Research Assistant
Emma Hugonnet – Graduate Research Assistant
Karah Hagins – Online BBA LX Lead
Please contact userexperience@gsu.edu with questions about this website.
Icons from flaticon on FreePik.