"Quality Can’t Be Done in a Checklist”: Applying Faculty Perspectives about Online Course Reviews

Streamed Session

Watch This Session

Brief Abstract

Online course reviews have been in place for five years at our institution. To continuously assess their effectiveness, participating faculty were asked to describe their review experience via surveys and focus groups. In this session, we will identify factors that relate to review success as well as obstacles we plan to overcome.

Presenters

In the summer of 2011, Aimee joined the Instructional Design team at CDL. She graduated with a Doctor of Education degree in Curriculum and Instruction with a specialization in Instructional Design and Technology from the University of Cincinnati in 2011. Her research interests include quality online course design, textbook affordability, online discussion strategies, and technology and gender. Dr. deNoyelles has published in several journals including Computers & Education, Journal of Asynchronous Learning Networks, Journal of Applied Research in Higher Education, and Journal of Special Education Technology.
Nancy Swenson has a MA degree in Educational Technology from the University of Central Florida. She has a B.S from Florida International University in Business Education. Nancy has worked at the Center for Distributed Learning at UCF as an Instructional Designer since 2000. Prior to working at UCF, she taught business education classes in the public school system for 13 years. She has also worked as an adjunct with Florida Virtual School, Valencia Community College, and the University of Central Florida. Her online teaching and learning research interests include usability, accessibility of online education, quality of online courses, and universal design for learning. Nancy has presented on similar topics at a variety of conferences including: EDUCAUSE, EDUCAUSE Southeast, SLOAN-Consortium International Conference on Online Learning, EDUCAUSE Learning Initiative (ELI) Webinar, and Assistive Technology Industry Association (ATIA) Annual Conference, and Accessing Higher Ground, Annual International Technology and Persons with Disabilities Conference (CSUN)

Extended Abstract

Despite years of evidence suggesting that there are no significant differences between online and face-to-face environments with regards to learning, the question of online course quality remains. More institutions are offering online courses and programs, so there is increasing pressure to show evidence that their online courses are high in quality. Online course reviews are becoming more commonplace in higher education as a way to show evidence of a quality online learning experience. There is evidence to suggest that online courses that go through a formal online course review process do show an improvement for students (see FIU Online, 2016). 

At the institution in which the authors are based, a state initiative was established that called for a certification process for online courses within the state system. A faculty member who is credentialed to design online courses opts into the review of their online course and an instructional designer reviews the course. A consultation follows in which the faculty member and instructional designer discuss the feedback and collaboratively generate a plan for revisions. When satisfactory revisions are made, a certification (Quality designation) is achieved. Since its inception in 2017, over 700 online course designations have been earned by over 300 faculty members. Despite this success, many challenges continue to be faced, namely an abundance of courses to be reviewed, faculty reluctance, and concerns about strategic impact. 

Our goal is to satisfy the state requirement in a way that makes the faculty experience satisfying, and results in actual improvement of courses for students. To do this, we must incorporate feedback from faculty members. In 2021, any faculty member who had participated in an online course review was invited to take the survey about their perceptions of the online course review experience. 314 faculty were asked in all, and 110 participated, resulting in a 35% response rate. The survey consisted of questions about motivation to participate in reviews, how well they understood aspects of the review process, relationship with ID, items themselves, and how they thought the process could be made more valuable. After the survey results were analyzed, two focus group sessions took place. The purpose was to better understand the survey results and allow an open-ended space for faculty to share their feedback and perceptions on the online course review experience. Survey respondents were randomly selected and asked to participate. Two sessions via Zoom were conducted (6 and 8, so 14 faculty total). The sessions were transcribed and the researchers came back to discuss and settled on main themes that emerged.

When asked about their motivations to participate in the online course review, the majority (74%) of survey respondents said the main reason was to improve the learning experiences for students. 85% were extremely satisfied with the course review process in general, and 90% agreed or strongly agreed that the course review process improved the design of the course. Specific design improvements mentioned included better organization, accessibility, and student performance. The relationship with the instructional designer was positively regarded. 94% agreed or strongly agreed that the collaboration helped to generate strategies to improve course design.

Three points stood out as deserving of more attention: (1) understanding certain concepts regarding the review process, (2) refining the instructional designer-faculty relationship, and (3) clarifying the student perspective. We asked faculty how well they understood aspects of the review process. Some stood out as not understood well, namely “when the designation expires,” “what course modalities are eligible for a review,” and “how a High Quality designation is achieved.” Another point had to do with the nature of the review itself, which is design-focused rather than teaching. As one faculty member declared in the survey, “Quality can’t be done in a checklist.” An additional concern was when an instructional designer would recommend actions that the faculty did not personally agree with. As one faculty noted in the focus group, “I did things my ID told me to do to get the badge even though I didn’t really want to….” In addition, knowing the status of the review also emerged as a theme, with some faculty not knowing when they would hear about the review’s next steps. Finally, there was confusion about how students understand the certification (“No students ask about it,” “How do they know about it?” “Can they see it when they sign up for the course?”). 

Based on this feedback, we plan to:

  • More clearly articulate the intention of the reviews and clarify the points that were not well understood, which will require modifications to the institutional website and instructional designer training. 

  • Explicitly relate faculty motivations to participate with benefits of completing a review. 

  • Leverage existing relationships. The collaborative element of course reviews between instructional designers and faculty is critical. We will stress more emphasis on better understanding what the faculty member is trying to achieve through the review process, rather than informing them what they need to do to earn the certification. 

Presentation Style

This is an asynchronous virtual session. We plan to present these findings for around 10 minutes, and ask participants to virtually interact along the way through poll questions and discussion. To allow for continued conversation, we will provide our contact information and link to our resources such as the survey. Our intention is that attendees will walk away with some solid recommendations that will enhance their own online course quality assurance plans.