Looking Behind the Curtain: A Cooperative Approach to Reviewing Online Courses

Concurrent Session 9

Brief Abstract

Teaching faculty the value of course design is harder than it seems. By establishing a cooperative approach, we recruited faculty across the university to come together to train and provide feedback to each other using the Quality Matters Rubric. We will share the process, obstacles, and outcomes of our QM Co-Op.

Extended Abstract

Background

This presentation will describe my institution’s process for assessing the need for a high-quality and collaborative approach to implementing best practices in online course design. As summarized in this proposal, I will share our experiences related to our identified problem, the process for planning and implementing our program, and the practices we instilled to maintain quality and multidisciplinary cooperation throughout the program. 

My institution began a partnership with Quality Matters in 2019. Quality Matters (QM) is a nationally recognized, faculty-driven peer-review process used to ensure the quality of online and blended course design. At the onset of our institutional membership with QM, over 30 faculty participated in the “Applying the Quality Matters Rubric” (APPQMR) workshop. This surge quickly dwindled the following year, as pandemic-related stressors took front stage. Although faculty were busy and burnt out, the need for high-quality online courses was more important than ever. Led by the university’s Center for Online Learning, 50 faculty members had completed the APPQMR workshop between 2019-2021. 

About the Rubric

The Quality Matters rubric consists of 43 specific review standards focusing on best practices in online course design and alignment. The rubric breaks the standards down into eight categories - course overview, learning objectives, assessment and measurement, instructional materials, learning activities and learner engagement, learner support, and accessibility and usability. Between the eight standards, instructors can curate a learning experience where content, activities, and assessments are aligned to course outcomes, all of which are accessible to all types of users. The APPQMR workshop’s purpose is to guide new users through the QM Rubric standards and application of the standards in an online course. 

Problem

Parallel to the Quality Matters training, the Center for Online Learning conducted an internal audit of all online courses, analyzing courses for an abbreviated list of criteria that was deemed high-impact. We noticed an inconsistent correlation between participation in Quality Matters training and the results of the audit. Whether due to learning a new LMS - we migrated from Moodle to Canvas during this period -  or the many responsibilities that come with online teaching during a pandemic, we can’t say. What we were able to identify is that QM training alone was not the “magic solution” to administering successful online courses.

Alongside QM-sponsored training, the Center for Online Learning hosts regular “open lab” training on various concepts of course design and Canvas-related skills. Additionally, our instructional design team consults with faculty on an as needed basis. We decided to compare the audit data to the list of training and consultation participants. We found that faculty were more likely to perform better on the audit if they participated in the synchronous training or consultation sessions with the Instructional Designer. Why was this? Our training focused on the implementation of the learning management system, Canvas all the while blending the principles of QM into the skills. For example, when training faculty on the use of Canvas Discussion forums, we also discussed and shared benefits of learner-to-learner interactions in an online course, alongside specific strategies for embedding Discussion forums within a course. Additionally, our training offered faculty the opportunity to share their own experiences, which led to greater depth of conversation and grappling with the concepts with which we focused. 

This data led us to question the method in which we were enrolling faculty in the APPQMR training. This training was not a “one and done” method, but rather should serve as the foundation to best practices in online course design. After taking the training, how can we best implement a system for support and accountability in implementing best practices?

The second layer to our discussions surrounded systematically certifying courses for Quality Matters distinction. In our first iteration, after taking the APPQMR training, we hoped faculty would then prepare their courses for a self, internal, and eventual formal QM course review. This turned out to be a bigger hurdle than anticipated. How can our team support faculty throughout both the training, preparation, and certification of their online courses? How can we utilize what worked in our earlier training to supply opportunities for interactions and discourse between faculty?

Process

We assessed and sorted our needs into three categories: effective training, ongoing collaboration, and course preparation and certification. Between these three components, we wanted to see not only if the results of our internal audit changed, but also the effect on students' performance and course outcomes. We plan to use Spring 2021 audit and course evaluation data to serve as the baseline for this measurement. 

We developed a program for faculty called the Quality Matters Cooperative. The mission of the cooperative is to foster multi-disciplinary collaboration to assess, support, and certify online courses based on the Quality Matters Rubric. The program is laid out into five steps: Opt-in, Training, a Self-Review, the Internal Reviews, and a Formal Review. Through the QM Cooperative, we plan to submit six courses for QM managed Course Reviews by the end of 2022 to achieve QM recognition for the courses that are taught frequently and to large numbers of students.

Faculty Opt-In

Often the hardest part of any voluntary professional development, recruitment was the first step we made to the initiation of the QM Co-Op. We decided to reach out to several groups of people to invite faculty to participate in the first cohort of the cooperative. First, we reached out to Deans throughout the university to recommend a small list of faculty members - these were the first members of the faculty to whom we reached out. After receiving acceptances from a group of recommendations, we identified faculty who have shown interest in Quality Matters in the past, either having taken the APPQMR training already, or expressing interest in the review process. Our goal is to have at least six to ten participants in the cooperative in order to have a well-rounded group of faculty from across the university. 

Training 

Once identified, participants will take two Quality Matters training workshops. First, the Applying the Quality Matters Rubric training provides the foundational knowledge to course design that we strive to incorporate into all online courses. Second, participants will take the Peer Reviewer training, which will qualify them to internal and formally review online courses at the higher education level. Throughout this step, participants will meet with one another and our instructional designer throughout the training process to discuss challenges and connections they are making to their own course and subsequent review. 

Self-Review 

Using the Quality Matters Self-Review tool, Co-Op faculty can examine the different design components of their online course and begin to apply the best practices that create quality courses. The Self-Review tool allows faculty to confidentially evaluate their own course against the QM Rubric. Upon completion of a Self-Review, faculty have access to a Final Report that may be emailed to whomever you choose. 

Internal Review

Following the Self-Review, an internal review is conducted by the Internal Review Team, consisting of two other participants of the co-op. Courses that receive 100% of the 21 “must-haves” out of the 43 standards from the Quality Matters rubric are deemed to have the QM Essentials and will be designated a certified Essentials QM Course. Feedback from the remaining standards is meant to guide and support the instructor in improvement prior to the Formal Course Review. 

Formal Review 

The formal Quality Matters-managed Course Review follows similar steps as the Internal Review, except it is conducted by trained individuals from outside the instructor’s institution. QM Co-Op faculty will submit their courses for formal review at the end of the academic year. 

Practice

In our planning conversations, we determined the importance of sustaining communication and collaboration between participants of the cooperative. In reflection of this, we plan to regularly schedule meetings to discuss ongoing training, updates and reviews. These meetings will serve the same need we found our previous open lab trainings offered - a space for faculty to ask questions and grapple with the challenges they may be facing in their course design. Meeting agenda will also curate opportunities to build resource tools required in a Quality Matters certified course that may not already be created, such as course maps, technical requirement documents, and accessibility services guides. 

The collaborative nature of the Quality Matter Cooperative is meant to serve as a pivotal driving force in the success of its participants. We plan to adapt meetings and schedules to meet the needs of the ongoing reviews, as well as support individual participants with consultations and guidance along the way. 

Conclusion

While this program is still in its early stages of implementation, we are eager to share the experiences with other institutions, both good and bad. Our goal in this presentation is to offer a skeletal program that other institutions can use to implement Quality Matters or a similar rubric program within their online courses.