Taking the Long View: The Benefits of Iterative Course Design—Susan Marshall (2019)


An accessible version of the documents on this site will be made available upon request. Contact cte@ku.edu to request the document be made available in an accessible format.

 

A faculty member continues to improve an undergraduate psychology statistics course that is part of a graduate student teacher training program in the psychology department. The impact of these current changes highlights the importance of iterative course design efforts that model good teaching practices for novice teachers.

 

—Susan Marshall (2019)

Portfolio Overview

PSYC 210, Statistics in Psychological Research, is an undergraduate course designed to introduce statistical concepts and methods as they relate to the analysis and interpretation of psychological data. Students in PSYC 210 have four unit exams each semester. Performance is very high on the first two exams. However, performance declines on exams three and four as the course content becomes more challenging. The goal of the redesign was to better scaffold student learning and to improve performance on these more difficult exams.

During this course redesign, new homework assignments were created to scaffold the types of questions asked on exams. A graduate student in the Department of Psychology was hired to help re-write ten homework assignments for PSYC 210 using backward design. Throughout this process, we used the exams as our guide to recreating homework. These new homework assignments required students to practice and engage with course content in the same way that they would on the exams. We also implemented pre-class quizzes (rather than post-class quizzes) to better prepare students for the active learning classroom. We implemented these changes in Fall 2017.

Overall, exams scores increased post-redesign, and low performing students benefitted the most. All students, including top-performing students, improved their exam four scores by at least half of a letter grade. The redesigned homework assignments likely contributed to this improved performance because students’ homework and exam grades were more correlated post-redesign compared to pre-redesign, suggesting that the homework assignments better matched (or scaffolded) what would be expected of students on the exam. DFW rates also continued to drop.

Iterative changes to courses and the way they are taught can have meaningful and impactful results for the students who take them. We were able to help more students succeed in this course through the implementation of redesigned homework and quizzes with our latest tweaks to this class. Based on our typical enrollment each semester, approximately 30 additional students are now completing the course as compared to before. Even better, we are accomplishing these levels of success using novice graduate student instructors who are part of our teaching development program. We are not only helping our undergraduates as they complete their coursework, but we are also training our graduate students in how to teach effectively and how to approach teaching and course design in a way that gets results and is based on good course design practices.

 

PSYC 210, Statistics in Psychological Research, is an undergraduate course in statistics designed to introduce statistical concepts and methods as they relate to the analysis and interpretation of psychological data.

PSYC 210 is a required course for psychology majors and is also utilized by other departments (e.g., nursing, human biology, behavioral neuroscience). All PSYC 210 students must have taken PSYC 104, General Psychology, as a prerequisite regardless of major. Although students are coming into PSYC 210 with various backgrounds and experiences, they all have some basic and introductory knowledge of psychology.

The goal of PSYC 210 is to introduce the statistical methodology used to analyze and interpret psychological data. The course covers visual displays of data, central tendency and variability, correlations, sampling and probability, z-scores and the normal distribution, hypothesis testing, confidence intervals, effect size, power, z- and t-tests, and one-way analysis of variance. Course objectives include to:

  • Gain an understanding of the basic concepts underlying statistics, especially those utilized in the social sciences
  • Conduct different statistical procedures in order to test various research questions
  • Organize, summarize, interpret, and present results obtained from research studies
  • Become a more knowledgeable consumer of behavioral research through understanding of statistical methods used both in research and in everyday life

These learning objectives are assessed via online pre-class activities, in-class contribution, homework assignments, quizzes, unit exams, and a final group research project and poster presentation. These assessments are opportunities for students to demonstrate what they have learned.

PSYC 210 has been modified several times since 2015. Pre-redesign, PSYC 210 was taught in a traditional lecture format by graduate teaching assistants (GTAs), each with multiple sections of 45 students. The semester concluded with a traditional final exam. GTAs were independently responsible for making their own decisions related to the textbook, syllabus, topics covered, course schedule, assignments, and exams. During the initial redesign phase, PSYC 210 was converted to a flipped format in which some course content was completed online. Eventually, PSYC 210 was fully transformed to a hybrid format in which student learning occurs outside of class using online materials and in-class through active engagement.

In the hybrid format, PSYC 210 students and the instructor meet face-to-face two to three times a week for 50 minutes. GTAs utilize in-person class time for active learning techniques as students are expected to come to class prepared and having already interacted with course material through online, pre-class activities. The curriculum was standardized across all PSYC 210 sections to ensure that the textbook, syllabus, topics covered, course schedule, assignments, and exams are common across GTAs. Rather than having an individual final exam, an alternative exam was implemented in which students work as a group to develop research questions, analyze a common data set, and present their results in the format of a poster presentation during finals week.

As part of the course transformation, the novice GTAs are now supported through a teaching development program which consists of weekly teaching team meetings. All course components are built, maintained, and updated by the faculty coordinator, and novice GTAs can focus on learning to teach, including what they will do to engage their students each class, how to teach the content effectively, and how best to give feedback to students to further their understanding of the material.

In the most recent course transformation project, we aimed to address two issues:

  1. Students were not performing well on the most difficult exams and
  2. Students were not coming to class prepared for active learning group work.

This portfolio highlights the most recent transformation efforts in detail.

Before the most recent course redesign, PSYC 210 homework assignments were constructed by utilizing publisher question pools which did not scaffold exam questions well. There was a mismatch between what students were asked to do for homework and what they were asked to do on the exams. As course material became more difficult, student exam performance decreased. The median scores for exams one and two were a B, while median scores for exams three and four, as well as for the overall average across exams, were in the C range (see graph below).

To improve student learning and exam performance, we created new homework assignments that better scaffold what would be expected of students on each exam. We hired a graduate student who had experience with the course materials and who would be teaching PSYC 210 in the upcoming semester to help develop ten new homework assignments during Summer 2017. It is important to note that exams did not change between the pre- and post-redesign periods.

Course quizzes were also changed during this redesign. Post-class quizzes that came from publisher question pools were dropped, and new pre-class quizzes were created. Post-class quizzes had originally been included to give students feedback on their learning AFTER they had been to class and had engaged with classmates and with content. However, these publisher-generated questions, again, did not really model what we wanted students to do. Instead, we realized that students needed more pre-class exposure to content, so that they would be ready for active learning during class time. The new pre-class quizzes tested students on lower-level knowledge (e.g., Remembering and Understanding levels of Bloom’s taxonomy), and students were encouraged to take quizzes as many times as necessary to achieve basic mastery so they would be well prepared for class. The pre-class quizzes also facilitated a common knowledge base among students that class activities could then build upon.

After redesigning these course components during the summer of 2017, the redesigned course was taught for the first time in Fall 2017.

To assess the effects of our redesign efforts, we planned the following analyses:

  • Evaluate changes in homework and exam scores as a result of the redesigned materials
  • Correlate homework scores with exam scores pre- and post-redesign
  • Assess exam grade distributions pre- and post-redesign
  • Monitor the rate of students who earned a D, F or Withdraw from the course (DFW rate)

Student data from all PSYC 210 sections taught in Fall 2016 represented student performance pre-redesign, while data from Fall 2017 represented student performance post-redesign. Fall sections of PSYC 210 are generally taught by novice GTAs who are just learning to teach. We chose to omit spring grades from our analyses, because GTAs have a semester of teaching under their belts and this experience has a positive influence on student grades. For all analyses reported here, there were 137 students in the pre-redesign stage (Fall 2016), and 191 students in the post-redesign stage (Fall 2017).

Mean homework scores post-redesign were lower (81%) compared to pre-redesign (91%). Thus, post-redesign homework assignments were more challenging, which was our intention as we wanted to scaffold a deeper understanding of the material using homework questions that would better prepare students for challenging exam questions. There was a stronger correlation between homework and exam averages post-redesign (r = .71) than between homework and exam averages pre-redesign (r = .45), indicating that the redesigned homework assignments better resembled or matched the exam questions and the knowledge and skills that students were expected to use on the exams. This same pattern emerged when looking at the correlation between the homework average and each exam score, as well. See the figure below for a depiction of these correlations.

We were particularly interested in how the distribution of exam scores might change after implementing the redesigned materials. This data is presented in the box and whisker plot below. Exam one scores did not change from pre- to post- redesign, probably due to a general ceiling effect for this exam. Exam one covers introductory material, and students do well regardless of whether the associated homework assignment properly scaffolds exam questions. However, as course material becomes more difficult, we expected that redesigned homework assignments would improve exam scores. This is exactly what was observed. Median scores on exams two, three, and four increased post-redesign. Additionally, students scoring in the bottom quartile were the ones who benefitted the most on the later exams. This is very exciting, because exam performance increased among students who were likely most at risk for not passing the class (i.e., students who might receive a D, F, or Withdraw from the course). In other words, we were especially able to support the learning of the bottom half of the class with this redesign. On exam four, the most challenging exam, all students improved their scores by at least half a letter grade, including the strong students at the top of the class distribution.

The analyses presented thus far have been using only data from the first semester of teaching (fall semester) for the novice GTAs who typically teach the class. During the spring semester, however, I teach one section. At this point, I would like to bring in some data from my classes both pre- and post-redesign. I have been teaching statistics for over 20 years. My experience is pretty constant from semester to semester, and I tend to teach the class in a consistent way each time. Even among my students, who tend to do somewhat better than those taught by the novice GTAs, the effects of the redesigned homework and quizzes on exam scores is impressive. This data shows the boost that results from having students really practice what you want them to know and do. As can be seen in the graph below, exams one and two both show a ceiling effect, but then there are much larger gains on exams three and four when the redesigned materials were implemented.

As noted previously, redesign efforts in PSYC 210 began in earnest in Spring 2015 with the implementation of a fully hybrid class format, a consistent curriculum across sections, and a teaching development program for the GTAs who teach the class. Prior to this time, DFW rates were 22%, although there was also improvement in the DFW rate during Fall 2014 when we briefly implemented a flipped format and added online content for students to access prior to coming to class. Since the redesign efforts began, DFW rates have continuously decreased. With the redesigned homework and pre-class quizzes that were detailed in this portfolio, DFW rates decreased from 14% to 10%. In essence, with each change we have implemented over the last five years, we have seen a 4% drop in the DFW rate. We currently enroll around 250 students in this class each semester. Decreasing the DFW rate from 22% to 10% amounts to an additional 30 students passing the class each semester. See graph below.

Through this course redesign, I am reminded how we can make large and impactful differences in student learning and performance by approaching teaching and course design in an iterative manner. We successfully increased student performance on exams and have continued to lower the PSYC 210 DFW rate in a manner that is impressive and very meaningful when you consider the impact that it is having on the learning and educational experience of our students. Moreover, we are seeing the benefits of these redesign efforts even when novice graduate students are teaching most of our students. It is critical to acknowledge that the success of this redesign process cannot be attributed to having extremely skilled faculty members who have a lot of training in teaching pedagogy. Rather these benefits were achieved with graduate student instructors who are brand new to the classroom and who must divide their time between teaching, research, coursework, and myriad other obligations. I believe this speaks highly to the sustainability of these efforts.

The redesign of this class and the analyses included here highlight the impact that we see on the undergraduate students who take PSYC 210. However, it is important to note that the successes we are seeing with this course can also be attributed to the graduate student instructor training program that we have implented in our department. In this program, our novice instructors work on a team led by me, an experienced faculty member. Graduate students are typically passionate about teaching and have a lot of energy and willingness to try new things, which is essential for course redesign efforts in which aspects of the course are constantly being re-worked to improve student performance. Being part of these efforts is such good experience for our GTAs. They will leave KU with teaching experience that will help them in their professional careers. We are not only teaching them how to teach effectively and giving them great examples of course materials that produce results in student learning; we are also showing them that there are always meaningful tweaks that we can make to our courses and that we can use a scientific approach when implementing and evaluating those changes.

2019