Out of the Comfort Zone and Into an Engaging Experience: Flipping a Course to Improve Students’ Critical Thinking—Kathleen Nuckolls (2014)
An accessible version of the documents on this site will be made available upon request. Contact cte@ku.edu to request the document be made available in an accessible format.
A professor flips a large, lecture-based, environmental studies class, asking students to learn the fundamental terms and concepts outside the classroom and reserving class time for interactive learning exercises.
—Kathleen Nuckolls (2014)
Portfolio Overview
Scientific Principles of Environmental Studies (EVRN 148) is an introductory environmental science course for non-majors. When I first began teaching this course in 2008, I taught with a more traditional class format of lectures and exams, but I found that students relied heavily on rote memorization and therefore could not extend or apply course content. In order to help students acquire deeper critical thinking skills and better achieve learning outcomes, I decided to use a more student-centered course structure. With these goals in mind, I began to “flip” my course in the fall of 2014 – asking students to learn the fundamental terms and concepts outside the classroom and spending class time on active learning exercises.
I used a variety of activities throughout the semester. In most classes, I used two-minute papers or think-pair-share activities, with the occasional worksheet used to emphasize big ideas from the class period. I also had a lot of success using Sli.do, a live, interactive tool that allows me to ask questions in class and do real-time polling. I extended each of these activities with whole-class discussions. For most activities, I chose concepts that are difficult for students to understand or those where students generally hold on to substantial misconceptions. I found these activities to be much more effective than lecturing over the same material, and was pleased to find that the class would sometimes think of novel answers and approaches that went beyond my expectations
In order to assess student learning and long-term retention from the in-class activities, I compared student answers on specific exam questions from fall 2013 (pre-intervention) to fall 2014 (post-intervention). I also compared exam means and final course grades. While the exam performance showed slight improvement, the final course grades dropped slightly from 2013 to 2014, with the mean course score dropping from 86.13 to 79.38. This change was likely due to a change in the number of items used to calculate final grades. Although the exams were worth the same number of points each year, in 2014 I reduced the number of points available for the online homework and replaced those points with credit for completing the in-class activities. Students receiving D’s and F’s in 2014 were disengaged from most of the course, participating in only 40-45% of class activities, and scoring poorly on exams. In contrast, A and B students completed over 95% of the homework and 87% or more of the in-class activities. In future iterations of the course, I hope to increase student engagement by spending more class time on active learning projects, increasing peer-to-peer instruction, and helping students to discover the relevance of course topics to their own areas of interest.
I found the change I made to be very rewarding and much easier to implement than I had expected. I enjoyed the more spontaneous feel of the class sessions, and I looked forward to the energy and insights that students brought to the active learning exercises. One of my biggest challenges turned out to be pacing. It was very difficult for me to predict how long each activity would take, and I almost always underestimated the amount of time that even the simplest activity would require. Overall, I was able to identify several areas that I hope to improve for future iterations of the course. For example, I plan to expand active learning opportunities and drastically reduce or eliminate the amount of time spent lecturing, reserving this time to address problem areas, target difficult concepts, and help students place ideas and activities into the larger context of the course.
Scientific Principles of Environmental Studies EVRN 148 (pdf) is an introductory environmental science course for non-majors, and it usually draws around 100 students from diverse disciplines. Approximately two-thirds of the students are upper classmen from the Engineering or Architecture programs. The remainder of the students are generally freshman or sophomores who are taking the course to fulfill a natural science requirement. One of the most obvious challenges, therefore, is to keep these diverse students engaged with course material that is outside of their respective majors.
Despite their diverse backgrounds, most of my students are genuinely concerned about environmental issues and are looking for concrete solutions to environmental problems. Students are dismayed to find that definitive answers to complex problems are a rarity, and even the issues themselves are constantly evolving. My goal for this course is not just to inform my students about the current state of the environment, but also to teach them the fundamentals of environmental science and provide them with the skills they need to investigate and critically evaluate information in an ever-changing environment.
When I first began teaching this course in 2008, I used a more “textbook centered” approach, initially constructing my lectures to follow the chapters of the text rather than using concrete objectives to guide my use of instructional time. As a result, I found that students relied heavily on rote memorization and an inability to extend or apply course content. However, it has always been a larger goal of mine, even within this more traditional structure, to help students move away from rote memorization to applying course information to new situations.
In order to help students reach this higher level of understanding, I realized that I needed to significantly re-think my course structure from the ground up. I began with the course objectives, and I refocused the course to emphasize these critical thinking skills. My revised objectives expect students to:
- Understand the scientific concepts and methods that underlie environmental studies.
- Understand the way that human choices and natural processes combine to create current environmental conditions.
- Critically evaluate and synthesize new information and relate that information to course concepts and to novel situations.
- Identify and investigate important environmental issues and assess potential solutions to environmental problems.
In order to help students best acquire these skills, I recognized that significant changes in both in-class and out-of-class expectations were necessary. I decided to use a more student-centered course structure to target these learning outcomes and to enhance student engagement with the course. I also hoped that asking students to complete more team-based in-class activities would encourage more peer-teaching among students from diverse academic backgrounds.
With these goals in mind, I began to “flip” my course in the fall of 2014. This required asking students to learn the fundamental terms and concepts outside the classroom and spending class time on active learning exercises that would enhance students’ understanding of difficult concepts and provide opportunities to practice critical thinking skills. I replaced portions of my lectures with in-class exercises that would give students ample practice applying, evaluating, and synthesizing material, and which would require students to have read the text and mastered basic vocabulary and concepts prior to class.
Mindful of colleagues’ advice to flip courses gradually, I decided to retain the basic structure of my existing lectures and intersperse a few active learning exercises into each class session. In addition to the added activities, I continued to provide PowerPoint Slides and lecture note handouts (pdf) to supplement my oral coverage of the material. Because I was not certain which activities would best accomplish my goals, I used a variety of activities throughout the semester. In most classes, I used two-minute papers or think-pair-share exercises, with the occasional worksheet (pdf) to emphasize big ideas from the class period. I also had a lot of success using Sli.do, a live, interactive tool that allows me to ask questions in class and do real-time polling. I extended each of these activities with whole-class discussions. Although I imagine these activities would be easier to implement with a smaller group of students, I was pleasantly surprised at how much discussion they generated in a 100-seat classroom. I found these activities to be much more effective than lecturing over the same material, and was pleased to find that the class would sometimes think of novel answers and approaches that went beyond my expectations. (Doug Ward wrote about one such exercise in Vol. 8, no. 1 of Teaching Matters.)
For most activities, I chose concepts that are difficult for students to understand or those where students generally hold on to substantial misconceptions. There are a number of well-known misconceptions in the physical and biological sciences that were obvious choices for in-class activities and discussions. For example, students often struggle to understand the difference between evolutionary adaptation and short-term acclimation. On several past exams, I had asked a question that required students to apply their knowledge of these concepts to an example involving dogs, and I have been consistently surprised by how poorly students do on this question. It was relatively simple to construct an activity that would help students discover and correct this misconception.
I decided to do a short exercise allowing students to explore these two concepts using a very different example, hoping to test whether practicing this application skill would allow students to transfer their knowledge. I chose an unusual example to illustrate this; I showed the class a YouTube video of “Snowball the dancing cockatoo.” This video may not be the best example of adaptation and acclimation, but the video was quite memorable to students, includes some interesting behaviors, and worked very well for our purposes. After the students watched the video, I asked them to work together to decide whether the observed behaviors represented an adaptation or acclimation, and then construct an argument defending their position. Although most students appeared completely mystified by the entire exercise, they gamely attempted to answer the questions. I then asked for volunteers to share their ideas, and opened it up to a larger discussion. Their curiosity was piqued, and I had no problem maintaining a lively discussion – even with 100 students. We were able to discuss the question in enough depth to cover several related topics as well. (e.g. Can a behavior be an adaptation? What is required to determine if something is an adaptation? Are behaviors genetically determined?)
An additional type of activity came directly from my lecture notes. In almost every lecture, I illustrate important points using stories and examples to highlight the connections among core concepts and to model problem solving and prediction. It was very simple to turn these examples into short think-pair-share exercises, by making these stories interactive. For example, instead of simply listing the energy transitions involved in the production of chocolate, I gave a few examples, modeling the exercise; then I allowed students to practice this thought process themselves, listing as many energy transitions as they could in collaborative small groups. We finished the exercise in a whole-class discussion. It was gratifying and a little embarrassing to realize how much more the students learned on their own or working with peers than they had listening to my examples. In some cases, I even found that entire lectures could be converted into worksheets, and that students not only enjoyed working on these, but also retained the material much better than they had when I had merely lectured over it.
As I spent more of class time on interactive activities, I didn’t have to make many changes in the out-of-class work. This is largely because I had already begun using online adaptive learning modules for homework. Students who used the materials generally learned the basic terms and concepts quite well, but I found that I had to offer a fairly large number of points for each module to motivate students to complete them. Since all students should be able to earn 100% on these modules, this policy resulted in considerable grade-inflation – a situation that I was eager to remedy. Under the new course format, students would need to complete the online modules in order to fully participate in the in-class activities, so I decided to reduce the number of points each module was worth and replace those points with scores for in-class exercises and discussions. I hoped that this would solve the grade-inflation problem while simultaneously improving student learning.
Lastly, in fall 2014 (pdf) I made alterations to the course exams (pdf). I wanted to keep much of the exam similar to past exams to facilitate pre-intervention, post-intervention comparisons, but I also felt it was important to add a few open-ended, short answer questions to each exam. I feel that it is easier to discover exactly what students do and do not understand from open-ended questions, but I had previously been limited in my ability to add these questions due to the large size of the class. With the addition of a teaching assistant who could assist with grading, I was able to add a few of these questions to each exam.
In order to assess student learning and long-term retention from the in-class activities, I compared student answers on specific exam questions from fall 2013 (pre-intervention) to fall 2014 (post-intervention). For example, on the first exam, I included the question stem mentioned above about acclimation and adaptation in dogs. However, because I wanted more insight into students’ thought processes, I changed the questions from multiple-choice to short answer. I was gratified by how much better the students were able to answer this question than they had in previous years, even though the question stem was very different from the example discussed and questions I had asked in class.
Although the questions were in different formats over the two years, it is still possible to make some comparisons between the results from the fall 2013 and the fall 2014 classes. Because multiple choice questions are either right or wrong, I felt that the most conservative test would be to compare the proportion who correctly answered the multiple choice question in 2013 with the proportion who received full credit for their response to the short answer question in 2014.
In analyzing the data, it appears that the interventions I made in the fall of 2014 resulted in an increase in the proportion of students completing the question correctly – 33.71% answered correctly in 2013 which improved to 49.48% in 2014. This is a conservative comparison, since it excludes the students who were nearly correct with their short answer response (an additional 14.4 % students received 4.5 out of 5 and 16.5% received 4/5). Even when considering only the perfect scores, the difference between 2013 and 2014 is statistically significant (p = 0.015). There was also a slight trend indicating higher scores for students who were present during the cockatoo activity, but the sample size for absentees was low and the trend was not statistically significant.
The overall exam means were remarkably similar in 2013 and 2014 (averaged across the three exams, the overall mean was 72.76% in 2013, and 74.84% in 2014). However, there was a slight shift in the types of questions missed and in the overall distributions of the grades. The 2014 students scored better on the more difficult application and synthesis question, but missed some easier terminology questions that most students had gotten correct the previous year. This was partially due to a faulty assessment strategy in 2014. Because I wanted to compare exam performance across years, I kept most of the exam the same from 2013 to 2014. However, the in-class activities placed a large emphasis on some portions of the material, while de-emphasizing others. This likely influenced the students’ study habits, causing them to skim material that previous classes had spent more time studying. This was not an entirely undesirable result, however, as students appeared to be spending less time memorizing less important key words and more time deepening their understanding of core concepts. In the future, I will change my assessment strategy so that it is a better reflection of students’ progress in meeting course objectives.
It was also interesting to note that, although the overall mean was very similar in 2014, the distribution of grades was noticeably different. The grade distribution was fairly flat in 2013, with 13% of the students earning an A, and 20-25% of students falling into each of the remaining grade categories. The distribution in 2014 more closely resembled a normal distribution; the number of students receiving either A’s or F’s dropped closer to 10%, and the majority of students fell in the middle with B’s C’s, or D’s. It was particularly striking that the number of D’s dropped slightly, from 23% to 20%, and the number of F’s dropped the most: from 20% in 2013 to 12% is 2014.
Total number of students in receiving each grade, summed across the three exams
Despite the improved exam performance, the final course grades dropped slightly from 2013 to 2014, with the mean course score dropping from 86.13 to 79.38. As was seen with exam scores, the 2014 grades more closely approximated a bell-shaped curve, with a peak in the “B” range. This change was likely due to a change in the number of items used to calculate final grades. Although the exams were worth the same amount each year, in 2014 I reduced the number of points available for the online homework and replaced them with credit for completing the in-class activities.
There was also an increase in D’s and F’s in 2014. Although this is concerning, I feel that the grades more accurately reflect the amount of effort that the students put into their learning, as well as the amount of course material they were able to master. In 2013, students were able to get by with less effort and lackluster exam performance by completing all the assigned homework and using those points to make up for poor exam performance. In contrast, students in 2014 had to attend class and participate regularly, as well as finish homework assignments in order to earn points outside of exams.
Students receiving D’s and F’s in 2014 were disengaged from most of the course, participating in only 40-45% of class activities, and scoring poorly on exams. Students receiving D’s put slightly more effort into the course, completing about twice as many homework assignments and scoring slightly better on the exams than the students who earned F’s. In contrast, A and B students completed over 95% of the homework and 87% or more of the in-class activities. In future iterations of the course, I hope to increase student engagement by spending more class time on active learning projects, increasing peer-to-peer instruction, and helping students to discover the relevance of course topics to their own areas of interest. In addition, I will include additional assessment strategies to better assess course objectives and to give students more avenues to demonstrate their learning.
Final, Posted grade distributions in 2013 and 2014
Although I was only able to flip portions of each class session during this first semester, I found the change to be very rewarding and much easier to implement than I had expected. I enjoyed the more spontaneous feel of the class sessions, and I looked forward to the energy and insights that students brought to the active learning exercises. One of my biggest challenges turned out to be pacing. It was very difficult for me to predict how long each activity would take, and I almost always underestimated the amount of time that even the simplest activity would require. As the course progressed, I found myself increasingly behind schedule, and I had to push back dates several times. I also found myself rushing through material as we got closer to each exam to be sure that we had covered enough for the test. Because I was concerned that some material would be glossed over or missed entirely, I posted my traditional lecture notes to the Blackboard site for students to review. I’m not certain how many students took advantage of the posted notes, but their availability seemed to calm any fears that the students may have had about studying for exams.
I tried to be very transparent with my teaching goals, because I was concerned that the frequent changes to the schedule and the adjustment to more activities would cause students to lose patience with the course. In order to provide my students more insight into the course design, I explained both the experimental nature of the course and the reasons that I thought the new structure would benefit them. I also made it very clear that students would not be penalized because of any schedule or format changes, and I was proactive in seeking out and incorporating student feedback during the semester. Contrary to my initial expectations, I was pleased to find that the students embraced the experimental nature of the course and demonstrated very positive attitudes even in the face of the inevitable misfires and apparent dis-organization that occur with this type of redesign. Students expressed enthusiasm about helping to improve the course, felt more in control of their learning experience, and genuinely appeared to enjoy class sessions and the active learning activities. This was borne out by a low course dropout rate, and high reported levels of student satisfaction on the end-of-semester course surveys. I found the course evaluations particularly surprising, since I had been warned to expect a large drop in evaluation scores for the first semester of flipping a course.
Nevertheless, I was able to identify several areas that I hope to improve for future iterations of the course. Although I am still experiencing a certain amount of trepidation about fully abandoning a lecture format, I plan to expand active learning opportunities and drastically reduce or eliminate the amount of time spent lecturing. In 2014, I made a good start at addressing my revised, skills-based learning objectives, and I plan to further expand in-class activities that give students more practice acquiring these skills and that will increase student engagement. With improved support from teaching assistants, I plan to increase the number and diversity of short in-class writing assignments and to improve the feedback on those assignments. I have also applied to have writing fellows assist with the course, so that I can assign one or two longer writing assignments that students will have the opportunity to revise. These assignments will allow students to research an environmental issue of their choice, critically evaluate possible solutions, and bridge the gap between subject matter and their own disciplines. I will expect students to discuss their research and assess different solutions in a written report. In addition, I plan to have students summarize their findings in a short online presentation that will be made available to the entire class.
Lecture time will be further restricted, and will only be used to address problem areas that are uncovered in students’ homework performance, target difficult concepts, and help students place ideas and activities into the larger context of the course. I would also like to record these mini-lectures and make them available online for students to review. Of course, replacing the majority of lecture time with more time-consuming hands-on activities will necessitate a reduction in the number of topics that I am able cover, but I believe that this reduction will better reflect my overall philosophy for this course. I feel that it is less important that students memorize the exact facts and figures related to current environmental problems, and more important to help them acquire the skills that they will need to navigate the ever-changing landscape of environmental issues, potential solutions, and emerging technologies. A thoughtful selection of content and activities, will give my students the foundations and skills that they need to grapple with complex environmental issues.
Finally, I would like to spend more time looking through all the data I have acquired from student exams. Student responses to test questions can provide tremendous insight on the threshold concepts in the field; as I identify more of the questions students frequently miss on the exam, I can understand the larger ideas with which they struggle. I would like to spend more time uncovering these fundamental misunderstandings and design specific activities to address them.