Grading Notes
PowerPoint Slides. During the student design presentations, I evaluated the students'visual
aids in several ways. First, I assessed whether the PowerPoint slides were
easy to read and understand. For example, in 2006 there were several instances
when student slides were simply screen captures from the roadway design program
used to design their highway. These typically were poor visual aids, as
they were oftentimes covered with obscuring lines or had so much detail that
the important information on the slide was lost amid the "background
noise." Also, students tended to place so much text placed on a
slide that it was hard to read (due to the small font size). In these instances, slides did not
perform their intended purpose of effectively conveying information.
After showing students in 2007 examples of these poor slides, I showed them
higher quality slides that conveyed only the necessary information in a manner
that the audience could read and understand. In 2007, there were very few instances
where poor-quality slides were used in the final presentation, reflecting
improvement from the previous year.
Second, I evaluated each group's actual use of their
slides. Did they ignore the slides and proceed with their talk in a
different direction from the slides? Did they rely on their slides so
much that it appeared to be a crutch without which they could not have
continued? Or were the slides simply supporting tools that meshed well
with what they were actually saying? For the most part the groups did a
good job of meshing their slides with their actual presentation, so overall I
had few concerns in this regard.
Third, I evaluated whether the slides were interesting in
and of themselves. While I did not take points from teams that had slides
that were easy to understand but otherwise "boring," students were
much more likely to get full credit if their visual material included
interesting and pertinent pictures, graphs, etc. that made the information more
visually appealing.
2. Presentation skills. Several criteria were used to evaluate student performances for
the actual presentation. First, I assessed the ability of each team member
to discuss the technical aspects of their portion of the design. Students
that efficiently discussed their portion of the design task, how they
approached the problem, and their final recommendations were likely to receive
most or all of the points regarding the presentation.
Second, an additional important criterion I looked for was
how the team answered questions from the panel. Was one person dominating
the responses to the exclusion of the other team members? Was there a team
member who appeared shy, practically hiding behind his colleagues? There
were several instances of some students dominating the question and answer
portion of the presentation in 2006, and fewer in 2007.
Third, the basics of an oral presentation were also
evaluated. Was the speaker loud enough to hear? Was there a
reasonable amount of eye contact with the review panel? Did the speaker
seem to appear knowledgeable on the subject matter or did it appear they were
unprepared?