Could Better Teaching Be Partly Responsible for Grade Inflation?

How do you know one teaching method is better than another?

How do you know when you’ve successfully developed more effective ways for students to learn the material in your courses?

If students are learning better, does that mean that they are earning better grades?

Image from: the-year-in-education-seven-innovations-changing-the-way-the-world-learns

How do we tell better learning from grade inflation?

I wish I had some answers to these questions. I don’t.

In evaluating teacher effectiveness and quality, we tend to rely on course evaluation questionnaires. Course evaluation questionnaires are a complex measure of teacher effectiveness, teacher enthusiasm/empathy, teacher reputation, student satisfaction with his/her grade, student preconceptions about the material covered in the course and a host of other intangibles. At my College and likely many others, we find ourselves grappling with the meaning of our student evaluation results as we decide whether or not someone should get tenure or keep a job or get a pay raise. The stakes are super high. (No, this isn’t a discussion of course evaluations!)

To measure student mastery and learning we look at grades as the major outcome assessment. My suspicion is that other types of outcome assessments would correlate pretty strongly with grades. But we are also concerned about grade inflation- that perhaps the grades don’t really reflect course material mastery. We wonder if perhaps the pressure to score well on our course evaluations makes us afraid as faculty to give grades lower than a B. We wonder if pressure from our students telling us they’ll never get into med school or land that job if they don’t earn A’s is making us grade our students more generously.

One thing I don’t hear us talking about though, is whether perhaps we are teaching better, leading our students to earn those better grades. There are abundant resources, particularly online, about strategies to help students learn. Hands-on experiences. A focus on writing in most of our courses. The use of more engaging technologies like course web sites, online interactive textbooks, colorful slides and videos. Students as peer teachers, working in small groups to approach core concepts more actively.

I’ve found that the first assignment I give to my classes reveals that my students seem to have the same types of difficulties, misconceptions, as ever at the beginning of a course. But, over the course of the semester, many transform their thinking about the course material. Their science writing improves, their use of scientific language becomes more sophisticated, their ability to understand the primary experimental literature improves. In short, they learn and master the material for the course. Shouldn’t their grades reflect this mastery?

If I’m doing a better job teaching my students to think critically, use the language of science in their writing and speaking, to master the core concepts of the discipline, shouldn’t the evidence of that good teaching be seen in the grades my students earn?

Which leads me back ’round to: How do we tell grade inflation from better learning?


2 thoughts on “Could Better Teaching Be Partly Responsible for Grade Inflation?

  1. Thank you! This issue of whether or not we are teaching better really should be central when we discuss grade inflation.

  2. Some disciplines have “standardized tests” that can be given to your students pre- and post-instruction to measure their learning gains more objectively. For example, physics has the Force Concept Inventory. I use some like this many semesters so that I can see if “better” or “worse” classes are really learning more or less, or if it’s just that my tests (or other circumstances) varied significantly. You could compare the results of such testing to your course grades.

Comments are closed.