Time to Play the Game

Share

I Know Why College Grades Are Going Up. It’s Definitely Not Wokeism. (The New Republic)

While I was never one of those kids who took apart my toys and electronics when I was younger so I could try to figure out how they worked, I definitely enjoyed more than my fair share of puzzles back in the day. Part of the reason I latched onto computer programming when I was allowed on my family’s ZX-80 is because it let me see how to do these amazing things on a television screen, and how the commands I gave the computer would change what I’d see in front of me, in a way that even my five-year-old brain could understand was kind of unprecedented for the time, having seen just enough television shows by then to make me really want to be “on television,” and early computers like the ZX-80 and TRS-80 Color Computer enabling me to put at least some facsimile of myself on a television screen. Thinking about how to go beyond the first “Hello World” and craps game programming I did made me eager to keep figuring out how these computers worked, and what skill I could learn next to do something even cooler on the screen the next time I tried programming.

The downside to this, of course, is that it can breed a tendency to see things as only their components and how they work together, and begin to lose the ability to shut off that analytical part of the brain so you can appreciate the whole on the breadth of its merits. This is often more visible in things like media criticism, but it can apply to other areas of life as well. Instead of experiencing something on its own merits, you instead just start to see it as however many components, a number of things to be understood that may comprise something larger in the aggregate, but which doesn’t concern you because you’re always looking to see how things work, and not interested in appreciating how things are.

Academic classes are much the same way. Even when I was still an undergrad, I noticed how many students around me were basically treating each of their classes as a puzzle to be solved, a game to be played, where they could reduce everything down to a checklist of “things I need to do to get the grade I want in this course,” whether that grade was an A or a C or a D, accomplish those things in order, and otherwise be mentally and emotionally vacant from a class, forgetting anything that was learned while completing those tasks to make mental room for the new stuff to learn while working on the next checklist. It’s not something I want to criticize — especially on the other side of the classroom now, it’s easy for me to see the complex and horrific lives my students are leading, and to understand why they simply lack the time and/or energy to engage in their classes as much as I engaged with my college classes — but it’s also something that I could never see myself doing as a student. Especially after the difficulties I had earlier in my schooling, once I was able to regain my love of learning after I got to college, I can’t imagine myself not throwing myself headlong into any class I might take in the future, even if the subject matter doesn’t interest me, just so I can make the most of the opportunities that class provides.

I’ve always hypothesized that the rise of video games is a big culprit in this phenomenon, and why an increasing percentage of students seem to be “gaming the system” year over year. It’s not enough that video games, and the systems they’re on, have become more explicit about the steps necessary to get those all-important achievements, to get the best-looking skin for your character or a platinum trophy or what have you, so gamers can just treat the game as its own series of tasks to be completed. As more classes (in college and elsewhere) began utilizing course management software like BlackBoard and Canvas, it became much easier for students to conceptualize each of their classes as a series of tasks to be done, rather than as an organic whole. These structures were already in place long before classes went online, of course, but the nature of course management software makes it far more evident to students: Do these things, and all the percentages you get on them will add up to your final grade at the end of the term.

When we were all forced online at the start of the COVID-19 pandemic, of course, that accelerated this phenomenon to a whole new degree. Especially as young people dealt with the stresses of a world that was seemingly crumbling around them, and they lacked the coping mechanisms that older people had been given the time to develop, it was far too easy for them to stop thinking about taking a class as an organic learning experience, a chance to better themselves in ways both intended and unintended by the instructor, and instead to just see a bunch of tasks on their online to-do lists in the course management software, get them done, and then move on to all the other things that were on their minds, most of which they experienced as (and probably were) a lot more important to them at a time of such deep crisis.

I’ve been pondering this problem for years, long before the start of the pandemic, and I’ve always tried to focus on the issue of motivation. Maybe because of my own past as a student, I’ve tried to figure out how to cultivate in my students not necessarily a love for learning, but an understanding that learning is about more than just accomplishing a series of tasks your instructor gives you, and that looking at learning as a whole can help you get more out of it, especially when it comes to learning things you actually want to learn and might not be what your instructor planned the course around. That New Republic piece is making me think about how this gaming-out of classes that so many students are doing may be playing a part in grading, and how grades are perceived.

On the one hand, this ties directly back into issues of what grades measure, and how much those grades should be determined by “book learning” and quizzes and tests and the like, and how much “real-word applications” and things like practical projects should play a role in the final grades that students get. Those projects, especially when graded subjectively (and without an explicit rubric that students can game out), often do a lot to help students get out of that checklist mindset. At the same time, though, given that complaints of grade inflation almost always come from the same forces that want to turn colleges and universities into glorified trade schools with their own sports teams, it’s easy to dismiss accusations of grade inflation as another form of vapid crankiness from the same people who have been railing against higher education since the sixties.

I can safely say that I have never consciously engaged in any kind of grade inflation in the eighteen and a half years I’ve been teaching, and I believe that the institutions at which I’ve taught have rigourous evaluation systems that would sniff out any grade inflation I was engaging in if I was somehow doing it unconsciously. This isn’t to say that grade inflation doesn’t exist, or that it isn’t a problem, but the people being allowed to define “grade inflation” in popular discourse are often using it as a bludgeon to try to beat colleges and universities into submission to their will, in the same way that they’ve used terms from “the New Left” to “woke-ism” to try to force their ideologies into even more of our lives. We absolutely need to be talking about these issues, but just like we don’t want our students to see our classes as just a checklist of tasks to complete, we can’t let ourselves conceptualize or engage in arguments about grade inflation as just a series of arguments to refute. This debate is happening in a larger, organic whole conflict about the purpose and practice of colleges and universities, and we need to prioritize that whole if we don’t want our arguments about grade inflation to be forgotten about as soon as we get them done.