ChatGPT Exposes the Dubious Economics of Learning Systems
Tom Haymes
—
ChatGPT challenges the systems of industrial education by undermining the accepted economics of learning. I’m not talking about whether college is worth it, but how we reward value for effort at all levels of our educational systems. In Learn at Your Own Risk, I describe this as “transactional teaching,” but its impact goes far beyond any specific interactions between student and teacher.
In brief, transactional teaching is the idea that students exchange work for a grade. Grades lead to degrees and certifications, but none of this shows the true value (or lack thereof) of what the student takes home.
Transactional teaching cheapens education. It exchanges valueless currency for meaningless experiences. Chat GPT exposes this reality, because it threatens to provide students with a means of exchange potentially as valueless as the grades they receive in return.
Transactional education is susceptible to the same kinds of theft and fraud that occur in any economic system. Transparency of transactions is the only remedy to illicit activity. Most educational transactions, as well as their underlying logic, are far from transparent.
Defense is no answer here either. Efforts to crack down and centralize an economic system will cause the same kinds of outcomes: a black market.
Chat GPT is not the first fake ID to emerge in the educational landscape. It is merely the most elaborate of them. As I pointed out in a recent blog, the Internet threatens the logic of a transactional educational system. It significantly expands the resources of the students as they navigate the game that is set up for them.
Up to now, they were wealthy in information, but poor in the application of that information. Chat GPT reduces that poverty of application to a point where, using traditional assessment methods, their poverty is much harder to perceive.
Like healthcare, the economics of education have never made sense because we do such a poor job valuing the intangibles of what it means to get a college education. Completion is an easy metric and grades are the building blocks of completion in the current system.
We have much better technology than these crude metrics to communicate achievement these days. These tools make possible new ways of communicating achievement that are far richer, and harder to falsify, than grades or other unidimensional metrics can provide.
However, we can’t just ignore extensive systems and cultural practices that we have built around anachronistic assessment methods. Most faculty are not well-trained in anything beyond summative assessment based on tests and essays. That alone is a huge barrier to quickly pivoting to richer assessment methods. Add on to that, there is a vast credentialing network that depends on grade-based course outcomes.
Academic freedom has turned most classes into what are essentially black boxes. They just spit out a grade at the end of the process. There are many exceptions to this, but most classes work like this, mine included.
I have used this freedom in my class to upend notions of grading. I am not naïve about how well this works. Swimming against the cultural systems of grading and “achievement” makes it hard for students to wrap their heads around different approaches to assessment.
I have considered carefully how ChatGPT might enter the workflow of my class. I am not as interested in how well my students write as much as I am interested in how writing disciplines their minds to allow them to break down problems and analyze them. It’s helpful to have a “student” who is less good at this process than they are. ChatGPT provides an infinite variety of poor students for my live students.
My approach to teaching is unusual among my colleagues. Those who engage in transactional teaching often build walls around eroding kingdoms of practice. I still see courses in our faculty development portal on Respondus Lockdown browser and other “defensive” tactics designed to preserve meaningless and outdated assessment practices.
However, it’s the institutions themselves that put pressure on already overburdened faculty to stay the course. The ultimate metric for a class is a “grade” and this is true even in my class.
This reality perverts the focus of learning in my class and is something I cannot get around. I have spent countless hours trying to game out how to pull my students’ focus off of these systemic factors, but it’s really tough.
Most faculty have neither the time nor the inclination to engage in similar reflections and, ultimately, my quest may be quixotic. Institutions need to create pathways that lead to non-graded outcomes if we want to get away from transactional teaching. It’s not fair to put this burden on the shoulders of faculty alone.
ChatGPT is the product of a collectivization of learning. It skims vast amounts of data and mashes that all together to create its outputs. That’s essentially what we ask our students to do when we assign them generic research papers. It should come as no surprise that this non-imaginative process is easy to automate.
The solution to this is to value individual learning over conformity. We should encourage students to apply their uniqueness to their learning products and journeys. ChatGPT fails miserably when we ask it to do this, for it is not human. AI can only hoodwink us if we lose sight of the human in the learning process. Grades are a way of automating humans.
There are many ways that institutions could devalue grades in their internal processes, but this involves embracing individualistic learning and the enabling technology that allows us to scale that to a viable level. These systems need to be built and implemented.
Institutions have a responsibility to both the faculty and their students to train faculty to think differently about how they structure their classes. This is not hard from a content perspective. There is a lot of this that is merely common sense. However, common sense is often difficult to implement, especially in the face of cultural and systemic barriers.
The character of this training is just as important as the techniques being taught. We need to get away from increasingly futile defensive tactics and reimagine the kingdom. We need to create a culture of responsive teaching, not one of reactionary teaching. This will involve some tough conversations.
Throughout history, but particularly in the last century, technology has challenged humanity’s capacity for adaptation. For instance, thoughtful predictions of doom accompanied the dropping of the atomic bomb. Humanity seemed to be too immature to wield the Sword of Damocles.
In the end, it was a combination of technology with the careful reconstruction of human systems that gradually built up our ability to turn data into sound decisions and avoid Armageddon. The human-technology systems that emerged made it easier to avoid brinkmanship as a tactic and ultimately made the world a safer place. We slowed time down to a human pace.
AI is going to force a similar reckoning of our human processes and the creation of new human-technology systems. This will take time. Human systems are slow to change.
Compared to the Cold War, the stakes are both lower in the immediate future (AI won’t blow up the planet) but higher in the long term. Humans need to stand on the shoulders of AI. We also need to learn how to do that.
How we respond to ChatGPT will be a good marker and a learning lesson for the next technology that comes down the pike. Education must develop a new flexibility to pivot and grow. Diving into a bunker will not save us.