How to Evaluate Critical Thinking in the Age of AI
- Many campuses have taken steps to curtail the use of generative AI tools, often over fears of plagiarism—but these fears overshadow AI’s potential as a pedagogical tool.
- Because GenAI’s responses are immediate and nonjudgmental, learners can develop their critical thinking processes as they freely reflect on thoughts, responses, and concepts.
- GenAI has not supplanted the role of instructors in the classroom. Rather, it has become a tool that we can use to teach, inspire, and guide our learners.
Learners have embraced generative artificial intelligence (GenAI), but academic administrators and faculty appear to be more apprehensive about using this emerging technology. Since GenAI began taking hold, administrators and faculty have set policies to restrict its use. They have used AI detectors to police plagiarism (despite the inconsistent capabilities of these tools), while their offices of integrity have doled out punishments.
But as educators have learned over the past year, these interventions won’t curtail the use of generative AI by learners. Moreover, we believe there are many reasons that educators should stop resisting this technology and start enjoying the benefits GenAI has to offer.
Before we offered anything close to a salve, we wanted to know: What are some of the sources of apprehension among our colleagues? The three of us have had productive conversations on this question with professors from various institutions. Through these conversations, we learned that most faculty were concerned about the same thing: plagiarism.
As we listened, we realized that plagiarism is merely an administrative term used by academic cultures. When we set rules prohibiting plagiarism, we create a policy safety net that allows us to teach and evaluate our students’ critical thinking. We want to know of our students: Is this your own thinking? Are these your own written words?
These questions lie at the heart of our anxiety. How can we evaluate a learner’s critical thinking if the content is AI-generated? While this is a fair question, we should be asking a different one: How can we use generative AI to develop our students’ critical thinking skills?
The Limitation of Traditional Teaching
Our answer here may surprise you. For example, prior to having chatbots in our own classroom, we provided learners with short scenarios focused on ethical dilemmas that entry-level professionals might encounter. Each learner would take 20 minutes to think through the dilemma, generate an overview, identify stakeholders, and decide what course of action to take. We would then spend the rest of the class time in discussion.
Our students enjoyed these thought challenges. As instructors, we recognized the effectiveness in getting future business leaders to think, write, and discuss potential moments of ethical fading. We never graded these interactions, and learners never asked for points for their participation. Socrates would have been proud. With these class discussions, we had transcended transactional coursework.
In our classes, we encourage students to engage in conversations with the bots. Learners discover that GenAI can serve as a tutor, an intellectual sparring partner, and a personal instructor.
But these assignments had a significant limitation: It was difficult to measure whether all learners had pushed themselves to think critically and reflect deeply about the dilemma. As in any group discussion, some were more vocal than others. Even if called on, some learners would simply parrot previous responses. Moreover, these assignments were not designed to provide students with additional instructional feedback after the in-class discussions were over.
How could we address this limitation? How could we ascertain every learner’s depth of critical thinking through this exercise? Enter ChatGPT.
Conversing With the Bots
In an October 2023 article in AACSB Insights, Anthony Hié and Claire Thouary write that “the better students are at communicating with AI, the more likely it is that they will have seamless and rewarding learning experiences as they use AI to deepen their understanding of complex concepts, find solutions to problems, or explore new areas of knowledge.”
Yes, ChatGPT creates content; it can write essays, blogs, and even novels based on a simple prompt. But at the J. Whitney Bunting College of Business and Technology (CoBT) at Georgia College & State University in Milledgeville, we use it differently. Rather than worrying about how it might replace our teaching, we wanted to figure out how it could improve student learning.
After all, chatbots are, at their core, dialogical. With this in mind, we guide our learners to engineer effective prompts. We encourage them to learn how to engage in conversations with the bots. In our classes, learners discover that GenAI can serve as a tutor, an intellectual sparring partner, and a personal instructor.
Learning Through Repetition
Let’s look, for instance, at how we now ask students to think through ethical dilemmas in an in-class assignment in our undergraduate business communications course. Before the class session starts, we send students a specific prompt. We instruct them to copy and paste the entire prompt into their own ChatGPT accounts.
It’s important to note that the prompt’s rules and steps tell the bot how to behave. When we write in the prompt, “Now, please follow these steps,” we are instructing the bot to follow those exact steps. The learner is identified as the “user” in this context.
Once the learner submits the prompt, ChatGPT will create an ethical dilemma for the learner, along with the three discussion questions and the required list of components the learner must address. Until the learner has answered the questions and provided the information itemized on the list, the system will continue to request that the learner satisfy all components. (These components are listed a through d, as noted below.)
Once the learner gives the required responses, ChatGPT then will become the expert debater and present a response that questions the learner’s stance by offering the opposite perspective. The student will then respond to that “debate,” and then ChatGPT will evaluate the learner’s final response.
Here is the prompt we use for this assignment:
Act as an expert professor in business ethics. Create an ethical dilemma that involves an entry-level finance employee.
Rule: The dilemma should be complex. Right versus wrong should not be explicit. Please do NOT provide analysis.
Now, please follow these steps:
1. Create three discussion questions.
2. After the user’s response, create three more questions, UNLESS the response does NOT include all the following components:
a) An overview of the ethical situation
b) A list of options
c) A list of stakeholders
d) A recommended action
3. If the responses are missing any of the components, please ask the user to provide the missing component.
4. If all the components are provided, then act as expert debater and present an opposite perspective.
5. Wait for a final statement from the user.
6. Once the user provides the final statement, evaluate the quality of the responses based on the detail of the user’s responses, user’s use of evidence, and ethical validity.
The prompt creates an individual dilemma, and learners must work through that dilemma step by step—this prompt focuses on finance, but we can modify to focus on any industry. The benefit to these in-class conversations with ChatGPT is that learners often go beyond initial levels of thinking about the ethicality of the dilemma.
In fact, learners reach secondary and tertiary levels of thinking. They ask themselves more nuanced questions: Why does a particular response matter? What are the implications of good or bad decisions? What learned concepts can be applied to making ethical decisions and acting ethically?
The point of critically writing through these dilemmas is not to bring about ethical epiphanies, since such epiphanies are hard to sustain. Instead, by regularly assigning these writing exercises, we want students to create muscle memory, as Brooke Deterline describes in her 2012 TED talk on creating ethical cultures in business. Through such repetition, learners are more likely to acquire ethical reflexes that guard against the potential risks of ethical fading.
Learning Without Judgment
Another important benefit to using generative AI tools for critical thinking in the classroom is that each tool acts as a nonjudgmental collaborator. This means learners can converse with the tool, asking any question they want without the collaborator judging that question as “stupid” or “unworthy.”
GenAI’s nonjudgmental, in-depth responses ultimately help learners develop their own critical thinking processes, because the platform allows them to play with and reflect upon a variety of thoughts, responses, and concepts. They feel free to ask questions, challenge their own perspectives, and allow the bot to help shape and organize their thinking. We cannot overstate the value to learners of playing with questions, thoughts, and concepts.
Generative AI isn’t going away. As Microsoft and Google integrate generative AI into their word processing software, instructors need to adapt.
In a September 2023 article published by Harvard Business Publishing–Education, Ethan Mollick and Lilach Mollick note that the instantaneous feedback from ChatGPT adds significant educational benefits. Learners often have attention and distraction issues, but AI tools can instantly generate feedback, which means learners don’t have to wait to see if their responses can be better developed.
Revolutionized, Not Replaced
As we have found, GenAI has not supplanted the role of instructors. On the contrary, after our students’ initial independent conversations with the bots, our facilitated class discussions are much more focused and informed. We can select one dilemma from the group and discuss it in detail, and those discussions are lively and provocative. Now, everyone has self-developed perspectives. We still find room to teach, inspire, and guide our learners.
To further ensure accountability, we require students to submit their conversations to our learning management system, a process that requires just the click of a button. We then can review and evaluate each learner’s response.
At the end of the day, generative AI isn’t going away. As Microsoft and Google integrate generative AI into their word processing software, instructors need to adapt. This is why the CoBT continues to expand its work with this technology. For instance, we have established an AI Lab, and we offer GenAI workshops for the campus and broader communities in Georgia. We continue to bring in industry leaders to engage our campus community on the topic, and we collaborate on AI projects with students and faculty outside the CoBT.
We must continue to innovate to make the best use of all AI has to offer. Let’s use this revolutionary tool to help ourselves and our learners become better thinkers—and better people.