Overcoming the Stresses of AI in Education

Article Icon Article
Monday, July 1, 2024
By Lynn Anderson Davy
Illustration by iStock/Ivan Sherstiuk
AI experts at EDHEC discuss how schools can integrate tech into teaching, learning, and administrative functions in ways that are comfortable for all.
  • While AI allows schools to create more personalized learning journeys for students, professors still must focus on the fundamentals of education.
  • Chatbots can be used to provide students with answers to basic questions, allowing faculty to spend more time providing deeper feedback on student work.
  • Professors need to experiment with AI tools to reduce their own anxiety about the technology and realize how useful it can be in the classroom.

 
While artificial intelligence (AI) is ubiquitous in the business world, it’s becoming just as common in the educational sector, as students rely on it to complete assignments and teachers use it to design tests and grade papers. Yet the technology is still new enough that both students and professors are still uncertain about when and how to employ it.

At EDHEC Business School in France, we have long sought to reflect current technology in our programs. We were among the earliest members of the Future of Management Education (FOME) Alliance, a group of international business schools that are reimagining digital learning. Our 2020–25 strategic plan includes an emphasis on tech and the humanities, with the goal of ensuring that students can make “managed and well-considered use of technology and data science.” As part of our strategy, this fall students in our Master in Management program will have the option of pursuing a track that focuses on using data science and AI in business contexts.

This commitment to technology has led many of EDHEC’s professors and program directors to speculate on the future of AI in education. These include Michelle Sisto, a former associate dean of graduate studies and current head of EDHEC’s AI initiative; Benoît Arnaud, dean of programs and the leader of EDHEC’s technology, AI, and sustainability strategies; and Arne De Keyser, a professor of marketing whose research focuses on understanding how people react to and experience new technologies.

Recently, these three AI specialists were recorded as they engaged in an unstructured discussion about how to incorporate AI into the classroom in ways that reduce anxiety for students and teachers—and improve learning outcomes across the board. Below is a transcript of their conversation, which has been edited and condensed for clarity.

Benoît Arnaud: We are all feeling a bit stressed about AI, but should we be? I believe AI will bring professors and students closer together, enabling and allowing business schools to offer even more personalized learning journeys—an excellent outcome. However, there is also concern about becoming overreliant on AI. Are we going to rely on it like we rely on GPS? Does anyone use maps anymore?

Michelle Sisto: I appreciate the GPS analogy, because it reminds us of tech innovations’ impact on our everyday lives. However, I believe we can avoid overreliance on AI by showing students how to use it to boost productivity and be a partner in the reasoning process without expecting it to fully replace hard skills and subject expertise.

Arne De Keyser: I agree that we should teach students to see AI as a useful tool, not a replacement for critical thinking. I think that’s the way most business schools are handling this transition. At any rate, we don’t have much choice—AI is here, and it’s not going away. At some point in the future, just like with GPS, we’ll look back and wonder how we ever got along without it.

Arnaud: You’re right. So, if AI is here to stay, what do we want to do with it in management education? We need to reframe the situation. We shouldn’t ask, “What can AI do for me?” but “What can I do with AI?”

“With AI amplifying disinformation and social media creating bubbles, it is ever more important for schools to help our students become informed and participative citizens. This means we must change the way we teach.”—Michelle Sisto

De Keyser: I guess this question of how to use AI to improve business education and the student experience is part of what makes some of us anxious.

Arnaud: Yes. We’re still trying to figure out what will constitute social interactions and what will constitute immersive learning in an AI-enhanced environment. We’re also trying to calculate the actual cost of keeping all this technology going, including the impact on climate. Will the carbon footprint be too large, and if so, will we need to focus on vital activities only?

Sisto: Another thing that keeps me up at night is the lack of research on the best uses of AI in the classroom. How can we use it as a sparring partner that will debate with us and give us alternative explanations for why something is happening? We don’t have the answers yet; we’re still waiting to see how students will use it and what the cognitive impact will be over time. Will it make them lazier? Will it be used differently by different types of students?

De Keyser: I have a lot of the same questions. AI has the potential to revolutionize higher education, but it’s still imperative for educators to think about the fundamentals of learning and what students want and need to get out of education. Will classes go fully online and digital, for instance? That’s a longstanding debate.

But we do see that campus life is still vital for students. This is where they make connections for life. They want to be part of a community of peers and connect with their professors. Things like student associations and clubs are integral to the student experience and can’t be replicated online.

Sisto: I agree that higher education is about more than acquiring a list of competencies. It’s also about learning to interact with and collaborate with others. With AI amplifying disinformation and social media creating bubbles, it is ever more important for schools to help our students become informed and participative citizens. So, this means we must change the way we teach.

I recently used a chatbot that was trained to help students with online course material. Students appreciated the immediate feedback they could get because the chatbot could answer simple questions where and when they were working on homework and projects.

De Keyser: So, did you have fewer questions after class? Did you feel like the chatbot was stealing the show?

Sisto: I still had plenty of people ask questions at the end of each class—so, no, I never felt like I was being replaced. However, the chatbot did free me up from having to answer many emails, and I spent that time providing deeper feedback on student work. AI can be very valuable in that sense, freeing time that professors can spend in strong value-added activities. In a course of 80 students, the time needed for individual feedback is significant.

“I’m amazed when colleagues tell me they tried a tool like ChatGPT once and didn’t get the answer they wanted, so they never used it again. ChatGPT is not a one-magical-click solution. It requires having a conversation of sorts to get meaningful output.”—Arne De Keyser

Arnaud: OK, let’s switch gears and talk about business school faculty and AI. How do we help professors think entirely differently about pedagogy in the face of AI?

Sisto: That’s a crucial question. Here, we’ve run small workshops with several teams. We introduce faculty to the tools of generative AI and explain how these are different from other tools. When we show faculty how to properly interact with AI and get the help they need, they say, “Oh, this is pretty useful,” and their anxiety levels decrease quickly.

De Keyser: Practice is key. I’m amazed when colleagues across different schools and countries tell me they tried a tool like ChatGPT once and didn’t get the answer they wanted, so they never used it again. People need to learn how to work with it. ChatGPT is not a one-magical-click solution. It requires having a conversation of sorts to get meaningful output.

Sisto: Yes, I’ve heard that too, which is why it’s crucial to delegate AI ambassadors—people ready to use this new technology to resolve even bigger problems. Ambassadors can help other team members feel more confident about AI.

For example, we have a professor working closely with our Pedagogical Innovation Lab to evaluate student work in her taxation course. She has gone through several iterations of prompts and styles of rubrics. At each stage, she compared AI feedback to professor feedback to understand what AI does well and what it is does poorly and to evaluate the consistency of its answers.

In parallel, she and another professor are doing a study across institutions to understand the degree of “acceptability” of AI feedback from the student perspective. They both have shared these experiences with our faculty at large to allow others to learn from their experience and to inspire further experimentation.

Arnaud: I’d like to hear more about critical thinking in the era of AI because I see it as paramount to solving some of the biggest problems of our century, such as addressing climate change, ensuring diversity and inclusion, and reducing social disparities. How can we use AI and curriculum restructuring to help our students contribute positively to society and the planet?

De Keyser: I like that. A professor’s role with students is to push back and demand more from them—a deeper analysis, a more critical look. So, as AI becomes a greater source of information in the classroom, I think we professors should demand even more of our students. The message should be: “You have AI supporting you, and if you bring your critical thinking, you can go even farther than before. And make a real difference where it matters.”

“I see AI as paramount to solving some of the biggest problems of our century. How can we use AI and curriculum restructuring to help our students contribute positively to society and the planet?”—Benoît Arnaud

Sisto: I agree. There’s an opportunity with AI to ramp up learning, not dumb it down. And I think the same opportunity exists for the administrative side of business schools. AI will push us to do better in terms of campus operations, including the overall governance and structuring of our data. We need to think about how we’re going to organize everything from the policy documents we create to the enterprise resource planning software we use for student data. With better organization, preparing documentation for accreditations might be easier—at least, that would be my dream outcome.

Arnaud: That’s a nice dream, and I think it could become a reality. We need to give ourselves the time to reimagine the way we do things.

De Keyser: My hope is precisely that: that AI gives us more time to think. AI could reduce the time we spend focused on administrative duties—any repetitive paperwork or analyses—giving us the freedom to reimagine business education. That would be a considerable improvement, in my opinion.

Arnaud: Any final thoughts as we wrap up?

Sisto: One thing we haven’t touched on yet is the value systems behind various AI tools and how those values could impact our students. For example, Google’s Gemini AI tool recently depicted German Nazis and American Founding Fathers as racially diverse. These were explicit misrepresentations of history that had the objective of being more inclusive. So, precisely whose values are these AI tools replicating, and do we want to delegate this to a small number of private companies?

De Keyser: Well, that’s something that really worries me, especially since we live in an increasingly polarized world.

Sisto: Right. And the Google example is just the beginning. Most of the AI tools we’re using now were created in Western Europe and the U.S., but there will soon be models from other countries. We don’t know what the underlying values of these tools will be or how they might clash with other AI value systems down the road.

Arnaud: Excellent point. The ethics of AI are tricky. Millions of people around the globe can’t even imagine what we’re talking about today because they don’t have access to reliable internet, which seems like an even more significant challenge to tackle. So, how can we ensure these new tools are transparent and accessible?

Sisto: These are excellent questions that we should tackle together—“we” meaning the global business community.

Authors
Lynn Anderson Davy
Manager of International Press Relations, EDHEC Business School
The views expressed by contributors to AACSB Insights do not represent an official position of AACSB, unless clearly stated.
Subscribe to LINK, AACSB's weekly newsletter!
AACSB LINK—Leading Insights, News, and Knowledge—is an email newsletter that brings members and subscribers the newest, most relevant information in global business education.
Sign up for AACSB's LINK email newsletter.
Our members and subscribers receive Leading Insights, News, and Knowledge in global business education.
Thank you for subscribing to AACSB LINK! We look forward to keeping you up to date on global business education.
Weekly, no spam ever, unsubscribe when you want.