Published: Sept. 16, 2024 By

As AI enters the classroom, Leeds faculty balance ethics with innovation.


From Google search results to social media, artifi cial intelligence (AI) is everywhere鈥 and it is here to stay. AI is the Wild West of technology, with few regulations and seemingly limitless applications.

鈥淪ince its release in November 2022, ChatGPT and similar language models have transformed nearly every aspect of business,鈥 said Shannon Blankenship (Econ鈥98), Leeds Advisory Board member and principal at Deloitte Tax. 鈥淭he technology has dramatically accelerated interest and investment in digital transformation from nearly every industry, the likes of which we haven鈥檛 seen since the internet became mainstream in the late 1990s.鈥

Like any new technology, AI has the potential to revolutionize the way we teach students in higher education, but there are also legitimate concerns about its accuracy and its ethicality. Still, preparing听students for a world with AI means equipping them to use it, and at Leeds, faculty and staff are forging ahead to make this happen experientially and ethically.

鈥淚t鈥檚 going to change things dramatically,鈥 said Dan Zhang, associate dean for research and academics. 鈥淚 think people, once they realize the potential, are going to be so excited about it.鈥

Megan Van Portfliet headshot generated by AI

Meghan Van Portfliet, teaching assistant professor in the Social Responsibility and Sustainability Division, plans to let students in her fall 2024 World of Business courses use ChatGPT for an in-class debate. Split into two teams, students will craft prompts for ChatGPT. Then, once the students feel they have created a prompt strong enough to generate a robust argument, ChatGPT will debate itself using the responses generated by the students鈥 prompts.

鈥淏ecause it鈥檚 hands-on in the classroom, it鈥檚 really transparent about what they鈥檙e doing, and it gets them to engage with it in a way that鈥檚 not risky from a standpoint of 鈥楢re we assessing what we want to assess?鈥欌 Van Portfl iet said.

Jeremiah Contreras, teaching assistant professor in accounting, received the 2024 David B. Balkin, Rosalind, and Chester Barnow Endowed Innovative Teaching Award in part due to his adoption of AI technology in the classroom. In his Ethics in Accounting class, he uses ChatGPT to create a custom chatbot based on the Sarbanes-Oxley Act of 2002, which students can then use to learn about the law. He has also used ChatGPT to help students create team contracts. The result is a more immersive approach to learning.

Zhang and David Kohnke, senior IT director at Leeds, are collaborating on an initiative to incorporate AI into not only the Leeds curriculum but also operations and research. One major aspect of this initiative is the implementation of an AI grant proposal process, which will award faculty stipends for implementing AI tools in their courses. On the research side, Zhang plans to organize training workshops and information exchange sessions so faculty can learn how others are using AI in research. He stressed that the goal of the initiative is to organize all areas of Leeds.

鈥淭he trick here is not to do this just in one class or as one person, but rather the charge is really to mobilize our faculty and staff,鈥 Zhang said.

As head of the initiative鈥檚 education committee, Contreras is working to incorporate AI into the Business Core curriculum. Contreras is also developing training seminars to help other faculty integrate this new technology into their lessons. The goal, he says, is for all Business Core classes to teach students how AI is being used in industry and how to use it ethically.

鈥淚t鈥檇 be irrespoJeremy Contreras headshot generated by AInsible not to address it,鈥 Contreras says. 鈥淚t would be like not showing students about Word and Excel when those first came into play.鈥

There are reasons for caution when using generative AI in education. As a spring 2023 report from Cornell University explains, introducing generative AI tools without setting guidelines can prevent students from developing foundational skills. If a student is asking ChatGPT to draft their essays, they are not practicing necessary critical thinking skills. To Contreras, this is听one reason AI skills should be taught. He emphasizes that 鈥済enerative AI is most effective as a partner鈥 and that we should teach students that even when leveraging AI, fi nal decisions and outcomes should remain our responsibilities.

An April report from the National Institute of Standards and Technology also notes that AI models can reproduce systemic and individual biases, particularly when the datasets used to train these models are themselves biased, lacking data from marginalized groups, for example.

But Van Portfliet, whose research centers on ethical business practices, notes that bias is not an issue limited to AI. 鈥淭he issue of bias within AI is not any more dangerous or risky than the bias in what material we select,鈥 she says. 鈥淏ias is everywhere, and it鈥檚 something we have to be conscious of and try to overcome.鈥

She added that when using AI, questioning AI output and being more intentional in the prompts used can counteract the bias in the technology.

When it comes to other ethical issues surrounding AI use in the classroom, such as students using ChatGPT to plagiarize, Van Portfl iet believes students should be encouraged to use AI as a tool rather than a replacement for critical thinking. In some cases, instructors might need to rethink methods of assessment, such as essays, which can be fully completed by generative AI.

Overall, Contreras and Van Portfl iet believe AI should be discussed openly with students, not demonized.

鈥淭here are right ways to use AI, and there are wrong ways to use AI,鈥 Van Portfl iet says. 鈥淚t鈥檚 important to acknowledge that they both exist, but it鈥檚 not a black and white issue. It鈥檚 not right or wrong to use AI on assignments full stop. It鈥檚 right or wrong to use it on a specifi c assignment or in a specific way.鈥