猫咪社区

Skip to Content, Navigation, or Footer.

Editorial: Brown should prioritize its foundational learning goals when addressing AI

This semester, there is a new phrase  haunting professors鈥 syllabi: artificial intelligence. Each course seems to have a different statement on AI: For some classes, the use of AI is defined as plagiarism; for others, it is permitted as long as students cite how the technology was used. Some syllabi don鈥檛 mention AI at all; some enthusiastically encourage students to utilize it. Brown professors should have discretionary power over AI regulation, but the University should also offer guiding principles for the non-negotiable elements of learning that must be preserved as part of a Brown education.  

Over the summer, the University released a regarding the impact of AI on Brown鈥檚 academic mission. The communication encouraged faculty to outline clear rules about what is and is not permitted in their classrooms. Other universities have released , including guides on how to write AI policies for syllabi. Given Brown's wide range of disciplines and their unique learning goals, it makes sense to let professors implement their own course-specific policies instead of a blanket University regulation. But if we want to safeguard the fundamental features at the heart of student learning, Brown must recognize both the threat and potential of these technologies. 

There is significant . Chatbots like ChatGPT deprive students of a learning process that is central to academic growth and skill building. By using ChatGPT, students don鈥檛 have to struggle through the challenging, iterative process of writing a research paper, solving a problem set or completing a coding project. These tools allow students to bypass the process of using and developing their critical thinking skills. Rather than prioritizing learning, AI systems maximize productivity and efficiency, which explains why have embraced them with open arms 鈥 but also why academia should proceed with caution.  

This doesn鈥檛 mean that Brown should ban AI completely. Instead, the University should consider how it can integrate AI as a complementary feature of its academic mission. In doing so, it can both continue to , while also equipping students with the skills to enter an increasingly digital job market. For example, just like there is a , perhaps Brown could create a 鈥渢echnology鈥 requirement. Courses under this requirement might include classes on computer science, AI prompt engineering, technology literature, cybersecurity and more. This way students can engage with AI in a way that complements rather than conflicts with a liberal learning education, and develop a better ability to think critically about the uses of AI.

ADVERTISEMENT

Two of Brown鈥檚 liberal learning goals most threatened by AI models like ChatGPT are 1) Improving speech and writing, and 2) Enhancing aesthetic sensibility. AI reduces writing and art, time-intensive creative practices, into near-instant processes of creation. The integrity of literary and visual arts courses, as well as any courses that require some mode of generative writing or creativity, is . If Brown truly believes in the importance of these two learning goals, then the University should ensure that they cannot be compromised by easy access to AI creative generation. This could be accomplished, for example, by having more in-class creative assignments that eliminate the possibility of reliance on AI tools.

Similarly, with their ability to write quick solutions for almost any homework problem, AI chatbots offer an easy way for students to bypass the work that goes into quantitative or coding-based courses. Brown professors should respond to this threat with the same philosophy held toward using the internet to search for answers since ChatGPT鈥檚 answers also come only from information already existing online. While online tools can solve the problems presented to us as practice, uncovering and addressing new problems requires a human brain capable of critical reasoning 鈥 something AI is not yet capable of emulating. 

These are just a few examples of the ways that AI can compromise the integrity of Brown鈥檚 learning goals. As the capabilities of artificial intelligence grows, so too will the possibilities for action that the University should consider to protect its academic mission. We call on Brown to strike a balance between the drawbacks and benefits of AI within its academic philosophy in order to protect students鈥 potential to learn, grow and contribute to the world around them. 

Editorials are written by The Herald鈥檚 editorial page board and aim to contribute informed opinions to campus debates while remaining mindful of the group鈥檚 past stances. The editorial page board and its views are separate from The Herald鈥檚 newsroom and the 134rd Editorial Board, which leads the paper. This editorial was written by the editorial page board鈥檚 members Paul Hudes 鈥27, Paulie Malherbe 鈥26, Laura Romig 鈥25, Alissa Simon 鈥25, and Yael Wellisch 鈥26. 

ADVERTISEMENT


Powered by Solutions by The State News
All Content © 2024 猫咪社区.