Skip Navigation
MarylandToday

Produced by the Office of Marketing and Communications

Subscribe Now
Campus & Community

Let’s Chat About AI on Campus

Rise of Eerily Convincing Language Models Spurs Hopes, Fears About Impact on Education

By Maryland Today Staff

hands coming out of computer and typing while woman observes

The latest chatbots, including ChatGPT and Bing Chat, have an uncanny ability to converse with human-seeming ease. While increasingly powerful artificial intelligence opens up enticing new possibilities, a cross section of students, faculty and staff have found it has plenty of limitations as well.

Animation by Valerie Morgan

Is AI a next-gen calculator—a handy tool to crush drudgery in our studies and research so we can concentrate on the important stuff? Will it instead tempt us to let it do all the thinking? Is it a potential agent of chaos and misinformation? Is it something we have to understand for future jobs? Is it all of these?

In a recent email to all faculty, Senior Vice President and Provost Jennifer King Rice addressed the inevitable yet unpredictable effect such technology will have on college campuses, and encouraged a “nuanced and intentional” approach to the risks and rewards of systems like advanced chatbots that churn through mountains of data to converse and write in a seemingly human manner.

The new AI-based large-language models like Google’s LaMDA and OpenAI’s ChatGPT are becoming increasingly available for everyday use and are expected to expand in sophistication as they become more integrated with many technologies that we use daily, Rice wrote.

“The reality is that AI is here to stay, and we will need to adjust to and, as appropriate, integrate these new technologies into our instructional and assessment practices,” she told faculty. “We will also need to determine when the use of AI tools should be deterred and how to respond to inappropriate uses.”

In his state of the campus address last month, UMD President Darryll J. Pines struck a similar forward-looking note—and even had ChatGPT write the first few paragraphs of his speech. He then disclosed the author, noting that while higher education faces a challenge from this emerging technology, Terps have a history of adapting to innovation—and using it to solve problems.

“So while ChatGPT and other AI systems may at first seem to pose threats to our work, I’d encourage everyone to remember how other new tools each became integral to our enterprise,” Pines said—from calculators to laptops to Zoom.

Maryland Today spoke to a range of other Terps familiar with AI—through recent experimentation or years of study—to understand how it’s being used on campus now, and how it needs to develop to become a powerful educational tool.

Environment science and technology (ENST) major Neil Gomes ’23 employed ChatGPT in a class taught by ENST Associate Professor David Tilley, but also uses the software to learn new skills.

Dr. Tilley suggested we use ChatGPT to brainstorm ideas and topics, because we write a lot in his class. But I also use it sometimes to learn new things. When I wanted to learn Photoshop, I used ChatGPT to teach me the basics of the software. But I don’t think it’s at the level of specificity to be completely reliable; sometimes it’s wrong and sometimes it seems like it’s just making things up. You have to take it all with a grain of salt.

For instance, I asked ChatGPT to help me figure out which NBA players shot at least five or more free throws in a particular season using Python, which I was trying to learn for fun. It spit out a block of code but didn’t define the function. So, it was pointing me toward what to do, just not very well. That being said, it’s a pretty big game changer; I could see it almost acting as a virtual tutor in the future, particularly as the technology gets better.

Hal Daumé III is a professor of computer science with an appointment in UMD’s Institute for Advanced Computer Studies and the Language Science Center who researches AI language models.

In the last five years, AI development has accelerated at an absurd rate and is touching people in ways we wouldn’t have guessed not long ago. A lot of that progress has been made in a very “tech-first” way. You know, the attitude that “I’m going to do it because I can do it”—not because it solves a real problem or addresses the needs of a community. ChatGPT and others fall into this category.

There’s so much possibility, but if we’re going to have AI technology that does all the things we hope—increase economic development, increase well-being both in the U.S. and around the world, reduce rather than increase inequality—we need to make AI development people-first. In the big picture, that’s what needs to change. It’s not easy, because it’s the tech-first AI world that has most of the resources.

Nat McGartland is a Ph.D. candidate in English focusing on digital studies and the history of the physical book.

I know a ton of people are afraid of AI chatbots like ChatGPT, but I think there are absolutely ways we can work with the technology innovatively. I teach English 101 some semesters, and when I teach it again, I think that one way I might use it is to have it write essays, just standard 101-style essays, and pass them out to students and have them edit them. That gives them a way to work on their peer review skills, but not working from scratch with their own ideas.

I personally have used it for a bunch of stuff. I helped organize a conference, and we needed to name all the panels people were presenting on. We sent ChatGPT the paper titles and general themes, and we actually used a good number of the panel names it suggested.

Jessica Vitak, an associate professor in the College of Information Studies, focuses her research on the ethical and privacy ramifications of big data, including its use by AI applications.

There was a time people worried Wikipedia was going to ruin education. But like Wikipedia, or Stack Overflow for programming, or Grammarly for writing, chatbots can be useful tools in the early stages of a homework assignment or group project. Still, you have to be careful. We know that these models often have inherent biases, because the people building the models aren’t necessarily doing a good job of identifying and mitigating potential biases in datasets.

The challenge is that AI algorithms in general are black boxes; we know the inputs and we know the outputs, but we don’t know much about what goes on in between. If you built your model on bad training data, the outputs will be unreliable. And so here’s the problem with ChatGPT: It’s based on internet data, not highly vetted data. That’s why you get the weird or bad results you sometimes hear about. It’s not even “trust but verify”—you should never inherently trust the results from chatbots.

Mary E. Warneka, associate director of learning experience at the Teaching and Learning Transformation Center, manages instructional designers and educational developers with a learner-centered approach.

We ask instructors to consider, “What competencies are you asking your students to demonstrate? Is that a necessary skill to develop? Does AI rob the student of the opportunity to develop that skill?” If you think that skill is valuable for them to practice, then absolutely restrict AI, and be really explicit as to why. For some courses, when the students are properly motivated, AI could be used for first drafts, to develop counterarguments, to create an outline, to generate search terms for research, or to clean up, rephrase or simplify sentences.

There are so many features of large language modeling that could decrease anxiety, help non-native English speakers or students with disabilities. There’s a lot of “leveling of the playing field” that AI offers in ways that can be exciting.

Chris Carroll, Maggie Haslam, Annie Krakower and Sala Levin ‘10 contributed to this article.

Answer to question in sidebar: Chelsea Clinton

Maryland Today is produced by the Office of Marketing and Communications for the University of Maryland community on weekdays during the academic year, except for university holidays.