Skip Navigation
MarylandToday

Produced by the Office of Marketing and Communications

Subscribe Now
Research

UMD Team Creates AI Tool to Give Memory a Hand

Team Found the Mnemonics We Prefer May Not Match Those That Actually Help

By Aleena Haroon M.P.P. ’25

digitized silhouette of person with letters coming out of their mouth

A UMD doctoral student and collaborators recently developed an AI system that can generate high-quality keyword mnemonics to aid memory.

Photo by iStock

What is melancholy, really?

Consider a hardworking collie after a day of herding sheep. She’ll be hungry, but all she has to eat is a melon. The pup will be sad—melancholy, even.

Whether it’s memorizing vocabulary for a test or learning new foreign language phrases, we’ve all used such “keyword mnemonics,” which help us to learn and remember new terms or phrases by associating them with something that’s easy to recall. But crafting mnemonics can be almost as hard as memorization. What if there was a way to make using this classic technique easier?

Enter SMART—an AI-driven keyword mnemonic generator created by University of Maryland researchers in the Computational Linguistics and Information Processing (CLIP) Lab. Designed to simplify mnemonic generation for students, SMART was showcased last week at the Conference on Empirical Methods in Natural Language Processing in Miami.

For Nishant Balepur, a second-year Ph.D. student in computer science who is leading the project, memorizing hundreds of unfamiliar terms and concepts for the GRE graduate school entry exam stopped being “painfully tedious,” thanks to mnemonics, which he finds “fun and effective.”

Although Balepur recalls struggling to come up with some of them on his own, when large language models (LLMs) like ChatGPT entered the picture not long after he took the GRE, he realized they might be the perfect tool to help students produce mnemonics.

Working with others, including researchers from Yale University and George Washington University, Balepur developed SMART, which led to a fascinating insight about how people assess the usefulness of what’s generated.

“We found there was a discrepancy between expressed preferences, what students thought helped them learn better, and observed preferences, which are the actual learning outcomes,” Balepur said.

To bridge this gap, he and his team used Bayesian modelling—an advanced statistical approach—to account for, and balance, both types of preferences. This divergence from focusing on what students said helped them and instead discovering which mnemonics actually helped distinguishes the team’s work from prior research in the field.

As a result, Balepur said, SMART can design mnemonics that are both engaging and educationally effective, but cheaper and more efficient than GPT-4, the LLM that powers ChatGPT.

But as powerful as SMART and other LLMs are, they’re not without limitations. In its published work, the team said that a human language expert it worked with was better than both SMART and GPT-4 in crafting high-quality mnemonics.

The team is determined to address its model’s blind spots and hopes to build upon its findings to enhance SMART’s memorability.

Balepur is co-advised by computer science Professor Jordan Boyd-Graber and computer science Assistant Professor Rachel Rudinger, who both have appointments in the University of Maryland Institute for Advanced Computer Studies (UMIACS) and are core members of the CLIP Lab.

Topics:

Research

Maryland Today is produced by the Office of Marketing and Communications for the University of Maryland community on weekdays during the academic year, except for university holidays.