Do a baby’s adorable dimples make you click “share?” Or perhaps you’re motivated by disgust at what politicians have pulled THIS time. Maybe you just want to pass along a meme that made you chuckle.

To understand how emotion helps online messages go viral—central to information campaigns that have targeted governments and elections worldwide in recent years—the U.S. Department of Defense (DOD) has awarded a team of University of Maryland researchers $1.5 million to examine how posts and videos give you the feels or stoke your fury.

Based out of UMD’s Applied Research Laboratory for Intelligence and Security (ARLIS), the team will collect and annotate a sample of 1,000 real-world public Facebook posts and 300 YouTube videos from Poland and Lithuania that were shared by social and political influencers from those countries.

Both countries are NATO allies as well as the frequent focus of Russian information warfare. Once complete, these annotations will be used to explore how eliciting specific emotions can help a narrative—truthful or not—go viral.

“Whether using outright disinformation or manipulating public opinion with accurate stories, information warfare involves stories shared on social media platforms with specific embedded narratives designed to provoke, enrage, excite and change behavior,” said Susannah Paletz, principal investigator of the project, research professor in the UMD College of Information Studies and ARLIS affiliate.

A social psychologist, Paletz has been studying social media for five years for the Office of Naval Research and, in a project last summer, developed an innovative coding scheme to annotate emotions that inspired this new effort.

The annotations extend beyond the so-called six basic emotions: anger, disgust, fear, happiness, sadness and surprise. Paletz and her colleagues’ annotation scheme includes humor, wonder, nostalgia, relief, love and hate and others—currently over 20, and they’re still refining the list.

“We also included something called kama muta, which is an emotion of feeling heartwarmed when you see something infantile—in other words, the ‘awww!’ feeling you get when you see something cute,” Paletz said.

She and her team will work with native speakers at universities in Poland and Lithuania to complete the annotation. Members of small groups will first individually use the annotation scheme to judge each of the collected social media posts for each emotion, rating them 0 to 100,  both for the content itself and for their own reactions to the post, with the groups finally settling on consensus ratings.

The project will also address critical gaps in research about how information travels through populations and across national boundaries and languages. The researchers will develop methods for detecting and tracking how narratives and other memes spread within and across languages.

Information efforts probably aren’t being led by shadowy spies in dark suits, but young people familiar with the workings of social media, said team member Cody Buntain, a UMD computer scientist soon to join the faculty at the New Jersey Institute of Technology.

“Rather than being experts at propaganda or disinformation, I think many of these individuals are using marketing tools exactly as they were meant to be used, but with an unanticipated intent—are the tools Coca-Cola or ExxonMobil uses to market its product all that different?—and these people find what works to get engagement, followers and clicks,” he said.

Paletz’s research team also includes co-PI Anton Rytting, a computational linguist in ARLIS; Devin Ellis, a policy expert in the National Center for the Study of Terrorism and Responses to Terrorism (START); Ewa Golonka, a Russian linguist and social scientist with ARLIS; and Egle Murauskaite, a START expert on unconventional security threats.