Skip Navigation
MarylandToday

Produced by the Office of Marketing and Communications

Subscribe Now
Research

To Build Better Robots, UMD Researchers Are Making Them Think More Like Us

$2.8M DARPA Award Aims for Symbolic Reasoning in Autonomous Systems

By Melissa Brachfeld

two AI--one a robot, one a human--stand back to back

Backed by Department of Defense grant, UMD researchers are working to give autonomous systems the ability to use symbolic reasoning.

Photo by iStock

They’re flipping burgers, navigating vehicles, sorting parcels and much more—things that seemed like science fiction only a few years ago. As autonomous systems come closer to being part of our daily lives, there is a need to increase human-like “thinking” in these artificial intelligence (AI)-driven platforms to smooth the path toward humans and robots teaming up.

Now, supported by a $2.8 million grant from the Defense Advanced Research Projects Agency (DARPA), researchers in the University of Maryland Institute for Advanced Computer Studies (UMIACS) are leading a multi-institutional effort aimed at enhancing how robots reason, plan and engage in complex, real-world scenarios.

The UMIACS team—Associate Professor Furong Huang and Professor Tom Goldstein, both of the Department of Computer Science—is collaborating with researchers at the University of Chicago and the University of Texas at Austin.

Their ultimate aim is robust autonomous robotic systems based on symbolic reasoning, which involves the use of symbols and abstract concepts to reason and solve problems, allowing the machine’s AI neural networks to mimic human thought processes, vastly increasing their versatility and capability.

“Our project aims to transcend the limitations of current robotic systems, which are often siloed and task-specific,” said Huang. “Instead, we envision developing a versatile robotics foundation model—a scalable brain capable of powering a diverse range of robots—from household assistants and industrial machines to next-generation autonomous vehicles and medical devices.”

Goldstein said the DARPA-funded project emphasizes the need for autonomous robotic systems that not only understand their surroundings in theory—ranging from people to complex built environments—but can interact with them effectively and safely.

“Currently, while foundation models can interpret static scenes and images well, they struggle with the dynamic aspects of real-world environments,” he said.

The researchers plan to develop perceptual systems that can transform sensor data into clear representations of the environment, such as scene graphs and knowledge graphs. Such computational methods mimic the human ability to separate the tasks of sensing and reasoning; people sense objects in their environment and think about their properties, then plan actions using logical reasoning about the objects they have identified.

Ensuring the robotic systems can operate effectively across various scenarios—including unfamiliar terrain or adverse weather conditions—requires a great diversity of datasets, dedicated lab space and UMIACS’ powerful computational resources, Goldstein said.

Zikui Cai, a UMD postdoctoral researcher with expertise in computer vision, computational photography and foundation model training, will assist Huang and Goldstein on the project.

The team plans to release both the robotics software it develops and its findings in an open-source format, allowing others to contribute to advancing the technology.

Topics:

Research

Maryland Today is produced by the Office of Marketing and Communications for the University of Maryland community on weekdays during the academic year, except for university holidays.