Produced by the Office of Marketing and Communications
UMD Human-Computer Interaction Expert Says Faulty Machine Learning Algorithms Risk Safety, Threaten Bias
Photo by Shutterstock
The twin crashes of Boeing 737 MAX airliners in 2019 and 2020 illustrate how bad algorithms—processes computers use to solve problems—can lead to deadly outcomes, writes computer science Professor Emeritus Ben Shneiderman in a new essay in The Hill.
Even if they don’t kill you, malfunctioning algorithms can cause you to lose a job, fail to qualify for a loan or be falsely accused of a crime—potentially because machine-learning artificial intelligence systems have developed a bias, racial or otherwise. Algorithms could also be intentionally deployed by terrorists or political oppressors to further their ends—suggesting national action is needed, writes Shneiderman, a pioneer in human-computer interaction.
In the United States, the National Transportation Safety Board is widely respected for its prompt responses to investigate plane, train and boat accidents. Its independent reports have done much to promote safety in civil aviation and beyond. Could a National Algorithms Safety Board have a similar impact in increasing safety for algorithmic systems, especially the rapidly proliferating artificial intelligence applications based on unpredictable machine learning? Alternatively, could agencies such as the Food and Drug Administration, Securities and Exchange Commission or Federal Communications Commission take on the task of increasing safety of algorithmic systems?
In addition to federal agencies, could the major accounting firms provide algorithmic audits as they do in auditing financial statements of publicly listed companies? Could insurance companies provide guidance for the tech community as they do for the construction industry in making safer buildings? We already see civil society and business groups such as Underwriters Laboratories developing a Digital Safety Research Institute to provide the evidence-based research for safer algorithmic systems.
Read the rest of the essay in The Hill.
Maryland Today is produced by the Office of Marketing and Communications for the University of Maryland community on weekdays during the academic year, except for university holidays.
Faculty, staff and students receive the daily Maryland Today e-newsletter. To be added to the subscription list, sign up here:
Subscribe