Skip Navigation
MarylandToday

Produced by the Office of Marketing and Communications

Subscribe Now
Research

Expanding Data Science Beyond Visualization

$2M in State Funding Supports Work Expanding Access to Field for the Visually Impaired

By Maria Herd M.A. ’19

Blind woman wearing headphones uses a smartphone

A UMD researcher and expert in big data visualization is working to make the field's tools available to those who are blind or visually impaired.

Photo by iStock

Data scientists rein in vast, unruly datasets, often using colorful, dynamic computer graphics that clarify “big data” for users—but not for all of them. Data visualization expert Niklas Elmqvist discovered this while teaching a 2018 class on the subject, and one of the students was blind.

“It was illuminating, and took me completely by surprise,” said Elmqvist, a professor in the College of Information Studies with an appointment in the University of Maryland Institute for Advanced Computer Studies. “Only then did I see the elephant in the room, and it had always been there.”

He’s now helping to lead a project that addresses this inequality, developing and evaluating new tools that can provide visually challenged high schoolers in Maryland with better access to visualization data. His work is supported by a $2 million contract with the state’s Department of Education.

Computer scientists Andreas Stefik and Sina Bahram from the companies Data11y and of Prime Access Consulting, respectively, are creating the new data visualization materials and coursework, while Elmqvist and the iSchool’s fourth-year doctoral student Pramod Chundury are in charge of evaluating the developed materials.

Many data science materials for sighted users can’t be easily replicated for blind users, Elmqvist said. For the coursework to be accessible, all students must be able to read the data in charts, interpret and communicate workflows, facilitate interactions with the data and describe the visualizations.

While screen readers—software programs that read text with a speech synthesizer—can navigate tables, they’re befuddled by large datasets. Braille displays don’t work with this task either.

One approach is “sensory substitution,” using general assistive technology to functionally sub in one sense with another—perhaps replacing visual information with tactile or auditory information. In an earlier study, Elmqvist developed a method to incorporate the sense of smell to help navigate huge datasets.

Another method is sound representation; a pitch might represent the height of a bar in a chart—the higher the pitch, the taller the bar. Yet, as Elmqvist discussed at a community TEDx talk at Montgomery-Blair High School in Silver Spring, Md., many current sensory substitution techniques are not scalable because they take too much time to create, are too costly, or are not widely available.

Now his team is building upon work undertaken more than a decade ago by UMD faculty members Ben Shneiderman, Catherine Plaisant, Jonathan Lazar and others. A tool they built, called iSonic, creates an audio version of a map that allows a user to execute sweeps from left to right to hear various pitches that represent the data.

The new project will scale this idea to mobile devices, and include more types of data representation. One focus will be to take data interactions available to sighted people—like zooming in, selecting points or filtering the data—and finding ways to translate them for blind users through other means like sound and haptic (or touch) feedback, allowing for inputs via keyboards and touchscreens. The team hopes its work will be eventually distributed nationwide.

This story was adapted from an iSchool news release written by Liz Zogby.

Topics:

Research

Maryland Today is produced by the Office of Marketing and Communications for the University of Maryland community on weekdays during the academic year, except for university holidays.