Innovative Graphs Aid Blind and Low-Vision Readers

Massachusetts Institute of Technology

Bar graphs and other charts provide a simple way to communicate data, but are, by definition, difficult to translate for readers who are blind or low-vision. Designers have developed methods for converting these visuals into "tactile charts," but guidelines for doing so are extensive (for example, the Braille Authority of North America's 2022 guidebook is 426 pages long). The process also requires understanding different types of software, as designers often draft their chart in programs like Adobe Illustrator and then translate it into Braille using another application.

Researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) have now developed an approach that streamlines the design process for tactile chart designers. Their program, called "Tactile Vega-Lite," can take data from something like an Excel spreadsheet and turn it into both a standard visual chart and a touch-based one. Design standards are hardwired as default rules within the program to help educators and designers automatically create accessible tactile charts.

The tool could make it easier for blind and low-vision readers to understand many graphics, such as a bar chart comparing minimum wages across states or a line graph tracking countries' GDPs over time. To bring your designs to the real world, you can tweak your chart in Tactile Vega-Lite and then send its file to a Braille embosser (which prints text as readable dots).

This spring, the researchers will present Tactile Vega-Lite in a paper at the Association of Computing Machinery Conference on Human Factors in Computing Systems. According to lead author Mengzhu "Katie" Chen SM '25, the tool strikes a balance between the precision that design professionals want for editing and the efficiency educators need to create tactile charts quickly.

"We interviewed teachers who wanted to make their lessons accessible to blind and low-vision students, and designers experienced in putting together tactile charts," says Chen, a recent CSAIL affiliate and master's graduate in electrical engineering and computer science and the Program in System Design and Management. "Since their needs differ, we designed a program that's easy to use, provides instant feedback when you want to make tweaks, and implements accessibility guidelines."

Data you can feel

The researchers' program builds off of their 2017 visualization tool Vega-Lite by automatically encoding both a flat, standard chart and a tactile one. Senior author and MIT postdoc Jonathan Zong SM '20, PhD '24 points out that the program makes intuitive design decisions so users don't have to.

"Tactile Vega-Lite has smart defaults to ensure proper spacing, layout, and texture and Braille conversion, following best practices to create good touch-based reading experiences," says Zong, who is also a fellow at the Berkman Klein Center for Internet and Society at Harvard University and an incoming assistant professor at the University of Colorado. "Building on existing guidelines and our interviews with experts, the goal is for teachers or visual designers without a lot of tactile design expertise to quickly convey data in a clear way for tactile readers to explore and understand."

Tactile Vega-Lite's code editor allows users to customize axis labels, tick marks, and other elements. Different features within the chart are represented by abstractions - or summaries of a longer body of code - that can be modified. These shortcuts allow you to write brief phrases that tweak the design of your chart. For example, if you want to change how the bars in your graph are filled out, you could change the code in the "Texture" section from "dottedFill" to "verticalFill" to replace small circles with upward lines.

To understand how these abstractions work, the researchers added a gallery of examples. Each one includes a phrase and what change that code leads to. Still, the team is looking to refine Tactile Vega-Lite's user interface to make it more accessible to users less familiar with coding. Instead of using abstractions for edits, you could click on different buttons.

Chen says she and her colleagues are hoping to add machine-specific customizations to their program. This would allow users to preview how their tactile chart would look before it's fabricated by an embossing machine and make edits according to the device's specifications.

While Tactile Vega-Lite can streamline the many steps it usually takes to make a tactile chart, Zong emphasizes that it doesn't replace an expert doing a final check-over for guideline compliance. The researchers are continuing to incorporate Braille design rules into their program, but caution that human review will likely remain the best practice.

"The ability to design tactile graphics efficiently, particularly without specialized software, is important for providing equal access of information to tactile readers," says Stacy Fontenot, owner of Font to Dot, who wasn't involved in the research. "Graphics that follow current guidelines and standards are beneficial for the reader as consistency is paramount, especially with complex, data-filled graphics. Tactile Vega-Lite has a straightforward interface for creating informative tactile graphics quickly and accurately, thereby reducing the design time in providing quality graphics to tactile readers."

Chen and Zong wrote the paper with Isabella Pineros '23, MEng '24 and MIT Associate Professor Arvind Satyanarayan. The researchers' work was supported by a National Science Foundation grant.

The CSAIL team also incorporated input from Rich Caloggero from MIT's Disability and Access Services, as well as the Lighthouse for the Blind, which let them observe technical design workflows as part of the project.

/University Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.