How to convey financial charts for users without sight? Bloomberg’s UX designers explore visual data through sound

August 28, 2018

Over the last two years, Bloomberg’s UX team has partnered with Carnegie Mellon University and visually impaired users to prototype next generation accessibility tools.

At Bloomberg, there’s a mantra that speaks to the design philosophy and values of every UX team member: “I am not the user.” It’s a phrase you will invariably hear recited daily by team members – a constant reminder that users have unique needs different from those doing the designing.

No project better embodies this idea than an ongoing partnership between Bloomberg and the Human-Computer Interaction Institute at Carnegie Mellon University to explore ways to make financial data more accessible to those with visual impairments. Over the last two years, the master’s students working with Bloomberg have been tasked with using UX design and technology to explore how a gradual or congenital loss of eyesight impacts people’s ability to find employment in the financial sector – and to create tech-driven accessibility prototypes that help users “see” financial data non-visually.

For Richard Ram, a senior UX designer at Bloomberg and mentor for the student teams, the opportunity to work on accessibility is a passion project. For him, it’s more than a unique design challenge–it’s personal. His interest in accessibility started when his mother was diagnosed with Parkinson’s five years ago. He noticed how difficult it was for her to precisely tap buttons when using FaceTime with his kids.

“It got me thinking about how design can simplify technology interactions for those with disabilities,” said Richard. “For instance, a lot of the content in finance needs to be interpreted from charts and graphs. But if you can’t see those charts and graphs, how do you get at that content and analyze the patterns they surface?”

It’s a question that has implications for the entire finance industry, where competition for top talent is fierce. Just because somebody struggles to see shouldn’t mean they’re out of the talent pool or the workforce later in their career. It simply means they need different tools to do the job. A recent article in The Lancet estimates that 253 million people worldwide live with some form of vision impairment; 36 million are blind and 217 million have moderate to severe vision impairment. That’s a lot of potential talent not being given the opportunity or tools to succeed in finance.

 

For conditions like Glaucoma, there are a range of experienced limitations and corresponding technological solutions. The teams started their research by understanding how technology is used today — and where it falls short for complex data analysis. (Source: Sonify)

For conditions like Glaucoma, there are a range of experienced limitations and corresponding technological solutions. The teams started their research by understanding how technology is used today — and where it falls short for complex data analysis. (Source: Sonify)


Historically, one of the tools that makes digital content accessible to people with visual impairments has been a screen reader — an assistive technology that reads everything on a screen out loud. Screen readers used to be niche, now they are increasingly baked into computing platforms, such as Apple’s VoiceOver and Windows 10’s Narrator. But, while they are improving, these solutions are generally difficult to use and are ineffective when trying to understand information displayed in a graph.

“That’s where sonification came in,” said Richard. “The Carnegie Mellon students focused on using sound as a way to overcome the limitations of screen readers and help people experience historical price movements non-visually.”

Building empathy, not pity

Getting into the mindset of someone with a visual impairment was one of the biggest challenges the teams encountered. On a sensorial level, the students couldn’t design for users with visual impairments without first understanding what it’s like to work with complex data and not be able to see it.

To get in the mindset of a user with some form of visual impairment, such as blindness, glaucoma, color blindness, macular degeneration and more, the teams conducted a series of empathy building exercises, ranging from using common screen reader tools for an entire day to physically wearing an eye covering that simulated pinhole vision.

 

The teams used empathy exercises to simulate a range of disabilities that make using everyday technology difficult—from eye covers to simulate blurred vision to taping fingers together to simulate advanced arthritis. (Source: Sonify)

The teams used empathy exercises to simulate a range of disabilities that make using everyday technology difficult—from eye covers to simulate blurred vision to taping fingers together to simulate advanced arthritis. (Source: Sonify)

Nora Tane and Emily Saltz were members of the student team that conducted this research in 2017. Both now work as UX designers at Bloomberg. They recalled the empathy exercises being helpful, but warned they were no replacement for talking to actual users.

“There’s a lot of criticism of empathy exercises you need to be aware of when designing for accessibility,” cautioned Emily. “So we tried to be really thoughtful about how we used them. As a sighted person, you don’t want to do an exercise that inspires you to feel pity or even awe for the experience of someone with a disability. That kind of emotion simply doesn’t line up with the day-to-day experience lived by someone who has been blind from birth or had their vision decline over time.”

“In our simulations, we tried to focus on completing a specific financial task like checking a bank account or looking at a chart using a screen reader,” said Nora. “We then used the insights we got to create experiments with different tactile, non-visual ways to display chart data. In the end, empathy exercises aren’t a replacement for talking to actual users,  but they can help.”

A tactile chart with pegs and rubber bands. A tactile chart with discrete points. A simulation of a haptic system.

The teams also talked with dozens of working professionals who have a visual impairment to better understand not just how it affected them on the job, but also to understand the emotional toll that losing one’s sight takes over time.

“One accountant we met had macular degeneration. He  was trying to hide his impairment from his colleagues for as long as he could,” said Emily. “He told us about how he would go to the most isolated cubicle in the building where nobody could see him and press his face against the computer monitor. He was going to extreme lengths to avoid people knowing about his impairment. He was the same person with the same skillset. He just needed different tools to access information and do his work.”

Getting to know the community of people with visual impairments gave the team a sense of mission and created a network of users for testing the prototypes they designed. Chancey Fleet, an Assistive Technology Coordinator at a library in New York, is one of many blind users the Carnegie Mellon teams worked with. In her opinion, the only way designers can successfully create products for this community is to actively engage the community–focusing on user diversity and not viewing the community of people with visual impairments as a monolith.

“Sometimes people have this idea that, because they know how to close their eyes, they know what a non-visual workflow looks like.Nothing could be further from the truth,” said Chancey. “We spend years cultivating different approaches and our workstyles are as diverse as our community. To design a product that is delightful and responsive to this community it’s so important to not just include one or two users, but to include lots of members of the community that work with non-standard technologies.”

It’s this focus on engaging a diversity of users with visual impairments that makes all the difference, she said. “The Carnegie Mellon teams working with Bloomberg both years recruited a broad range of non-visual users with varying levels of tech proficiency, platform usage, and comfort levels understanding spatial information by sound and touch,” said Chancey. “The quality of their work speaks to the effort that they took to recruit diverse perspectives.”

Hearing is believing

Over the last two years, two different Carnegie Mellon teams have created two unique prototypes, each tackling a different user need related to helping someone with vision loss read charts.

In 2016, one of the teams created Sonify, an iPhone application that enables individuals with visual impairments to quickly understand line graphs. The app makes use of sonification techniques to help users “listen” to the curve of a stock trendline and get a sense of change in stock prices over time. This was an attempt to enable users with all visual capabilities to get a gist of a stock’s performance.

 

The team behind Sonify chose to include a visual representation of each graph so that users could communicate about the graphs with people who are sighted–helping sighted and non-sighted users collaborate. Learn more here.

The team behind Sonify chose to include a visual representation of each graph so that users could communicate about the graphs with people who are sighted–helping sighted and non-sighted users collaborate. Learn more here.

Building on Sonify, the team in 2017 focused on creating a tool to help non-sighted users get information about stock prices in context. That team created Stockgrok, an accessible web-based chart analysis tool for making buy or sell decisions through sound. With Stockgrok, the team focused on using auditory counterparts to visual cues in charts, such as helping a user hear the distance between two lines, the intersection points and price positions above or below a 50-day moving average.

 

Stockgrok was built to be accessible to people with varying levels of sight. Because the audio and visual elements are linked, sighted users can follow along as users with visual impairments jump around the interface. Learn more here.

Stockgrok was built to be accessible to people with varying levels of sight. Because the audio and visual elements are linked, sighted users can follow along as users with visual impairments jump around the interface. Learn more here.

“We were surprised how audio made some things even easier to pick out than visuals alone,” said Emily. “If you’re looking at it visually, it’s just squiggly lines. Audio was actually a powerful extra layer on top of the visuals.”

Both teams’ work has inspired new efforts to make the Bloomberg Terminal more accessible to people of all abilities.

“There’s a huge interest in designing for accessibility at Bloomberg,” said Richard. “When these teams presented their projects, we had more than 50 designers crammed into a conference room to see the results. While visually impaired users are a relatively tiny part of our active user base, designing for accessibility is about more than business. It’s about ensuring people don’t get left behind. For example, we’re making sure commonly-used screen readers, like JAWS, play well with our next-gen UI toolkits. These efforts are being driven by our Chairman Peter Grauer and our CTO Shawn Edwards, on down.”

The 2018 Carnegie Mellon team recently wrapped up their design project looking at ways to improve the hiring experience for people with disabilities, marking the third year in a row that a team is partnering with Bloomberg to explore the intersection of UX and accessibility.

But, if you talk to designers at Bloomberg and users in the non-sighted community, they’ll be quick to tell you the work is very far from over.

“We’re at an interesting moment where we’ve just about convinced everyone that basic web accessibility is a right. We should have the right to conduct basic tasks like bill pay and understand basic elements on a web page,” said Chancey. “That baseline is getting better, but as a culture we’re moving away from textual data and toward interactive, rich data sets that users can explore and filter at will. The accessibility field barely has a toehold there.”

For those interested in designing for accessibility, the teams hope new designers can build on the work they’ve started.

“In a lot of ways, we’re standing on the shoulders of giants,” said Nora. “But, it’s still the wild west in the world of accessibility. As long as you’re actually talking to users with disabilities, these communities are super welcoming of designers who are eager to bring attention to their underserved needs.”

If you want to get involved in designing for accessibility, there are a lot of ways to get involved with the community–like following #A11Y on Twitter, going to meetups and attending conferences–and a ton of work to do to level the playing field and ensure access to employment in finance is open to all, regardless of disabilities.  

“There’s this whole iceberg of content that is interesting and vital to our success as professionals, heads of households, and members of our communities–and the development work to make that content accessible is still in the very early stages,” said Chancey. “So, every team project or interim group that does a little something to advance understanding in the field is really helping the cause. We need many more developers and designers advancing the cause of accessibility for spatial information and complex data.”

To learn more about each team’s work and Bloomberg’s ongoing partnership with Carnegie Mellon’s Masters in Human-Computer Interaction program, see their project portfolios below:

2016 Project Sonify

Designers: Monali Agarwal, Felicia Alfieri, Safinah Ali, Jacob Jorgensen, Laya Muralidharan

2017 Project Stockgrok

Designers: Conrad Bassett-Bouchard, Emily Saltz, Nora Tane, Clare Marie Carroll, Jayanth Prathipati

Interested in learning more about designing for accessibility? Checkout the #A11Y hashtag on Twitter and The A11Y Project to learn more.