

Whether Apple’s solution would work well for our own graphs is entirely dependent on what we’re trying to visualize, who our visualizations are for and how they want to use them. Apple’s graphs set a good example, so what can we learn from them to make our own visualizations more screen reader accessible? The graphs in the cards were read differently for different types of graphs as well, but seemed to follow a pattern: the graph wasn’t read when it served as an illustration of what was already described in the summary, but numbers bringing more detail to the summary were included.

The goal of the graph is to get a quick summary or highlight of the data, which isn’t easily solved by swiping through a list of numbers.

But it’s also a nice addition to the graphs when listening to them with VoiceOver. In the Activity app, the rings seem to have a bug (or feature?!) that causes the activity to be read twice, but overall I feel like this one works quite well.įor me personally, the summary makes it quicker to get the information I need, since I don’t have to find and read labels or process any visuals at all. VoiceOver pronounces the (visually hidden) label of each bar, followed by the percentage that’s visualized. The activity rings display how far you’ve gotten towards achieving your daily goals. Each of them is broken down by the hour, and includes the timeslot and value in the label. The only difference is that the values aren’t grouped, and movement, exercise and standing are read as three different graphs. The Apple Watch activity breakdown provides a similar experience. Whether or not consecutive blocks with no activity should be grouped is more of a screen reader usability question, for which I recommend doing user tests with blind or visually impaired folks. Sorry, your browser doesn't support embedded videos.įor example, between 1AM and 9AM the values for moving, exercising and standing are usually always zero.īut on the other hand Apple’s current solution is consistent: VoiceOver reads three values for each hour in the day, no matter what. One thing I personally found inconvenient was having to keep moving along hour by hour even when there’s no data. While the labels on the axes aren’t read, that information does get added at the start of each of the columns' labels.Īll the data is communicated clearly with pretty straightforward navigation. Each hour is a column, which includes the movement, exercise and standing values. In this view, VoiceOver allows us to navigate through the graph by the hour. The health app has a page dedicated to activity, which visualizes movement, exercise and standing hours per day, week, month and year. They’re an inspiration to many when it comes to visual design, so I was especially wondering if we can use them as an example for accessible data visualizations as well. After being disappointed by all the inaccessible graphs around the US Presidential Elections, I wanted to find out how Apple’s visualizations are handled by VoiceOver. Ever since getting the Apple Watch in November, I’ve been looking at my Apple Health & Activity data on a daily basis.
