As one of my first projects at Unboxed, I’m working as part of the team delivering the Alpha phase with Public Health England to develop the UK National Screening Committee’s new online presence.
As part of this project, we visited the GDS (Government Digital Service) Empathy Lab — a space set up in their Aldgate building for helping people to understand how those using assistive technology interact with websites.
To quote the GDS blog:
In preparation for our upcoming Alpha phase assessment, we went to test the prototypes that have been created as part of this phase, as well as find out more about accessibility.
Upon arriving at the Empathy Lab, we were given a short introduction and a tour. As our guide spoke about the topic of accessibility, a few things struck me.
#1: Accessibility issues can affect everyone at different stages in their lives and for different amounts of time
When speaking about accessibility, it can be so easy to automatically just think about permanent impairments. I’ve realised that impairments can be grouped according to the length of time that they affect someone. For example, a person who has lost an arm has a permanent impairment. A person with a broken arm has a temporary impairment. A person holding a child has a situational impairment. The effect, though, is the same when using technology.
#2: Designing with accessibility in mind benefits many more people than I originally thought
Some features of websites that were designed for those with accessibility issues have become mainstream. For example, subtitles were initially designed for those who may be hard of hearing. However, they are now used by a much wider audience. I often see commuters watching programmes on their phone who don’t bother with headphones and read the subtitles instead.
#3: Wealth plays a factor when thinking about accessibility
From seeing several screen readers of varying quality, the best is naturally the most expensive. A lot of people will have no other option but to use the versions that come native on their computers, which may be of lower quality.
Following this, our guide demonstrated the various accessibility tools that are on offer in the lab. These included screen readers, programmes that enlarge what is shown on the screen, and a voice command tool.
A tool that I found particularly interesting was a large button plugged into a phone. This is designed for people who have limited physical movement, so don’t have the full range of mobility that a touchscreen phone may require. Instead, the programme scans the phone screen, breaking it into chunks. The user presses the button when they want to go onto the chunk that is highlighted. They are then shown several actions and press the button again to select an action. This makes navigating websites fairly easy, but typing anything can become a long and arduous task.
The screen reader on iPhones is also a popular option. The user drags their finger in certain directions to activate the reader in different ways. This appears intuitive and flexible to use, quicker than using a screen reader on a laptop.
On reflection, this was a very insightful visit, leading me to think differently and more carefully about accessibility. There is no one way that people use assistive technology. Each person has their own situation, their own preferences and finds different ways more intuitive.
To conclude, it’s important to keep this in mind and test with as many different types of users as possible. The Empathy Lab felt like a good start in understanding some of the issues that surround accessibility for the UK National Screening Committee’s new online presence.