During the last weeks, we’ve discussed different users, who might not necessarily have abled body, good health or the ability to choose – Johanna Kaipio (2020) introduced patient experience, which is rather complicated field with multiple stakeholders, reaching beyond satisfaction with service or facilities. Paula Valkonen had focused on firefighters and visually impaired users (Laarni, Valkonen, 2020) – on both cases the main stakeholders cannot perceive visual information and thus, use all their senses. I feel that as we’ve discussed various solutions, they tend to focus on visual communication, which might not be accessible to visually impaired users. However, for more inclusive design, acknowledging that not all users are able-bodied is essential, and accessibility might not resonate with usability (Rosin, 2020).
To design for multiple senses and beyond visibility, Paula Valkonen highlighted that lacking senses could be temporary or long-term condition (Laarni, Valkonen, 2020) – firefighters’ case is reflecting the temporal situation while blind or visually impaired users have to experience the world through all the other senses constantly. On Miro board, I tried to elaborate both users needs and what should be considered when designing beyond visual solutions.

It could be said experiencing dangerous environments and relying on audible solutions describes both firefighters and visually impaired users.
As firefighters have to work in extreme conditions, often because of smoke they have limited visibility, and thus most of the communication is reliant on two-way portable radios (‘walkie-talkies’). Denef et al. (2008) propose that wearable technology and sensors could help gathering more data about the environment as well as position the firefighter, but the ideas are currently a bit too novel. E-textiles and wearables should bear extreme heat as well as water, but the current solutions are not so durable and thus reliable. Moreover, there are lots of challenges to position firefighters inside buildings – the simulations are done on test sights, which are 3D rendered in computer in advance, while on-site conditions aren’t mapped and it’s impossible to predict the real situatuon. Therefore, as ‘brick-and-mortar’ style analog solutions are hard to replace, the new solutions have to be co-designed and tested with firefighters (Denef et al. 2008).

Coming to visually impaired users, apps and digital solutions could be significant helpers, when developers have taken into account that nice visuals are not enough. Jakob Rosin, a tech journalist and the head of the blind association in Estonia, pointed out during his presentation in UX Tartu conference (2020) that smart devices have screen readers and voiceovers, but the UI is based on visual interface and it’s not intuitive for not-sighted user. Moreover, for understanding the text, voiceovers are not scanning the text, but reading the code, which is often badly written and thus not making sense for the visually impaired user. In addition, although the users could customise some solutions via voice commands, updating apps/systems deletes the changes. Of course there are created solutions like Voxmate or Be My Eyes, focusing directly on visually impaired users’ needs, but downloading and setting it up might need help from sighted person. As an interesting recommendation, Rosin suggested focusing (in addition to visuals) to audible identity – apps or platforms could have custom-made tones for notifications, texts, signatures, etc to distinguish different sections of text (Rosin, 2020).

Therefore, it could be said that the Miro board shows just a fraction of what directions design for limited senses could take. Designing for firefighters would need their direct contribution and more analog tools, while voice-based interfaces could make blind users navigations significantly easier. So it would be interesting to work with voice-based rather than visual solutions, as it takes into account users without sight and create more inclusivity.
REFERENCES:
Denef, S., Ramirez, L., Dyrks, T., & Stevens, G. (2008, February).Handy navigation in ever-changing spaces: An ethnographic study of firefighting practices. Proceedings of the 7th ACM conference on Designing interactive systems (pp. 184-192).
Kaipio, J. (2020, June 23). Designing for Healthcare Experiences from a Multistakeholder Perspective. https://hcrdcrisis2020.wordpress.com/2020/06/22/designing-for-healthcare-experiences-from-a-multi-stakeholder-perspective/
Laarni, J., Valkonen, P. (2020, June 30). Designing Multi-modal Interfaces for Demanding Conditions in Crisis. https://hcrdcrisis2020.wordpress.com/2020/06/29/multi-modal-interfaces-crisis/
Rosin, J. (2020, May 29). How UX shapes the world of blind and visually impaired users. UX Tartu 2020. https://www.youtube.com/watch?v=6emfy_Z5Njk
4 Comments