Suzanne Buckley

Suzanne Buckley is an AAC Consultant at the ACE Centre, Abingdon.

“Can I say a HUGE thank you for allowing me the opportunity to attend this virtual conference. It is truly amazing what technology can help us achieve with the right support and features in place. It is time now we help spread the word!”.



Restoration and recovery of services using technological solutions and supports. Working with adults with acquired communication disorders post stroke.

It is so exciting to be working in this field where technology and the general populations’ access to it has advanced significantly in the last ten years. Moreover, the pandemic has sped up the move towards better use of technology, particularly in the NHS, that would in normal times have happened slowly and incrementally (Ford, 2020). Neuro rehabilitation services have had to make radical adaptations quickly so that the quality and deliverability of essential and timely rehab is no longer disrupted, as it was in the beginning. Community teams have made a rapid move towards working remotely, with families better able to support and set up video calls, even for those who see themselves as ‘not tech savvy’.

This article will touch on some of the practical ideas I have gained from attending the ATIA 2021 conference that can be applied to working with individuals following a stroke. This will encompass ideas for therapy as well as adaptations to support a life participation approach (Chapey, 2000).

News flash! “Assistive technology can support communication, facilitate the reacquisition of language, support orientation and engagement and even health and safety – all these benefits and more…and many are free!”

It’s great to see the big tech companies finally tackling accessibility in a meaningful way. Microsoft’s Immersive Reader is one such tool that has a wide range of applications for people with acquired communication difficulties. Word documents can be read aloud with the rate of speech slowed down for those with receptive language difficulties; syllable segmentation can be turned on to be used as a practice tool for those with apraxia of speech or non-fluent aphasia; grammar options can highlight different parts of speech for those working at the sentence level of language processing; pictorial supports above words can be turned on to aid reading comprehension. All have excellent practical usage in supporting the reacquisition of language or participation in work or life activities that were once the norm.

Changes to the way the page looks, for example, applying the Line Focus feature, can reduce visual crowding; enormously helpful for those with cognitive or visual processing difficulties who need support to attend or reduce visual clutter. Even web browsing has become accessible. By right clicking anywhere in the Microsoft Edge browser you can select ‘read aloud’ and it will read any website anywhere in the world out loud. Microsoft’s free app Office Lens, available in iOs or Android, allows you to take a picture of a page in a book or an article in a once loved but now impossible to read magazine then send that photo to Immersive Reader for the text to be read aloud. An individual could use voice output to provide samples for speech practice, for example, if they were aiming for a goal of reading a bedtime story to the grandchildren; Immersive Reader could be used to practice on that meaningful goal, independently and in their own time. The Dictate feature in word could be applied to give immediate feedback to those with expressive language difficulties who could use their strength in reading to self-monitor their spoken output. For those with cognitive communication difficulties, the Editor feature could be used to enable feedback on verbose or tangential language being more concise. There is also a lightweight but free word prediction feature that you can switch on in Windows 10 that could be useful to explore for those who have difficulty writing before purchasing a higher-grade word prediction software such as Claro’s or Don Johnston’s.

Microsoft Teams has all the accessibility features described above which can help support real life conversations and chats with family and friends. You can turn on aphasia friendly features, for example, pictorial supports or having words read aloud to support understanding. This opens up a whole world of accessibility to the individual with acquired difficulties, for example, ‘Chats’ between them and their therapist or stroke consultant can be saved so they can refer to them another time. Another feature that could be a huge help when needing to attend complex medical discussions remotely, is using their real time transcription feature. This can enable conversations to be saved with any recording so you can download and read through the transcription afterwards. This will take the pressure off family or carers trying to write down important points while at the same time being involved and ‘present’ in the conversation. They can then copy and paste the transcription and put it into Word, turn on Immersive Reader and have a record that can be read out and go back to time and time again. For those who are bilingual and have ‘lost’ their second language but retained their first, or with family members who speak little English, you can apply a translate feature for select words or indeed translate a whole document. Indeed, their free Microsoft Translator app can translate what you say into multiple different languages to multiple people, in real time!

Power Point’s Presenter Coach can be turned on and used as a therapeutic tool for those who need to ‘rehearse’ a speech and then receive the Presenter Coach’s ‘feedback’ report. Again, brilliant for those with aphasia or cognitive communication difficulties to practice in a ‘safe’ space, taking away the fear factor of it being ‘real life’. Sticking with Power Point, another feature that can help engage family and friends in remote therapy sessions is Power Point Live, a free feature that is built into Power Point Web. A practical use could be for the individual to have a therapy session, putting word finding strategies that they have learnt to use when giving a mini ‘speech’ to friends and family. Their friends and family can follow it live and provide their loved one with immediate feedback by sending emojiis of hearts or thumbs up to encourage and praise. Microsoft’s free Seeing AI app for iOs is another excellent tool that opens up the visual world and can describe and ‘name’ the world around them. Having the audio description alongside the visual support, i.e. a picture of a room or object, can help cue in language for those with word finding difficulties.

Google AI’s Project Euphonia is another example of how the Big Tech companies are working on helping people access speech recognition software that was previously unavailable to them because of their impaired speech. They need us to spread the word and encourage people living with dysarthria to know about this project. Interested participants can sign up via the Interest form at g.co/euphonia and they are particularly looking for people who have moderate to severe dysarthric speech. Gmail’s Smart Compose is another example of Google’s AI at work that can be switched on as a writing support tool. Their free Look to Speak app is also a fantastic tool, allowing people to use eye gestures to communicate key messages using their mobile phone. It can be used offline, in direct sunshine or rain, even in the wet room. Furthermore, it can serve as a vital tool to support communication for those in critical situations in ICU on ventilation, for example. It is incredibly easy for medical professionals to programme and edit.

Moving on to Google’s free accessibility apps Action Blocks allows people to condense commands down into a single tap. There are so many practical usages including having a rudimentary but free AAC solution for people that don’t otherwise have access to one. For example, PCS symbols and ‘speaking’ blocks placed on the users’ home screen can serve to tell others their name, or provide ‘In Case of Emergency’ information, or text a family member to say they are ready now for a video call, or that they are home. One tap of a button on the home screen can enable environmental control features to launch such as turning on a fan or playing a favourite radio station. Another possibility would be to easily launch previously made videos from a therapy session role playing a scenario to support real life participation, for example, going to a pub or a café and ordering a favourite drink. By taking those dynamic routines, we can now as therapists capitalise on video prompting to support communication and life participation.

It is truly amazing what technology can help us achieve with the right supports and features in place. It is time now we help spread the word!

References

Chapey, R. et al, 2000 Life Participation Approach to Aphasia: A Statement of Values for the Future; ASHA Wire

Enderby, P. & Petheram, 2002 Has Aphasia Therapy Been Swallowed Up? Clinical Rehabilitation

Ford, G. et al, 2020 Restoration and recovery of stroke services during the Covid-19 pandemic; Oxford Academic Health Science Network