Thirty-Six Years in Assistive Technology


This month, the blogging team at Sound For Schools was fortunate enough to get in touch with SEN education technology specialist, Martin Littler. We asked him to write a piece for us on his vast experience in the field and his outlook on the industry. This is what he had to say…


In May I was lucky enough to be invited to give a keynote presentation at the Inclusive Learning Technologies Conference in Brisbane, Australia. My talk was on the history of Assistive Technology – perhaps because I’m old enough to have lived through all of it!


I was also required to look into the future and see if what had gone before cast any light on what is to come. I think it does. For me, Assistive Technology arrived with the Micro Computer in 1978. It’s been around for 36 years. For my money, the next 36 months will be as exciting as the last 36 years!

Steve Gensler and Martin Littler
Martin Littler and Steve Gensler.


Back in 1976, pioneers like Steve Gensler in San Francisco and Paul Schwejda in Seattle were playing around with build-it-yourself kits which were to become the Apple, Commodore and Tandy computers released late the following year. This is how, as soon as these “micro-computers” arrived, Steve had an early “Unicorn Board” (later called “IntelliKeys”) ready for the Apple and Paul had cracked switch access for the Tandy. There were also switches and overlay keyboards ready for Apple with the “adaptive firmware card”.


Educational Computing
Tim Kitchen’s early overlay keyboard on the cover of a 1980 edition of Educational Computing.


Meanwhile in the UK, an old colleague of mine, Tim Kitchen of Walsall, had a working overlay keyboard paving the way for the enormously successful Concept Keyboard on the BBC computers of the 80s and early 90s which even outsold America’s IntelliKeys.


These devices were important because they allowed direct access to words, pictures concepts, communication and learning: see the thing you want, point to it and it’s yours! No keyboards, no mice – just point at what you mean.


 Direct access to learning back in 1989.


You can see overlay keyboards, heavily disguised as farmers’ fields on the video of the 1989 “Micros for Special Needs” Exhibition in Oldham.


Shortly after the video opens, you see a young girl using an early monitor adapted for touch – another direct access device. It was another ten years before integrated touch monitors appeared in schools with special schools again taking the lead. Twelve years later, in 2010, the iPad arrived and now everyone has direct access to what they want to know or do.


Point at what you want with the Robin Light Pen. 


Back in 1988, there was a pointing device called the Robin Light Pen invented by an old lecturer of mine, Mike Robinson, in Liverpool. Mounted on your glasses or helmet, it worked as a hands-free pointing device – but only with a very limited range of special software – perhaps the half-a-dozen programs on the cassette in the picture! There have been and are a number of pointing devices since, where the movement and position of your head could move the mouse cursor around.


However, the breakthrough has come with eye gaze devices. These track the irregular shape of the lenses of your eyes precisely, keeping a record of exactly where you look and where you linger. Eye gaze has enabled some very young children with profound and multiple learning difficulties to make choices on screen and communicate for the first time.


Eye Gaze
Eye Gaze is becoming more affordable.


Individually, with special software, this also allows specialist teachers and therapists to record what a child can see; whether they can follow movement on screen, where they look and how long they fixate. All of this information is collected to record eye movement and provide ‘heat maps’ of which parts of the screen got the most attention. Collectively, this will begin to collect data on all children. Different patterns of looking have already been discerned in learners with dyslexia and those without. Prepare for some really interesting research on looking and learning.


Steve Gensler and Paul Schwejda’s first experiments with computers were to connect a single switch to a computer. Via such a switch, with the right software, people who can only make one voluntary movement or sound can speak, write, learn and control their environment. The best known of these is Stephen Hawking, who can write books, lecture and debate with just a slight movement of his cheek (but this is fading).


I expect eye gaze (and over time, thought control too) to not only liberate the considerable talents of people with disabilities, but to enter the lives of us all and be as ubiquitous, useful and easy to use as the iPad. Watch this space!


About the author: Martin Littler

MartinMartin is Chairman of Inclusive Technology Ltd, a Trustee of the ACE Centre and was Founding Chairman of the British Assistive Technology Association. He has been involved with educational computing since 1979 – as a Liverpool Deputy Head, Lancashire Advisory Teacher, and Director of Manchester SEMERC from 1986 to 1996.


Tags: , , ,

Sign up to our newsletter