Evidence

Evidence From Schools


The Professional Learning Community has gathered evidence about the effectiveness of the gesture based software and hardware it is using. Most of the schools have contributed evidence, mainly in the form of narrative given the nature of our pupils.

Trinity Fields School and Resource Centre


Somantics Post-Graduate Action Research Project

A post-graduate action research project for a Post-Graduate Diploma in Professional Development in SLD/PMLD at Swansea Metropolitan University introduced a young teenage man with ASD, working at around P5 National Curriculum level, to Somantics using the Kinect in March 2012 in the school ICT suite. He was very disengaged from most other activities in school and before the research he was observed during four activities and given a score on the Leuven Scale of Involvement and Well Being for each as a baseline. The Leuven scale is a standard used to measure Involvement and Well Being, usually in mainstream early years settings but it fitted well with this research which was trying to increase his engagement in school. Link: http://www.kindengezin.be/img/sics-ziko-manual.pdf
He scored very low for the four activities before the project, around 2 on the scale. He was then introduced to Somantics during short one to one sessions, around 4 a week for a month period. Video evidence was taken of all of the sessions to be analysed later. His first interaction with the system scored an average of 3-4 on the Leuven scale. At the end of the sessions he was scoring 4-5 for both his Well Being and Involvement during the sessions while he interacted with his favourite programs on the system, mainly Painter and Sparkles. The score for each of the assessed sessions was calculated by using the best fit scale descriptor for each 10 second period over 2 minutes, for the control activities, the initial session and the final session.
He gained a marked improvement in his physical movements during the project. Normally he only exhibits stereotypical movements and transitions from place to place walking quickly with his head down, he had also recently began refusing to attend PE lessons. With Somantics he began to move his upper body, then move around the area, waving arms and kicking his feet out to interact with the system. Verbally he is echolalic, both delayed and instant and will label pictures he knows verbally when shown to him when asked 'what is this?'. During the trial however there were three incidences where he named the colour on the screen independently and with no prompting. While there were no behaviours exhibited through using the system that he had not shown previously in his life the behaviours described had not occurred in school for at least a 12 month period. The action research project was submitted for a Post-Graduate Diploma in SLD/PMLD course at Swansea Metropolitan University and was graded A. The essay is here:
kniect1.pngincreased movement.png
These two photographs are an attempt to show the pupil's static movement at first, increasing to him using the whole space after a number of sessions.

choosing activity independently.png
Eventually he would choose which program he wanted from the main menu screen.
See this video here for a full detailed description:

Ben's Story, Trinity Fields School, SHAPE project, funded by ESRC, 17th July 2013 copy with titles from leah mc laughlin on Vimeo.



Update June 2014:
This pupil continues to use Kinect programs in interaction sessions at least once a week. He continues to use Somantics but we have extended this with the Processing Kinect interactive programs as well. He is varying his interaction and movements according to what program he is using- showing more engagement and different movement patterns. We are now focusing on furthering his choice making using photographs of the different programs.

Other Findings

Somantics has proven to be very popular with our students with ASD. Initially they are hesitant in using the system as they have not experienced anything like it before, except for a mirror perhaps, but mirrors only copy you and don't produce sparkles or start painting the room around you! Some ASD students respond well to the programs that show the camera image of themselves, others more with the programs that respond to movement in an abstract way. Not all pupils respond well to it however, like anything else some like it, some don't. There are also issues as our visually impaired pupils cannot use the system. It will track every user though and so is very adaptable for wheelchair users.

Also watch this video about what Callum gets from Somantics:

Callum's story, Trinity Fields School, SHAPE project, funded by ESRC 17 july 2013Copy with Shape titles from leah mc laughlin on Vimeo.



And this one on Charys' different interaction with it.

Charys' story, Trinity Fields School, SHAPE project, funded by ESRC17 july 2013 Copy with titles from leah mc laughlin on Vimeo.



And how Jordan interacts with the system- an excellent case study:

Jordan's story, Trinity Fields School, SHAPE project funded by ESRC17 july 2013 Copy with SHAPE titles from leah mc laughlin on Vimeo.



All these videos were compiled as part of the SHAPE project at Birmingham University alongside CARIAD. Click here for some more impact evidence from The Hollies School in Cardiff.

Visikord has proved popular with our pupils too, pupils with ASD, Down Syndrome and GDD all enjoy interacting with the system and watching the changes that it makes to their alternative 'graphic bodies'. It has been used in PE lessons and works especially well as a movement motivator with music blaring! It will track active wheelchair users only.

Anthony Rhys

Oak Grove College, West Sussex


In terms of gesture-based technology we have use Po-motion using a webcam and the Kinect with different types of software including Visikord and Somantics.

In terms of Po-motion, we have set this up in the sensory room and from feedback I have received from teachers using it, they say that the students are engaged and love exploring the cause and effects that occur when moving their bodies. Here is the link to the blog post I wrote about Po-motion. http://senclassroom.wordpress.com/2012/05/09/po-motion-interactive-wall-in-the-sensory-room/
Though we use this with a webcam the developers have created the software to work with the kinect as well. We are exploring different applications with the developers to include different scenes etc.

Visikord is one piece of software that we have used quite a bit with our students and they love to explore the different effects it produces to the music. We have also used it for physio for one student, as it encourages him to reach up to top of screen to make things happen. Also here is the link to the blog post i wrote about Visikord: http://senclassroom.wordpress.com/2012/06/04/kinect-and-sen-visikord-turning-music-into-visuals/

James Winchester

Heronsbridge School, Bridgend.


Heronsbridge have started their Kinect journey, I saw some evidence today (12-12-12) from Heini of the class using the Snowbells program with pupils with ASD for cause and effect work. The footage was perfect to show a start point for the interaction work, as although the pupils were enjoying the program it was not yet clear whether they had made the link between their actions and the sound of the bells, so good potential there.
The school have also been trialling Visikord and the lovely Davor Magdic from the company has been in contact with them too!
Also showed Allison, the ICT co-ordinator for the main school body the Kinect programs that were available and the school is hoping to get more programs up and running ready for the new year, Sylvia from the Autistic Unit is also getting on with Somantics installation. Roll on 2013!

Exeter House Special School, Salisbury.


Context

At Exeter House we are involved in a Wiltshire wide project called Inspire Create Teach. The project as a whole is about using new technologies in creative ways to raise attainment for Wiltshire pupils and to disseminate good practice to colleagues.
In particular the Exeter House Inspire Create Teach project is attempting to address this question.
How do you engage pupils with Complex Learning Difficulties and Disabilities (CLDD) and how do you motivate them to communicate?
Pupils have been selected to take part in the project who meet the following criteria:
  1. Not making expected progress currently (According to Progression guidance).
  2. Reason for lack of progress is lack of engagement.
  3. Pupil is likely to be motivated by technology (cause & effect, visual, auditory, movement etc.)
  4. Pupil likely to make significant progress through intervention within 1 year.
The pupils are being baselined in three areas.
  1. Attainment as routinely measure by the school and recorded in BSquared assessment system.
  2. Communication assessment (Pre Verbal Communication Schedule) conducted jointly with our Speech & Language Therapy colleagues.
  3. Engagement as measured in Engagement Profile and Scale.
For some of the pupils selected the chosen technology is iPads running a range of apps thought likely to engage them. However we have some for whom this is a less promising avenue and for whom we are very excited about the possibilities of engagement through gestural interaction, in particular the Kinect Sensor and the work of colleagues involved in this wiki.

Engagement

What we hope to see in these three measures is an initial rise in levels of engagement, followed sometime later by an improvement in levels of communication and followed after that by improved learning in other areas.
It seems to us that the most obvious benefit of using gestural interaction with the computer and in particular the Kinect Sensor is its ability to engage pupils.
With this in mind I recommend looking at the DfE Complex Needs training module on Engagement.
http://www.education.gov.uk/complexneeds/modules/Module-3.2-Engaging-in-learning---key-approaches/All/m10p010a.html
(There are a huge amount of training materials here and I would also recommend using them in an organised way for professional development.)
If you follow the training module you will learn about the Engagement Profile and the Engagement Scale. These are tools that I think it would be really useful to use with students who are accessing apps through gesture. They will help to understand the nature of the students' engagement and a realistic way of assessing changes in that brought about by the gesture work. This will greatly strengthen the case that we can make to senior managers when arguing for time and resources to carry out this work.

Progress so far


In terms of gestural interaction via the Kinect Sensor, we have made a few faltering but very encouraging steps using Visikord software. It has been used both with pupils involved in the project and some who are not a part of it.
D. is confined to a wheelchair but has some good movement in his arms. He generally operates at about P6. He is not part of the project and is not generally difficult to engage, however it has been fascinating to watch him using Visikord. He makes lots of very large arm movements, which are not easy for him. He engages with it in response to the music and likes to dance through the field of view in his powered wheelchair. He has also explored holding one had up as still as possible and observed carefully the effects of making small movements with one or two fingers.
L. is also not part of the project, he is able bodied but emotionally very troubled. His engagement with the Visikord is total and he looks forward to further sessions as soon as he has finished. It has been possible for an adult to move alongside him, either mirroring his movements or in sympathy with them. He seems to be on the verge of being able to follow an adult in their movements as well as being able to lead them, and I hope this will help with his turn taking and sense of exchange in a conversation. Although he is not one we would normally identify as a candidate for Intensive Interaction his work has shown me that this sort of work via the Visikord display is a distinct possibility for some (probably pupils on the Autistic Spectrum) who may engage with a person indirectly through shared dance interpreted on the screen by a Kinect Sensor.
I. is a member of the project and he unsurprisingly has so far been the least engaged of the pupils who have used Visikord. However he has been willing to sit on a chair angled sideways and has happily swung his legs to the music. His feet have then taken on the hand powers and he watches with a smile as they light up the screen. There are many variables in terms of music, loudness, environment, visual characteristics etc. that we have to work with to make this more appealing to him. In this we will be guided by information gleaned in his Engagement Profile.
So far we have only dipped our toes in the waters of gestural interface by using Visikord. The possibilities of Somantics and other programs identified elsewhere on this wiki are very exciting and we look forward to exploring them further in the New Year.

Andrew Walker


Reactickles and SoundBeam Session


Exeter House School February 09 Feb 2013

Inspiration

Wendy Keay – Bright’s talk about Somantics, Reactickles etc at the PLC and the role of body movement in self esteem suggested to me the following idea to try with a pupil of mine who suffers from C.P.

Problem

Access has long been a challenge for S. We have tried a range of switches in a range of positions and have had some success, but it is always hard work for S. He comes to Kinect / VisiKord sessions and is always engaged in what the other students can do, but he is unable to make sufficiently large movements of his head, arms or legs to be effectively detected by the VisiKord camera outside the space of his wheelchair.

Solution

S. is however an effective user of SoundBeam because the beam can be varied so much in its direction, length, divisions etc. Using the Reactickles suite set to receive microphone input, S. could suddenly exercise huge control over the software, through the following chain of events:-
  • His movements are detected by the beam
  • Soundbeam generates musical notes related to his movements
  • The microphone detects the notes
  • Reactickles uses the input from the microphone to activate the onscreen animations

==
Reactickles Soundbeam 1.jpg

Features of the setup we used are:

  • The beam is positioned such that it is possible for the student to rest outside the beam and so have silence, and therefore no onscreen activity. This is recommended good practice for any SoundBeam session and helps to make it clear to the student that it is his movement that is causing the sound.
  • A short, compressed beam (i.e. long minimum length and short maximum length) with a relatively large number of divisions (in this case 12-18) allows a small range of movements (mostly turns or nods of the head) to produce a wide range of notes
  • Beam pitch sequence set so that notes played are fragments of a tune, makes it more appealing to the ear
  • We altered the Midi Instrument associated with the beam to be different with each Reactickles animation, to help keep a clear distinction between them.
  • Staff kept very quiet so that almost all noises (and consequent animation) were generated by the student. This is also important I feel because students who have so little control over events in their life will tend to expect the member of staff or carer to be the active participant. I didn’t want this session to be about interaction with me, but rather him being an active learner making music and animations take place, not needing me to work it for him.

Outcomes

This session lasted about 30 minutes after setting up and the student was highly engaged throughout. He appeared relaxed but active and able to control the music and animation with much less effort than he is used to through switches. There was also some sign of the phenomenon (described by Hector in the eyegaze presentation at the PLC) that pupils who are used to having such restricted control over their field of view develop greater attention to peripheral vision. However for the most part S. was very focussed on the screen. There was the additional benefit from the setup that when S. was particularly excited, he did vocalise and this was detected by the microphone as well, so effectively he could use voice and movement to explore the music and visuals.
This simple connection i.e. accurate movement detection from the Soundbeam allied to the sound input of the Reactickles and other software, offer us fantastic new avenues to explore with this pupil and many others. I hope to post more on this in the next few weeks.
Andrew Walker

Crug Glas Special School, Swansea.


Kinect:

Over the last year or so we have bought a Kinect and created a ‘Movement room’ to give children a space where they can interact with the Kinect on a one-to-one basis without other distractions. Initial teething problems with the software, shadows and getting rid of all distractions made it a slow process to start with but we have experienced some children engaging with themselves on screen where they would not have been interested otherwise.

Pupil K particularly enjoyed the Somantics package, especially sparkles and kaleidoscope. Pupil K is 12 years old. He is the second child of twins and has severe developmental delay with features compatible with ASD. He is generally a passive child who does not seek interaction with his peers. He has no speech but his vocalisations tend to be a continuous humming.


Pupil K shows some key areas of development when using Sparkles from September to December 2013. He initially wonders around and tries to touch the projected screen but soon adjusts his movements to move out of the projector shadow and to view his own sparkling reflection on the screen. He also claps and watches the sparkles and also stills his movement intentionally too- to stop the sparkles and then watch them restart as he moves again. He also explores Kaleidoscope by moving intentionally around the responsive area- looking at when his body image appears and disappears. His general engagement and interaction levels improve during the three month period.

Eye gaze:

Eye gaze technology has taken off at quite a pace within the school. Once we had the assessment day we were able to get funding for a Tobii eye gaze system almost straight away and quickly did some fundraising to back that up with another camera. A number of parents have got on board and have bought systems for their children, some of which get sent back and forth between school and home at the weekends.



The eye gaze technology is opening up a world for our children that they otherwise have been unable to access or have found very difficult to access. It is aiding our children in cause and effect type programs right through to developing communication.

After taking one child (William) for an assessment session I did a write up on the session and created a tick sheet based on that and the ideas from the Look to Learn check lists, these enable us to keep a track of progression and use but are quick and easy for staff to fill in at least twice a term. (see word docs in evidence file).
Annie