As robots become ever more sophisticated, what future role could they play in the early years – and do practitioners need to fear for their jobs? Charlotte Goddard report
In nurseries and schools across the world, the robots are coming. Some are furry, some are futuristic. Some look like people and some look like machines. Some use artificial intelligence (AI), while others are controlled by humans.
Last year, the Department for Education published its EdTech strategy, setting out the Government’s vision for the use of technology across education. The department is working with charity Nesta to fund 15 innovative products, several of which are underpinned by AI, and has set up an AI Horizon Scanning Group to explore how AI might impact education policy, as well as the benefits it could bring.
Not all robots use artificial intelligence, and not all devices that do use AI are robots – some may be apps, or Alexa-type gadgets. Some robots and other devices can personalise their interactions with children, tailoring content and delivery depending on a child’s behaviour, pace of learning or interests. In Sweden, for example, there is a robot that can tell if a child in the group is not interacting with others, and will draw that child into the game.
Researchers at Plymouth University found primary school children had a better knowledge of a new subject when they learned about it from a personalised robot, compared with a non-personalised robot. However, the approach is not without controversy, with concerns about privacy, potential bias, and the marginalisation of teachers (see ‘ethics’ box).
A growing use for robots in the early years is supporting children with SEND, especially those on the autistic spectrum. Many studies show robots have a positive impact on children with autism, but there are some concerns that robot therapy risks further isolating such children. However, Tony Belpaeme, professor of robotics at the University of Plymouth, says this has not been his experience. ‘When a child has successful interactions with the robot, usually parents, therapists and teachers will report other positive effects,’ he says. ‘It seems to be a catalyst for other interactions.’
The University of Hertfordshire’s humanoid Kaspar robot has been around since 2013 and is used by early years centre Tracks Autism in Stevenage. Kaspar was developed to help children better interact and communicate and explore basic emotions, and can engage children in play to help them learn social skills such as imitation, collaboration and turn-taking.
Kaspar uses his sensors to help children learn about socially acceptable touching – for example, if a child hits him he will move away and say ‘ouch’.
‘I thought there should be some response because if a child slaps the robot and nothing happens, then slaps another child in the playground and they slap back, they wouldn’t necessarily know why,’ says the University of Hertfordshire’s Dr Ben Robins.
It is important for interaction with Kaspar to be mediated by a teacher or therapist. ‘A child might continue hitting because it makes the robot say something, so we need to teach the consequence of the action – “look at Kaspar’s face, he is sad, why is that?”.’
‘One very quiet child in the first year of primary school would not look at the teacher’s face,’ says Dr Robins. ‘He loved exploring Kaspar’s face, controlling his eye blinking, for example. Then he started to explore the teacher’s face; it was the first time he had looked at her. Something in the robot allowed this to happen.’
Dr Robins thinks a robot’s predictability is one factor. ‘Human behaviour can be subtle, and here is something very simple and predictable – if they press the button, the same thing will happen – they can explore and feel safe,’ he explains.
Fine Art Miracles (see box, right) chief executive Tess Lojacono says humans instinctively want to communicate with robots. ‘The reason social robotics works is that when humans see something their brains perceive to be humanoid but not exactly human, they will have an irresistible urge to communicate with it. So if you put a robot with children who have difficulty communicating, that urge will help them to reach out.’
A robot grabs the children’s attention, and because it is non-judgemental it enables them to interact with it on their level, as a peer, a pet, or even as a pupil.
However, while in the future there may be a robot in every classroom, roboticists say a classroom will never be staffed entirely by robots. A robot is seen as an assistant, not a replacement teacher.
There are some things a robot cannot or should not be used for, says Goren Gordon, head of Tel-Aviv University’s Curiosity Lab. ‘Robots should not be involved with anything related to discipline, and emotional stuff,’ he says. ‘If a child is disruptive, the robot shouldn’t handle it – at most it should call the teacher.’
Right now cost is a factor, as most robots are made or programmed by researchers rather than being mass-produced. The Curiosity Lab robots, however, are 3D-printed, with raw materials costing around $150. ‘Some researchers say let’s first make it work and then let’s make it cheap,’ says Gordon. ‘My agenda is to make the stupidest, cheapest robot and see if it works.’
Local secondary school children are building the robots for pre-schools as part of their science lessons. ‘It’s totally scalable, they don’t need us,’ says Gordon.
PEERbot in USA
Fine Art Miracles (FAM) has created a robot designed to appeal to pre-schoolers. The not-for-profit organisation has drawn on its experience using robots to interact with older people with dementia and children with autism to come up with a robot for children having difficulty with issues such as socialising, making choices and taking turns.
‘We went to a professor of early childhood development and asked “what is the biggest development issue for nought to three-year-olds?”,’ explains Tess Lojacono, FAM chief executive. ‘Without hesitation she said “joint attention” – the ability to share a focus on something with another person through gaze, pointing or other interactions. It’s important in social and emotional development and the development of language, providing children with information about their environment, and is the first indication of communication among humans.’
FAM decided to combine joint attention with literacy and social and emotional learning, designing short sessions involving members of its robotic crew. Each robotic interaction lasts around 15 minutes and involves two or three two-year-olds. The PEERbot’s face is a tablet, allowing for changes in colour and expression, and an adult instructor controls the PEERbot with another tablet, which has buttons that will make the robot speak in pre-programmed phrases and words. Phrases can also be added by the teacher quickly, if a child says something for which the robot does not have a programmed response. The presence of a professional operating the robot is vital: the robot is a tool, not a replacement teacher.
Each session is based around a particular book, and PEERbot’s appearance can be customised accordingly. Interactions centre on a learning mat. Children pick up animals from the mat and place them in a cart, with the robot encouraging them to work together and asking them questions.
‘Children are able to connect with the robot, create relationships, and communicate more easily than with a human because it is non-judgemental and really cute and fuzzy,’ Ms Lojacono says. ‘They want to please the robot so they do what he asks, making their task completion skyrocket. This changes their attitude from “I can’t” to a resounding “I can”.’
Copyright Fine Art Miracles
Undue influence: The University of Plymouth recently published research that found children were significantly influenced by robots, and tended to trust their answers even when they were wrong. Problems could arise from robots that recommend particular products, as children can be highly susceptible to this form of advertising.
Privacy: Robots and AI learning systems generate huge amounts of data on individual children. There are questions around who owns this data and what it might be used for.
Security: Stored data can be hacked, but robots with cameras and internet access can also allow hackers to view children in the classroom. ‘I know several robots in classrooms right now that have that vulnerability,’ says Professor Belpaeme. ‘Security isn’t necessarily high on the agenda of the people making that technology.’
Social and emotional development: Should a child be allowed to be rude to a robot? Telling them to be polite to a device which has no feelings could lead to confusion, but allowing rudeness could impact a child’s interactions with humans. Programmers are already building devices that aim to discourage rudeness. ‘Colleagues carried out a study in Japan, where they put a robot in a mall,’ says Professor Belpaeme. ‘Kids tended to bully the robot, standing in front of it so it couldn’t move.’ Some people also fear that children will become more comfortable with robots than with humans, and even start to imitate their robotic behaviour.
Unconscious bias: Devices built on algorithms using data from one set of children may be less able to predict the needs of another demographic group, like facial-recognition systems that have trouble identifying people with darker skin. There are also concerns that some personalised learning devices may work against disadvantaged children, taking away opportunities to progress by only offering work at a certain level.
Replacing teachers: There are concerns robots may eventually replace teaching assistants or even teachers themselves. In the current recruitment crisis, using robots to ease the workload may seem attractive but in the long run could lead to less effort put into recruiting and training teachers. Also, control over the content of lessons and the curriculum may pass from schools and government to the manufacturers of robots and learning devices.