Understanding the effects of the relative size of the Baby Boom compared to preceding and succeeding age groups is essential to anticipating the demand for long-term services and support and the potential availability of family care in the future.
And then I learned about Linda.
At first, Linda doesn’t seem like the sort of career I would want in my old age. She’s completely silent, never smiles, is rather cold and hasn’t got any arms.
But experts believe that Linda, a $40,000 robot who resembles a human-sized chess pawn, could be the perfect solution to one of the biggest hazards facing elderly residents in care homes: falls with no one to be at their aide with any immediacy.
As noted above, Baby Boomers will be at a loss when it comes to care, so chances are if the hurt individual is not in a home, the time span in which there is someone to check on them is going to be long—perhaps too long. Nurses in homes are typically so busy that when residents fall and injure themselves in their own rooms, it can be several hours until the accident is discovered – when they fail to appear for breakfast, for example.
Continuously sweeping the building in search of distressed residents would be far too demanding on a nurse’s time. Likewise, it is too challenging to ask the constant check-ups that would be necessary of a relative caregiver, presumably with a job of their own, but is exactly the kind of repetitive task to which robots are ideally suited.
Not only could robots like Linda patrol corridors for 24 hours a day, providing much more continuous surveillance than any human, but they could save nurses valuable time by performing additional tasks such as carrying messages or escorting patients to appointments.
There’s just one snag – how does a robot tell the difference between an elderly and vulnerable patient who has collapsed, and a similarly shaped object – such as a large duffel bag – lying on the floor?
The problem of teaching machines to distinguish between an everyday situation and a possible emergency is now being tackled by a £7m ($11m) EU-funded project being conducted at six universities in Britain and abroad.
The project, known as STRANDS (Spatio-Temporal Representations and Activities for Cognitive Control in Long-term Scenarios) is focused on programming robots to learn about their environment and recognize when something is amiss.
The first major phase of the study took place this summer at the University of Lincoln, where researchers from Birmingham and Aachen universities gathered for a week of intensive programming.
Within five years, the scientists hope that a robot placed in an unfamiliar care home will be able to learn about its surroundings and create recognize patterns of everyday activity such as doors opening and closing or furniture being moved.
Operating without any input from humans for up to three months at a time, the robots should be able to tell the difference between a normal situation, such as someone leaving their room during the day, and abnormal one, such as doing so in the middle of the night.
A second arm of the project will see the robots deployed as security guards in office buildings, patrolling and picking up signs of unusual activity such as open windows or people moving around at night.
“When a human security guard is working, they learn about their environment – where things usually are, what people do – they get this common sense understanding of the environment over a long period of time,” Dr. Nick Hawes, coordinator of the project from the University of Birmingham, said. “The intellectual challenge is, could you enable a robot to operate the same way?”
The major difficulty will be teaching the machines which environmental changes it should consider normal, preventing them from interpreting a repositioned chair or missing stapler as a security threat.
Professor Tom Duckett, Director of the Lincoln Centre for Autonomous Systems Research, explained: “What we are trying to do is enable robots to learn from their long-term experience.
“If you were to go into a busy restaurant at lunchtime and see a chair that was out of place, that would not concern you at all because it would be what you would expect. But furniture moved in an office late at night would be suspicious.”
Prof Duckett’s role in the project is to oversee the creation of “four dimensional” mapping software, which will allow the robots not just to navigate its way around, but recognize how the environment changes during the day and over longer periods.
The machines have a 360 degree laser at floor level which tells them where they are in relation to walls and doors, but are also equipped with a camera similar to a Microsoft Kinect on their head, allowing them to recognize objects they have seen before and spot when something is out of its usual place.
While coping with the variation of furniture in an empty office is hard enough, getting to grips with normal daily activities in a care home with hundreds of residents poses the tougher of the two challenges.
Some hospitals already use basic robots for jobs like delivering medicines, but developing machines that can react to different situations and perform a variety of tasks is a large step forward.
“In the care home, we want the robot to detect when people have fallen over, but also do things like passing messages and assisting around the place,” Dr. Hawes said. “Learning that, for example, residents all have a cup of tea at 3pm and being around to help at that time.
“If someone tries to leave with her shopping bag at 3am, the robot should be able to say, ‘this isn’t what should be going on’, and raise an alarm.”
But despite the technical difficulties, perhaps the most difficult part of the project will be encouraging the residents of the Haus der Barmherzigkeit care home in Austria, where the technology will be tested, to accept their new staff.
The robots have a head with two blinking “eyes” on top of their conical bodies, not for technical reasons but to make them more approachable and to help humans interact with them.
“If it’s just a box on wheels, it’s a lot harder for people to understand what’s happening and how to interact with the robot,” Dr. Hawes explained.
“The head gives them a focal point and the eyes will indicate where the robot is looking. If the camera points somewhere the eyes should look in the same direction, so a human should have an intuitive understanding of what it is doing.”
Although the project is purely for research purposes, the scientists behind it intend to form a spin-out company and market the software before others have the same idea.
“The security aspect I think is easier because they can get a lot more value out of limited functionality, so I would say after the end of the project you would maybe have another year or two of development to commercialize these things,” Dr. Hawes said.
When it comes to nursing homes, assisted living centers, or eventually in home care, Hawes has a longer outlook. “Care homes I think would be another four to six years of work, so maybe in 10 years’ time you might start seeing these things.” In the 10 years’ time that he speaks of however, the Baby Boomer population will be in rapidly approaching, if not in immediate need of more care than their loved ones will be able to provide.