Los Angeles:
The boy, a dark-haired 6-year-old, is playing with a new companion.
The two hit it off quickly -- unusual for the 6-year-old, who has autism -- and the boy is imitating his playmate's every move, now nodding his head, now raising his arms.
"Like Simon Says," says the autistic boy's mother, seated next to him on the floor.
Yet soon he begins to withdraw; in a video of the session, he covers his ears and slumps against the wall.
But the companion, a three-foot-tall robot being tested at the University of Southern California, maintains eye contact and performs another move, raising one arm up high.
Up goes the boy's arm -- and now he is smiling at the machine.
In a handful of laboratories around the world, computer scientists are developing robots like this one: highly programmed machines that can engage people and teach them simple skills, including household tasks, vocabulary or, as in the case of the boy, playing, elementary imitation and taking turns.
So far, the teaching has been very basic, delivered mostly in experimental settings, and the robots are still works in progress, a hackers' gallery of moving parts that, like mechanical savants, each do some things well at the expense of others.
Yet the most advanced models are fully autonomous, guided by artificial intelligence software like motion tracking and speech recognition, which can make them just engaging enough to rival humans at some teaching tasks.
Researchers say the pace of innovation is such that these machines should begin to learn as they teach, becoming the sort of infinitely patient, highly informed instructors that would be effective in subjects like foreign language or in repetitive therapies used to treat developmental problems like autism.
Several countries have been testing teaching machines in classrooms. South Korea, known for its enthusiasm for technology, is "hiring" hundreds of robots as teacher aides and classroom playmates and is experimenting with robots that would teach English.
Already, these advances have stirred dystopian visions, along with the sort of ethical debate usually confined to science fiction. "I worry that if kids grow up being taught by robots and viewing technology as the instructor," said Mitchel Resnick, head of the Lifelong Kindergarten group at the Media Laboratory at the Massachusetts Institute of Technology, "they will see it as the master."
Most computer scientists reply that they have neither the intention, nor the ability, to replace human teachers. The great hope for robots, said Patricia Kuhl, co-director of the Institute for Learning and Brain Sciences at the University of Washington, "is that with the right kind of technology at a critical period in a child's development, they could supplement learning in the classroom."
Lessons From RUBI
"Kenka," says a childlike voice. "Ken-ka."
Standing on a polka-dot carpet at a preschool on the campus of the University of California, San Diego, a robot named RUBI is teaching Finnish to a 3-year-old boy.
RUBI looks like a desktop computer come to life: its screen-torso, mounted on a pair of shoes, sprouts mechanical arms and a lunchbox-size head, fitted with video cameras, a microphone and voice capability. RUBI wears a bandanna around its neck and a fixed happy-face smile, below a pair of large, plastic eyes.
It picks up a white sneaker and says kenka, the Finnish word for shoe, before returning it to the floor. "Feel it; I'm a kenka."
In a video of this exchange, the boy picks up the sneaker, says "kenka, kenka" -- and holds up the shoe for the robot to see.
In person they are not remotely humanlike, most of today's social robots. Some speak well, others not at all. Some move on two legs, others on wheels. Many look like escapees from the Island of Misfit Toys.
They make for very curious company. The University of Southern California robot used with autistic children tracks a person throughout a room, approaching indirectly and pulling up just short of personal space, like a cautious child hoping to join a playground game.
The machine's only words are exclamations ("Uh huh" for those drawing near; "Awww" for those moving away). Still, it's hard to shake the sense that some living thing is close by. That sensation, however vague, is enough to facilitate a real exchange of information, researchers say.
In the San Diego classroom where RUBI has taught Finnish, researchers are finding that the robot enables preschool children to score significantly better on tests, compared with less interactive learning, as from tapes.
Preliminary results suggest that these students "do about as well as learning from a human teacher," said Javier Movellan, director of the Machine Perception Laboratory at the University of California, San Diego. "Social interaction is apparently a very important component of learning at this age."
Like any new kid in class, RUBI took some time to find a niche. Children swarmed the robot when it first joined the classroom: instant popularity. But by the end of the day, a couple of boys had yanked off its arms.
"The problem with autonomous machines is that people are so unpredictable, especially children," said Corinna E. Lathan, chief executive of AnthroTronix, a Maryland company that makes a remotely controlled robot, CosmoBot, to assist in therapy with developmentally delayed children. "It's impossible to anticipate everything that can happen."
The RUBI team hit upon a solution one part mechanical and two parts psychological. The engineers programmed RUBI to cry when its arms were pulled. Its young playmates quickly backed off at the sound.
If the sobbing continued, the children usually shifted gears and came forward -- to deliver a hug.
Re-armed and newly sensitive, RUBI was ready to test as a teacher. In a paper published last year, researchers from the University of California, San Diego, the Massachusetts Institute of Technology and the University of Joensuu in Finland found that the robot significantly improved the vocabulary of nine toddlers.
After testing the youngsters' knowledge of 20 words and introducing them to the robot, the researchers left RUBI to operate on its own. The robot showed images on its screen and instructed children to associate them with words.
After 12 weeks, the children's knowledge of the 10 words taught by RUBI increased significantly, while their knowledge of 10 control words did not. "The effect was relatively large, a reduction in errors of more than 25 percent," the authors concluded.
Researchers in social robotics -- a branch of computer science devoted to enhancing communication between humans and machines -- at Honda Labs in Mountain View, Calif., have found a similar result with their robot, a three-foot character called Asimo, which looks like a miniature astronaut. In one 20-minute session the machine taught grade-school students how to set a table -- improving their accuracy by about 25 percent, a recent study found.
At the University of Southern California, researchers have had their robot, Bandit, interact with children with autism. In a pilot study, four children with the diagnosis spent about 30 minutes with this robot when it was programmed to be socially engaging and another half-hour when it behaved randomly, more like a toy. The results are still preliminary, said David Feil-Seifer, who ran the study, but suggest that the children spoke more often and spent more time in direct interaction when the robot was responsive, compared with when it acted randomly.
Making the Connection
In a lab at the University of Washington, Morphy, a pint-size robot, catches the eye of an infant girl and turns to look at a toy.
No luck; the girl does not follow its gaze, as she would a human's.
In a video the researchers made of the experiment, the girl next sees the robot "waving" to an adult. Now she's interested; the sight of the machine interacting registers it as a social being in the young brain. She begins to track what the robot is looking at, to the right, the left, down. The machine has elicited what scientists call gaze-following, an essential first step of social exchange.
"Before they have language, infants pay attention to what I call informational hotspots," where their mother or father is looking, said Andrew N. Meltzoff, a psychologist who is co-director of university's Institute for Learning and Brain Sciences. This, he said, is how learning begins.
This basic finding, to be published later this year, is one of dozens from a field called affective computing that is helping scientists discover exactly which features of a robot make it most convincingly "real" as a social partner, a helper, a teacher.
"It turns out that making a robot more closely resemble a human doesn't get you better social interactions," said Terrence J. Sejnowski, a neuroscientist at University of California, San Diego. The more humanlike machines look, the more creepy they can seem.
The machine's behavior is what matters, Dr. Sejnowski said. And very subtle elements can make a big difference.
The timing of a robot's responses is one. The San Diego researchers found that if RUBI reacted to a child's expression or comment too fast, it threw off the interaction; the same happened if the response was too slow. But if the robot reacted within about a second and a half, child and machine were smoothly in sync.
Physical rhythm is crucial. In recent experiments at a day care center in Japan, researchers have shown that having a robot simply bob or shake at the same rhythm a child is rocking or moving can quickly engage even very fearful children with autism.
"The child begins to notice something in that synchronous behavior and open up," said Marek Michalowski of Carnegie Mellon University, who collaborated on the studies. Once that happens, he said, "you can piggyback social behaviors onto the interaction, like eye contact, joint attention, turn taking, things these kids have trouble with."
One way to begin this process is to have a child mimic the physical movements of a robot and vice versa. In a continuing study financed by the National Institutes of Health, scientists at the University of Connecticut are conducting therapy sessions for children with autism using a French robot called Nao, a two-foot humanoid that looks like an elegant Transformer toy. The robot, remotely controlled by a therapist, demonstrates martial arts kicks and chops and urges the child to follow suit; then it encourages the child to lead.
"I just love robots, and I know this is therapy, but I don't know -- I think it's just fun," said Sam, an 8-year-old from New Haven with Asperger's syndrome, who recently engaged in the therapy.
This simple mimicry seems to build a kind of trust, and increase sociability, said Anjana Bhat, an assistant professor in the department of education who is directing the experiment. "Social interactions are so dependent on whether someone is in sync with you," Dr. Bhat said. "You walk fast, they walk fast; you go slowly, they go slowly -- and soon you are interacting, and maybe you are learning."
Personality matters, too, on both sides. In their studies with Asimo, the Honda robot, researchers have found that when the robot teacher is "cooperative" ("I am going to put the water glass here; do you think you can help me by placing the water glass on the same place on your side?"), children 4 to 6 did much better than when Asimo lectured them, or allowed them to direct themselves ("place the cup and saucer anywhere you like"). The teaching approach made less difference with students ages 7 to 10.
"The fact is that children's reactions to a robot may vary widely, by age and by individual," said Sandra Okita, a Columbia University researcher and co-author of the study.
If robots are to be truly effective guides, in short, they will have to do what any good teacher does: learn from students when a lesson is taking hold and when it is falling flat.
Learning From Humans
"Do you have any questions, Simon?"
On a recent Monday afternoon, Crystal Chao, a graduate student in robotics at the Georgia Institute of Technology, was teaching a five-foot robot named Simon to put away toys. She had given some instructions -- the flower goes in the red bin, the block in the blue bin -- and Simon had correctly put away several of these objects. But now the robot was stumped, its doughboy head tipped forward, its fawn eyes blinking at a green toy water sprinkler.
Dr. Chao repeated her query, perhaps the most fundamental in all of education: Do you have any questions?
"Let me see," said Simon, in a childlike machine voice, reaching to pick up the sprinkler. "Can you tell me where this goes?"
"In the green bin," came the answer.
Simon nodded, dropping it in that bin.
"Makes sense," the robot said.
In addition to tracking motion and recognizing language, Simon accumulates knowledge through experience.
Just as humans can learn from machines, machines can learn from humans, said Andrea Thomaz, an assistant professor of interactive computing at Georgia Tech who directs the project. For instance, she said, scientists could equip a machine to understand the nonverbal cues that signal "I'm confused" or "I have a question" -- giving it some ability to monitor how its lesson is being received.
To ask, as Dr. Chao did: Do you have any questions?
This ability to monitor and learn from experience is the next great frontier for social robotics -- and it probably depends, in large part, on unraveling the secrets of how the human brain accumulates information during infancy.
In San Diego, researchers are trying to develop a human-looking robot with sensors that approximate the complexity of a year-old infant's abilities to feel, see and hear. Babies learn, seemingly effortlessly, by experimenting, by mimicking, by moving their limbs. Could a machine with sufficient artificial intelligence do the same? And what kind of learning systems would be sufficient?
The research group has bought a $70,000 robot, built by a Japanese company, that is controlled by a pneumatic pressure system that will act as its senses, in effect helping it map out the environment by "feeling" in addition to "seeing" with embedded cameras. And that is the easy part.
The much steeper challenge is to program the machine to explore, as infants do, and build on moment-to-moment experience. Ideally its knowledge will be cumulative, not only recalling the layout of a room or a house, but using that stored knowledge to make educated guesses about a new room.
The researchers are shooting for nothing less than capturing the foundation of human learning -- or, at least, its artificial intelligence equivalent. If robots can learn to learn, on their own and without instruction, they can in principle make the kind of teachers that are responsive to the needs of a class, even an individual child.
Parents and educators would certainly have questions about robots' effectiveness as teachers, as well as ethical concerns about potential harm they might do. But if social robots take off in the way other computing technologies have, parents may have more pointed ones: Does this robot really "get" my child? Is its teaching style right for my son's needs, my daughter's talents?
That is, the very questions they would ask about any teacher.
The two hit it off quickly -- unusual for the 6-year-old, who has autism -- and the boy is imitating his playmate's every move, now nodding his head, now raising his arms.
"Like Simon Says," says the autistic boy's mother, seated next to him on the floor.
Yet soon he begins to withdraw; in a video of the session, he covers his ears and slumps against the wall.
But the companion, a three-foot-tall robot being tested at the University of Southern California, maintains eye contact and performs another move, raising one arm up high.
Up goes the boy's arm -- and now he is smiling at the machine.
In a handful of laboratories around the world, computer scientists are developing robots like this one: highly programmed machines that can engage people and teach them simple skills, including household tasks, vocabulary or, as in the case of the boy, playing, elementary imitation and taking turns.
So far, the teaching has been very basic, delivered mostly in experimental settings, and the robots are still works in progress, a hackers' gallery of moving parts that, like mechanical savants, each do some things well at the expense of others.
Yet the most advanced models are fully autonomous, guided by artificial intelligence software like motion tracking and speech recognition, which can make them just engaging enough to rival humans at some teaching tasks.
Researchers say the pace of innovation is such that these machines should begin to learn as they teach, becoming the sort of infinitely patient, highly informed instructors that would be effective in subjects like foreign language or in repetitive therapies used to treat developmental problems like autism.
Several countries have been testing teaching machines in classrooms. South Korea, known for its enthusiasm for technology, is "hiring" hundreds of robots as teacher aides and classroom playmates and is experimenting with robots that would teach English.
Already, these advances have stirred dystopian visions, along with the sort of ethical debate usually confined to science fiction. "I worry that if kids grow up being taught by robots and viewing technology as the instructor," said Mitchel Resnick, head of the Lifelong Kindergarten group at the Media Laboratory at the Massachusetts Institute of Technology, "they will see it as the master."
Most computer scientists reply that they have neither the intention, nor the ability, to replace human teachers. The great hope for robots, said Patricia Kuhl, co-director of the Institute for Learning and Brain Sciences at the University of Washington, "is that with the right kind of technology at a critical period in a child's development, they could supplement learning in the classroom."
Lessons From RUBI
"Kenka," says a childlike voice. "Ken-ka."
Standing on a polka-dot carpet at a preschool on the campus of the University of California, San Diego, a robot named RUBI is teaching Finnish to a 3-year-old boy.
RUBI looks like a desktop computer come to life: its screen-torso, mounted on a pair of shoes, sprouts mechanical arms and a lunchbox-size head, fitted with video cameras, a microphone and voice capability. RUBI wears a bandanna around its neck and a fixed happy-face smile, below a pair of large, plastic eyes.
It picks up a white sneaker and says kenka, the Finnish word for shoe, before returning it to the floor. "Feel it; I'm a kenka."
In a video of this exchange, the boy picks up the sneaker, says "kenka, kenka" -- and holds up the shoe for the robot to see.
In person they are not remotely humanlike, most of today's social robots. Some speak well, others not at all. Some move on two legs, others on wheels. Many look like escapees from the Island of Misfit Toys.
They make for very curious company. The University of Southern California robot used with autistic children tracks a person throughout a room, approaching indirectly and pulling up just short of personal space, like a cautious child hoping to join a playground game.
The machine's only words are exclamations ("Uh huh" for those drawing near; "Awww" for those moving away). Still, it's hard to shake the sense that some living thing is close by. That sensation, however vague, is enough to facilitate a real exchange of information, researchers say.
In the San Diego classroom where RUBI has taught Finnish, researchers are finding that the robot enables preschool children to score significantly better on tests, compared with less interactive learning, as from tapes.
Preliminary results suggest that these students "do about as well as learning from a human teacher," said Javier Movellan, director of the Machine Perception Laboratory at the University of California, San Diego. "Social interaction is apparently a very important component of learning at this age."
Like any new kid in class, RUBI took some time to find a niche. Children swarmed the robot when it first joined the classroom: instant popularity. But by the end of the day, a couple of boys had yanked off its arms.
"The problem with autonomous machines is that people are so unpredictable, especially children," said Corinna E. Lathan, chief executive of AnthroTronix, a Maryland company that makes a remotely controlled robot, CosmoBot, to assist in therapy with developmentally delayed children. "It's impossible to anticipate everything that can happen."
The RUBI team hit upon a solution one part mechanical and two parts psychological. The engineers programmed RUBI to cry when its arms were pulled. Its young playmates quickly backed off at the sound.
If the sobbing continued, the children usually shifted gears and came forward -- to deliver a hug.
Re-armed and newly sensitive, RUBI was ready to test as a teacher. In a paper published last year, researchers from the University of California, San Diego, the Massachusetts Institute of Technology and the University of Joensuu in Finland found that the robot significantly improved the vocabulary of nine toddlers.
After testing the youngsters' knowledge of 20 words and introducing them to the robot, the researchers left RUBI to operate on its own. The robot showed images on its screen and instructed children to associate them with words.
After 12 weeks, the children's knowledge of the 10 words taught by RUBI increased significantly, while their knowledge of 10 control words did not. "The effect was relatively large, a reduction in errors of more than 25 percent," the authors concluded.
Researchers in social robotics -- a branch of computer science devoted to enhancing communication between humans and machines -- at Honda Labs in Mountain View, Calif., have found a similar result with their robot, a three-foot character called Asimo, which looks like a miniature astronaut. In one 20-minute session the machine taught grade-school students how to set a table -- improving their accuracy by about 25 percent, a recent study found.
At the University of Southern California, researchers have had their robot, Bandit, interact with children with autism. In a pilot study, four children with the diagnosis spent about 30 minutes with this robot when it was programmed to be socially engaging and another half-hour when it behaved randomly, more like a toy. The results are still preliminary, said David Feil-Seifer, who ran the study, but suggest that the children spoke more often and spent more time in direct interaction when the robot was responsive, compared with when it acted randomly.
Making the Connection
In a lab at the University of Washington, Morphy, a pint-size robot, catches the eye of an infant girl and turns to look at a toy.
No luck; the girl does not follow its gaze, as she would a human's.
In a video the researchers made of the experiment, the girl next sees the robot "waving" to an adult. Now she's interested; the sight of the machine interacting registers it as a social being in the young brain. She begins to track what the robot is looking at, to the right, the left, down. The machine has elicited what scientists call gaze-following, an essential first step of social exchange.
"Before they have language, infants pay attention to what I call informational hotspots," where their mother or father is looking, said Andrew N. Meltzoff, a psychologist who is co-director of university's Institute for Learning and Brain Sciences. This, he said, is how learning begins.
This basic finding, to be published later this year, is one of dozens from a field called affective computing that is helping scientists discover exactly which features of a robot make it most convincingly "real" as a social partner, a helper, a teacher.
"It turns out that making a robot more closely resemble a human doesn't get you better social interactions," said Terrence J. Sejnowski, a neuroscientist at University of California, San Diego. The more humanlike machines look, the more creepy they can seem.
The machine's behavior is what matters, Dr. Sejnowski said. And very subtle elements can make a big difference.
The timing of a robot's responses is one. The San Diego researchers found that if RUBI reacted to a child's expression or comment too fast, it threw off the interaction; the same happened if the response was too slow. But if the robot reacted within about a second and a half, child and machine were smoothly in sync.
Physical rhythm is crucial. In recent experiments at a day care center in Japan, researchers have shown that having a robot simply bob or shake at the same rhythm a child is rocking or moving can quickly engage even very fearful children with autism.
"The child begins to notice something in that synchronous behavior and open up," said Marek Michalowski of Carnegie Mellon University, who collaborated on the studies. Once that happens, he said, "you can piggyback social behaviors onto the interaction, like eye contact, joint attention, turn taking, things these kids have trouble with."
One way to begin this process is to have a child mimic the physical movements of a robot and vice versa. In a continuing study financed by the National Institutes of Health, scientists at the University of Connecticut are conducting therapy sessions for children with autism using a French robot called Nao, a two-foot humanoid that looks like an elegant Transformer toy. The robot, remotely controlled by a therapist, demonstrates martial arts kicks and chops and urges the child to follow suit; then it encourages the child to lead.
"I just love robots, and I know this is therapy, but I don't know -- I think it's just fun," said Sam, an 8-year-old from New Haven with Asperger's syndrome, who recently engaged in the therapy.
This simple mimicry seems to build a kind of trust, and increase sociability, said Anjana Bhat, an assistant professor in the department of education who is directing the experiment. "Social interactions are so dependent on whether someone is in sync with you," Dr. Bhat said. "You walk fast, they walk fast; you go slowly, they go slowly -- and soon you are interacting, and maybe you are learning."
Personality matters, too, on both sides. In their studies with Asimo, the Honda robot, researchers have found that when the robot teacher is "cooperative" ("I am going to put the water glass here; do you think you can help me by placing the water glass on the same place on your side?"), children 4 to 6 did much better than when Asimo lectured them, or allowed them to direct themselves ("place the cup and saucer anywhere you like"). The teaching approach made less difference with students ages 7 to 10.
"The fact is that children's reactions to a robot may vary widely, by age and by individual," said Sandra Okita, a Columbia University researcher and co-author of the study.
If robots are to be truly effective guides, in short, they will have to do what any good teacher does: learn from students when a lesson is taking hold and when it is falling flat.
Learning From Humans
"Do you have any questions, Simon?"
On a recent Monday afternoon, Crystal Chao, a graduate student in robotics at the Georgia Institute of Technology, was teaching a five-foot robot named Simon to put away toys. She had given some instructions -- the flower goes in the red bin, the block in the blue bin -- and Simon had correctly put away several of these objects. But now the robot was stumped, its doughboy head tipped forward, its fawn eyes blinking at a green toy water sprinkler.
Dr. Chao repeated her query, perhaps the most fundamental in all of education: Do you have any questions?
"Let me see," said Simon, in a childlike machine voice, reaching to pick up the sprinkler. "Can you tell me where this goes?"
"In the green bin," came the answer.
Simon nodded, dropping it in that bin.
"Makes sense," the robot said.
In addition to tracking motion and recognizing language, Simon accumulates knowledge through experience.
Just as humans can learn from machines, machines can learn from humans, said Andrea Thomaz, an assistant professor of interactive computing at Georgia Tech who directs the project. For instance, she said, scientists could equip a machine to understand the nonverbal cues that signal "I'm confused" or "I have a question" -- giving it some ability to monitor how its lesson is being received.
To ask, as Dr. Chao did: Do you have any questions?
This ability to monitor and learn from experience is the next great frontier for social robotics -- and it probably depends, in large part, on unraveling the secrets of how the human brain accumulates information during infancy.
In San Diego, researchers are trying to develop a human-looking robot with sensors that approximate the complexity of a year-old infant's abilities to feel, see and hear. Babies learn, seemingly effortlessly, by experimenting, by mimicking, by moving their limbs. Could a machine with sufficient artificial intelligence do the same? And what kind of learning systems would be sufficient?
The research group has bought a $70,000 robot, built by a Japanese company, that is controlled by a pneumatic pressure system that will act as its senses, in effect helping it map out the environment by "feeling" in addition to "seeing" with embedded cameras. And that is the easy part.
The much steeper challenge is to program the machine to explore, as infants do, and build on moment-to-moment experience. Ideally its knowledge will be cumulative, not only recalling the layout of a room or a house, but using that stored knowledge to make educated guesses about a new room.
The researchers are shooting for nothing less than capturing the foundation of human learning -- or, at least, its artificial intelligence equivalent. If robots can learn to learn, on their own and without instruction, they can in principle make the kind of teachers that are responsive to the needs of a class, even an individual child.
Parents and educators would certainly have questions about robots' effectiveness as teachers, as well as ethical concerns about potential harm they might do. But if social robots take off in the way other computing technologies have, parents may have more pointed ones: Does this robot really "get" my child? Is its teaching style right for my son's needs, my daughter's talents?
That is, the very questions they would ask about any teacher.
Track Latest News Live on NDTV.com and get news updates from India and around the world