To help spur interest in science, technology, engineering, and mathematics, many schools have begun to integrate robotics into the curriculum—but are younger students and their teachers ready for a new wave of robotic teaching assistants?
Many researchers and robotics experts agree that robot teachers are no longer the stuff of science fiction—they’re part of a new workforce designed to lend a helping hand to classroom teachers … whose jobs aren’t in jeopardy any time soon, experts say.
Although the technology is still in its developmental stage, and Apple has yet to develop an iRobot, assistant teaching robots already are being piloted in preschool classrooms from Korea to San Diego, Calif. The benefits, say researchers, are that robots not only provide infinite patience but can handle simple tasks that take up teachers’ valuable classroom time.
When asked why it would be desirable to have a robotic assistant in the classroom, instead of another teacher with more skills, Javier Movellan, founder of the UC San Diego Machine Perception (MP) Lab, said robots could be a cost-effective way for schools to get early education teachers the help they deserve.
“Why not both?” asked Movellan. “Think about the operating room of the 19th century and the operating room of the 21st century. The difference is doctors with more skills and much better technology. The technology shaped the skills, and the skills shaped the technology.”
Movellan, whose team at MP Lab developed RUBI, a robot tutor for toddlers, said he only wants early childhood educators of the 21st century to have access to cutting-edge technology tools like other professionals have in their fields.
“I want for them to shape our educational technology so they can have the best tools at their disposal. This is not about reducing the number of teachers. This is about giving the teachers an opportunity to do what they can do best, with the best possible technology,” he said.
RUBI, which stands for Robot Using Bayesian Inference, measures 22 by 24 by 8 inches and is a low-cost sociable robot designed to interact safely with 18- to 25-month-old toddlers.
The idea began when Movellan took a four-month visit to Japanese roboticist Dr. Hiroshi Ishiguro’s lab in Kyoto, Japan. As part of the visit, Movellan took a robot to a nearby preschool to conduct a series of experiments on how children responded to robots.
Movellan said it was clear that the robot got the children’s attention, and after returning to the U.S., he started working on prototypes in his garage with the help of his four-year-old son. Movellan’s daughter insisted the finished robot was a girl, and so RUBI was born.
After four years, RUBI is now in her fourth and latest version, thanks to funding from a collaboration between the University of California and Sony through a program called UC Discovery, as well as bridge funding from the National Science Foundation’s Science of Learning Centers.
RUBI is part of MP Lab’s Intelligent Tutoring Systems (ITS), or computer systems that can simulate a human teacher by providing direct, customized instruction or feedback to students without the intervention of human beings. ITS assess each learner’s actions and develop a customized “learner model” of the children’s knowledge, skills, and expertise.
RUBI, like most other education-based robots currently in production, can assist human teachers with mundane tasks like teaching vocabulary word memorization or other rote exercises.
The bandana-wearing RUBI interacts with preschoolers at UCSD’s Early Childhood Education Center and teaches them basic concepts such as colors, numbers, and some vocabulary words.
“I think of robot teachers as ‘exercise machines’ and of teachers as ‘coaches,'” said Movellan. “A good coach provides you with the motivation and the vision for why you are exercising. A good exercise machine takes care of the low-level details so you achieve your fitness goals.”
RUBI also has smile-detection technology, made possible with a Computer Expression Recognition Toolbox (CERT), which allows the robot to giggle and encourage a child to continue with the lesson when it sees the child smiling. This smile-recognition technology led to a smile-learning algorithm, now used in Sony’s Shutter Smile Technology, found in a line of the company’s digital camera products.
Movellan says this research is part of MP Lab’s “early inreach” philosophy—inreach as opposed to outreach, meaning that instead of waiting for the basic science to be developed in laboratory environments, MP Lab embeds scientists, engineers, and robots in classrooms very early on in the development process.
“We strive for the ecology of the classrooms to influence the scientific questions we ask. This is in contrast with the more traditional ‘outreach’ approach that emphasizes developing basic science in controlled laboratory conditions and then outreach to the classroom environment and telling teachers what they should do to improve education,” he said.
So far, Movellan said, the most challenging aspect to RUBI’s effectiveness in the classroom is her fatality rate.
“The children would shake RUBI’s head, poke her eyes, pull her arms, and bite her hands,” he said. “Some of our versions of RUBI died after two hours of interacting with the children. In the end, a critical part of RUBI’s survival was to provide her with an ’emotion engine.'”
According to Movellan, RUBI now can detect when her well-being is at stake and respond appropriately. For example, when she feels threatened, she cries, and this gives the children a clear message as to what is appropriate and what is not. When she is surrounded by children who are playing with her, she is happy and giggles.
RUBI is also modeled after typical child behavior, with an interactive program that allows her to take objects offered to her from children, say “thank you,” and then give the objects back.
By the end of RUBI’s first phase, and after conducting two randomized, pre-test/treatment/post-test trials, RUBI was found to have a significant effect on vocabulary learning, both in English and in a foreign language (Finnish).
“Most importantly, we learned about the factors that were more predictive of learning,” Movellan said. “Turns out that the total amount of time a child spent with RUBI was not a very good predictor of learning.”
As with other classroom technology, Movellan explained, what matters most is the duration of the daily interactions. Children who interacted a lot with RUBI during the day, but with short interaction periods, did not learn much. Children who interacted for a lesser amount of time, with each episode being more sustained (about four minutes in all), learned a lot.
MP Lab is now programming RUBI to get four minutes of sustained interaction with each child per day. The researchers also learned that it’s useful to involve more than one child in the interactions. For example, children who typically interact with the robot in short bursts have longer, more sustained interactions when they play with RUBI together.
Beginning in September, MP Lab will receive funding from the National Science Foundation to experiment with the development of a social network of robots (RubiNet) for early childhood education. Movellan hopes RUBI-style educational robots will act as interfaces between toddlers across the nation and across the world, the same way computers act as interfaces between humans and the internet.
Robots all over the world
In Seoul, South Korea, robots are establishing a presence in education and soon will move beyond basic testing to full classroom implementation.
Engkey (a contraction of “English Jockey,” as in disc jockey), a robot created by the Center for Intelligent Robotics (CIR) at the Korea Institute of Science and Technology (KIST), is a short, penguin-like robot designed to help South Korean students learn English.
CIR’s focus on the development of robot technologies for daily human assistance is what inspired the team to create Engkey. CIR’s technologies are managed by a government-funded program, 21C Frontier, which is hosted by the Korea Ministry of Knowledge and Economy (MKE). The program is a 10-year initiative that began in 2003, and its total budget is about 130 million U.S. dollars.
MKE, and the city of Masan in Korea, provided an additional half a million U.S. dollars specifically for the development of Engkey.
According to KIST’s Intelligent Robotics team leader, Dr. Mun-taek Choi, the development of Engkey started in early 2009, and the first version took slightly less than a year to create by applying some of the original technologies from CIR. Choi and his team expect the second version of Engkey, with better software, to be ready by the end of this year.
In a pilot project to test the applicability of the robot as a teaching tool, the first version of Engkey was installed in classrooms for two months, from late Dec. 2009 to early Feb. 2010. Engkey helped teach students in two Masan schools.
Engkey has the option of using either a synthesized female or male voice and can follow students around the room, asking them basic programmed questions in English, such as “How can I help you today?”
The robot also is programmed to give a series of set responses, such as “Wow, very good!” or “Not good this time. You need to focus more on your accent.”
However, Engkey is programmed to hear a set list of responses, and if a student deviates from the responses, Engkey cannot compute those responses.
“To make a robot have ‘good’ interaction skills, first we need to develop good recognition technologies to understand humans, environments, and situations,” said Choi. “Those are very limited with current sensor technologies.”
After more tests in schools this year, Choi hopes to commercialize Engkey and to reduce the price from the current $24,000 to $8,000.
South Korea, a leader in robotics, soon will deploy hundreds of robotic teaching aides as part of a plan to have the country’s 8,400 kindergartens work with robots by 2013, thanks to the efforts of the Education Ministry.
“In Korea, English education is very important to students and their parents, and in many cases it takes substantial costs to have native speakers teaching,” explained Choi. “Although robots cannot supplant human teachers, we believe that we can at least provide a cost-effective way of teaching; and robots are quite effective in teaching as long as they are carefully designed with predefined teaching materials using current technologies.”
Engkey might (or might not) be one of these teaching aides, owing to the robot’s inability to handle off-program responses and what some critics have called a lack of functional human abilities, such as recognizing emotions or responding organically.
“In reality, many unexpected incidents happen all the time [in the classroom], and humans have a good capability to adapting to these incidents and even learning from them. Although robotists are doing hard work, they might not have that capability in the short term,” Choi said.
Because robots can’t yet recognize individual students and respond promptly and properly, he said, they cannot be used to replace teachers.
However, one robot currently being developed in the United States might be the most promising development in robotic teaching assistants.
Simon, the creation of Dr. Andrea Thomaz, assistant professor in the School of Interactive Computing at the Georgia Institute of Technology’s Socially Intelligent Machines (SIM) Lab, can learn simply by socializing.
According to the SIM Lab’s web site, Simon was partly developed due to the realization that if robots are to be effective in the classroom, developers will not be able preprogram the robots with every skill needed–the robots will need to interact and learn new things “on the job” from ordinary people.
Simon, an upper-torso robot with a “socially expressive head” and the body proportions of a 5-foot, 7-inch woman, learns from social attention and interactive task learning. For example, say Simon is handed an object. If Simon recognizes the object, it will drop that object into the appropriate color-coded bucket. If Simon has not learned where to put the object, it is told, and then will remember that object and its designation in the future.
Simon is also a proactive learner. If Simon is asked whether it has any questions, it will scan the environment to identify any objects it might not know.
Simon performing simple tasks
Simon is also programmed to recognize the most important aspects of its environment through visual and auditory stimuli, and then assigns value to these stimuli.
New features for Simon include voice recognition, facial recognition, sound localization, and an overall increase in processing speed.
“I want to see robots successfully helping people in human environments, and in particular, I want those robots to be easy for people to adapt and use in whatever way they see fit,” said Thomaz in an interview with SmartPlanet. “You shouldn’t have to learn how to program your robot. It should be intuitive to teach it what you want it to do for you.”
Thomaz said her lab is currently working on Simon’s interactive learning skills, focusing on nonverbal gestures for natural and intuitive turn-taking, which she hopes will improve the learning interaction.
Movellan said his lab is also working on the theory of machine learning, or algorithms that allow for machines to learn from examples.
“Up to recently, robotics focused on applications to very structured conditions, like industrial fabrication plants. In such conditions, you can solve most of the theoretical problems—inverse kinematics, inverse dynamics, trajectory generation, trajectory control—analytically,” he said.
“That’s not the case when you need to operate in unstructured environments, like a classroom. In order to solve robot control problems analytically, I would have to have a perfect mathematical theory of how children behave. That is never going to happen. Instead, we need robots that learn to adapt to the environment in which they operate. This includes learning to interact with children, learning to teach, learning to move around a cluttered room, and much more.”
Although robotics might be taking off in many service sectors, including classrooms, developers and researchers say it’s important to understand that robots might never have the full capabilities of a human.
“Putting your heart into teaching, wanting to help your students and make them feel good about learning—that is not easily replicable by any kind of hardware,” said Jacob Whitehill, an MP Lab researcher, during an interview with UCSD News. “… I don’t think humans have to fear for their jobs just yet.”
Note to readers:
Don’t forget to visit the Igniting and Sustaining STEM Education resource center. As the workplace changes and becomes increasingly global, today’s students must be educated with a 21st-century mindset. Science, technology, engineering, and math (STEM) skills are no longer just “good skills” to have; they are increasingly vital to a 21st-century education—and students should begin cultivating these skills as early as possible. Go to:
Igniting and Sustaining STEM Education
- #4: 25 education trends for 2018 - December 26, 2018
- Video of the Week: Dealing with digital distraction in the classroom - February 23, 2018
- Secrets from the library lines: 5 ways schools can boost digital engagement - January 2, 2018