Wednesday, March 5, 2008

First Humanoid Robot That Will Develop Language May Be Coming Soon


Cub, a one meter-high baby robot which will be used to study how a robot could quickly pick up language skills, will be available next year.
iCub, a one metre-high baby robot which will be used to study how a robot could quickly pick up language skills, will be available next year.

Professor Chrystopher Nehaniv and Professor Kerstin Dautenhahn at the University of Hertfordshire’s School of Computer Science are working with an international consortium led by the University of Plymouth on ITALK (Integration and Transfer of Action and Language Knowledge in Robots), which begins on 1 March.

ITALK aims to teach the robot to speak by employing the same methods used by parents to teach their children. Professor Nehaniv and Professor Dautenhahn, who are European leaders in Artificial Intelligence and Human Robot Interaction, will conduct experiments in human and robot language interaction to enable the robot to converse with humans.

Typical experiments with the iCub robot will include activities such as inserting objects of various shapes into the corresponding holes in a box, serialising nested cups and stacking wooden blocks. Next, the iCub will be asked to name objects and actions so that it acquires basic phrases such as "robot puts stick on cube".

Professor Nehaniv said: “Our approach is that robot will use what it learns individually and socially from others to bootstrap the acquisition of language, and will use its language abilities in turn to drive its learning of social and manipulative abilities. This creates a positive feedback cycle between using language and developing other cognitive abilities. Like a child learning by imitation of its parents and interacting with the environment around it, the robot will master basic principles of structured grammar, like negation, by using these abilities in context.”

The scientific and technological research developed during the project will have a significant impact on the future generation of interactive robotic systems within the next ten years and the leadership role of Europe in this area.

Speaking about the research, Professor Dautenhahn said: “iCub will take us a stage forward in developing robots as social companions. We have studied issues such as how robots should look and how close people will want them to approach and now, within a year, we will have the first humanoid robot capable to developing language skills.”

Boys And Girls Brains Are Different: Gender Differences In Language Appear Biological


New research shows that areas of the brain associated with language work harder in girls than in boys during language tasks, and that boys and girls rely on different parts of the brain when performing these tasks.
Although researchers have long agreed that girls have superior language abilities than boys, until now no one has clearly provided a biological basis that may account for their differences.

For the first time -- and in unambiguous findings -- researchers from Northwestern University and the University of Haifa show both that areas of the brain associated with language work harder in girls than in boys during language tasks, and that boys and girls rely on different parts of the brain when performing these tasks.

"Our findings -- which suggest that language processing is more sensory in boys and more abstract in girls -- could have major implications for teaching children and even provide support for advocates of single sex classrooms," said Douglas D. Burman, research associate in Northwestern's Roxelyn and Richard Pepper Department of Communication Sciences and Disorders.

Using functional magnetic resonance imaging (fMRI), the researchers measured brain activity in 31 boys and in 31 girls aged 9 to 15 as they performed spelling and writing language tasks.

The tasks were delivered in two sensory modalities -- visual and auditory. When visually presented, the children read certain words without hearing them. Presented in an auditory mode, they heard words aloud but did not see them.

Using a complex statistical model, the researchers accounted for differences associated with age, gender, type of linguistic judgment, performance accuracy and the method -- written or spoken -- in which words were presented.

The researchers found that girls still showed significantly greater activation in language areas of the brain than boys. The information in the tasks got through to girls' language areas of the brain -- areas associated with abstract thinking through language. And their performance accuracy correlated with the degree of activation in some of these language areas.

To their astonishment, however, this was not at all the case for boys. In boys, accurate performance depended -- when reading words -- on how hard visual areas of the brain worked. In hearing words, boys' performance depended on how hard auditory areas of the brain worked.

If that pattern extends to language processing that occurs in the classroom, it could inform teaching and testing methods.

Given boys' sensory approach, boys might be more effectively evaluated on knowledge gained from lectures via oral tests and on knowledge gained by reading via written tests. For girls, whose language processing appears more abstract in approach, these different testing methods would appear unnecessary.

"One possibility is that boys have some kind of bottleneck in their sensory processes that can hold up visual or auditory information and keep it from being fed into the language areas of the brain," Burman said. This could result simply from girls developing faster than boys, in which case the differences between the sexes might disappear by adulthood.

Or, an alternative explanation is that boys create visual and auditory associations such that meanings associated with a word are brought to mind simply from seeing or hearing the word.

While the second explanation puts males at a disadvantage in more abstract language function, those kinds of sensory associations may have provided an evolutionary advantage for primitive men whose survival required them to quickly recognize danger-associated sights and sounds.

If the pattern of females relying on an abstract language network and of males relying on sensory areas of the brain extends into adulthood -- a still unresolved question -- it could explain why women often provide more context and abstract representation than men.

Ask a woman for directions and you may hear something like: "Turn left on Main Street, go one block past the drug store, and then turn right, where there's a flower shop on one corner and a cafe across the street."

Such information-laden directions may be helpful for women because all information is relevant to the abstract concept of where to turn; however, men may require only one cue and be distracted by additional information.