© Universität Bielefeld
uni.news
Published on
11. Juli 2018
Category
General
Following nature's example: animals are prototypes for robotics
The natural way in which living beings orient themselves and move in their environment, avoid obstacles, and find their way home again without their brain consuming much energy is a model for scientists who also want to equip robots with such complex abilities. Because in this field there is a wide gulf between the brain and electronics. The exchange of expertise between behavioural neurologists and robotics scientists is the focus of a symposium at the FENS Forum 2018 in Berlin (7.-11.7. 2018).
Bumblebees have a small brain but they cover considerable distance in the search for food. Depending on the species, their radius is up to three kilometres. The flight route is teeming with enemies and obstacles. Changing wind speeds and wind directions add to the hazards. The insects have to steer their flight through a changeable environment, navigate extensively and learn how to find a good source of food and get home to their nests.
"When bumblebees leave their nest for the first time, they take flights to learn their surroundings so that they can find their way back," says Dr. Olivier Bertrand from the Department of Neurobiology at Bielefeld University, Germany. "These flights have a loop-like pattern, whereby the pattern varies from animal to animal, as our studies show. We assume that the bumblebees store snapshots of their environment in their brain, the usefulness of which is checked on subsequent flights."
When flying within a complex cluttered environment, bees constantly need to evaluate the environmental features and have to make decisions that influence the flight course. Dr. Shridar Ravi from the RMTI University in Melbourne, Australia, used bumblebees to seek insights into the mechanisms used for gap identification when the bees are confronted with an obstacle in their flight path and have to assess gap properties. Bees spend significant time in the near vicinity of the gap while performing rapid lateral maneuvers and looking at the gap, as if they would scan the gap to collect important information. In doing so the bee could detect the edges of the gap by utilising the difference between the relative motion of the gap edges and the foreground or background: a closer object moves relative faster than objects in the background.
As long as the capabilities of robots are limited, linking the abilities of animals with those of robots could be helpful. The team led by Prof. Dr. Noriyasu Ando from the Research Center for Advanced Science and Technology in Tokyo has taken this path: they have developed an insect-driven mobile robot. "A male silkmoth sits in a cockpit and his walking controls the robot
and directs it to a female moth as soon as he notices her sexual pheromone and reacts to it", is how Professor Ando describes the principle.
"From a technical point of view, this hybrid robot’s performance matches our goal: the future insect mimetic robot will have the model of the insect brain." The hybrid robot also provides scientists with insights into the behaviour of insects. By changing the sensory input and/or the motor output of the robot, the team was able to uncover the sensory-motor control of the reactions of silkmoths to odours. "The hybrid robot enables us to compare an insect brain with an electronic model," said Professor Ando. "Now the robot is controlled directly by a real silkmoth. If the insect is replaced by a robot model of this insect, we can directly compare the performance of the insect brain with that of the model brain on this robot platform. It's still a conceptual idea, but we're working on it.”
Neuromorphic systems are similar to the neuronal networks of the brain. Their hardware is highly specialized and highly interconnected. A team led by Prof. Dr. Elisabetta Chicca of Bielefeld University has developed a neuromorphic model that will enable autonomous mobile systems to navigate better and avoid obstacles in complex environments. The "laser eyes" (laser rangefinder) of autonomous cars detect obstacles, but are very expensive despite many years of development. They consume a lot of energy and - as current incidents have shown - misinterpretations occur in certain situations. The system of the Bielefeld scientists could bring progress in this area.
"We have developed a new electronic motion detector, the “Spiking Elementary Motion Detector”, which can detect the relative motion of objects”, says Professor Chicca. Every car or train driver knows what a "relative movement" is: the church tower in the distance glides slowly past, while the tree at the roadside rushes very quickly past. Insects use such information during navigation in the terrain to avoid collisions.
The new motion detector, sEMD for short, is a technical nerve cell with an artificial synapse. It can pick up signals and produce signals when two pulses arrive within a certain time - hence the name suffix "spiking". A chip can carry thousands of these detectors, depending on the experiment.
The detectors receive their input from innovative neuromorphic cameras, developed by a company in switzerland. In contrast to normal cameras, the pixels of the sensors in these cameras only produce a signal independently if something changes in their "field of vision". These signals are picked up by the motion detector's receptive fields. Each detector has two receptive fields, each receiving signals from nine pixels. If more than half of the pixels of a receptive field are activated, the receptive field produces a signal that is further processed by the detector. The detector can calculate the relative speed at which an object moves in front of the camera based on the time intervals between the signals of two adjacent receptive fields. "Our experiments show that it is possible to generate information for the navigation of robots that avoid collisions," explains Professor Chicca. "Our results pave the way for the construction of low-power compact systems for autonomous navigation. In addition, the sEMD is a universally applicable element for calculating time differences and can therefore also be used for processing other sensory stimuli, for example for locating the source of a sound.
Contact
Dr. Olivier Bertrand
Universität Bielefeld, Neurobiologie
https://ekvv.uni-bielefeld.de/pers_publ/publ/PersonDetail.jsp?personId=34427052
Prof. Dr. Elisabetta Chicca
Universität Bielefeld, AG Neuromorphic Behaving Systems
https://ekvv.uni-bielefeld.de/pers_publ/publ/PersonDetail.jsp?personId=26461080
Bumblebees have a small brain but they cover considerable distance in the search for food. Depending on the species, their radius is up to three kilometres. The flight route is teeming with enemies and obstacles. Changing wind speeds and wind directions add to the hazards. The insects have to steer their flight through a changeable environment, navigate extensively and learn how to find a good source of food and get home to their nests.
"When bumblebees leave their nest for the first time, they take flights to learn their surroundings so that they can find their way back," says Dr. Olivier Bertrand from the Department of Neurobiology at Bielefeld University, Germany. "These flights have a loop-like pattern, whereby the pattern varies from animal to animal, as our studies show. We assume that the bumblebees store snapshots of their environment in their brain, the usefulness of which is checked on subsequent flights."
When flying within a complex cluttered environment, bees constantly need to evaluate the environmental features and have to make decisions that influence the flight course. Dr. Shridar Ravi from the RMTI University in Melbourne, Australia, used bumblebees to seek insights into the mechanisms used for gap identification when the bees are confronted with an obstacle in their flight path and have to assess gap properties. Bees spend significant time in the near vicinity of the gap while performing rapid lateral maneuvers and looking at the gap, as if they would scan the gap to collect important information. In doing so the bee could detect the edges of the gap by utilising the difference between the relative motion of the gap edges and the foreground or background: a closer object moves relative faster than objects in the background.
As long as the capabilities of robots are limited, linking the abilities of animals with those of robots could be helpful. The team led by Prof. Dr. Noriyasu Ando from the Research Center for Advanced Science and Technology in Tokyo has taken this path: they have developed an insect-driven mobile robot. "A male silkmoth sits in a cockpit and his walking controls the robot
and directs it to a female moth as soon as he notices her sexual pheromone and reacts to it", is how Professor Ando describes the principle.
"From a technical point of view, this hybrid robot’s performance matches our goal: the future insect mimetic robot will have the model of the insect brain." The hybrid robot also provides scientists with insights into the behaviour of insects. By changing the sensory input and/or the motor output of the robot, the team was able to uncover the sensory-motor control of the reactions of silkmoths to odours. "The hybrid robot enables us to compare an insect brain with an electronic model," said Professor Ando. "Now the robot is controlled directly by a real silkmoth. If the insect is replaced by a robot model of this insect, we can directly compare the performance of the insect brain with that of the model brain on this robot platform. It's still a conceptual idea, but we're working on it.”
Neuromorphic systems are similar to the neuronal networks of the brain. Their hardware is highly specialized and highly interconnected. A team led by Prof. Dr. Elisabetta Chicca of Bielefeld University has developed a neuromorphic model that will enable autonomous mobile systems to navigate better and avoid obstacles in complex environments. The "laser eyes" (laser rangefinder) of autonomous cars detect obstacles, but are very expensive despite many years of development. They consume a lot of energy and - as current incidents have shown - misinterpretations occur in certain situations. The system of the Bielefeld scientists could bring progress in this area.
"We have developed a new electronic motion detector, the “Spiking Elementary Motion Detector”, which can detect the relative motion of objects”, says Professor Chicca. Every car or train driver knows what a "relative movement" is: the church tower in the distance glides slowly past, while the tree at the roadside rushes very quickly past. Insects use such information during navigation in the terrain to avoid collisions.
The new motion detector, sEMD for short, is a technical nerve cell with an artificial synapse. It can pick up signals and produce signals when two pulses arrive within a certain time - hence the name suffix "spiking". A chip can carry thousands of these detectors, depending on the experiment.
The detectors receive their input from innovative neuromorphic cameras, developed by a company in switzerland. In contrast to normal cameras, the pixels of the sensors in these cameras only produce a signal independently if something changes in their "field of vision". These signals are picked up by the motion detector's receptive fields. Each detector has two receptive fields, each receiving signals from nine pixels. If more than half of the pixels of a receptive field are activated, the receptive field produces a signal that is further processed by the detector. The detector can calculate the relative speed at which an object moves in front of the camera based on the time intervals between the signals of two adjacent receptive fields. "Our experiments show that it is possible to generate information for the navigation of robots that avoid collisions," explains Professor Chicca. "Our results pave the way for the construction of low-power compact systems for autonomous navigation. In addition, the sEMD is a universally applicable element for calculating time differences and can therefore also be used for processing other sensory stimuli, for example for locating the source of a sound.
Contact
Dr. Olivier Bertrand
Universität Bielefeld, Neurobiologie
https://ekvv.uni-bielefeld.de/pers_publ/publ/PersonDetail.jsp?personId=34427052
Prof. Dr. Elisabetta Chicca
Universität Bielefeld, AG Neuromorphic Behaving Systems
https://ekvv.uni-bielefeld.de/pers_publ/publ/PersonDetail.jsp?personId=26461080