Research
Everything is connected. The nature of the world is much like a dense network of connected nodes, whether it may be molecules that interact, humans that communicate or weather systems that influence each other.
The Landgraf lab studies complex social systems - collectives made of individual living units that each observe their environment, learn what’s good and bad, and interact with conspecifics. These interactions (and how they change over time) often create emergent properties on the group level: from coordinated motion patterns of flocks and shoals to the democratic decision making process in humans. The whole is greater than the sum of its parts - but exactly how?
Models of Individual Behavior
To understand how collectives can be smart and how group patterns emerge, we often simplify the system, creating models that represent how we think each individual works: Does it have a memory of past events? How is an observation of the environment translated to behavior? Rather than using the human brain to come up with a model, a part of the Landgraf lab is concerned with (machine) learning models of individual behavior. How can we leverage large amounts of data to extract a model that not only is as accurate but also as generic as possible? How can we control the level of abstraction and how can we make sure we humans learn something about the actual living system and not just about our model?
An important part of our work is tracking individuals in groups and use this data to model their behavior. On the left you can see a simulation of two neural networks that interact in a virtual tank. The neural network model was trained on tracking data to imitate the tracking data of fish optimally.
Models of Group Behavior
Similarly, we can use machine learning to come up with a model that reflects the general properties of the group dynamics (rather than using the individual models to simulate a group). Looking at many hundred or thousand honeybees, for example, we have been able to study life-long social networks and learn about different interaction patterns bees produce when performing one of the many roles there are in a colony.
Now, we can invert this idea: knowing with whom you meet tells us something about your task in the collective!
With our bee tracking system "BeesBook", we obtained lifetime tracking data for several thousand bees. From the original images and those tracks, we can detect a variety of behaviors: waggle dances and dance-following movements, food exchanges and many more. These behaviors are then used to construct social interaction networks. For a given day, these networks represent how long, or how many times a pair of individuals engaged in a specific interaction. This "descriptor" of a bee day contains several thousand numbers. In a recent paper we have shown how to compress those into a single number that we have called “Network Age” (Wild et al. 2021). We have shown that this network age is a good predictor of the role a bee will take in the future. Bees with a high network age are more likely to be foraging, while bees with a low network age are more likely to be actively engaged in nursing.
Robots that interact with the living system
In the scientific literature, there are countless mathematical models that describe the behavior of individual members of a group. How can we be sure whether a given model is realistic or under which circumstances it may not be accurate? A large portion of our work is creating physical animal models that look and act like their animal counterparts. These biomimetic robots can observe the real environment and act upon it. We can then use these robots to test our models in the real world. Our "RoboFish" is an open source platform that can be used to study collective behavior in fish. We have used it to e.g. to find that fish anticipate the movement of their shoal mates (Bierbach et al. 2022) or that fish prefer socially competent (robot) leaders (Maxeiner et al. 2023). With "RoboBee" we have studied the honeybee dance to better understand which cues bees use to follow and decode a dance (Landgraf et al. 2018).
Robots can be used for a much wider variety of questions and applications. We have used quadrocopters to record neural activity of bees that were flown over real-world scenes, or applied a mechanism observed in honeybees to electric cars allowing them to exchange energy between electric cars.
Explainabilty and Interpretability
Modern machine learning models (i.e. deep neural networks, DNNs) are powerful. They can learn virtually any mapping between input-output pairs. Whether we use DNNs to learn individual or collective behavior from data, or use DNNs in predicting cardiovascular risk, we would like some explanation as to why the model does what it does. Did the model use spurious correlations, or which features did it use?
To understand which input regions were relevant for the decision of a image classification network, we recently proposed restricting the amount of information that is allowed to flow through the network with so-called information bottlenecks. We introduce as much random noise into the activations of early network layers as possible without impairing the classification result. Those areas that received much noise are irrelevant!
Funding and Sponsoring
We would like to thank the following funding agencies and sponsors for their generous support of our research:
Time Frame | Acronym | Title of Project | Funding Body |
---|---|---|---|
2023 - 2026 | Sublethal FX | Using Novel Tools to Resolve an Old Issue in Bee Ecotoxicology - Application of Digital Methods to Understand Sublethal Effects of Pesticides to Bees and Their Significance in Regulatory Risk Assessment | Bayer Crop Science |
2023 - 2026 | Open.Make II | Implementing Open and FAIR Hardware | Berlin University Alliance |
2022 - 2024 | IMPACT | Implementierung von KI-basiertem Feedback und Assessment mit Trusted Learning Analytics in Hochschulen | BMBF |
2021 - 2024 | PetraKIP | Persönliches transparentes KI-basiertes Portfolio für die Lehrerbildung | BMBF |
2021 - 2023 | Open.Make I | Towards Open and FAIR Hardware | BMBF |
2020 - 2023 | ElektroFish | Roboter, die mit Fischen kommunizieren: Untersuchung der Rolle elektrischer Signale und Bewegungsmustern nach Episoden gegenseitiger Aufmerksamkeit bei schwach elektrischen Fischen | DFG |
2019 - 2021 | BrainRL | Learning the Language of the Brain: Adaptive Brain-Machine Interfaces Maximize Information Transfer Through Autonomous Interaction with Brain Tissue | Volkswagen Foundation |
2019 - 2024 | Hiveopolis | Futuristic Beehives for a smart Metropolis | EU H2020 FET |
2017 - 2021 | RoboFish | Mixed Shoals of Live Fish and Interactive Robots for the Analysis of Collective Behavior in Fish | DFG |
2016 - 2021 | NeuroCopter | Robotik in der Neurobiologie: Ziele finden mit einem winzigen Gehirn. Die neuronalen Grundlagen der Navigation der Bienen. | Dr.-Klaus-Tschira-Stiftung |