Agents can generalize to novel levels of abstraction by using adaptive linguistic strategies
Published in submitted to ACL Rolling Review, 2024
Recommended citation: Kobrock, K., Ohmer, X., Bruni, E., & Gotzner, N. (2024). “Agents can generalize to novel levels of abstraction by using adaptive linguistic strategies.” submitted to ARR. https://doi.org/10.31234/osf.io/km28h_v2
We study abstraction in an emergent communication paradigm. In emergent communication, two artificial neural network agents develop a language while solving a communicative task. In this study, the agents play a concept-level reference game. This means that the speaker agent has to describe a concept to a listener agent, who has to pick the correct target objects that satisfy the concept. Concepts consist of multiple objects and can be either more specific, i.e. the target objects share many attributes, or more generic, i.e. the target objects share fewer attributes. We test two directions of zero-shot generalization to novel levels of abstraction: When generalizing from more generic to very specific concepts, agents utilize a compositional strategy. When generalizing from more specific to very generic concepts, agents utilize a more flexible linguistic strategy that involves reusing many messages from training. Our results provide evidence that neural network agents can learn robust concepts based on which they can generalize using adaptive linguistic strategies. We discuss how this research provides new hypotheses on abstraction and informs linguistic theories on efficient communication.
Recommended citation: Kobrock, K., Ohmer, X., Bruni, E., & Gotzner, N. (2024). “Agents can generalize to novel levels of abstraction by using adaptive linguistic strategies.” submitted to ARR.