Dancing With Robots: Expressivity in Natural and Artificial Systems
Animal movement encodes information that is meaningfully interpreted by other natural systems, counterparts in the environment, and this is a behavior that roboticists are trying to replicate in artificial systems. Yet, prevailing mathematical models for movement are continuous, e.g. Newton’s models, while those for information are discrete, e.g., Shannon’s models. This talk presents an information-theoretic measure for the capacity of motion complexity of articulated platforms (both natural and artificial) and shows that this measure is stagnant and unexpectedly limited in extant robotic systems spanning the last 15 years. The talk points out fundamental limitations on mechanization, leveraging known limits on computation and challenging the idea that human and robot motion are comparable. The proposed measure, applied to a variety of natural and artificial systems, shows trends in increasing capacity in both internal and external complexity for natural systems while artificial, robotic systems have increased significantly in the capacity of computational (internal) states but remained more or less constant in mechanical (external) state capacity. This work presents a way to analyze trends in animal behavior and shows that robots may not be capable of the same multi-faceted behavior in rich, dynamic environments as natural systems. The talk will also outline work that aims to develop more expressive robotic systems and highlight how embodied movement analysis – and dancing with robots – has facilitated this process.