Home      Log In      Contacts      FAQs      INSTICC Portal
 

Keynote Lectures

Keynote Lecture
Michael Beetz, University of Bremen, Germany

Bio-Inspired AI for Autonomous Systems
Jan Seyler, Festo SE & Co. KG, Germany

Interacting with Socially Interactive Agents
Catherine Pelachaud, CNRS/University of Pierre and Marie Curie, France

 

Keynote Lecture

Michael Beetz
University of Bremen
Germany
 

Brief Bio
Available soon.


Abstract
Available soon.



 

 

Bio-Inspired AI for Autonomous Systems

Jan Seyler
Festo SE & Co. KG
Germany
 

Brief Bio
Jan Seyler likes to spark the fire of inspiration in people. He is 35 years old and studied mathematics with focus on scientific computing at Heidelberg University and subsequently worked on his PhD in cooperation with Daimler and the University of Erlangen in the area of real-time communication systems. 2015 he started at Festo as a embedded software developer. Within two years, he became a semantic data engineer and in 2019 Lead AI Algorithm Developer. Since 2020 he is leading the Festo AI competence team and the department for AI, controls and embedded software within Festo research.
Additionally, Jan teaches Real-Time Systems and IoT at the Baden-Wuerttemberg Cooperative State University (DHBW) Stuttgart as well as Applied AI and Advanced Data Models at Esslingen University.
In his free time, Jan likes to spend time with his family, read and go exploring.


Abstract
“Automation is a fundamental part of industry and our private lives. At Festo, our vision is to free humans from harming tasks – these might be mentally (by being boring and repetitive) or even physically harming. Today, one aspect of an automation solution is that it is engineered to a specific problem and environment and often does not interact actively with humans. For humans to naturally interact with machines, it is needed that the machines can react to changing tasks and environments. Here, AI-based solutions come into play and offer a solution. Mechanisms inspired by biology offer efficient solutions to many questions arising in this field. This keynote will show concrete challenges from within the industrial automation and the solutions Festo research came up with.”



 

 

Interacting with Socially Interactive Agents

Catherine Pelachaud
CNRS/University of Pierre and Marie Curie
France
 

Brief Bio
Catherine Pelachaud received the Ph.D. degree in computer science from University of Pennsylvania, Philadelphia, PA, USA, in 1991. She is currently a CNRS director of research in the laboratory ISIR, Sorbonne University, where her research encompasses socially interactive agents, modeling of nonverbal communication and expressive behaviors. She has authored more than 200 articles. She is and was associate editors of several journals among which IEEE Transactions on Affective Computing, ACM Transactions on Interactive Intelligent Systems and International Journal of Human-Computer Studies. She has co-edited several books on virtual agents and emotion-oriented systems. She participated to the organization of international conferences such as IVA, ACII and AAMAS, virtual agent track. She was the recipient of 4 best papers award of IVA. She is recipient of the ACM – SIGAI Autonomous Agents Research Award 2015 and was honored the title Doctor Honoris Causa of University of Geneva in 2016. Her Siggraph’94 paper received the Influential paper Award of IFAAMAS (the International Foundation for Autonomous Agents and Multiagent Systems).


Abstract
In this talk, I will present our work toward building Socially Interactive Agent SIA, that is agent able to be socially aware, interact with human partners, but also adapt its behaviors to favor user’s engagement during the interaction.
During an interaction, partners adapt their behaviors to each other. Adaptation can happen at different levels, such as at the linguistic one (choice of vocabulary, grammatical style), the behavior one (change of posture), etc. It can involve the choice of a given conversational strategy to manage user’s impression, or synchronization mechanisms (eg imitate a smile)… Lately we have developed three adaptation mechanisms that work on different levels: conversational strategy, nonverbal behaviors and multimodal signals. I will present the general framework on which these models have been built. As it interacts with a human partner, the agent learns to adapt to maximize the quality of the interaction. These models have been evaluated through a study involving visitors of a science museum. I will discuss the results of these studies.
Touch is a modality that is not much considered in human-agent interaction. Social touch bares many functions during an interaction. A caress can convey comfort, a tap human’s attention, etc. We have been working on endowing an agent to respond to social touch by a human and to touch the human to convey different intentions and emotions. I will describe the decision model based on the emotional model FAtiMA we have developed.



footer