Cognition and Artificial Life 2025 – TERAIS Webinar


Date
22 May 2025 09:00 — 10:40
Location
Hotel Park, Piešťany, Slovakia and MS Teams Meeting to join online

About Conference

The conference creates a space for presenting the results of the work of experts from the Czech and Slovak Republic interested in various aspects and methods of cognition research, whether in living or artificial systems. The aim of the conference is to stimulate professional discussions, to enable the exchange of knowledge in a pleasant atmosphere as well as to attract the younger generation to this dynamically developing interdisciplinary field of cognition, important for the 21st century.

The TERAIS section of the KUŽ / CAL conference will bring one keynote and three regular talks related to robotics and HRI.

Program

Time Title Speaker(s)
09:00 Keynote: Human-Guided Task Specification for Flexible Robotics Karla Štěpánová – CIIRC CTU, Prague
09:40 How to See Anything Andrej Lúčny – UKBA
10:00 Prediction of Observed Motor Trajectories in a Simulated Environment Radovan Gregor, Igor Farkaš, Kristína Malinovská – UKBA
10:20 Towards Spatial Memory of a Humanoid Robot Laxmi R. Iyer, Lukáš Gajdošech, Branislav Zigo – UKBA

Invited Speaker

Karla Štěpánová

Mgr. Karla Štěpánová, PhD., Czech Inst. of Informatics, Robotics, and Cybernetics, CTU in Prague Karla Štěpánová is a researcher at the CIIRC CTU and the head of the Robotic Perception Group. Her research focuses on developing AI-driven systems that enable robots to learn from human instructions and demonstrations. She is particularly interested in how robots can understand human intent and context by integrating data from multiple modalities using probabilistic models and multimodal neural networks. She has led the Robotic Perception Group since 2024. She earned her Ph.D. in Artificial Intelligence and Biocybernetics from the Faculty of Electrical Engineering at CTU in Prague in 2017 and a Master’s degree in Condensed Matter Physics from the Faculty of Mathematics and Physics at Charles University in 2010.

Talk Abstract: As robotics continues to expand into dynamic and small-batch production settings, the need for intuitive and flexible task specification is becoming increasingly important. This talk presents a novel approach to natural task specification, enabling rapid task definition and deployment without the burden of extensive programming. By integrating human demonstrations, language, and gestures, we create more accessible and adaptable ways to define task parameters and constraints. Additionally, we explore robust task representation methods that structure these specifications for easy transfer across different robotic setups and enable fluent transformation into executable robot plans. This approach also enhances adaptability to new environments and represents a significant step toward more human-centric, flexible robotic systems—bringing us closer to truly natural collaborative automation.