Towards artificial systems: what can we learn from human perception
* SPEAKERS
Name
Affiliation
E-mail
Porf. Dr. Heinrich Hermann Bülthoff
MPI for Biological Cybernetics / Korea Univ.
sec(at)apctp.org
* HOST(Applicant)
Name
Affiliation
E-mail
-
* DATE / TIME
2010-09-13, 16:00
* PLACE
512 Seminar Room, APCTP Headquarters, Pohang
* ABSTRACT
The question of how we perceive and interact with the world around us has been at the heart of cognitive and neuroscience research for the last decades. Despite tremendous advances in the field of computational vision – made possible by the development of powerful learning techniques as well as the existence of large amounts of labeled training data for harvesting - artificial systems have yet to reach human performance levels and generalization capabilities. In this contribution we want to highlight some recent results from perceptual studies that could help to bring artificial systems a few steps closer to this grand goal. In particular, we focus on the issue of spatio-temporal object representations (dynamic faces), face synthesis, as well as the need for taking into account multisensory data in models of object categorization. Having understood the important role of haptic feedback for human perception, we also explored new ways of exploiting it for helping humans (pilots) in solving difficult control tasks. This recent work on human machine interfaces naturally extends to the case of autonomous or intelligent machines such as robots that are currently envisioned to be pervasive in our society and closely cooperate with humans in their tasks. In all of these perceptual research lines, the underlying research philosophy was to combine the latest tools in computer vision, computer graphics, and virtual reality technology in order to gain a deeper understanding of biological information processing. Conversely, we discuss how the perceptual results can feed back into the design of better and more efficient tools for artificial systems.