Stop Thinking, Just Do!

Sungsoo Kim's Blog

Towards Lifelong Learning Machines (L2L)

tagsTags

28 July 2020


Article Source


Towards Lifelong Learning Machines (L2L)

Abstract

One shot learning is a paradigm in learning theory that explores the ability of machines to recognize a certain class or category of objects from observing only a single instance of it. That means that the system needs to be able to generalize well enough to correctly categorize future observations of the same “thing” based on the fact that this new observation shares fundamental commonalities with the previous observed example. Classical machine learning approaches study this problem as a pure numerical challenge, in which the “better” algorithm is judged purely on better accuracy of classification. But this trivializes the power of the technique in real-world applications in which the context of the observation is absolutely critical to efficient generalization. In fact, without a context-dependent model of what is to be observed, one-shot learning would be an oxymoron. One shot gesture recognition is such a challenge in which teams compete to recognize hand/arm gestures after only one training instance.

My work proposes a novel solution to the problem of one-shot recognition applied to human action, and specifically human gestures, using an integrative approach: a method to capture the variance of a gesture by looking at both the process of human cognition and the execution of the movement, rather than just looking at the outcome (the gesture itself). We achieve this from the perspectives of neuroscience and linguistics by employing EEG sensors on human observers to try to capture what they actually remember from the gesture. Further, we propose to leverage one shot learning (OSL) approaches coupled with conventional ZSL approaches to address and solve the problem of Hard Zero Shot Learning (HZSL). The main aim of HZSL is to be able to recognize unseen classes (zero examples) with limited (one or few examples per class) training information.

About the Speaker

Juan Wachs is the James A. and Sharon M. Tompkins Rising Star Associate Professor in the Industrial Engineering School at Purdue University. He is also an Adjunct Associate Professor of Surgery at Indiana University School of Medicine. He is the director of the Intelligent Systems and Assistive Technologies (ISAT) Lab at Purdue, and he is affiliated with the Regenstrief Center for Healthcare Engineering. He leads the Intelligent Systems and Assistive Technologies (ISAT) Laboratories at Purdue. Dr. Wachs received his B.Ed.Tech in Electrical Education in ORT Academic College, at the Hebrew University of Jerusalem campus. His M.Sc and Ph.D in Industrial Engineering and Management from the Ben-Gurion University of the Negev, Israel. He completed postdoctoral training at the Naval Postgraduate School’s MOVES Institute under a National Research Council Fellowship from the National Academies of Sciences. He pioneered the field of gesture interaction in healthcare, for applications in the operating room and austere environments. He co-authored a Best Paper Award Finalist IEEE International Conference on Systems, Man, and Cybernetics, was awarded the 2012 Air Force Summer Faculty Fellowship Program (SFFP), awarded IEEE Appreciation Award for outstanding contribution to the success of Spring 2012 Section Conference. He is the recipient of the 2013 Air Force Young Investigator Award, co-authored the poster presentation award AAAI 2015. He was also awarded the 2015 Helmsley Senior Scientist Fellow, and he is the 2016 Fulbright Scholar and the 2017 Rising Star Professor. He is the technical advisor to “prehensile technologies” which looks at how to develop technologies to improve peoples with disabilities’ wellbeing. His research interests include human-machine interaction, gesture recognition and assistive robotics.


comments powered by Disqus