Cristian Sminchisescu (Google, Lund University)
Talk: GHUM, Interactions, and Active Human Sensing
Bio: Cristian Sminchisescu is a Research Scientist leading a team at Google, and a Professor at Lund University. He has obtained a doctorate in computer science and applied mathematics with focus on imaging, vision and robotics at INRIA, under an Eiffel excellence fellowship of the French Ministry of Foreign Affairs, and has done postdoctoral research in the Artificial intelligence Laboratory at the University of Toronto. He has held a Professor equivalent title at the Romanian Academy and a Professor rank, status appointment at Toronto, and has advised research at both institutions. During 2004-07, he was a faculty member at the Toyota Technological Institute at the University of Chicago, and later on the Faculty of the Institute for Numerical Simulation in the Mathematics Department at Bonn University. Cristian Sminchisescu regularly serves as an Area Chair for computer vision and machine learning conferences (CVPR, ECCV, ICCV, AAAI, NeurIPS), as a Program Chair for ECCV 2018, and an Associate Editor of IEEE Transactions for Pattern Analysis and Machine Intelligence (PAMI) and the International Journal of Computer Vision (IJCV). Over time, his work has been funded by the US National Science Foundation, the Romanian Science Foundation, the German Science Foundation, the Swedish Science Foundation, the European Commission under a Marie Curie Excellence Grant, and the European Research Council under an ERC Consolidator Grant. Cristian Sminchisescu's research interests are in the area of computer vision (3d human sensing, reconstruction and recognition) and machine learning (optimization and sampling algorithms, kernel methods and deep learning). The visual recognition methodology developed in his group was a winner of the PASCAL VOC object segmentation and labeling challenge during 2009-12, as well as the Reconstruction Meets Recognition Challenge (RMRC) 2013-14. His work on deep learning of graph matching has received the best paper award honorable mention at CVPR 2018.