Prof. Ioannis (Yiannis) Patras

Photo of Ioannis Patras

Title: Professor in Computer Vision and Human Sensing
Tel: +44 20 7882 7523
Room: E211, School of Electronic Engineering and Computer Science
Research Group: Multimedia and Vision Research Group

Address: School of EECS, Queen Mary University of London, Mile End road, E1 4NS, London, UK

My research is in the area of ‘’Looking at /Sensing People’’, using Machine Learning, Computer Vision and Signal Processing methodologies to interpret and predict actions, behaviour, emotions and cognitive states of people by analysing their images, video and neuro-physiological signals. This includes detection, tracking and recognition of facial and body gestures in unconstrained environments. I have more than 200 publications in the most selective Journals and conferences in the field of Computer Vision, an h-index of 33, and more than 5000 citations. I am associate editor in the journals of Pattern Recognition, Computer Vision and Image Understanding, and Image and Vision Computing, and area chair or in the programme committee of all the major conferences in the field. My research has been funded by the EPSRC, EU FP7 and direct bilateral collaborations with research institutes and the industry.

Google Scholar profile


A number of PhD positions are open in the fields of Computer Vision, Machine Learning and Affective Computing. The emphasis is on the development and application of Machine Learning methods (in particular Deep Learning) on the analysis of visual (images/video), social, and neuro-physiological (e.g. EEG) signals for understanding and interpreting human actions, activity and affective states. A Masters degree in Computer Science or Electronic/Electrical Engineering is required. The positions offer the possibility for close collaboration with the industry and with top academic institutions in Europe.

Contact me with a copy of your CV and a transcript of your grades (for the PhD positions), at i.patras at

Recent News

Professional activities

Associate editor in