Predavanje: "Learning from Where We Look: AI in Eye-Tracking Research"

Datum začetka: 23. 12. 2025 Ura začetka: 13:15 Lokacija: Predavalnica LST-A4

V okviru predmeta Obdelava slik in videa (OSV) na magistrskem študiju Informacijsko komunikacijske tehnologije (IKT) bo v torek, 23. 12. 2025, ob 13.15 v predavalnici LST-A4 potekalo naslednje vabljeno predavanje v angleškem jeziku.

Naslov: Learning from Where We Look: AI in Eye-Tracking Research

Predavatelj: znan. sod. dr. Bulat Ibragimov (Univerza v Kopenhagnu, Fakulteta za računalništvo & Univerza v Ljubljani, Fakulteta za elektrotehniko)

Povzetek v angleškem jeziku

In recent years, eye-tracking has emerged as a powerful tool for enriching computer vision and human-centered AI. Gaze data encodes not only where we look, but how we interpret, miss, or focus on visual information. This talk explores how eye-tracking can be leveraged for a range of visual tasks, including error prediction in decision-making, gaze-assisted image segmentation, expertise modeling, and even real-time human-in-the-loop interaction. I will present recent work on building learning pipelines from raw gaze to attention heatmaps, using eye-tracking to annotate large datasets faster, and communicating with computers.

Sledenje očem in umetna inteligenca (kreirano s pomočjo UI).

 

As part of the course in Image and video processing within the master’s programme in Information-Communication Technologies (ICT), the following invited lecture in English will take place on Tuesday, Dec 23, 2025 at 13:15 in lecture room LST-A4.

Title: Learning from Where We Look: AI in Eye-Tracking Research

Lecturer: Assoc. Prof. Bulat Ibragimov, Ph.D. (University of Copenhagen, Department of Computer Science & University of Ljubljana, Faculty of Electrical Engineering)

Abstract

In recent years, eye-tracking has emerged as a powerful tool for enriching computer vision and human-centered AI. Gaze data encodes not only where we look, but how we interpret, miss, or focus on visual information. This talk explores how eye-tracking can be leveraged for a range of visual tasks, including error prediction in decision-making, gaze-assisted image segmentation, expertise modeling, and even real-time human-in-the-loop interaction. I will present recent work on building learning pipelines from raw gaze to attention heatmaps, using eye-tracking to annotate large datasets faster, and communicating with computers.