The major goal of this project is to leverage eye-tracking data, including gaze positions and pupil diameter changes, as an implicit annotation for video interesting-ness prediction. Eye tracking users as they watch videos provides a stream of rich data that contains patterns of attention and emotion. Gaze positions reveal where users focused their attention, and pupil diameter changes index their arousal state. Toward the major goal, the research funded by this award has developed pupillary light response models to account for brightness related pupil diameter changes and methods to measure and visualize interesting regions in images and videos, including omnidirectional 360 content. As the premise of the project is abundant availability of eye tracking data, the activities supported by this award have additionally investigated security and privacy threats and associated mitigations for eye tracking data.