Skip to content

pratikk-bulani/gaze_guided_cinematic_editing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Gaze-guided Cinematic Editing of Wide-Angle Monocular Video Recordings

This is my own implementation of the paper: https://dl.acm.org/doi/pdf/10.1145/3313831.3376544

Dataset

Please download the dataset from the link: https://iiitaphyd-my.sharepoint.com/:f:/g/personal/pratikkumar_bulani_alumni_iiit_ac_in/EsyJYpTHMrlHvjq7TV98RVwBjfLs2lymKt0l8-fafL1CPA?e=GrrfXM

Outputs

Please refer to the following playlist on YouTube: https://www.youtube.com/playlist?list=PL7i-RSp-AiPPUben2DGYi1BRabi3tr0aO

Execution

Just set your global variables in GazEd.ipynb file

Citation

@inbook{10.1145/3313831.3376544,
author = {Moorthy, K. L. Bhanu and Kumar, Moneish and Subramanian, Ramanathan and Gandhi, Vineet},
title = {GAZED– Gaze-Guided Cinematic Editing of Wide-Angle Monocular Video Recordings},
year = {2020},
isbn = {9781450367080},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3313831.3376544},
abstract = {We present GAZED– eye GAZe-guided EDiting for videos captured by a solitary, static, wide-angle and high-resolution camera. Eye-gaze has been effectively employed in computational applications as a cue to capture interesting scene content; we employ gaze as a proxy to select shots for inclusion in the edited video. Given the original video, scene content and user eye-gaze tracks are combined to generate an edited video comprising cinematically valid actor shots and shot transitions to generate an aesthetic and vivid representation of the original narrative. We model cinematic video editing as an energy minimization problem over shot selection, whose constraints capture cinematographic editing conventions. Gazed scene locations primarily determine the shots constituting the edited video. Effectiveness of GAZED against multiple competing methods is demonstrated via a psychophysical study involving 12 users and twelve performance videos.},
booktitle = {Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems},
pages = {1–11},
numpages = {11}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published