Visual Attention-based Small Screen Adaptation for H.264 Videos
MetadataShow full item record
We develop a framework that uses visual attention analysis combined with temporal coherence to detect the attended region from a H.264 video bitstream, and display it on a small screen. A visual attention module based upon Walther and Koch's model gives us the attended region in I-frames. We propose a temporal coherence matching framework that uses the motion information in P-frames to extend the attended region over the H.264 video sequence. Evaluations show encouraging results with over 80% successful detection rate for objects of interest, and 85% respondents claiming satisfactory output.
Cite this version of the work
Abir Mukherjee (2008). Visual Attention-based Small Screen Adaptation for H.264 Videos. UWSpace. http://hdl.handle.net/10012/3929