Can social tagged images aid concept-based video search?
| Authors |
|
|---|---|
| Publication date | 2009 |
| Book title | 2009 IEEE International Conference on Multimedia and Expo, ICME 2009: Proceedings: June 28-July 3, 2009, Waldorf-Astoria Hotel, New York, New York, U.S.A. |
| ISBN |
|
| Event | 2009 IEEE International Conference on Multimedia and Expo (ICME 2009), New York, USA |
| Pages (from-to) | 1460-1463 |
| Publisher | Piscataway, NJ: IEEE |
| Organisations |
|
| Abstract |
This paper seeks to unravel whether commonly available social tagged images can be exploited as a training resource for concept-based video search. Since social tags are known to be ambiguous, overly personalized, and often error prone, we place special emphasis on the role of disambiguation. We present a systematic experimental study that evaluates concept detectors based on social tagged images, and their disambiguated versions, in three application scenarios: within-domain, cross-domain, and together with an interacting user. The results indicate that social tagged images can aid concept-based video search indeed, especially after disambiguation and when used in an interactive video retrieval setting. These results open-up interesting avenues for future research.
|
| Document type | Conference contribution |
| Published at | https://doi.org/10.1109/ICME.2009.5202778 |
| Permalink to this page | |
