Interactive content-based visualizations for multimedia search
| Authors | |
|---|---|
| Supervisors | |
| Award date | 18-10-2017 |
| ISBN |
|
| Number of pages | 117 |
| Organisations |
|
| Abstract |
Finding images or videos in multimedia collections is a difficult task. Many collections only have metadata such as filenames or timestamps, and no other information is available. To augment this, we can employ content based analysis techniques that provides extra content based metadata. This provides a good starting point, but the accuracy is often insufficient to automate full collection categorization. A human in the loop is essential to aid with search and categorization.
In this thesis we evaluate how to retrieve elements from multimedia collections for a variety of retrieval tasks. We investigate different user interfaces that extend content based retrieval methods with novel user interface techniques. In one interface, MediaTable, we focus on categorization tasks by leveraging table-style user interfaces with images so users can investigate both the multimedia content and associated metadata at the same time. Users can categorize elements by placing them in buckets, and we perform experiments with aiding the user with their categorization task by automatically categorizing similar elements. We find that MediaTable provides an efficient categorization process and it provides users with valuable insight into the collection. In the RotorBrowser and ForkBrowser, we focus on retrieval tasks by enhancing search results by linking similar results together as threads which the user can follow. This allows users to browse not only through results from the original query, but also to find visually, semantically or temporally related results. This was shown to improves the performance as it encourages people to explore new parts of the collection. |
| Document type | PhD thesis |
| Language | English |
| Downloads | |
| Permalink to this page | |
