Exquisitor at the Lifelog Search Challenge 2024 Blending Conversational Search with User Relevance Feedback
| Authors |
|
|---|---|
| Publication date | 2024 |
| Book title | Proceedings of 2024 ACM Workshop on the Lifelog Search Challenge (LSC'24) |
| Book subtitle | 10th June 2024, Phuket, Thailand |
| ISBN (electronic) |
|
| Event | 7th Annual ACM Workshop on the Lifelog Search Challenge, LSC 2024, held during the ACM ICMR 2024 |
| Pages (from-to) | 117-121 |
| Number of pages | 5 |
| Publisher | New York, New York: The Association for Computing Machinery |
| Organisations |
|
| Abstract |
The past decade has seen a rapid expansion of personal and interpersonal multimedia collections. These collections offer a wealth of information about individuals, including their interests, health, and significant life events. While automated techniques can assist in structuring and organizing these collections, they often have limitations in helping users effectively navigate and find relevant items within such large datasets. The Lifelog Search Challenge (LSC) provides a valuable benchmark for evaluating interactive retrieval systems designed for personal multimedia collections. Exquisitor utilizes a large-scale user relevance feedback (URF) approach for searching through large collections. To address challenges in highly descriptive retrieval tasks where the relevance feedback model may fail to identify essential elements, we have enhanced Exquisitor with conversational search capabilities powered by a Vision Language Model (VLM) and refined the features underlying the URF model. Furthermore, Exquisitor has been updated with a streamlined user interface that enables seamless switching between conversational search and URF modes. |
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.1145/3643489.3661132 |
| Other links | https://www.scopus.com/pages/publications/85197895795 |
| Downloads |
3643489.3661132-1
(Final published version)
|
| Permalink to this page | |
