Minimizing Hyperbolic Embedding Distortion with LLM-Guided Hierarchy Restructuring

Open Access
Authors
Publication date 2025
Book title K-CAP '25
Book subtitle Proceedings of the 13th Knowledge Capture Conference 2025 : Dayton, Ohio, USA
ISBN (electronic)
  • 9798400718670
Event 13th International Conference on Knowledge Capture, K-CAP 2025
Pages (from-to) 123-130
Number of pages 8
Publisher New York, New York: Association for Computing Machinery
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract

Hyperbolic geometry is an effective geometry for embedding hierarchical data structures. Hyperbolic learning has therefore become increasingly prominent in machine learning applications where data is hierarchically organized or governed by hierarchical semantics, ranging from recommendation systems to computer vision. The quality of hyperbolic embeddings is tightly coupled to the structure of the input hierarchy, which is often derived from knowledge graphs or ontologies. Recent work has uncovered that for an optimal hyperbolic embedding, a high branching factor and single inheritance are key, while embedding algorithms are robust to imbalance and hierarchy size. To assist knowledge engineers in reorganizing hierarchical knowledge, this paper investigates whether Large Language Models (LLMs) have the ability to automatically restructure hierarchies to meet these criteria. We propose a prompt-based approach to transform existing hierarchies using LLMs, guided by known desiderata for hyperbolic embeddings. Experiments on 16 diverse hierarchies show that LLM-restructured hierarchies consistently yield higher-quality hyperbolic embeddings across several standard embedding quality metrics. Moreover, we show how LLM-guided hierarchy restructuring enables explainable reorganizations, providing justifications to knowledge engineers.

Document type Conference contribution
Language English
Published at https://doi.org/10.1145/3731443.3771357
Other links https://www.scopus.com/pages/publications/105024936133
Downloads
3731443.3771357 (Final published version)
Permalink to this page
Back