Elastoformer: Enabling Dynamic Adaptivity via Elastic Model Transformation
| Authors | |
|---|---|
| Publication date | 2025 |
| Book title | SEC '25 |
| Book subtitle | Proceedings of the 2025 The Tenth ACM/IEEE Symposium on Edge Computing : December 3-6, 2025, Arlington, VA, USA |
| ISBN (electronic) |
|
| Event | 10th Symposium on Edge Computing |
| Number of pages | 14 |
| Publisher | New York, New York: Association for Computing Machinery |
| Organisations |
|
| Abstract |
EdgeAI systems are increasingly deploying computer vision applications to enable intelligent, on-device decisionmaking in real-time. However, these deployments face highly dynamic operational conditions, with fluctuating constraints on latency, power availability, and memory resources. Deep Neural Networks (DNN), which follow fixed computational execution flows, lack the flexibility to adapt to such variability, resulting in inefficient and suboptimal performance in edge scenarios. This underscores the need for architectures that are not only efficient but also dynamically scalable at runtime. In this paper, we propose Elastoformer: a framework that transforms conventional deep learning models into elastic models capable of real-time dynamic adaptivity. Unlike the conventional bag-of-models approach, which requires maintaining multiple independent models for different operating conditions, Elastoformer offers a single, modular solution that dynamically switches between multiple modes of operation at runtime, adapting efficiently to the changing computational budgets of edge devices without the overhead of managing separate models
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.1145/3769102.3770612 |
| Other links | https://github.com/sudaksh14/Elastoformer |
| Downloads |
Elastoformer_preprint
(Submitted manuscript)
3769102.3770612
(Final published version)
|
| Permalink to this page | |
