Energy-Aware Adaptive Edge Intelligence Model for Resource-Constrained IoT Networks

Main Article Content

John Grasias S
K. Rajeswari
L. R. Sujithra

Abstract

The rapid proliferation of Internet of Things (IoT) devices has intensified the demand for intelligent data processing at the network edge. However, resource-constrained IoT nodes suffer from limited battery capacity, restricted memory, low computational power, and dynamic workload variations, making the deployment of conventional deep learning models inefficient and energy-intensive. To address these challenges, this paper proposes an Energy-Aware Adaptive Edge Intelligence (EA-AEI) model designed specifically for resource-constrained IoT environments. The proposed framework integrates dynamic model scaling, energy-aware inference control, adaptive pruning and quantization, and context-driven task offloading to optimize computational efficiency while maintaining predictive accuracy. Unlike traditional static TinyML and fixed edge inference frameworks, the EA-AEI model continuously monitors residual energy levels and system load conditions to adaptively select the most suitable model configuration in real time. This adaptive mechanism significantly reduces energy consumption, minimizes latency, and prolongs network lifetime without compromising performance. Experimental validation conducted on representative IoT datasets demonstrates that the proposed model outperforms existing edge intelligence approaches in terms of average energy consumption, inference latency, throughput, and system sustainability. The results confirm that integrating adaptive intelligence with energy-aware decision-making enables scalable and efficient deployment of AI models in next-generation IoT networks.

Article Details

Section
Articles