Incremental vs batch learning: A comparative analysis of efficiency and energy consumption
|
Full Text |
Pdf
|
|
Author |
Fahad Alkamli, Morched Derbali, and Tariq Mohamed Ahmed
|
|
e-ISSN |
1819-6608 |
|
On Pages
|
1836-1846
|
|
Volume No. |
20
|
|
Issue No. |
20
|
|
Issue Date |
January 20, 2026
|
|
DOI |
https://doi.org/10.59018/1025207
|
|
Keywords |
incremental learning, batch learning, data streams, river framework, online machine learning, concept drift, real-time processing.
|
Abstract
Given the increasing demand for machine learning algorithms suitable for deployment on devices with limited memory and power resources, there is a clear need for frameworks that prioritize power and memory efficiency. A prominent candidate is the River framework. This study compares incremental and batch learning models, with a focus on efficiency and energy consumption. We evaluate the performance of three representative machine learning algorithms, Multinomial Naive Bayes, Logistic Regression, and Adaptive Random Forest, using the River framework, which enables instance-based learning on data streams. The findings indicate that while incremental learning can reduce memory footprint and training time, it incurs higher energy consumption during the early stages. Conversely, batch learning benefits from aggregated data but faces scalability challenges in resource-constrained environments. This paper explores the trade-offs between the two approaches and guides their applicability in domains such as cybersecurity and IoT.
Back