| dc.contributor.author | Mehboob, Talha | |
| dc.contributor.author | Bashir, Noman | |
| dc.contributor.author | Iglesias, Jesus Oma?a | |
| dc.contributor.author | Zink, Michael | |
| dc.contributor.author | Irwin, David | |
| dc.date.accessioned | 2026-01-14T21:59:07Z | |
| dc.date.available | 2026-01-14T21:59:07Z | |
| dc.date.issued | 2025-12-03 | |
| dc.identifier.isbn | 979-8-4007-2238-7 | |
| dc.identifier.uri | https://hdl.handle.net/1721.1/164535 | |
| dc.description | SEC ’25, Arlington, VA, USA | en_US |
| dc.description.abstract | Federated Learning (FL) distributes machine learning (ML) training across edge devices to reduce data transfer overhead and protect data privacy. Since FL model training may span hundreds of devices and is thus resource- and energy-intensive, it has a significant carbon footprint. Importantly, since energy's carbon-intensity differs substantially (by up to 60×) across locations, training on the same device using the same amount of energy, but at different locations, can incur widely different carbon emissions. While prior work has focused on improving FL's resource- and energy-efficiency by optimizing time-to-accuracy, it implicitly assumes all energy has the same carbon intensity and thus does not optimize carbon efficiency, i.e., work done per unit of carbon emitted.
To address the problem, we design EcoLearn, which minimizes FL's carbon footprint without significantly affecting model accuracy or training time. EcoLearn achieves a favorable tradeoff by integrating carbon awareness into multiple aspects of FL training, including i) selecting clients with high data utility and low carbon, ii) provisioning more clients during the initial training rounds, and iii) mitigating stragglers by dynamically adjusting client over-provisioning based on carbon. We implement EcoLearn and its carbon-aware FL training policies in the Flower framework and show that it reduces the carbon footprint of training (by up to 10.8×) while maintaining model accuracy and training time (within ~1%) compared to state-of-the-art approaches. | en_US |
| dc.publisher | ACM|The Tenth ACM/IEEE Symposium on Edge Computing | en_US |
| dc.relation.isversionof | https://doi.org/10.1145/3769102.3770625 | en_US |
| dc.rights | Creative Commons Attribution | en_US |
| dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | en_US |
| dc.title | EcoLearn: Optimizing the Carbon Footprint of Federated Learning | en_US |
| dc.type | Article | en_US |
| dc.identifier.citation | Talha Mehboob, Noman Bashir, Jesus Omaña Iglesias, Michael Zink, and David Irwin. 2025. EcoLearn: Optimizing the Carbon Footprint of Federated Learning. In Proceedings of the Tenth ACM/IEEE Symposium on Edge Computing (SEC '25). Association for Computing Machinery, New York, NY, USA, Article 4, 1–16. | en_US |
| dc.contributor.department | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory | en_US |
| dc.identifier.mitlicense | PUBLISHER_POLICY | |
| dc.identifier.mitlicense | PUBLISHER_POLICY | |
| dc.eprint.version | Final published version | en_US |
| dc.type.uri | http://purl.org/eprint/type/ConferencePaper | en_US |
| eprint.status | http://purl.org/eprint/status/NonPeerReviewed | en_US |
| dc.date.updated | 2026-01-01T08:55:43Z | |
| dc.language.rfc3066 | en | |
| dc.rights.holder | The author(s) | |
| dspace.date.submission | 2026-01-01T08:55:43Z | |
| mit.license | PUBLISHER_CC | |
| mit.metadata.status | Authority Work and Publication Information Needed | en_US |