UT Austin’s SuperSFL Revolutionizes Federated Learning for Energy Efficiency

Researchers from the University of Texas at Austin, including Abdullah Al Asif, Sixing Yu, Juan Pablo Munoz, Arya Mazaheri, and Ali Jannesari, have developed a novel approach to improve federated learning in diverse edge environments. Their work, titled “SuperSFL: Resource-Heterogeneous Federated Split Learning with Weight-Sharing Super-Networks,” was recently published in the IEEE Internet of Things Journal.

Federated learning enables multiple edge devices to collaboratively train a machine learning model without sharing their raw data. However, in heterogeneous environments where devices have varying computational and communication capabilities, this process can face significant challenges. The researchers propose SuperSFL, a framework that combines federated learning and split learning to address these issues.

SuperSFL leverages a weight-sharing super-network to dynamically generate client-specific subnetworks tailored to each device’s resources. This approach helps mitigate the heterogeneity among devices. The framework also introduces a Three-Phase Gradient Fusion (TPGF) optimization mechanism that coordinates local updates, server-side computation, and gradient fusion to speed up convergence.

Moreover, SuperSFL includes a fault-tolerant client-side classifier and collaborative client-server aggregation. These features enable uninterrupted training even when communication failures occur. The researchers tested SuperSFL on CIFAR-10 and CIFAR-100 datasets with up to 100 heterogeneous clients. They found that SuperSFL converges 2 to 5 times faster in terms of communication rounds than baseline SFL methods, achieving higher accuracy. This results in up to 20 times lower total communication cost and 13 times shorter training time.

In terms of practical applications for the energy sector, SuperSFL could be particularly useful for smart grid management and energy forecasting. Edge devices like smart meters and sensors often have varying capabilities and intermittent connectivity. SuperSFL’s ability to handle heterogeneity and communication failures could enable more efficient and accurate collaborative learning among these devices, leading to better energy management and forecasting.

The researchers also demonstrated that SuperSFL improves energy efficiency compared to baseline methods, making it a practical solution for federated learning in heterogeneous edge environments. This could translate to energy savings and improved performance for energy-related applications that rely on distributed data from edge devices.

In summary, SuperSFL offers a promising approach to enhance federated learning in diverse edge environments, with potential benefits for the energy sector. By improving the efficiency and accuracy of collaborative learning among edge devices, SuperSFL could contribute to more effective smart grid management and energy forecasting.

This article is based on research available at arXiv.

Scroll to Top
×