Power Transform Revisited: Numerically Stable, and Federated
Abstract
Power transforms are extended to federated learning to improve numerical stability and robustness.
Power transforms are popular parametric techniques for making data more Gaussian-like, and are widely used as preprocessing steps in statistical analysis and machine learning. However, we find that direct implementations of power transforms suffer from severe numerical instabilities, which can lead to incorrect results or even crashes. In this paper, we provide a comprehensive analysis of the sources of these instabilities and propose effective remedies. We further extend power transforms to the federated learning setting, addressing both numerical and distributional challenges that arise in this context. Experiments on real-world datasets demonstrate that our methods are both effective and robust, substantially improving stability compared to existing approaches.
Community
TL;DR: Numerically stable power transforms with an extension to federated learning.
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- PEARL: Performance-Enhanced Aggregated Representation Learning (2025)
- Gradient Shaping Beyond Clipping: A Functional Perspective on Update Magnitude Control (2025)
- Stable Forgetting: Bounded Parameter-Efficient Unlearning in LLMs (2025)
- PMODE: Theoretically Grounded and Modular Mixture Modeling (2025)
- Factor Decorrelation Enhanced Data Removal from Deep Predictive Models (2025)
- Calibration Meets Reality: Making Machine Learning Predictions Trustworthy (2025)
- DIM: Enforcing Domain-Informed Monotonicity in Deep Neural Networks (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper