Abstract:
Federated learning has emerged as a powerful paradigm for decentralized model training, ensuring privacy preservation by allowing clients to collaboratively learn a shared model without exchanging raw data. Quantum federated learning (QFL) extends this approach by leveraging quantum computing to enhance computational efficiency and security. However, existing QFL frameworks face challenges in handling temporal inconsistencies and ensuring model robustness across time-evolving datasets. Recent findings in quantum physics suggest the emergence of two opposing arrows of time in quantum systems, indicating that time-reversal symmetry can be harnessed for computational processes. This work introduces dual-timeline quantum federated learning (DT-QFL), a novel framework that integrates time-reversal symmetry into QFL. DT-QFL employs quantum memory kernels to encode temporal correlations in client updates, ensuring that both past and future data distributions contribute to the learning process. In addition, we introduce a quantum temporal-invariant neural network, which enables federated models to learn patterns invariant to time flow, improving generalization and reducing catastrophic forgetting in decentralized environments.
For more about this article see link below.
https://ieeexplore.ieee.org/document/11168277
For the open access PDF link of this article please click.

