Multi-Task Federated Split Learning Across Multi-Modal Data with Privacy Preservation
With the advancement of federated learning (FL), there is a growing demand for schemes that support multi-task learning on multi-modal data while ensuring robust privacy protection, especially in applications like intelligent connected vehicles. Traditional FL schemes often struggle with the complexities introduced by multi-modal data and diverse task requirements, such as increased communication overhead and computational burdens. In this paper, we propose a novel privacy-preserving scheme for multi-task federated split learning across multi-modal data (MTFSLaMM).
Our approach leverages the principles of split learning to partition models between clients and servers, employing a modular design that reduces computational demands on resource-constrained clients. To ensure data privacy, we integrate differential privacy to protect intermediate data and employ homomorphic encryption to safeguard client models. Additionally, our scheme employs an optimized attention mechanism guided by mutual information to achieve efficient multi-modal data fusion, maximizing information integration while minimizing computational overhead and preventing overfitting.
Experimental results demonstrate the effectiveness of the proposed scheme in addressing the challenges of multi-modal data and multi-task learning while offering robust privacy protection, with MTFSLaMM achieving a 15.3% improvement in BLEU-4 and an 11.8% improvement in CIDEr scores compared with the baseline.
We propose a privacy-preserving scheme for MTFSLaMM, designed to efficiently handle diverse data modalities while supporting multiple tasks. By leveraging split learning, the scheme reduces computational overhead for clients by partitioning the model and delegating part of the computation to the server. To ensure robust privacy protection, it integrates differential privacy to secure intermediate data exchanged during split learning and employs homomorphic encryption to safeguard client models during server-side aggregation. Experimental results demonstrate that the proposed scheme effectively enhances privacy while maintaining competitive performance in multi-modal and multi-task learning scenarios.
Our proposed MTFSLaMM scheme is particularly well-suited for ICV systems and other applications which require simultaneous execution of multiple tasks, such as object detection, traffic prediction, path planning, and driver behavior analysis, often relying on multi-modal data from diverse sensors like cameras, LiDAR, radar, and GPS. The modular split learning design not only reduces the computational burden on resource-constrained clients but also ensures that sensitive data remain protected through robust privacy mechanisms. Currently, the expensive communication overhead remains a bottleneck limiting the development of privacy-preserving, multi-task, multimodal federated learning, and future research could incorporate more efficient packet encryption or compressed transmission techniques to further reduce the communication overhead.
data analytics, big data, data science, machine learning, artificial intelligence, data mining, predictive analytics, cloud computing, data engineering, business intelligence, data warehouse, data governance, data visualization, data management, data strategy, real time data, structured data, unstructured data, data processing, data architecture
#DataAnalytics, #BigData, #DataScience, #MachineLearning, #ArtificialIntelligence, #DataMining, #PredictiveAnalytics, #CloudComputing, #DataEngineering, #BusinessIntelligence, #DataWarehouse, #DataGovernance, #DataVisualization, #DataManagement, #DataStrategy, #RealTimeData, #StructuredData, #UnstructuredData, #DataProcessing, #DataArchitecture
International Conference on Network Science and Graph Analytics
Visit: networkscience.researchw.com
Award Nomination: networkscience-conferences.researchw.com/award-nomination/?ecategory=Awards&rcategory=Awardee
For Enquiries: support@researchw.com
Get Connected Here
---------------------------------
---------------------------------
instagram.com/network_science_awards
tumblr.com/emileyvaruni
n.pinterest.com/network_science_awards
networkscienceawards.blogspot.com
youtube.com/@network_science_awards

Comments
Post a Comment