4.7 Article

Multi-task peer-to-peer learning using an encoder-only transformer model

Publisher

ELSEVIER
DOI: 10.1016/j.future.2023.11.006

Keywords

Peer-to-peer; Gossip averaging; Decentralized learning; BERT; NLP

Ask authors/readers for more resources

This paper presents and evaluates a novel approach that utilizes an encoder-only transformer model to enable collaboration between agents learning two distinct NLP tasks. The evaluation results demonstrate that collaboration among agents, even when working towards separate objectives, can result in mutual benefits.
Peer-to-peer (P2P) learning is a decentralized approach to organizing the collaboration between end devices known as agents. Agents contain heterogeneous data, and that heterogeneity is disrupting the convergence and accuracy of the collectively learned models. A common technique to mitigate the negative impact of heterogeneous data is to arrange the learning process in a multi-task setting where each task, although it has the same learning objective, is learned separately. However, the multi-task technique can also be applied to solve distinct learning tasks. This paper presents and evaluates a novel approach that utilizes an encoder-only transformer model to enable collaboration between agents learning two distinct Natural Language Processing (NLP) tasks. The evaluation of the approach studied revealed that collaboration among agents, even when working towards separate objectives, can result in mutual benefits, mainly when the connections between agents are carefully considered. The multi-task collaboration led to a statistically significant increase of 11.6% in the mean relative accuracy compared to the baseline results for individual tasks. To our knowledge, this is the first study demonstrating a successful and beneficial collaboration between two distinct NLP tasks in a peer-to-peer setting.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available