Mixture of Experts for Rationalization
Description:
Mixture of Experts (MoE) is a technique whereby several models are trained on the same data, each specializing in a certain subset.
MoE have been shown to be successful in a variety of applications and their original formulation dates back early 2000s.
The idea is to understand whether we can develop a MoE model for selective rationalization to address interlocking.
Contact: Federico Ruggeri
References:
A Survey on Mixture of Experts in Large Language Models
W. Cai, J. Jiang, F. Wang, J. Tang, S. Kim and J. Huang.
In IEEE Transactions on Knowledge and Data Engineering, vol. 37, no. 7, pp. 3896-3915, July 2025.
DOI