CORC  > 北京大学  > 数学科学学院
A Single Loop EM Algorithm for the Mixture of Experts Architecture
Yang, Yan ; Ma, Jinwen
2009
关键词The mixture of experts (ME) architecture The EM algorithm Gating network Single loop Least mean square regression MULTICLASS CLASSIFICATION MAXIMUM-LIKELIHOOD GAUSSIAN MIXTURES CONVERGENCE
英文摘要The mixture of experts (ME) architecture is a powerful neural network model for supervised learning, which contains a number of "expert" networks phis a gating network. algorithm can be used to learn the parameters of the ME architecture. I]I fact, there have already existed several methods to implement the EM algorithm, such as the IRLS algorithm, the ECM algorithm, and an approximation to the Newton-Raphson algorithm. The differences among these implementations rely on how to train the gating network, which results in a double-loop training procedure, i.e., there is all inner loop training procedure within the general or outer loop training procedure. In this paper, we propose a least mean square regression method to learn or compute the parameters for the gating network directly, which leads to a single loop (i.e., there is no inner loop training) EM algorithm for the ME architecture. It is demonstrated by the simulation experiments that our proposed EM algorithm outperforms the existing ones on both speed and classification accuracy.; Computer Science, Artificial Intelligence; Computer Science, Theory & Methods; EI; CPCI-S(ISTP); 0
语种英语
出处SCI ; EI
内容类型其他
源URL[http://hdl.handle.net/20.500.11897/315403]  
专题数学科学学院
推荐引用方式
GB/T 7714
Yang, Yan,Ma, Jinwen. A Single Loop EM Algorithm for the Mixture of Experts Architecture. 2009-01-01.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace