作者:琰琰
MLP比肩Transformer,归纳偏置多余了?
7篇论文重拳出击,Transformer扛不住了?
《Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks》 - 清华大学
《RepMLP: Re-parameterizing Convolutions into Fully-connected Layers for Image Recognition》清华大学软件学院
《Do You Even Need Attention? A Stack of Feed-Forward Layers Does Surprisingly Well on ImageNet》 - 牛津大学
《ResMLP: Feedforward networks for image classification with data-efficient training》 - Facebook AI
《Are Pre-trained Convolutions Better than Pre-trained Transformers?》 - Google Research
《FNet: Mixing Tokens with Fourier Transforms》 - Google Research
《Pay Attention to MLPs》 - Google Research
反映了哪些研究问题?