공부공간

논문 리뷰 ) Attention Is All You Need 본문

Natural Language Processing/Paper-Review

논문 리뷰 ) Attention Is All You Need

개발자가될수있을까? 2020. 2. 23. 22:22

https://arxiv.org/abs/1706.03762

 

Attention Is All You Need

The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new

arxiv.org

 

Comments