NMT(3)
-
그림으로 보는 Transformer 번역 및 정리
https://jalammar.github.io/illustrated-transformer/ The Illustrated Transformer Discussions: Hacker News (65 points, 4 comments), Reddit r/MachineLearning (29 points, 3 comments) Translations: Chinese (Simplified), Korean Watch: MIT’s Deep Learning State of the Art lecture referencing this post In the previous post, we looked at Atten jalammar.github.io 1) Encoder - 첫번째(맨 아래) Encoder만 word embed..
2020.02.10 -
[seq2seq + Attention] 불어-영어 번역 모델 PyTorch로 구현하기
https://pytorch.org/tutorials/intermediate/seq2seq_translation_tutorial.html NLP From Scratch: Translation with a Sequence to Sequence Network and Attention — PyTorch Tutorials 1.4.0 documentation Note Click here to download the full example code NLP From Scratch: Translation with a Sequence to Sequence Network and Attention Author: Sean Robertson This is the third and final tutorial on doing “N..
2020.02.10 -
Attention Model 번역 및 정리
출처 1) Neural Machine Translation By Jointly Learning to Align and Translate 2) Attention: Illustrated Attention 3) Attention and Memory in Deep Learning and NLP 기존 Encoder-Decoder RNN/LSTM 모델의 문제점 - 아무리 긴 input sentence가 주어져도 고정 길이 벡터fixed-length vector로 압축해서 표현해야 함 - Decoder는 Encoder의 마지막 은닉상태만 전달받음 → 엄청 긴 문장이라면 엄청 많이 까먹음 기존 Encoder-Decoder RNN/LSTM 모델의 문제점 해결 - 고정길이벡터 X - input sentence는 여러 벡터..
2020.02.10