번역(60)
-
[2021-12-27] 오늘의 자연어처리
Are E2E ASR models ready for an industrial usage? The Automated Speech Recognition (ASR) community experiences a major turning point with the rise of the fully-neural (End-to-End, E2E) approaches. At the same time, the conventional hybrid model remains the standard choice for the practical usage of ASR. According to previous studies, the adoption of E2E ASR in real-world applications was hindere..
2021.12.27 -
[2021-12-24] 오늘의 자연어처리
How Should Pre-Trained Language Models Be Fine-Tuned Towards Adversarial Robustness? The fine-tuning of pre-trained language models has a great success in many NLP fields. Yet, it is strikingly vulnerable to adversarial examples, e.g., word substitution attacks using only synonyms can easily fool a BERT-based sentiment analysis model. In this paper, we demonstrate that adversarial training, the ..
2021.12.24 -
[2021-12-23] 오늘의 자연어처리
Mixed Precision DNN Qunatization for Overlapped Speech Separation and Recognition Recognition of overlapped speech has been a highly challenging task to date. State-of-the-art multi-channel speech separation system are becoming increasingly complex and expensive for practical applications. To this end, low-bit neural network quantization provides a powerful solution to dramatically reduce their ..
2021.12.23 -
그림으로 보는 Transformer 번역 및 정리
https://jalammar.github.io/illustrated-transformer/ The Illustrated Transformer Discussions: Hacker News (65 points, 4 comments), Reddit r/MachineLearning (29 points, 3 comments) Translations: Chinese (Simplified), Korean Watch: MIT’s Deep Learning State of the Art lecture referencing this post In the previous post, we looked at Atten jalammar.github.io 1) Encoder - 첫번째(맨 아래) Encoder만 word embed..
2020.02.10 -
[seq2seq + Attention] 불어-영어 번역 모델 PyTorch로 구현하기
https://pytorch.org/tutorials/intermediate/seq2seq_translation_tutorial.html NLP From Scratch: Translation with a Sequence to Sequence Network and Attention — PyTorch Tutorials 1.4.0 documentation Note Click here to download the full example code NLP From Scratch: Translation with a Sequence to Sequence Network and Attention Author: Sean Robertson This is the third and final tutorial on doing “N..
2020.02.10 -
seq2seq 모델 PyTorch로 구현하기 번역 및 정리
https://github.com/bentrevett/pytorch-seq2seq/blob/master/1%20-%20Sequence%20to%20Sequence%20Learning%20with%20Neural%20Networks.ipynb bentrevett/pytorch-seq2seq Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText. - bentrevett/pytorch-seq2seq github.com 독일어를 영어로 번역하는 모델 PyTorch로 구현하기 - Encoder-Decoder LSTM(=seq2seq) 모델은 RNN을 이용해 input을 feature vector..
2020.02.09