오차역전파(4)
-
그림으로 보는 RNN/LSTM과 GRU 번역 및 정리
https://towardsdatascience.com/illustrated-guide-to-recurrent-neural-networks-79e5eb8049c9 Illustrated Guide to Recurrent Neural Networks Understanding the Intuition towardsdatascience.com RNN과 은닉상태 초기화 → input 단어와 은닉상태를 RNN에 집어 넣기 → 그 단어의 output과 새로운 은닉상태가 출력됨 → 이 출력을 다시 RNN에 집어넣기 → 단어 없을 때까지 반복 → 출력을 FeedForward 레이어에 넣기 → 최종결과(prediction) 출력 은닉상태가 RNN의 기억장치 (저번 출력을 다음 입력으로 넘겨줌) tanh 함수는 출력을 [-..
2020.02.07 -
[DL Wizard] Derivative, Gradient and Jacobian 번역 및 정리
https://www.deeplearningwizard.com/deep_learning/boosting_models_pytorch/derivative_gradient_jacobian/ Derivative, Gradient and Jacobian - Deep Learning Wizard Derivative, Gradient and Jacobian Simplified Equation This is the simplified equation we have been using on how we update our parameters to reach good values (good local or global minima) \theta = \theta - \eta \cdot \nabla_\theta \theta:..
2020.02.05 -
[DL Wizard] Forwardpropagation, Backpropagation and Gradient Descent with PyTorch 번역 및 정리
https://www.deeplearningwizard.com/deep_learning/boosting_models_pytorch/forwardpropagation_backpropagation_gradientdescent/ Forward- and Backward-propagation and Gradient Descent (From Scratch FNN Regression) - Deep Learning Wizard Forwardpropagation, Backpropagation and Gradient Descent with PyTorch Transiting to Backpropagation Let's go back to our simple FNN to put things in perspective Let ..
2020.02.04 -
밑바닥부터 시작하는 딥러닝 2권 : Chapter 1-2 2020.01.24