경사하강법(2)
-
[DL Wizard] Derivative, Gradient and Jacobian 번역 및 정리
https://www.deeplearningwizard.com/deep_learning/boosting_models_pytorch/derivative_gradient_jacobian/ Derivative, Gradient and Jacobian - Deep Learning Wizard Derivative, Gradient and Jacobian Simplified Equation This is the simplified equation we have been using on how we update our parameters to reach good values (good local or global minima) \theta = \theta - \eta \cdot \nabla_\theta \theta:..
2020.02.05 -
밑바닥부터 시작하는 딥러닝 1권 : Chapter 5-6 [오차역전파법 및 학습] 2019.12.27