[DL Wizard] Derivative, Gradient and Jacobian 번역 및 정리

2020. 2. 5. 16:53nlp

반응형

https://www.deeplearningwizard.com/deep_learning/boosting_models_pytorch/derivative_gradient_jacobian/

 

Derivative, Gradient and Jacobian - Deep Learning Wizard

Derivative, Gradient and Jacobian Simplified Equation This is the simplified equation we have been using on how we update our parameters to reach good values (good local or global minima) \theta = \theta - \eta \cdot \nabla_\theta \theta: parameters (our t

www.deeplearningwizard.com

 

parameters = parameters - learning_rate * parameters_gradients 

→ 이 과정은 2가지 과정으로 쪼갤 수 있음

1) Backpropagation : gradient 구하기

2) Gradient descent : gradient를 이용해 parameter 갱신하기 

 

Gradient, Jacobian, Generalized Jacobian

 

- Gradient : (input) vector → (output) scalar

- Jacobian :  (input) vector → (output) vector

- Generalized Jacobian :  (input) tensor → (output) tensor

 

Jacobian Matrix(야코비 행렬)

반응형