nlp
[DL Wizard] Derivative, Gradient and Jacobian 번역 및 정리
codlingual
2020. 2. 5. 16:53
반응형
Derivative, Gradient and Jacobian - Deep Learning Wizard
Derivative, Gradient and Jacobian Simplified Equation This is the simplified equation we have been using on how we update our parameters to reach good values (good local or global minima) \theta = \theta - \eta \cdot \nabla_\theta \theta: parameters (our t
www.deeplearningwizard.com
parameters = parameters - learning_rate * parameters_gradients
→ 이 과정은 2가지 과정으로 쪼갤 수 있음
1) Backpropagation : gradient 구하기
2) Gradient descent : gradient를 이용해 parameter 갱신하기
Gradient, Jacobian, Generalized Jacobian
- Gradient : (input) vector → (output) scalar
- Jacobian : (input) vector → (output) vector
- Generalized Jacobian : (input) tensor → (output) tensor
반응형