하이퍼파라미터(4)
-
[DL Wizard] Learning Rate Scheduling 번역 및 정리
https://www.deeplearningwizard.com/deep_learning/boosting_models_pytorch/lr_scheduling/ Learning Rate Scheduling - Deep Learning Wizard Learning Rate Scheduling Optimization Algorithm: Mini-batch Stochastic Gradient Descent (SGD) We will be using mini-batch gradient descent in all our examples here when scheduling our learning rate Combination of batch gradient descent & stochastic gradien www.d..
2020.02.05 -
[Hyper-parameter Tuning] 하이퍼 파라미터 튜닝
https://towardsdatascience.com/hyper-parameter-tuning-techniques-in-deep-learning-4dad592c63c8 Hyper-parameter Tuning Techniques in Deep Learning The process of setting the hyper-parameters requires expertise and extensive trial and error. There are no simple and easy ways to set… towardsdatascience.com Hyper-parameters 1) learning rate 2) momentum 3) batch size 4) weight decay Momentum이란? - 기존 ..
2020.02.05 -
머신러닝의 기초
https://towardsdatascience.com/machine-learning-basics-part-1-a36d38c7916 Machine Learning —Fundamentals Basic theory underlying the field of Machine Learning towardsdatascience.com 머신러닝의 유형 1. 감독학습 Supervised Learning 1.1. 분류 Classification → Logistic Regression (반응변수 y가 이산discrete) 1.2. 회귀 Regression → Linear Regression (반응변수 y가 연속continuous) 2. 무감독학습 Unsupervised Learning 3. 강화학습 Reinforcemen..
2020.02.02 -
밑바닥부터 시작하는 딥러닝 1권 : Chapter 5-6 [오차역전파법 및 학습] 2019.12.27