nlp(62)
-
CNN for NLP 번역 및 정리
http://www.wildml.com/2015/11/understanding-convolutional-neural-networks-for-nlp/ Understanding Convolutional Neural Networks for NLP When we hear about Convolutional Neural Network (CNNs), we typically think of Computer Vision. CNNs were responsible for major breakthroughs in Image Classification and are the core of most Compute… www.wildml.com NLP에서 사용되는 CNN - 입력 데이터는 word embedding, 각 행은 한 단..
2020.02.06 -
CNN : Convolutional Neural Network 정리
(사진 출처) 밑바닥부터 시작하는 딥러닝 1권 http://taewan.kim/post/cnn/#fn:1 CNN, Convolutional Neural Network 요약 Convolutional Neural Network, CNN을 정리합니다. taewan.kim 완전연결 계층(Affine)으로 이루어진 네트워크 (ex) input → Affine → ReLU → Affine → ReLU → Affine → ReLU → Affine → Softmax CNN으로 이루어진 네트워크 (ex) input → Conv →ReLU → Pooling → Conv → ReLU → Pooling → Conv → ReLU → Affine → ReLU → Affine → Softmax = (1) 이미지의 특징을 추출하는 ..
2020.02.06 -
[DL Wizard] Weight Initializations & Activation Functions 번역 및 정리
https://www.deeplearningwizard.com/deep_learning/boosting_models_pytorch/weight_initialization_activation_functions/ Weight Initialization and Activation Functions - Deep Learning Wizard Weight Initializations & Activation Functions Recap of Logistic Regression Recap of Feedforward Neural Network Activation Function Sigmoid (Logistic) \sigma(x) = \frac{1}{1 + e^{-x}} Input number \rightarrow [0,..
2020.02.05 -
[DL Wizard] Optimization Algorithms 번역 및 정리
https://www.deeplearningwizard.com/deep_learning/boosting_models_pytorch/optimizers/ Optimization Algorithms - Deep Learning Wizard Optimization Algorithms Introduction to Gradient-descent Optimizers Model Recap: 1 Hidden Layer Feedforward Neural Network (ReLU Activation) Steps Step 1: Load Dataset Step 2: Make Dataset Iterable Step 3: Create Model Class Step 4: Instantiate Model Class www.deepl..
2020.02.05 -
[DL Wizard] Learning Rate Scheduling 번역 및 정리
https://www.deeplearningwizard.com/deep_learning/boosting_models_pytorch/lr_scheduling/ Learning Rate Scheduling - Deep Learning Wizard Learning Rate Scheduling Optimization Algorithm: Mini-batch Stochastic Gradient Descent (SGD) We will be using mini-batch gradient descent in all our examples here when scheduling our learning rate Combination of batch gradient descent & stochastic gradien www.d..
2020.02.05 -
[DL Wizard] Derivative, Gradient and Jacobian 번역 및 정리
https://www.deeplearningwizard.com/deep_learning/boosting_models_pytorch/derivative_gradient_jacobian/ Derivative, Gradient and Jacobian - Deep Learning Wizard Derivative, Gradient and Jacobian Simplified Equation This is the simplified equation we have been using on how we update our parameters to reach good values (good local or global minima) \theta = \theta - \eta \cdot \nabla_\theta \theta:..
2020.02.05