This lecture is an introduction to the basics of Machine Learning and belongs to a wider series including neural networks, convolutional neural networks, Bayesian probability and Big Learning with Bayesian Methods. First, what learning means is explained. Then the concepts of supervised and unsupervised learning. Afterwards, the difference between optimization and machine learning. It follows an appearent stop: the no free lunch theorem, but distributions and parameter estimation will help to recover. In particular, Maximum Likelihood Estimation. Then another concept, Stochastic Gradient Descent. Finally, what lacks in machine learning that motivates Deep Learning.
Deep Learning 01 - Machine Learning Basics
Cristina De Castro;
2018
Abstract
This lecture is an introduction to the basics of Machine Learning and belongs to a wider series including neural networks, convolutional neural networks, Bayesian probability and Big Learning with Bayesian Methods. First, what learning means is explained. Then the concepts of supervised and unsupervised learning. Afterwards, the difference between optimization and machine learning. It follows an appearent stop: the no free lunch theorem, but distributions and parameter estimation will help to recover. In particular, Maximum Likelihood Estimation. Then another concept, Stochastic Gradient Descent. Finally, what lacks in machine learning that motivates Deep Learning.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.