<![CDATA[Deep Learning 2019/2020 (QHD 1920)]]>
http://www.video.uni-erlangen.de
en2019 FAUTue, 03 Dec 2019 00:00:00 +0100https://cdn.video.uni-erlangen.de/Images/Maier_1400_thumb.png<![CDATA[Deep Learning 2019/2020 (QHD 1920)]]>
http://www.video.uni-erlangen.de
Prof. Dr. Andreas Maier
Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
The accompanying exercises will provide a deeper understanding of the workings and architecture of neural networks.
FAUitunes@uni-erlangen.deUni-Erlangen, FAU,introduction, tasks, dataset, batch, activations, pooling, auxiliary, task, local, image, normalization, var, source, features, pattern, activation, perceptron, analytic, loss, breiningerno<![CDATA[1 - Deep Learning 2019/2020]]>/data/2019/10/15/FAU_W19_DL_ClipID_11995/20191015-DL-Maier-OC-1920x1080.m4vTue, 15 Oct 2019 00:00:00 +0200Dipl.-Inf. Vincent Christleinintroduction, network, perceptron, neural, pattern, recognition
Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
The accompanying exercises will provide a deeper understanding of the workings and architecture of neural networks.
1<![CDATA[1 - Deep Learning 2019/2020]]>/data/2019/10/15/FAU_W19_DL_ClipID_11995/20191015-DL-Maier-OC-combined-1920x1080.m4vTue, 15 Oct 2019 00:00:00 +0200Dipl.-Inf. Vincent Christleinintroduction, network, perceptron, neural, pattern, recognition
Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
The accompanying exercises will provide a deeper understanding of the workings and architecture of neural networks.
1<![CDATA[2 - Deep Learning 2019/2020]]>/data/2019/10/29/FAU_W19_DL_ClipID_12101/20191029-DL-Maier-OC-1920x1080.m4vTue, 29 Oct 2019 00:00:00 +0100Prof. Dr. Andreas Maiernetwork, analytic, function, gradient, regression2<![CDATA[2 - Deep Learning 2019/2020]]>/data/2019/10/29/FAU_W19_DL_ClipID_12101/20191029-DL-Maier-OC-combined-1920x1080.m4vTue, 29 Oct 2019 00:00:00 +0100Prof. Dr. Andreas Maiernetwork, analytic, function, gradient, regression2<![CDATA[3 - Deep Learning 2019/2020]]>/data/2019/11/05/FAU_W19_DL_ClipID_12147/20191105-DL-Breininger-OC-1920x1080.m4vTue, 05 Nov 2019 00:00:00 +0100M. Sc. Katharina Breiningerdescent, problems, iteration, rate, vesal, convex, learning, function, subgradients, functions, optimization, gradient, entropy, batch, local3<![CDATA[3 - Deep Learning 2019/2020]]>/data/2019/11/05/FAU_W19_DL_ClipID_12147/20191105-DL-Breininger-OC-combined-1920x1080.m4vTue, 05 Nov 2019 00:00:00 +0100M. Sc. Katharina Breiningerdescent, problems, iteration, rate, vesal, convex, learning, function, subgradients, functions, optimization, gradient, entropy, batch, local3<![CDATA[4 - Deep Learning 2019/2020]]>/data/2019/11/12/FAU_W19_DL_ClipID_12213/20191112-DL-Breininger-OC-1920x1080.m4vTue, 12 Nov 2019 00:00:00 +0100M. Sc. Katharina Breiningeractivation, function, model, image, gradients, networks, gradient, breininger, convolution, sigmoid, padding, features, pooling, vesal4<![CDATA[4 - Deep Learning 2019/2020]]>/data/2019/11/12/FAU_W19_DL_ClipID_12213/20191112-DL-Breininger-OC-combined-1920x1080.m4vTue, 12 Nov 2019 00:00:00 +0100M. Sc. Katharina Breiningeractivation, function, model, image, gradients, networks, gradient, breininger, convolution, sigmoid, padding, features, pooling, vesal4<![CDATA[5 - Deep Learning 2019/2020]]>/data/2019/11/19/FAU_W19_DL_ClipID_12278/20191119-DL-Maier-OC-1920x1080.m4vTue, 19 Nov 2019 00:00:00 +0100Prof. Dr. Andreas Maierloss, var, task, different, detection, convolution, source, input, auxiliary, tasks, activations, network, model, normalization, regularization, training, data, learning, transformations, dataset, landmark, independent, max, facial5<![CDATA[5 - Deep Learning 2019/2020]]>/data/2019/11/19/FAU_W19_DL_ClipID_12278/20191119-DL-Maier-OC-combined-1920x1080.m4vTue, 19 Nov 2019 00:00:00 +0100Prof. Dr. Andreas Maierloss, var, task, different, detection, convolution, source, input, auxiliary, tasks, activations, network, model, normalization, regularization, training, data, learning, transformations, dataset, landmark, independent, max, facial5<![CDATA[6 - Deep Learning 2019/2020]]>/data/2019/11/26/FAU_W19_DL_ClipID_12345/20191126-DL-Maier-OC-1920x1080.m4vTue, 26 Nov 2019 00:00:00 +0100Prof. Dr. Andreas Maiergradient, classifiers, breininger, performance, hyperparameters6<![CDATA[6 - Deep Learning 2019/2020]]>/data/2019/11/26/FAU_W19_DL_ClipID_12345/20191126-DL-Maier-OC-combined-1920x1080.m4vTue, 26 Nov 2019 00:00:00 +0100Prof. Dr. Andreas Maiergradient, classifiers, breininger, performance, hyperparameters6<![CDATA[7 - Deep Learning 2019/2020]]>/data/2019/12/03/FAU_W19_DL_ClipID_12413/20191203-DL-Maier-OC-1920x1080.m4vTue, 03 Dec 2019 00:00:00 +0100Prof. Dr. Andreas Maier7<![CDATA[7 - Deep Learning 2019/2020]]>/data/2019/12/03/FAU_W19_DL_ClipID_12413/20191203-DL-Maier-OC-combined-1920x1080.m4vTue, 03 Dec 2019 00:00:00 +0100Prof. Dr. Andreas Maier7