Deep Learning 2018/2019 (HD 1280)
http://www.video.uni-erlangen.de
en2011 FAUTue, 08 Jan 2019 00:00:00 +0100https://cdn.video.uni-erlangen.de/Images/Maier_1400_thumb.pngDeep Learning 2018/2019 (HD 1280)
http://www.video.uni-erlangen.de
Prof. Dr. Andreas MaierDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
FAUitunes@uni-erlangen.deUni-Erlangen, FAU,bernoulli, deep, visualization, backpropagation, inception, block, label, pattern, DeepDream, Go, Detection, Siri, Alexa, machine, neuronal, multi-layer, abstraction, layer, softmax, decisionno1 - Deep Learning 2018/201901:27:00/data/2018/10/16/FAU_W18_DL_ClipID_9548/20181016-DL-Maier-OC-1280x720.m4vTue, 16 Oct 2018 00:00:00 +0200Prof. Dr. Andreas MaierDeepDream, Go, Detection, Siri, Alexa, machine, neuronal, pattern, perceptron, learning, network, recognitionDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
12 - Deep Learning 2018/201901:24:57/data/2018/10/23/FAU_W18_DL_ClipID_9595/20181023-DL-Maier-OC-1280x720.m4vTue, 23 Oct 2018 00:00:00 +0200Prof. Dr. Andreas Maiermulti-layer, abstraction, layer, softmax, decision, feedback, activation, perceptron, problem, output, example, perceptron, neural, network, layer, activation, networks, learning, layers, neural, function, ravikumar, analytic, loss, breininger, propagation, input, forward, universal, gradient, functionsDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
22 - Deep Learning 2018/201901:24:57/data/2018/10/23/FAU_W18_DL_ClipID_9595/20181023-DL-Maier-OC-combined-1280x720.m4vTue, 23 Oct 2018 00:00:00 +0200Prof. Dr. Andreas Maiermulti-layer, abstraction, layer, softmax, decision, feedback, activation, perceptron, problem, output, example, perceptron, neural, network, layer, activation, networks, learning, layers, neural, function, ravikumar, analytic, loss, breininger, propagation, input, forward, universal, gradient, functionsDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
23 - Deep Learning 2018/201901:38:06/data/2018/10/30/FAU_W18_DL_ClipID_9631/20181030-DL-Maier-OC-1280x720.m4vTue, 30 Oct 2018 00:00:00 +0100Tobias Würfloptimization, likelihood, momentum, Nesterov, adam, possible, entropy, subgradient, functions, probability, gradients, subgradients, regression, estimation, loss, function, regression, classification, bernoulli, subgradient, gradient, network, classification, functionDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
33 - Deep Learning 2018/201901:38:06/data/2018/10/30/FAU_W18_DL_ClipID_9631/20181030-DL-Maier-OC-combined-1280x720.m4vTue, 30 Oct 2018 00:00:00 +0100Tobias Würfloptimization, likelihood, momentum, Nesterov, adam, possible, entropy, subgradient, functions, probability, gradients, subgradients, regression, estimation, loss, function, regression, classification, bernoulli, subgradient, gradient, network, classification, functionDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
34 - Deep Learning 2018/201901:17:15/data/2018/11/06/FAU_W18_DL_ClipID_9679/20181106-DL-Maier-OC-1280x720.m4vTue, 06 Nov 2018 00:00:00 +0100Prof. Dr. Andreas Maiernetworks, learning, neural, convolution, representation, linear, layers, functionDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
44 - Deep Learning 2018/201901:17:15/data/2018/11/06/FAU_W18_DL_ClipID_9679/20181106-DL-Maier-OC-combined-1280x720.m4vTue, 06 Nov 2018 00:00:00 +0100Prof. Dr. Andreas Maiernetworks, learning, neural, convolution, representation, linear, layers, functionDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
45 - Deep Learning 2018/201901:04:36/data/2018/11/13/FAU_W18_DL_ClipID_9729/20180509-DL-Breininger-OC-combined-1280x720.m4vTue, 13 Nov 2018 00:00:00 +0100MA Katharina BreiningerDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
55 - Deep Learning 2018/201901:04:36/data/2018/11/13/FAU_W18_DL_ClipID_9729/20180509-DL-Breininger-OC-1280x720.m4vTue, 13 Nov 2018 00:00:00 +0100MA Katharina BreiningerDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
56 - Deep Learning 2018/201901:04:15/data/2018/11/20/FAU_W18_DL_ClipID_9736/20181120-DL-Maier-OC-1280x720.m4vTue, 20 Nov 2018 00:00:00 +0100Prof. Dr. Andreas Maierpractices, training, data, gradient, performance, function, classification, accuracy, positives, network, multiple, classifiers, measures, model, architecture, comparing, validation, ravikumarDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
66 - Deep Learning 2018/201901:04:15/data/2018/11/20/FAU_W18_DL_ClipID_9736/20181120-DL-Maier-OC-combined-1280x720.m4vTue, 20 Nov 2018 00:00:00 +0100Prof. Dr. Andreas Maierpractices, training, data, gradient, performance, function, classification, accuracy, positives, network, multiple, classifiers, measures, model, architecture, comparing, validation, ravikumarDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
67 - Deep Learning 2018/201901:07:41/data/2018/11/27/FAU_W18_DL_ClipID_9787/20181127-DL-Maier-OC-1280x720.m4vTue, 27 Nov 2018 00:00:00 +0100Prof. Dr. Andreas Maiernetwork, layer, kernel, convolution, inception, bottleneck, deep, learning, label-smoothing, regularization, architecture, block, reinforcementDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
77 - Deep Learning 2018/201901:07:41/data/2018/11/27/FAU_W18_DL_ClipID_9787/20181127-DL-Maier-OC-combined-1280x720.m4vTue, 27 Nov 2018 00:00:00 +0100Prof. Dr. Andreas Maiernetwork, layer, kernel, convolution, inception, bottleneck, deep, learning, label-smoothing, regularization, architecture, block, reinforcementDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
78 - Deep Learning 2018/201901:00:59/data/2018/12/04/FAU_W18_DL_ClipID_9828/20181204-DL-Maier-OC-combined-1280x720.m4vTue, 04 Dec 2018 00:00:00 +0100Prof. Dr. Andreas Maierrecurrent, neural, network, RNN, backpropagation, BPTT, long, short, term, gradient, LSTM, GRUDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
88 - Deep Learning 2018/201901:00:59/data/2018/12/04/FAU_W18_DL_ClipID_9828/20181204-DL-Maier-OC-1280x720.m4vTue, 04 Dec 2018 00:00:00 +0100Prof. Dr. Andreas Maierrecurrent, neural, network, RNN, backpropagation, BPTT, long, short, term, gradient, LSTM, GRUDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
89 - Deep Learning 2018/201901:06:17/data/2018/12/11/FAU_W18_DL_ClipID_9873/20181211-DL-Maier-OC-combined-1280x720.m4vTue, 11 Dec 2018 00:00:00 +0100Prof. Dr. Andreas Maiervisualization, network, neural, architecture, backpropagation, inversion, confoundDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
99 - Deep Learning 2018/201901:06:17/data/2018/12/11/FAU_W18_DL_ClipID_9873/20181211-DL-Maier-OC-1280x720.m4vTue, 11 Dec 2018 00:00:00 +0100Prof. Dr. Andreas Maiervisualization, network, neural, architecture, backpropagation, inversion, confoundDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
910 - Deep Learning 2018/201901:07:23/data/2018/12/18/FAU_W18_DL_ClipID_9912/20181218-DL-Maier-OC-1280x720.m4vTue, 18 Dec 2018 00:00:00 +0100Prof. Dr. Andreas Maieratari, game, state, networkDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
1010 - Deep Learning 2018/201901:07:23/data/2018/12/18/FAU_W18_DL_ClipID_9912/20181218-DL-Maier-OC-combined-1280x720.m4vTue, 18 Dec 2018 00:00:00 +0100Prof. Dr. Andreas Maieratari, game, state, networkDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
1011 - Deep Learning 2018/201901:22:46/data/2019/01/08/FAU_W18_DL_ClipID_9958/20190108-DL-Breininger-OC-combined-1280x720.m4vTue, 08 Jan 2019 00:00:00 +0100MA Katharina Breininger, MA Katharina Breiningerobject, detection, segmentation, region-based, single-shot, upsampling, convolution, network, recognition, localization, classification, label, CNNDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
1111 - Deep Learning 2018/201901:22:46/data/2019/01/08/FAU_W18_DL_ClipID_9958/20190108-DL-Breininger-OC-1280x720.m4vTue, 08 Jan 2019 00:00:00 +0100MA Katharina Breininger, MA Katharina Breiningerobject, detection, segmentation, region-based, single-shot, upsampling, convolution, network, recognition, localization, classification, label, CNNDeep LearningDeep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
11