2017-02-09

Basics of Modern Image Analysis by Fred Hamprecht (Summer 2016, Universität Heidelberg)

# click the upper-left icon to select videos from the playlist

source: Universität Heidelberg    2017年1月18日
This course was held at the Universität Heidelberg in the summer term 2016 by Prof. Fred Hamprecht. It covers the foundation of image analysis methods such as the Fourier representation, characterization of linear filters, separability and steerability of filters, wavelets, etc. It then moves on to the currently hot topic of neural networks. This course introduces multi-layer perceptrons and how to train them, convolutional neural networks, the recently discovered tricks that allow the training of deep networks, and advanced architectures.
Lecture description: https://hci.iwr.uni-heidelberg.de/mip/teaching/ipintro
Exercises: http://tiny.cc/bmia16  

Filters  1:29:28
* Singular Value Decomposition
* Separable filters 00:16:00
* Steerable Filters 00:32:30
* Structure Tensor 01:20:20
Machine Learning for Semantic Segmentation 1:35:54
* Features: linear and nonlinear
* Overview of Classifiers 00:30:45
* Training Random Forests 00:50:50
* Cascaded Classifiers (Auto-Context) 01:02:50
* Hough Forests for Object Detection 01:10:15
Introduction to Neural Networks 1:21:08
* Linear Classification from a Loss Perspective - Stochastic Gradient Descent - Perceptron
* Multi-Class Classification with the Softmax 00:41:20
* Hidden Layers and their Feature Space Transformation 00:54:25
Making Neural Networks practical for Computer Vision 1:33:46
* Stochastic Gradient Descent
* Optimization Tricks: Add Momentum 00:16:20
* How to deal with the high number of parameters 00:23:25
* Convolutional Neural Networks 00:50:00
* Tricks & Best Practices 01:25:05
Designing a Rotation Invariant CNN Layer 1:21:01
* Choosing a Steerable Filter Basis
* Using it as Steerable Convolution Layer 00:33:00
* How to put this into a larger CNN 01:05:00
Natural Gradients 1:25:36
* Proximal Gradient Descent
* Fisher Information Matrix 00:24:10
* Natural Gradients 00:51:00
Modern Neural Network Architectures 1:07:35
* Deep CNNs with max-pooling and upscaling
* Holistically Nested Edge Detection 00:36:15
* Going deeper with Inception Layers 00:51:00
* Pruning Connections in the Network 00:58:55

No comments: