General Information on the Lecture, Term WS 16/17:

EDV Nr.  143104a
Date:  Mi 11.45h-13.15h;
Room:  133 / S202
SWS/ECTS:  4/5


October, 5th 2016: First lesson of the term.

Intersection with other lectures: The contents of the lecture are described below. Please note that the intersection of this lecture and the MIB-lecture Einführung in die künstliche Intelligenz is kept as low as possible.


The science of Machine Learning seeks to answer the question

How can we build computer systems that automatically improve with experience, and what are the fundamental laws that govern all learning processes?

In contrast to conventional computer systems, adaptive systems that integrate Machine Learning algorithms do not only process data, but try to extract knowledge from the available data and apply this knowlege for self-correction and -optimisation.
Machine Learning can be considered as an intersection of Computer Science and Statistics. Whereas Computer Science has focused primarily on how to manually program computers, Machine Learning focuses on the question of how to get computers to program themselves. Whereas Statistics already deals with the problem of how to infer knowledge out of a given set of data and some modelling assumptions, Machine Learning incorporates additional questions about what computational architectures and algorithms can be used to most effectively capture, store, index, retrieve and merge these data. Machine Learning also applies results and models from neuroscience and psychology
Machine Learning is an essential part of Artificial Intelligence. Data-/Web-Mining and Pattern Recognition can be considered as Machine Learning applications. A more concrete but not complete list of applications is:

  • Robotics
  • Autonomous Vehicles
  • Marketbasket analysis, customer analysis and recommender systems
  • Object Recognition (including character- speech- and face recognition) and Computer Vision
  • Search Engines
  • Sentiment Analysis, Opinion Mining
  • Gaming
  • Speech- and ImageCompression
  • ...

The lecture aims to provide the necessary theory as well as insights into the most important applications of Machine Learning. In the excercises students learn to program algorithms and small applications. Python is applied for this programming since it is easy to learn and provides a bunch of helpful packages.

The lecture is structured as follows

  • Introduction, Overview and Categorisation
  • Bayesian Learning Theory and parametric learning models
  • Neural Networks I (conventional Feed Forward Networks)
  • Kernel Methods: Support Vector Machines; Gaussian Processes
  • Unsupervised Learning I
  • Neural Networks II: Deep Belief Nets
  • Feature Selection and Feature Extraction
  • Learning with Sequential Data / Time-Series Prediction
  • Reinforcement Learning
  • Ensemble Learning
Exercises are now realised using IPython notebooks. The notebooks can be viewed and downloaded from the links below. After downloading the notebooks they can be applied for experiments. This requires that Python, Ipython, Pylab and the corresponding modules are installed. Comprehensive Python distribution like Python(xy) or Enthought already contain all of the basic modules to run IPython notebooks in general. In order to run the provided notebooks

  1. Download the notebook into an arbitrary directory
  2. Open a command shell and go to the directory which contains the notebook
  3. Type: ipython notebook --pylab inline

Exercise Ipython notebooks:

1.) Introductory example to classification

2.) Introductory example to common notions, procedures and problems in machine learning (regression)

3.) Maximum Likelihood Estimation of 1-dimensional distributions

4.) Parametric classification of 1-dimensional input

5.) Generalized Linear Regression of 1-dimensional input data

6.) Generative and Discriminative Classification Models

7.) Recognition of handwritten digits using Multi Layer Perceptron

8.) Recognition of handwritten digits with Lasagne/Theano Neural Networks (CNN)

9.) Recognition of handwritten digits using SVM classifier

10.) Prediction of future temperature increase using SVR

11.) Gaussian Process Regression (Example from lecture)

12.) DBSCAN Cluster demo