# General Information on the Lecture, Term WS 16/17:

EDV Nr. |
143104a |

Date: |
Mi 11.45h-13.15h; 14.15h-15.45h |

Room: |
133 / S202 |

SWS/ECTS: |
4/5 |

## Announcements:

Intersection with other lectures: The contents of the lecture are described below. Please note that the intersection of this lecture and the MIB-lecture Einführung in die künstliche Intelligenz is kept as low as possible.

# Content

The science of Machine Learning seeks to answer the question

*How can we build computer
systems that automatically improve with experience, and what
are the fundamental laws that govern all learning processes?*

In contrast to conventional computer systems, adaptive systems that
integrate Machine Learning algorithms do not only process data, but try
to extract knowledge from the available data and apply this knowlege
for self-correction and -optimisation.

Machine Learning can be considered as an intersection of Computer
Science and Statistics. Whereas Computer Science has
focused primarily on how to manually program computers, Machine
Learning focuses on the question of
how to get computers to program themselves. Whereas Statistics already
deals with the problem of how to infer knowledge out of a given set of
data and some modelling assumptions, Machine Learning incorporates
additional questions about what computational architectures and
algorithms can be used to most effectively capture, store, index,
retrieve and merge these data. Machine Learning also applies results
and models from neuroscience and psychology

Machine Learning is an essential part of Artificial Intelligence.
Data-/Web-Mining and Pattern Recognition can be considered as Machine
Learning applications. A more concrete but not complete list of
applications is:

- Robotics
- Autonomous Vehicles
- Marketbasket analysis, customer analysis and recommender systems
- Object Recognition (including character- speech- and face recognition) and Computer Vision
- Search Engines
- Sentiment Analysis, Opinion Mining
- Gaming
- Speech- and ImageCompression
- ...

The lecture aims to provide the necessary theory as well as insights into the most important applications of Machine Learning. In the excercises students learn to program algorithms and small applications. Python is applied for this programming since it is easy to learn and provides a bunch of helpful packages.

The lecture is structured as follows

- Introduction, Overview and Categorisation
- Bayesian Learning Theory and parametric learning models
- Neural Networks I (conventional Feed Forward Networks)
- Kernel Methods: Support Vector Machines; Gaussian Processes
- Unsupervised Learning I
- Neural Networks II: Deep Belief Nets
- Feature Selection and Feature Extraction
- Learning with Sequential Data / Time-Series Prediction
- Reinforcement Learning
- Ensemble Learning

- Download the notebook into an arbitrary directory
- Open a command shell and go to the directory which contains the notebook
- Type: ipython notebook --pylab inline

Exercise Ipython notebooks:

1.) Introductory example to classification

2.) Introductory example to common notions, procedures and problems in machine learning (regression)

3.) Maximum Likelihood Estimation of 1-dimensional distributions

4.) Parametric classification of 1-dimensional input

5.) Generalized Linear Regression of 1-dimensional input data

6.) Generative and Discriminative Classification Models

7.) Recognition of handwritten digits using Multi Layer Perceptron

8.) Recognition of handwritten digits with Lasagne/Theano Neural Networks (CNN)

9.) Recognition of handwritten digits using SVM classifier

10.) Prediction of future temperature increase using SVR

11.) Gaussian Process Regression (Example from lecture)

12.) DBSCAN Cluster demo