Course Outline
DAY 1 - ARTIFICIAL NEURAL NETWORKS
Introduction and ANN Structure.
- Biological neurons versus artificial neurons.
- Structural model of an ANN.
- Activation functions employed in ANNs.
- Common classes of network architectures.
Mathematical Foundations and Learning Mechanisms.
- Review of vector and matrix algebra.
- State-space concepts.
- Optimization principles.
- Error-correction learning.
- Memory-based learning.
- Hebbian learning.
- Competitive learning.
Single Layer Perceptrons.
- Structure and learning processes of perceptrons.
- Introduction to pattern classifiers and Bayes' classifiers.
- Application of perceptrons as pattern classifiers.
- Perceptron convergence criteria.
- Limitations inherent to perceptrons.
Feedforward ANNs.
- Structures of multi-layer feedforward networks.
- The backpropagation algorithm.
- Training and convergence in backpropagation.
- Functional approximation using backpropagation.
- Practical considerations and design issues in backpropagation learning.
Radial Basis Function (RBF) Networks.
- Pattern separability and interpolation.
- Regularization Theory.
- Integration of regularization with RBF networks.
- Design and training of RBF networks.
- Approximation capabilities of RBF networks.
Competitive Learning and Self-Organizing ANNs.
- General clustering procedures.
- Learning Vector Quantization (LVQ).
- Algorithms and architectures for competitive learning.
- Self-organizing feature maps.
- Properties of feature maps.
Fuzzy Neural Networks.
- Neuro-fuzzy systems.
- Foundations of fuzzy sets and logic.
- Design of fuzzy systems.
- Design of fuzzy ANNs.
Applications
- Discussion of various Neural Network application examples, highlighting their advantages and challenges.
DAY 2 - MACHINE LEARNING
- The PAC Learning Framework
- Guarantees for finite hypothesis sets – consistent case
- Guarantees for finite hypothesis sets – inconsistent case
- General Considerations
- Deterministic vs. stochastic scenarios
- Bayes error noise
- Estimation and approximation errors
- Model selection
- Rademacher Complexity and VC Dimension
- Bias-Variance Tradeoff
- Regularization
- Overfitting
- Validation techniques
- Support Vector Machines
- Kriging (Gaussian Process regression)
- PCA and Kernel PCA
- Self-Organizing Maps (SOM)
- Kernel-induced vector spaces
- Mercer Kernels and kernel-induced similarity metrics
- Reinforcement Learning
DAY 3 - DEEP LEARNING
This module will be taught in conjunction with the topics covered on Day 1 and Day 2
- Logistic and Softmax Regression
- Sparse Autoencoders
- Vectorization, PCA, and Whitening
- Self-Taught Learning
- Deep Networks
- Linear Decoders
- Convolution and Pooling
- Sparse Coding
- Independent Component Analysis
- Canonical Correlation Analysis
- Demos and Applications
Requirements
A solid understanding of mathematics is essential.
Familiarity with fundamental statistics is required.
While basic programming skills are not strictly mandatory, they are highly recommended.
Testimonials (2)
Working from first principles in a focused way, and moving to applying case studies within the same day
Maggie Webb - Department of Jobs, Regions, and Precincts
Course - Artificial Neural Networks, Machine Learning, Deep Thinking
It was very interactive and more relaxed and informal than expected. We covered lots of topics in the time and the trainer was always receptive to talking more in detail or more generally about the topics and how they were related. I feel the training has given me the tools to continue learning as opposed to it being a one off session where learning stops once you've finished which is very important given the scale and complexity of the topic.