|
Module Availability |
Spring |
|
|
Assessment Pattern |
Components of Assessment
|
Method(s)
|
Percentage Weighting
|
Closed-book examination
|
Written, 2 hours unseen examination
|
60%
|
Assignment
|
5-10 page Pattern Recognition Assignment Report
|
20%
|
Laboratory experiment
|
5-10 page Laboratory Experiment Report
|
20%
|
Laboratory experiment
|
|
Students may choose to complete the lab if their time allows, however, if they decide not to do it, their assignment mark will count 40% towards the assessment of the module
|
Qualifying Condition(s)
A weighted aggregated mark of 50% is required to pass the module |
|
|
Module Overview |
The course will introduce students to advanced techniques of signal processing and interpretation. |
|
|
Prerequisites/Co-requisites |
Contact School for details |
|
|
Module Aims |
- To provide students with advanced analytical tools for solving statistical and adaptive signal processing problems encountered in communications, telematics, and related engineering areas.
- To introduce statistical and adaptive techniques for detection, filtering and matching of signals in noise.
|
|
|
Learning Outcomes |
On the successful completion of this module students should
- understand how the concepts of statistical and adaptive techniques for detection, filtering and matching of signals in noise are expressed mathematically
- be able to manipulate mathematical models to solve problems and predict effects.
- appreciation of the relevance of the presented material to applications in machine perception and of its engineering significance.
|
|
|
Module Content |
Lecture Component: Adaptive Digital Filtering (JI) Hours 10 Lecture hours + 2 problem classes
[1 Introduction - Approaches to adaptive filters. State space model. Cost functions.
[2-3] Correlation matrix, autoregressive and moving - average models.
[4] Spectral analysis.
[5-6] Linear Prediction.
[7-8] Mean Square Estimation - Conditional expectation and orthogonality. Wiener filtering.
[9-10] FIR Adaptive Filters.
[11-12] Problem Classes.
Lecture Component Statistical Pattern Recognition (JK) Hours: 20 Lecture hours + 4 problem classes
[1-2] Elements of Statistical Decision Theory - Model of pattern recognition system. Decision theoretic approach to pattern classification. Bayes decision rule for minimum loss and minimum error rate. Optimum error acceptance trade-off. Learning algorithms.
[3-4] Nearest Neighbour (NN) Technique - 1-NN, k-NN pattern classifiers. Error bounds. Editing techniques.
[5-6] Discriminant Functions: Discriminant functions and learning algorithms. Deterministic learning. The least square criterion and learning scheme. Perceptron. Multilayer Perceptron. Neural nets. Stochastic approximation.
[7] Probability Density Function Estimation - Parzen estimator, k-NN estimator.
[8] Classification Error Rate Estimation - re-substitution method, leave-one-out method, error estimation based on unclassified test samples.
[9-10] Applications – Examples of successful applications of pattern recognition systems. (
Reading week)
[11-12] Feature Selection - Concepts and criteria of feature selection. Algorithms for selecting optimal and sub-optimal sets of features. Recursive calculation of parametric separability measures.
[13-14] Feature Extraction - Probabilistic distance measures in feature extraction. Properties of the Karhunen-Loeve expansion, feature extraction techniques based on the Karhunen-Loeve expansion. Discriminant analysis.
[15-16] Cluster Analysis - Concepts of a cluster, dissemblance and resemblance measures, globally sensitive methods, global representation of clusters by pivot points and kernels, locally sensitive methods (methods for seeking valleys in probability density functions), hierarchical methods, minimum spanning tree methods, clustering algorithms.
[17-18] Contextual Classification Methods - The role of context in pattern recognition. Heuristic approaches to contextual pattern recognition. Labelling of objects arranged in networks (chains, regular and irregular lattices). Neighbourhood systems. Elements of compound decision theory.
[19-20] Classifier Fusion - Fusion System architecture. Fusion rules and their properties
|
|
|
|
Methods of Teaching/Learning |
Lectures: 10 weeks, 3 hours per week
Labs: Pattern Recognition Experiment – set and marked by JK, starting in week 8, due week 11, Preparation via assignment
Assignment(s): Pattern recognition system design (linked to Pattern Recognition Lab) issued in week 5, due week 8
|
|
|
|
Selected Texts/Journals |
Lecture Component: Adaptive Filtering
Poularikas, A and Ramadan, Z. Adaptive Filtering Primer with MATLAB Taylor & Francis, 0849370434
Lecture Component: Statistical Pattern Recognition
Webb, A. Statistical Pattern Recognition Arnold 0340741643 B
Therrien, C. W. Decision, Estimation & Classification Wiley 0-471-50416-5 B
Devijver, A. and Kittler, J. Pattern Recognition: A Statistical Approach, Prentice Hall B
|
|
|
Last Updated |
29th July 2009 |
|