DCU Home | Our Courses | Loop | Registry | Library | Search DCU

Module Specifications..

Current Academic Year 2023 - 2024

Please note that this information is subject to change.

Module Title Advanced Topics in Machine Learning
Module Code EE613
School School of Electronic Engineering
Module Co-ordinatorSemester 1: Kevin McGuinness
Semester 2: Kevin McGuinness
Autumn: Kevin McGuinness
Module TeacherNo Teacher Assigned
NFQ level 9 Credit Rating 7.5
Pre-requisite None
Co-requisite None
Compatibles None
Incompatibles None
Repeat the module
GTE modules have no resit opportunity. Students that do not successfully complete the module may retake module in another year.
Description

This module is aimed at Ph.D students that are studying topics related to artificial intelligence and machine learning and need a deep and up-to-date understanding of the most important state-of-the-art in a fast moving field. The module will provide students with a broad understanding of the latest theoretical and experimental results in the field, the ability to mathematically formulate machine learning problems, the foundations needed to develop novel approaches, and experience with the peer-review process, both from the perspective of the author and of the reviewer. This is an advanced module and will require that students have already taken a first course in machine learning.

Learning Outcomes

1. Demonstrate a critical understanding of a wide variety of topics in machine learning research
2. Formulate and analyze machine learning problems in a mathematically rigorous way
3. Develop novel research in machine learning and document the results in a research paper that is to publication standards
4. Critically appraise a research paper and productively engage in the peer review process as both an author and a reviewer



Workload Full-time hours per semester
Type Hours Description
Lecture36Classroom or online lectures
Assignment Completion12Online Loop quizzes
Assignment Completion60Research Paper
Assignment Completion8Peer review
Assignment Completion12Rebuttal and final paper
Independent Study59Paper reading and background research
Total Workload: 187

All module information is indicative and subject to change. For further information,students are advised to refer to the University's Marks and Standards and Programme Specific Regulations at: http://www.dcu.ie/registry/examinations/index.shtml

Indicative Content and Learning Activities

ML Foundations and Fundamentals
A formal mathematical introduction to the foundations of modern machine learning and neural networks, learn notational conventions, matrix calculus, and stochastic optimization principles.

Deep Neural Architectures
Provide students with an in-depth understanding of the principles of deep neural architecture engineering, including: vanishing gradients, skip connections, residual connections, separable convolutions, group convolutions, upsampling techniques, neural architecture search, stochastic depth, etc.

Neural Network Optimization and Generalization
Analysis of current trends in optimization and regularization including transfer learning, domain adaptation, stochastic optimization, optimization techniques, deep network optimization theory and practice.

Kernel Methods
Discussion of kernel methods and their interaction with modern deep learning. Topics covered will include: the kernel trick, Kernel Principal Component Analysis (KPCA), Kernel Support Vector Machines (SVM), Kernel Ridge Regression, and Neural Tangent Kernels (Jacot et al, 2018)

Probabilistic Unsupervised Models
Discussion of recent developments in unsupervised probabilistic neural models. Topics covered will include: autoregressive models, Masked Autoencoder for Distribution Estimation (MADE) (Germain et al. 2015), Pixel Recurrent Neural Networks (Pixel-RNN) (Van Den Oord et al. 2016) and Pixel-CNN (Salimans et al. 2016), Variational Autoencoders (Kingma et al. 2014), variational lower bounds, and Normalizing Flows (Kobyzev et al. 2016).

Generative Models
Overview of the principles and recent developments in generative neural models. Specific topics covered will include: Generative Adversarial Networks (GANs) (Goodfellow et al. 2014), Wasserstein GAN (Arjovsky et al. 2017), Least Squares-GAN (Mao et al. 2017), BigGAN (Brock et al. 2019), StyleGAN (Karras et al. 2019), ProGAN (Karras et al. 2018), InfoGAN (Chen et al. 2016), Conditional GANs, pix2pix (Isola et al. 2017), CycleGAN (Zhu et al. 2017), Spectral Normalization (Miyato et al. 2018), and methods and metrics for the evaluation of GANs.

Representation Learning
Introduction to recent developments in self-supervised representation learning, semi-supervised techniques, label noise, pseudo-labeling, consistency regularization, and contrastive learning.

Applications
Analysis of the state-of-the-art in various practical applications of machine learning including: image retrieval, visual saliency, semantic segmentation, object detection, machine translation, speech-to-text, and language models.

Assessment Breakdown
Continuous Assessment100% Examination Weight0%
Course Work Breakdown
TypeDescription% of totalAssessment Date
Loop QuizLoop quizzes (mixture of free form and multiple choice questions) that test students understanding of the topics covered40%Every Week
Research PaperGroup project (groups of 3): 6 page IEEE format conference paper describing theoretical or experimental research results40%Week 10
Report(s)A structured critical peer-review of the initial conference papers of 5 other teams including the provision of reviewer feedback to the authors10%Week 11
Research PaperA response to peer-review in the form of a rebuttal and an updated research paper10%Week 12
Reassessment Requirement Type
Resit arrangements are explained by the following categories;
1 = A resit is available for all components of the module
2 = No resit is available for 100% continuous assessment module
3 = No resit is available for the continuous assessment component
This module is category 2
Indicative Reading List

  • Ian Goodfellow,Yoshua Bengio,Aaron Courville: 2016, Deep Learning, 1, MIT Press, 0262035618
  • Christopher M. Bishop: 2006, Pattern Recognition and Machine Learning, Springer Verlag, 0387310738
Other Resources

None
Programme or List of Programmes
AMPTPhD-track
CAPDPhD
CAPMMSc
CAPTPhD-track
EEPDPhD
EEPMMEng
EEPTPhD-track
MEPDPhD
MEPMMEng
MEPTPhD-track
Archives:

My DCU | Loop | Disclaimer | Privacy Statement