DCU Home | Our Courses | Loop | Registry | Library | Search DCU
<< Back to Module List

Latest Module Specifications

Current Academic Year 2025 - 2026

Module Title Advanced Topics in Machine Learning
Module Code EEN1015 (ITS: EE613)
Faculty Electronic Engineering School Engineering & Computing
NFQ level 9 Credit Rating 7.5
Description

This module is aimed at Ph.D students that are studying topics related to artificial intelligence and machine learning and need a deep and up-to-date understanding of the most important state-of-the-art in a fast moving field. The module will provide students with a broad understanding of the latest theoretical and experimental results in the field, the ability to mathematically formulate machine learning problems, the foundations needed to develop novel approaches, and experience with the peer-review process, both from the perspective of the author and of the reviewer. This is an advanced module and will require that students have already taken a first course in machine learning.

Learning Outcomes

1. Demonstrate a critical understanding of a wide variety of topics in machine learning research
2. Formulate and analyze machine learning problems in a mathematically rigorous way
3. Develop novel research in machine learning and document the results in a research paper that is to publication standards
4. Critically appraise a research paper and productively engage in the peer review process as both an author and a reviewer


WorkloadFull time hours per semester
TypeHoursDescription
Lecture36Classroom or online lectures
Assignment Completion12Online Loop quizzes
Assignment Completion60Research Paper
Assignment Completion8Peer review
Assignment Completion12Rebuttal and final paper
Independent Study59Paper reading and background research
Total Workload: 187
Section Breakdown
CRN20131Part of TermSemester 2
Coursework0%Examination Weight0%
Grade ScalePASS/FAILPass Both ElementsY
Resit CategoryRC1Best MarkN
Module Co-ordinatorNoel MurphyModule Teacher
Assessment Breakdown
TypeDescription% of totalAssessment Date
Loop QuizLoop quizzes (mixture of free form and multiple choice questions) that test students understanding of the topics covered40%Every Week
Research PaperGroup project (groups of 3): 6 page IEEE format conference paper describing theoretical or experimental research results40%Week 10
Report(s)A structured critical peer-review of the initial conference papers of 5 other teams including the provision of reviewer feedback to the authors10%Week 11
Research PaperA response to peer-review in the form of a rebuttal and an updated research paper10%Week 12
Reassessment Requirement Type
Resit arrangements are explained by the following categories;
RC1: A resit is available for both* components of the module.
RC2: No resit is available for a 100% coursework module.
RC3: No resit is available for the coursework component where there is a coursework and summative examination element.

* ‘Both’ is used in the context of the module having a coursework/summative examination split; where the module is 100% coursework, there will also be a resit of the assessment

Pre-requisite None
Co-requisite None
Compatibles None
Incompatibles None

All module information is indicative and subject to change. For further information,students are advised to refer to the University's Marks and Standards and Programme Specific Regulations at: http://www.dcu.ie/registry/examinations/index.shtml

Indicative Content and Learning Activities

ML Foundations and Fundamentals
A formal mathematical introduction to the foundations of modern machine learning and neural networks, learn notational conventions, matrix calculus, and stochastic optimization principles.

Deep Neural Architectures
Provide students with an in-depth understanding of the principles of deep neural architecture engineering, including: vanishing gradients, skip connections, residual connections, separable convolutions, group convolutions, upsampling techniques, neural architecture search, stochastic depth, etc.

Neural Network Optimization and Generalization
Analysis of current trends in optimization and regularization including transfer learning, domain adaptation, stochastic optimization, optimization techniques, deep network optimization theory and practice.

Kernel Methods
Discussion of kernel methods and their interaction with modern deep learning. Topics covered will include: the kernel trick, Kernel Principal Component Analysis (KPCA), Kernel Support Vector Machines (SVM), Kernel Ridge Regression, and Neural Tangent Kernels (Jacot et al, 2018)

Probabilistic Unsupervised Models
Discussion of recent developments in unsupervised probabilistic neural models. Topics covered will include: autoregressive models, Masked Autoencoder for Distribution Estimation (MADE) (Germain et al. 2015), Pixel Recurrent Neural Networks (Pixel-RNN) (Van Den Oord et al. 2016) and Pixel-CNN (Salimans et al. 2016), Variational Autoencoders (Kingma et al. 2014), variational lower bounds, and Normalizing Flows (Kobyzev et al. 2016).

Generative Models
Overview of the principles and recent developments in generative neural models. Specific topics covered will include: Generative Adversarial Networks (GANs) (Goodfellow et al. 2014), Wasserstein GAN (Arjovsky et al. 2017), Least Squares-GAN (Mao et al. 2017), BigGAN (Brock et al. 2019), StyleGAN (Karras et al. 2019), ProGAN (Karras et al. 2018), InfoGAN (Chen et al. 2016), Conditional GANs, pix2pix (Isola et al. 2017), CycleGAN (Zhu et al. 2017), Spectral Normalization (Miyato et al. 2018), and methods and metrics for the evaluation of GANs.

Representation Learning
Introduction to recent developments in self-supervised representation learning, semi-supervised techniques, label noise, pseudo-labeling, consistency regularization, and contrastive learning.

Applications
Analysis of the state-of-the-art in various practical applications of machine learning including: image retrieval, visual saliency, semantic segmentation, object detection, machine translation, speech-to-text, and language models.

Indicative Reading List

Books:
  • Ian Goodfellow,Yoshua Bengio,Aaron Courville: 2016, Deep Learning, 1, MIT Press, 775, 0262035618
  • Christopher M. Bishop: 2006, Pattern Recognition and Machine Learning, Springer Verlag, 738, 0387310738


Articles:
  • Goodfellow, Ian, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio: 2014, Generative Adversarial Nets, Advances in neural information processing systems, 2672, http://papers.nips.cc/paper/5423-generative-adversarial-nets, 519245
  • 0: Wasserstein generative adversarial networks., Proceedings of the 34th International Conference on Machine Learning, Vol. 70, 2017, 214, https://arxiv.org/abs/1701.07875, 519246, 2
  • Auto-encoding variational bayes: International Conference on Learning Representations, 2014, https://arxiv.org/abs/1312.6114, 519247, 2, Vaswani, Ashish Shazeer, Noam Parmar, Niki Uszkoreit, Jakob Jones, Llion Gomez, Aidan N. Kaiser, Łukasz Polosukhin, Illia
  • Advances in Neural Information Processing Systems: 2017, https://arxiv.org/abs/1706.03762, 519248, 2, Zhang, Chiyuan, Samy Bengio, Moritz Hardt, Benjamin Recht, and Oriol Vinyals, 0
  • 2017: 519249, 2, Jacot, Arthur, Franck Gabriel, and Clément Hongler, 0, Neural tangent kernel: Convergence and generalization in neural networks
  • 519250: 2, Belkin, Mikhail, Daniel Hsu, Siyuan Ma, and Soumik Mandal, 0, Reconciling modern machine-learning practice and the classical bias–variance trade-off., Proceedings of the National Academy of Sciences, Vol. 116, no. 32 (2019),
  • 2: Karras, T., Laine, S. and Aila, T., 0, A style-based generator architecture for generative adversarial networks, Proceedings of the IEEE conference on computer vision and pattern recognition, 2019,
  • Isola, Phillip, Jun-Yan Zhu, Tinghui Zhou, and Alexei A. Efros: 0, Image-to-image translation with conditional adversarial networks, Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, 519253
  • 0: Unpaired image-to-image translation using cycle-consistent adversarial networks, Proceedings of the IEEE international conference on computer vision, 2017, 519254, 2
  • Normalizing flows: An introduction and review of current methods: IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020, 519255, 2, Salimans, Tim, Andrej Karpathy, Xi Chen, and Diederik P. Kingma
  • International Conference on Learning Representations: 2016, 519256, 2, Van Den Oord, Aäron, Nal Kalchbrenner, and Koray Kavukcuoglu, 0
  • Vol. 48, pp. 1747-1756. 2016: 519257, 2, Chen, Ting, Simon Kornblith, Mohammad Norouzi, and Geoffrey Hinton, 0, A simple framework for contrastive learning of visual representations
  • https://arxiv.org/abs/2002.05709: 519258, 2, He, Kaiming, Haoqi Fan, Yuxin Wu, Saining Xie, and Ross Girshick, 0, Momentum contrast for unsupervised visual representation learning., Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2020,
  • 519259: 1, Mathieu Germain, Karol Gregor, Iain Murray, Hugo Larochelle, 2015, MADE: Masked Autoencoder for Distribution Estimation, International Conference on Machine Learning,
  • 1: Takeru Miyato, Toshiki Kataoka, Masanori Koyama, Yuichi Yoshida, 2018, Spectral Normalization for Generative Adversarial Networks, International Conference on Learning Representations,
  • Xi Chen, Yan Duan, Rein Houthooft, John Schulman, Ilya Sutskever, Pieter Abbeel: 2016, InfoGAN: interpretable representation learning by information maximizing generative adversarial nets, Neural Information Processing Conference, 519262
  • 2018: Progressive Growing of GANs for Improved Quality, Stability, and Variation, International Conference on Learning Representations, 519263, 1
  • Least Squares Generative Adversarial Networks: International Conference on Computer Vision, 519264, 1, Andrew Brock, Jeff Donahue, Karen Simonyan
  • International Conference on Learning Representations:
Other Resources

None

<< Back to Module List View 2024/25 Module Record for EE613