ECE 5973-901: Artificial Neural Networks and Applications

Artificial neural networks was introduced in the 50’s of the last century. However, in the last decade, there has been strong resurgence of neural networks as processing techniques where they have been applied to many real-world problems. This leads to numerous breakthroughs on image, video, and natural language processing applications.

This course is aimed to be quite hands-on and should provide students with sufficient details for them to quickly apply to their own research. In particular, applications relating to computer vision and natural language processing will be discussed. There may be some math but we will not spend too much time going into proofs. Instead, we may try to go through (not exhaustively) some of the free libraries such as Caffe and Torch. And you are definitely encouraged to explore and leverage them for your course project.


  • Ian Goodfellow, Yoshua Bengio, and Aaron Courville, Deep Learning, MIT Press.

It is not required but is a very good reference.


Some Deep Learning Toolboxes and Libraries

  • Tensorflow: From Google, probably most popular package. Not quite optimized for single PC

  • Caffe2: From Facebook

  • Caffe: From Berkeley

  • Torch 7: From NYU, and used by Facebook/Twitter

  • PyTorch: The Python version of Torch

  • Theano: From Bengio's group in Montreal

  • Keras: High-level layer on top of Theano/Tensorflow

  • Lasagne: High-level layer on top of Theano

  • matconvnet: From Oxford, kind of restricted

  • mxnet: From Amazon

  • Neon: From Intel

  • Deeplearning4j

Office Hours

There are no “regular” office hours. And you are welcome to come catch me anytime or contact me through emails.

Course Syllabus (Tentative)

  • Overview of machine learning

  • History of artificial neural networks

  • Perceptrons

  • Backpropagation algorithms

  • Regularization and dropout

  • Weight initialization

  • Optimization methods

  • Convolutional neural networks (CNN)

  • R-CNN, faster R-CNN

  • Weight visualization, Deep visualization, t-SNE, deepdream

  • Recurrent neural networks

  • LSTM networks

  • Restricted boltzmann machines

  • Autoencoders

  • Deep belief networks


Your projects will be presented in class on May 5. You may include an additional written report if you would like to further clarify anything.


"Activities": 30%. Quizzes, paper review, presentations, etc.

Homework: 30%. Programming assignments.

Final Project: 40%.


Calculus (MATH 1914 or equivalent), linear algebra (MATH 3333 or equivalent), basic probability (MATH 4733 or equivalent), and intermediate programming skill (experience on Python/Numpy is preferred)

Note that we will “borrow” programming assignment from Stanford 231n. So ability to program in Python is expected. Python is not difficult if you are familiar with any other high level general programming languages such as C/C++/C#/Java/Javascript/Perl/Matlab etc. If you don't know anything about Python, I would recommend you to try out this app.


Topics Materials Further reading/watching
1/16 Overview, AI, machine learning and its types, artificial neural networks and its history overview Andrew Ng: Artificial Intelligence is the New Electricity
1/18 Machine learning basics, linear regression, regularization classification and regression