Lecture 1: Deep Learning Fundamentals ( Full Stack Deep Learning Spring 2021)
In this video, we discuss the fundamentals of deep learning. We will cover artificial neural networks, the universal approximation theorem, three major types of learning problems, the empirical risk minimization problem, the idea behind gradient descent, the practice of backpropagation, the core neural architectures, and the rise of GPUs. This should be a review for most of you; if not, then briefly go through this online book Outline: 0:00 Intro 1:25 Neural Netw
|
|