Certificate Probabilistic Deep Learning with TensorFlow 2

Probabilistic Deep Learning with TensorFlow 2
Imperial College London

The focus of this course is the TensorFlow Probability library. Spoiler alert! Probability distributions are important factors you need to consider. From now on, building model is not only as simple as adding layers and squeezing your GPU to calculate various weights.

This is challenging course. From time to time, you may feel that you need to revisit the basics of probability and statistics. There is also a gap between the math equations and Python code, that you need jump over. Monte Carlo is more than a name, actually it is used quite often.



The course is split into 4 parts. The first part is about distribution object in the TensorFlow Probability library. Distributions are the basic building blocks, that we are going to train to get approximation. On one hand, you need basic knowledge of various commonly-used distributions. On the other hand, you need to get familiar with some conventions that Python adopted, for instance: the shape of sample, batch and event.

The second part is about building model. The aleatoric and epistemic uncertainty are the problems that we want to solve. To be honest, these are not simple and easy. I often pause the lecture video and ponder for a while. The beauty of the TensorFlow Probability library is that the library helps you concreate the abstract concepts, making them tangible. However, without strong skills in both math and programming, you are unlikely to proceed.

The third part teaches bijector objects. They are used to transform tensor objects, and form the basis for normalizing flow models. You probably need to revisit the concept of determinant and Jacobian. Don’t miss any reading in the course, they are very valuable.

The last part, the final challenge is variational autoencoders. Congratulations in advance! You are about to complete the adventure, if you neither give up nor give in. Kullback-Leibler Divergence is the most important thing, I recommend everyone firmly grasps it before starting to read Python code.

What do you think about the fact that the decoder corresponds to the likelihood in Baysian inference and the encoder corresponds to the posterior? Encoding information into another space and then decoding back? It sparks my desire to give information theory a visit.

Quick Recap



My Certificate

Certificate Probabilistic Deep Learning with TensorFlow 2
Probabilistic Deep Learning with TensorFlow 2
https://coursera.org/share/4e610bd8c7f1d024f744828cdc4d1d31

More in the Specialization


I am Kesler Zhu, thank you for visiting my website. Checkout more course reviews at https://KZHU.ai

Don't forget to sign up newsletter, don't miss any chance to learn.

Or share what you've learned with friends!

Leave a Reply

Your email address will not be published. Required fields are marked *