Skip to content

KZHU.ai πŸš€

Into the Unknown

Menu
  • πŸ“ˆ Discrete Optimization
    • Mathematics
    • Computer Science
    • Cryptography
    • C++
  • βš›οΈ Quantum Computing
    • Physics
    • Blockchain
  • πŸ€– Machine Learning
    • TensorFlow
    • Python
  • πŸ›° Data Science
    • Statistics
    • Matlab
  • 🌦 Hybrid Cloud
    • Kubernetes
    • Golang
    • Web
  • πŸ“ˆ Business
    • MBA @ Gies
    • Six Sigma
  • 🏦 Finance
    • Law
    • Economics
Menu
Certificate Probabilistic Deep Learning with TensorFlow 2

My #118 certificate from Coursera

Posted on November 10, 2022November 20, 2022 by keslerzhu

Probabilistic Deep Learning with TensorFlow 2
Imperial College London

The focus of this course is the TensorFlow Probability library. Spoiler alert! Probability distributions are important factors you need to consider. From now on, building model is not only as simple as adding layers and squeezing your GPU to calculate various weights.

This is challenging course. From time to time, you may feel that you need to revisit the basics of probability and statistics. There is also a gap between the math equations and Python code, that you need jump over. Monte Carlo is more than a name, actually it is used quite often.



The course is split into 4 parts. The first part is about distribution object in the TensorFlow Probability library. Distributions are the basic building blocks, that we are going to train to get approximation. On one hand, you need basic knowledge of various commonly-used distributions. On the other hand, you need to get familiar with some conventions that Python adopted, for instance: the shape of sample, batch and event.

The second part is about building model. The aleatoric and epistemic uncertainty are the problems that we want to solve. To be honest, these are not simple and easy. I often pause the lecture video and ponder for a while. The beauty of the TensorFlow Probability library is that the library helps you concreate the abstract concepts, making them tangible. However, without strong skills in both math and programming, you are unlikely to proceed.

The third part teaches bijector objects. They are used to transform tensor objects, and form the basis for normalizing flow models. You probably need to revisit the concept of determinant and Jacobian. Don’t miss any reading in the course, they are very valuable.

The last part, the final challenge is variational autoencoders. Congratulations in advance! You are about to complete the adventure, if you neither give up nor give in. Kullback-Leibler Divergence is the most important thing, I recommend everyone firmly grasps it before starting to read Python code.

What do you think about the fact that the decoder corresponds to the likelihood in Baysian inference and the encoder corresponds to the posterior? Encoding information into another space and then decoding back? It sparks my desire to give information theory a visit.

Quick Recap

Distribution Objects in TensorFlow Probability
TensorFlow: Probabilistic Deep Learning Models
TensorFlow: Normalizing Flow Models
Variational Autoencoders


My Certificate

Certificate Probabilistic Deep Learning with TensorFlow 2
Probabilistic Deep Learning with TensorFlow 2
https://coursera.org/share/4e610bd8c7f1d024f744828cdc4d1d31

More in the Specialization

My #8 specialization certificate from Coursera

I am Kesler Zhu, thank you for visiting my website. Checkout more course reviews at https://KZHU.ai

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

American Contract Law I Andrew Ng Anna Koop Brenda Gunderson Christopher Millard Computer Communications Specialization Cryptography Economics of Money and Banking Evgenii Vashukevich Garud Iyengar Ivan Vybornyi Jeffrey Chasnov John Daily Jonathan Katz Kevin Webster Ling-Chieh Kung Machine Learning: Algorithms in the Real World Martin Haugh Mathematics for Engineers Specialization Matthew Hutchens Michael Donohoe Michael Fricke Microsoft Azure Fundamentals AZ-900 Exam Prep Specialization Operations Research (3): Theory Perry Mehrling Petro Lisowsky Physical Basics of Quantum Computing Practical Reinforcement Learning Rebekah May Search Engine Optimization (SEO) Specialization Sergey Sysoev Statistical Thermodynamics Specialization Statistics with Python Specialization Taxation of Business Entities I: Corporations TensorFlow 2 for Deep Learning Specialization U.S. Federal Taxation Specialization Wounjhang Park Xiaobo Zhou Yi Wang БысоСв Π‘Π΅Ρ€Π³Π΅ΠΉ Π‘Π΅Ρ€Π³Π΅Π΅Π²ΠΈΡ‡

Subscribe to our newsletter!

© 2025 KZHU.ai πŸš€ | Powered by Superbs Personal Blog theme

Privacy Policy - Terms and Conditions