Probabilistic Graphical Models 1: Representation

4.64

Updated on

Course overview

Provider
Coursera
Course type
Free online course
Level
Advanced
Deadline
Flexible
Duration
67 hours
Certificate
Paid Certificate Available
Course author
Daphne Koller

Description

Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems. This course is the first in a sequence of three. It describes the two basic PGM representations: Bayesian Networks, which rely on a directed graph; and Markov networks, which use an undirected graph. The course discusses both the theoretical properties of these representations as well as their use in practice. The (highly recommended) honors track contains several hands-on assignments on how to represent some real-world problems. The course also presents some important extensions beyond the basic PGM representation, which allow more complex models to be encoded compactly.

Similar courses

Machine Learning
  • Flexible deadline
  • 61 hours
  • Certificate
Neural Networks and Deep Learning
  • Flexible deadline
  • 27 hours
  • Certificate
Introduction to Machine Learning in Production
  • Flexible deadline
  • 10 hours
  • Certificate
Probabilistic Graphical Models 1: Representation
  • English language

  • Recommended provider

  • Certificate available