Example: tourism industry

Lecture 13: Generative Models

Lecture 13: Generative Models Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 1 May 18, 2017 . Administrative Midterm grades released on Gradescope this week A3 due next Friday, 5/26. HyperQuest deadline extended to Sunday 5/21, 11:59pm Poster session is June 6. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 2 May 18, 2017 . Overview Unsupervised Learning Generative Models PixelRNN and PixelCNN. Variational Autoencoders (VAE). Generative Adversarial Networks (GAN). Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 3 May 18, 2017 . Supervised vs Unsupervised Learning Supervised Learning Data: (x, y). x is data, y is label Goal: Learn a function to map x -> y Examples: Classification, regression, object detection, semantic segmentation, image captioning, etc. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 4 May 18, 2017 . Supervised vs Unsupervised Learning Supervised Learning Data: (x, y).

Lecture 13 - 33 May 18, 2017 PixelRNN and PixelCNN Improving PixelCNN performance - Gated convolutional layers - Short-cut connections - Discretized logistic loss - Multi-scale - Training tricks - Etc… See - Van der Oord et al. NIPS 2016 - Salimans et al. 2017 (PixelCNN++) Pros: - Can explicitly compute likelihood p(x) - Explicit likelihood ...

Tags:

  2017, Inps, Generative

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Lecture 13: Generative Models

1 Lecture 13: Generative Models Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 1 May 18, 2017 . Administrative Midterm grades released on Gradescope this week A3 due next Friday, 5/26. HyperQuest deadline extended to Sunday 5/21, 11:59pm Poster session is June 6. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 2 May 18, 2017 . Overview Unsupervised Learning Generative Models PixelRNN and PixelCNN. Variational Autoencoders (VAE). Generative Adversarial Networks (GAN). Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 3 May 18, 2017 . Supervised vs Unsupervised Learning Supervised Learning Data: (x, y). x is data, y is label Goal: Learn a function to map x -> y Examples: Classification, regression, object detection, semantic segmentation, image captioning, etc. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 4 May 18, 2017 . Supervised vs Unsupervised Learning Supervised Learning Data: (x, y).

2 X is data, y is label Cat Goal: Learn a function to map x -> y Examples: Classification, regression, object detection, Classification semantic segmentation, image captioning, etc. This image is CC0 public domain Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 5 May 18, 2017 . Supervised vs Unsupervised Learning Supervised Learning Data: (x, y). x is data, y is label Goal: Learn a function to map x -> y Examples: Classification, DOG, DOG, CAT. regression, object detection, semantic segmentation, image Object Detection captioning, etc. This image is CC0 public domain Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 6 May 18, 2017 . Supervised vs Unsupervised Learning Supervised Learning Data: (x, y). x is data, y is label Goal: Learn a function to map x -> y GRASS, CAT, Examples: Classification, TREE, SKY. regression, object detection, semantic segmentation, image Semantic Segmentation captioning, etc.

3 Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 7 May 18, 2017 . Supervised vs Unsupervised Learning Supervised Learning Data: (x, y). x is data, y is label Goal: Learn a function to map x -> y A cat sitting on a suitcase on the floor Examples: Classification, regression, object detection, Image captioning semantic segmentation, image captioning, etc. Caption generated using neuraltalk2. Image is CC0 Public domain. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 8 May 18, 2017 . Supervised vs Unsupervised Learning Unsupervised Learning Data: x Just data, no labels! Goal: Learn some underlying hidden structure of the data Examples: Clustering, dimensionality reduction, feature learning, density estimation, etc. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 9 May 18, 2017 . Supervised vs Unsupervised Learning Unsupervised Learning Data: x Just data, no labels!

4 Goal: Learn some underlying hidden structure of the data Examples: Clustering, K-means clustering dimensionality reduction, feature learning, density estimation, etc. This image is CC0 public domain Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 10 May 18, 2017 . Supervised vs Unsupervised Learning Unsupervised Learning Data: x Just data, no labels! Goal: Learn some underlying hidden structure of the data 3-d 2-d Examples: Clustering, Principal Component Analysis dimensionality reduction, feature (Dimensionality reduction). learning, density estimation, etc. This image from Matthias Scholz is CC0 public domain Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 11 May 18, 2017 . Supervised vs Unsupervised Learning Unsupervised Learning Data: x Just data, no labels! Goal: Learn some underlying hidden structure of the data Examples: Clustering, dimensionality reduction, feature Autoencoders learning, density estimation, etc.

5 (Feature learning). Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 12 May 18, 2017 . Supervised vs Unsupervised Learning Unsupervised Learning Data: x Figure copyright Ian Goodfellow, 2016. Reproduced with permission. Just data, no labels! 1-d density estimation Goal: Learn some underlying hidden structure of the data Examples: Clustering, dimensionality reduction, feature 2-d density estimation learning, density estimation, etc. 2-d density images left and right are CC0 public domain Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 13 May 18, 2017 . Supervised vs Unsupervised Learning Supervised Learning Unsupervised Learning Data: (x, y) Data: x x is data, y is label Just data, no labels! Goal: Learn a function to map x -> y Goal: Learn some underlying hidden structure of the data Examples: Classification, regression, object detection, Examples: Clustering, semantic segmentation, image dimensionality reduction, feature captioning, etc.

6 Learning, density estimation, etc. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 14 May 18, 2017 . Supervised vs Unsupervised Learning Supervised Learning Unsupervised Learning Training data is cheap Data: (x, y) Data: x Holy grail: Solve x is data, y is label Just data, no labels! unsupervised learning => understand structure of visual world Goal: Learn a function to map x -> y Goal: Learn some underlying hidden structure of the data Examples: Classification, regression, object detection, Examples: Clustering, semantic segmentation, image dimensionality reduction, feature captioning, etc. learning, density estimation, etc. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 15 May 18, 2017 . Generative Models Given training data, generate new samples from same distribution Training data ~ pdata(x) Generated samples ~ pmodel(x). Want to learn pmodel(x) similar to pdata(x).

7 Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 16 May 18, 2017 . Generative Models Given training data, generate new samples from same distribution Training data ~ pdata(x) Generated samples ~ pmodel(x). Want to learn pmodel(x) similar to pdata(x). Addresses density estimation, a core problem in unsupervised learning Several flavors: - Explicit density estimation: explicitly define and solve for pmodel(x). - Implicit density estimation: learn model that can sample from pmodel(x) w/o explicitly defining it Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 17 May 18, 2017 . Why Generative Models ? - Realistic samples for artwork, super-resolution, colorization, etc. - Generative Models of time-series data can be used for simulation and planning (reinforcement learning applications!). - Training Generative Models can also enable inference of latent representations that can be useful as general features FIgures from L-R are copyright: (1) Alec Radford et al.

8 2016; (2) David Berthelot et al. 2017 ; Phillip Isola et al. 2017 . Reproduced with authors permission. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 18 May 18, 2017 . Taxonomy of Generative Models Direct GAN. Generative Models Explicit density Implicit density Markov Chain Tractable density Approximate density GSN. Fully Visible Belief Nets - NADE. - MADE Variational Markov Chain - PixelRNN/CNN. Variational Autoencoder Boltzmann Machine Change of variables Models (nonlinear ICA). Figure copyright and adapted from Ian Goodfellow, Tutorial on Generative Adversarial Networks, 2017 . Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 19 May 18, 2017 . Taxonomy of Generative Models Direct Today: discuss 3 most GAN. popular types of Generative Generative Models Models today Explicit density Implicit density Markov Chain Tractable density Approximate density GSN. Fully Visible Belief Nets - NADE.

9 - MADE Variational Markov Chain - PixelRNN/CNN. Variational Autoencoder Boltzmann Machine Change of variables Models (nonlinear ICA). Figure copyright and adapted from Ian Goodfellow, Tutorial on Generative Adversarial Networks, 2017 . Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 20 May 18, 2017 . PixelRNN and PixelCNN. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 21 May 18, 2017 . Fully visible belief network Explicit density model Use chain rule to decompose likelihood of an image x into product of 1-d distributions: Likelihood of Probability of i'th pixel value image x given all previous pixels Then maximize likelihood of training data Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 22 May 18, 2017 . Fully visible belief network Explicit density model Use chain rule to decompose likelihood of an image x into product of 1-d distributions: Likelihood of Probability of i'th pixel value image x given all previous pixels Complex distribution over pixel values => Express using a neural Then maximize likelihood of training data network!

10 Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 23 May 18, 2017 . Fully visible belief network Explicit density model Use chain rule to decompose likelihood of an image x into product of 1-d distributions: Will need to define ordering of previous Likelihood of Probability of i'th pixel value pixels . image x given all previous pixels Complex distribution over pixel values => Express using a neural Then maximize likelihood of training data network! Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 24 May 18, 2017 . PixelRNN [van der Oord et al. 2016]. Generate image pixels starting from corner Dependency on previous pixels modeled using an RNN (LSTM). Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 25 May 18, 2017 . PixelRNN [van der Oord et al. 2016]. Generate image pixels starting from corner Dependency on previous pixels modeled using an RNN (LSTM). Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 13 - 26 May 18, 2017 .


Related search queries