Introduction to Probabilistic Programming - PyMC3 and Edward
Probabilistic programming differ from deterministic ones by allowing language primitives to be stochastic. In other words, instead of being restricted to deterministic assignments such as:
rent = 25000
one can specify a probability distribution from which this house with such a rent was drawn
rent ~ Normal(mu=25000, sigma=1000)
The expressiveness of the probabilistic programming frawework, both theoretical and practical, allows us to go further into replacing parameters of Machine Learning algorithms with distributions. How do we do that?
With enhancing concerns about trust in blackbox AI, cases of small data, why will probabilistic programming help?
PyMC, Edward, Tensorflow Probability, Where do I start?
I’m so used to blackbox ML, How do I wear a Bayesian hat?
This talk tries to answer these questions.
Technically, talk will help get started with coding in PyMC3 and Edward, understand their strengths and weakness. Starting from Bayesian Inference to applying the same concepts on ML. In that sense, get an overall idea of how and where probabilistic programming helps. Code and graphs can be shown via Jupyter Notebook.
Basic understanding of widely used Probability Distributions like Normal, Poisson, Binomial. Basic understanding of Machine Learning, Neural Networks. Also Python.
It would be easier if you have Jupyter, Pymc3 and Edward installed apart from usual suspects like numpy/pandas/seaborn etc.
You might want to install a Tensorflow version < 1.7 for Edward compatibility.
Following are the pip packages I have installed for this session:
I’m Hariharan and I’m usually curious and love learning new things. I graduated from BITS – Pilani and since been in the industry for roughly 7 years. I have predominantly worked in the field of Machine Learning in my time in the industry. I love watching football, cricket. I used to love playing them, not anymore 😊. I like quizzing, despite not being good at it.