Introducing Stan2tfp - a lightweight interface for the Stan-to-TensorFlow Probability compiler

TL;DR The new Stan compiler has an alternative backend that allows you to do this: stan2tfp is a lightwight interface for this …

Mr. P meets TFP - mixed effects model with post-stratification in TensorFlow Probability

TL;DR We’ll: Learn an interesting method for generalizing inferences from a biased sample to a population of interest See why …

Bayesian golf puttings, NUTS, and optimizing your sampling function with TensorFlow Probability

TL;DR We’ll: Port a great Bayesian modelling tutorial from Stan to TFP Discuss how to speed up our sampling function Use the …

Survival analysis, censoring and hacking the log_prob in TensorFlow Probability

TL;DR Survival analysis is a super useful technique for modelling time-to-event data; implementing a simple survival analysis using TFP …

Varying Slopes Models and the CholeskyLKJ distribution in TensorFlow Probability

TL;DR Covariance matrices allow us to capture parameter correlations in multivariate hierarchical models; sampling these using …

Adam Haber

Computational Neuroscience PhD Student

Weizmann Institute of Science

About me

I’m a PhD student at Weizmann Institute of Science, under the supervision of Prof. Elad Schneidman. I have a BA in Physics and Mathematics from the Hebrew University. My work focuses on finding principles underlying structure-to-function relations in small neuronal networks. To address such questions, I combine simulation-based approaches with more information-theoretical tools and maximum entropy modelling.

I also work as a data scientist at Atidot, mostly working on machine-learning approaches to actuarial challenges such as mortality risk assessment and churn prediction.

Besides my own research topics, I’m very interested in scientific computing in general, and probabilistic programming in particular. I constantly try to expand my knowledge in statistics and probability theory, mostly by following interesting people on Twitter (and refs within). I especially enjoy (but rarely truly understand) topics that lie at the intersection of statistics and geometry, such as information geometry and computational optimal transport.

I’m married to Inbar, and together we’re running a life-long longitudinal study on a pair of twins. I love hiking, rock climbing, eating, and dogs.


  • Computational Neuroscience
  • Probabilistic Programming
  • Bayesian Machine Learning


  • PhD in Computational Neuroscience, 2015-

    Weizmann Institute

  • BSc in Physics and Mathematics, 2005-2008

    Hebrew University


Learning the architectural features that predict functional similarity of neural networks

The mapping of the wiring diagrams of neural circuits promises to allow us to link structure and function of neural networks. Current …