# Bayesian Statistics using Julia and Turing

*Bayesian for Everyone*

Welcome to the repository of tutorials on how to do **Bayesian Statistics** using **Julia** and **Turing**. Tutorials are available at storopoli.io/Bayesian-Julia.

**Bayesian statistics** is an approach to inferential statistics based on Bayes' theorem, where available knowledge about parameters in a statistical model is updated with the information in observed data. The background knowledge is expressed as a prior distribution and combined with observational data in the form of a likelihood function to determine the posterior distribution. The posterior can also be used for making predictions about future events.

**Bayesian statistics** is a departure from classical inferential statistics that prohibits probability statements about parameters and is based on asymptotically sampling infinite samples from a theoretical population and finding parameter values that maximize the likelihood function. Mostly notorious is null-hypothesis significance testing (NHST) based on *p*-values. Bayesian statistics **incorporate uncertainty** (and prior knowledge) by allowing probability statements about parameters, and the process of parameter value inference is a direct result of the **Bayes' theorem**.

## Julia

**Julia** is a fast dynamic-typed language that just-in-time (JIT) compiles into native code using LLVM. It "runs like C but reads like Python", meaning that is *blazing* fast, easy to prototype and to read/write code. It is multi-paradigm, combining features of imperative, functional, and object-oriented programming. I won't cover Julia basics and any sort of data manipulation using Julia in the tutorials, instead please take a look into the following resources which covers most of the introduction to Julia and how to work with tabular data in Julia:

**Julia Documentation**: Julia documentation is a very friendly and well-written resource that explains the basic design and functionality of the language.**Thinking Julia**: introductory beginner-friendly book that explains the main concepts and functionality behind the Julia language.**Julia High Performance**: book by two of the creators of the Julia Language (Avik Sengupta and Alan Edelman), it covers how to make Julia even faster with some principles and tricks of the trade.**An Introduction DataFrames**: the package`DataFrames.jl`

provides a set of tools for working with tabular data in Julia. Its design and functionality are similar to those of`pandas`

(in Python) and`data.frame`

,`data.table`

and`dplyr`

(in R), making it a great general purpose data science tool, especially for those coming to Julia from R or Python.This is a collection of notebooks that introduces`DataFrames.jl`

made by one of its core contributors Bogumił Kamiński.

## Turing

**Turing** is a ecosystem of Julia packages for Bayesian Inference using probabilistic programming. Models specified using Turing are easy to read and write — models work the way you write them. Like everything in Julia, Turing is fast.

## Author

José Eduardo Storopoli, PhD - *Lattes* CV - ORCID - https://storopoli.io

## How to use the content?

The content is licensed under a very permissive Creative Commons license (CC BY-SA). You are mostly welcome to contribute with issues and pull requests. My hope is to have **more people into Bayesian statistics**. The content is aimed towards social scientists and PhD candidates in social sciences. I chose to provide an **intuitive approach** rather than focusing on rigorous mathematical formulations. I've made it to be how I would have liked to be introduced to Bayesian statistics.

To configure a local environment:

Download and install Julia

Clone the repository from GitHub:

`git clone https://github.com/storopoli/Bayesian-Julia.git`

Access the directory:

`cd Bayesian-Julia`

Activate the environment by typing in the Julia REPL:

```
using Pkg
Pkg.activate(".")
Pkg.instantiate()
```

## Tutorials

## What about other Turing tutorials?

Despite not being the only Turing tutorial that exists, this tutorial aims to introduce Bayesian inference along with how to use Julia and Turing. Here is a (not complete) list of other Turing tutorials:

**Official Turing Tutorials**: tutorials on how to implement common models in Turing**Statistical Rethinking - Turing Models**: Julia versions of the Bayesian models described in*Statistical Rethinking*Edition 1 (McElreath, 2016) and Edition 2 (McElreath, 2020)**Håkan Kjellerstrand Turing Tutorials**: a collection of Julia Turing models

## How to cite

To cite these tutorials, please use:

`Storopoli (2021). Bayesian Statistics with Julia and Turing. https://storopoli.io/Bayesian-Julia.`

Or in BibTeX format \(\LaTeX\):

```
@misc{storopoli2021bayesianjulia,
author = {Storopoli, Jose},
title = {Bayesian Statistics with Julia and Turing},
url = {https://storopoli.io/Bayesian-Julia},
year = {2021}
}
```

## References

### Books

Gelman, A., Carlin, J. B., Stern, H. S., Dunson, D. B., Vehtari, A., & Rubin, D. B. (2013).

*Bayesian Data Analysis*. Chapman and Hall/CRC.McElreath, R. (2020).

*Statistical rethinking: A Bayesian course with examples in R and Stan*. CRC press.Gelman, A., Hill, J., & Vehtari, A. (2020).

*Regression and other stories*. Cambridge University Press.Brooks, S., Gelman, A., Jones, G., & Meng, X.-L. (2011).

*Handbook of Markov Chain Monte Carlo*. CRC Press. https://books.google.com?id=qfRsAIKZ4rICGeyer, C. J. (2011). Introduction to markov chain monte carlo. In S. Brooks, A. Gelman, G. L. Jones, & X.-L. Meng (Eds.),

*Handbook of markov chain monte carlo*.

### Academic Papers

van de Schoot, R., Depaoli, S., King, R., Kramer, B., Märtens, K., Tadesse, M. G., Vannucci, M., Gelman, A., Veen, D., Willemsen, J., & Yau, C. (2021). Bayesian statistics and modelling.

*Nature Reviews Methods Primers*,*1*(1, 1), 1–26. https://doi.org/10.1038/s43586-020-00001-2Gabry, J., Simpson, D., Vehtari, A., Betancourt, M., & Gelman, A. (2019). Visualization in Bayesian workflow.

*Journal of the Royal Statistical Society: Series A (Statistics in Society)*,*182*(2), 389–402. https://doi.org/10.1111/rssa.12378Gelman, A., Vehtari, A., Simpson, D., Margossian, C. C., Carpenter, B., Yao, Y., Kennedy, L., Gabry, J., Bürkner, P.-C., & Modr’ak, M. (2020, November 3).

*Bayesian Workflow*. http://arxiv.org/abs/2011.01808Benjamin, D. J., Berger, J. O., Johannesson, M., Nosek, B. A., Wagenmakers, E.-J., Berk, R., Bollen, K. A., Brembs, B., Brown, L., Camerer, C., Cesarini, D., Chambers, C. D., Clyde, M., Cook, T. D., De Boeck, P., Dienes, Z., Dreber, A., Easwaran, K., Efferson, C., … Johnson, V. E. (2018). Redefine statistical significance.

*Nature Human Behaviour*,*2*(1), 6–10. https://doi.org/10.1038/s41562-017-0189-zMcShane, B. B., Gal, D., Gelman, A., Robert, C., & Tackett, J. L. (2019). Abandon Statistical Significance.

*American Statistician*,*73*, 235–245. https://doi.org/10.1080/00031305.2018.1527253Amrhein, V., Greenland, S., & McShane, B. (2019). Scientists rise up against statistical significance.

*Nature*,*567*(7748), 305–307. https://doi.org/10.1038/d41586-019-00857-9van de Schoot, R., Kaplan, D., Denissen, J., Asendorpf, J. B., Neyer, F. J., & van Aken, M. A. G. (2014). A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research.

*Child Development*,*85*(3), 842–860. https://doi.org/10.1111/cdev.12169

### Software References

Bezanson, J., Edelman, A., Karpinski, S., & Shah, V. B. (2017). Julia: A fresh approach to numerical computing. SIAM Review, 59(1), 65–98.

Ge, H., Xu, K., & Ghahramani, Z. (2018). Turing: A Language for Flexible Probabilistic Inference. International Conference on Artificial Intelligence and Statistics, 1682–1690. http://proceedings.mlr.press/v84/ge18b.html

Tarek, M., Xu, K., Trapp, M., Ge, H., & Ghahramani, Z. (2020). DynamicPPL: Stan-like Speed for Dynamic Probabilistic Models. ArXiv:2002.02702 [Cs, Stat]. http://arxiv.org/abs/2002.02702

Xu, K., Ge, H., Tebbutt, W., Tarek, M., Trapp, M., & Ghahramani, Z. (2020). AdvancedHMC.jl: A robust, modular and efficient implementation of advanced HMC algorithms. Symposium on Advances in Approximate Bayesian Inference, 1–10. http://proceedings.mlr.press/v118/xu20a.html

Revels, J., Lubin, M., & Papamarkou, T. (2016). Forward-Mode Automatic Differentiation in Julia. ArXiv:1607.07892 [Cs]. http://arxiv.org/abs/1607.07892

## License

This content is licensed under Creative Commons Attribution-ShareAlike 4.0 Internacional.

## Environment

This website is built with Julia 1.6.2 and

```
BenchmarkTools 1.1.4
CSV 0.9.2
Chain 0.4.8
DataFrames 1.2.2
DifferentialEquations 6.19.0
Distributions 0.25.16
FillArrays 0.12.4
ForwardDiff 0.10.19
Franklin 0.10.53
GZip 0.5.1
HTTP 0.9.14
LaTeXStrings 1.2.1
LazyArrays 0.21.20
Literate 2.9.3
MCMCChains 5.0.1
NodeJS 1.3.0
Plots 1.21.3
StatsBase 0.33.10
StatsFuns 0.9.10
StatsPlots 0.14.27
Turing 0.18.0
```