A gentle pitch

Jose Storopoli, PhD

- speed
- ease-of-use
- composability

- Python background
- scientific computing background

- NASA uses Julia in a supercomputer to analyze the “Largest Batch of Earth-Sized Planets Ever Found” and achieve a whopping
**1,000x speedup**to catalog 188 million astronomical objects in 15 minutes. - The Climate Modeling Alliance (CliMa) is using mostly Julia to
**model climate in the GPU and CPU**. Launched in 2018 in collaboration with researchers at Caltech, the NASA Jet Propulsion Laboratory, and the Naval Postgraduate School, CliMA is utilizing recent progress in computational science to develop an Earth system model that can predict droughts, heat waves, and rainfall with unprecedented precision and speed. - US Federal Aviation Administration (FAA) is developing an
**Airborne Collision Avoidance System (ACAS-X)**using Julia. This is a nice example of the “Two-Language Problem”. Previous solutions used Matlab to develop the algorithms and C++ for a fast implementation. Now, FAA is using one language to do all this: Julia. **175x speedup**for Pfizer’s pharmacology models using GPUs in Julia. It was presented as a poster in the 11th American Conference of Pharmacometrics (ACoP11) and won a quality award.- The Attitude and Orbit Control Subsystem (AOCS) of the Brazilian satellite Amazonia-1 is
**written 100% in Julia**by Ronan Arraes Jardim Chagas (https://ronanarraes.com/). - Brazil’s national development bank (BNDES) ditched a paid solution and opted for open-source Julia modeling and gained a
**10x speedup**.

If this is not enough, there are more case studies in JuliaHub website.

**Julia is fast!**

Two examples:

- Data Wrangling:
`pandas`

versus`DataFrames.jl`

- ODE solving:
`scipy`

versus`DifferentialEquations.jl`

Common data wrangling scenario doing “split-apply-combine” operations.

- 10,000 observations
- 1 categorical variable
`x`

\(\in \{\mathrm{A}, \mathrm{B}, \mathrm{C}, \mathrm{D}\}\) - 2 continuous variables:
`y`

\(\in [0, 1]\)`z`

\(\text{Normal}(0, 1)\)

Second order non-linear ODE example with a **simple pendulum**

\[ \begin{align*} &\dot{\theta} = d{\theta} \\ &\dot{d\theta} = - \frac{g}{L}{\sin(\theta)} \end{align*} \]

```
using DifferentialEquations
# Constants
const g = 9.81
L = 1.0
# Initial Conditions
u₀ = [0, π/2]
tspan = (0.0, 6.3)
# Define the problem
function simplependulum(du, u, p, t)
θ, dθ = u
du[1] = dθ
du[2] = -(g/L)*sin(θ)
end
# Pass to solvers
prob = ODEProblem(simplependulum, u₀, tspan)
# RK 4/5th order solver (Tsitouras)
@btime solve(prob, Tsit5(); saveat=range(tspan...; length=1_000));
```

```
py"""
import numpy as np
from scipy.integrate import odeint
# Constants
g = 9.81
L = 1.0
# Initial Conditions
u0 = [0, np.pi/2]
tspan = np.linspace(0.0, 6.3, 1000)
def simplependulum(u, t, g, L):
theta, dtheta = u
dydt = [dtheta, -(g/L)*np.sin(theta)]
return dydt
"""
# RK 4/5th order solver (Dormand-Prince)
@btime py"odeint(simplependulum, u0, tspan, args=(g, L))";
```

*just-in-time*compilation for the LLVM compiler- exposes everything in
*intermediate representation*code - then LLVM does what does best:
**OPTIMIZE** - including
`for`

-loops

output in next slide

The syntax is quite similar to Python.

But, no indentation and every keyword needs an `end`

.

If you need to find something just use the `@which`

macro on a type or a function signature.

It is very easy to create new packages that have types and functions.

You can extend other package’s functions, including Julia’s

`Base`

standard library to you new types.And you can also create new functions for other Package’s types.

`Point`

Suppose you are creating a new sort of graph structure that allows for differentiation and integration, i.e you can take gradients, Jacobians, Hessians and so on.

Imagine having to code the whole API in `libtorch`

(`PyTorch`

C++ backend). Including:

- types
- constructors
- linear algebra functions
- autodiff rules

And in the end you can *only* use PyTorch. You would have to do the whole thing again for JAX or any other autodiff backend.

Now let’s see how we do this in Julia?

- We can create a package
`DifferentialGraph.jl`

. - Add
`ChainRulesCore.jl`

as a dependency. - Create forward- and reverse-mode derivative rules:
`rrules`

or`frules`

Now we can use you differential graphs with all of these backends:

`ForwardDiff.jl`

: forward-mode AD`ReverseDiff.jl`

: tape-based reverse-mode AD`Zygote.jl`

: source-to-source reverse-mode AD`Enzyme.jl`

: Julia bindings for Enzyme which ADs LLVM (low-level)`Diffractor.jl`

: experimental mixed-mode AD meant to replace Zygote.jl

Since your graph has derivatives you can use gradient-based solvers to perform optimization.

- Bayesian Neural Nets:
`Flux.jl`

neural network inside a`Turing.jl`

model. - Bayesian COVID modeling:
`DifferentialEquations.jl`

ODE inside a`Turing.jl`

model. - Quaternion ODE solver in the GPU:
`Quaternions.jl`

types in a`DifferentialEquations.jl`

ODE running in`CuArrays`

from`CUDA.jl`

.

- It is fast.
- It is easy to use.
- Learning the basics of Julia will make your life so much easier in all other packages. You don’t need to learn specific package syntax to be effective in using a certain package.
- A bliss to install in Windows, Mac, and Linux (even in clusters).
- Very good community, check the discourse.
- Very “nerdy”, “mathy”, and “geeky” userbase.
- If you are creating new stuff, like research or algorithms, you don’t want to have to stumble upon FORTRAN or C code (
`scipy`

,`numpy`

,`pytorch`

etc.). In Julia everything is in Julia. - You can easily mix-and-match types and functions from different packages, as you saw in the previous slide.
- Good language interop:
- C: standard library
- FORTRAN: standard library
- Python:
`PyCall.jl`

- R:
`RCall.jl`

- Hard to onboard people. Sometimes they don’t want to learn new stuff (I mean we still have FORTRAN around …).
- Not widely used in the marketplace (but tons of academic usage).
- Some package ecosystems are not mature enough, e.g. survival analysis. But, differential equations is way more mature than other scientific computing languages.
- In my point of view, Julia’s strength is in
**scientific computing**. For all other things, you might not have additional benefits.

- The whole standard library, especially
`LinearAlgebra`

module `DifferentialEquations.jl`

,`NeuralPDE.jl`

and the whole SciML package ecosystem`Flux.jl`

and`MLJ.jl`

`DataFrames.jl`

and`DataFramesMeta.jl`

`Makie.jl`

and`AlgebraOfGraphics.jl`

`Turing.jl`

`Pluto.jl`

`JuMP`

`Distributions.jl`

these are all clickable links

- Julia is pretty darn awesome.
- Easy to get going, and you can always make it faster by just optimizing your Julia code.
- No need to drop down to C++.
- Buuuut it can’t beat Python at deep learning.
- Otherwise, it’s worth a try.
- Godspeed to you.