### Small Matrix Multiplication Performance Shootout

Small Matrix Multiplication Shootout Here I’ll compare the speed of small matrix multiplication of a few different popular libraries with PaddedMatrices.jl: Eigen blaze gfortran’s built-in matmul MKL JIT I will benchmark the operation $\textbf{C} = \textbf{A} \times \textbf{B}$, where $\textbf{C}\in\mathbb{R}^{M\times N}$, $\textbf{A}\in\mathbb{R}^{M\times K}$, and $\textbf{B}\in\mathbb{R}^{K\times N}$. I’ll consider every combination of $M\in(3,\ldots,32)$ and $N\in(3,\ldots,32)$ with […]

### Probability Models: Hierarchical Example

itp_example_stan_julia Now, we look at an ITP model. This model takes longer to run, so we’ll want to fit multiple chains in parallel. The easiest way to do that at the moment is with the standard library Distributed. This library is for multiprocessing. We add child processes, and can then have them run code. Unfortunately, […]

### Probability Models: Logistic Regression

ProbabilityModelsExamples I am demonstrating a few examples of MCMC models using a Julia MCMC library I have been working on. This library, ProbabilityModels.jl, also depends on a host of unregistered libraries I’ve created. Being unregistered means that to add them, you’ll need the git url rather than just the package name (like devtools::install_github versus install.packages […]

### Vectorized Random Number Generator

VectorizedRNG I wanted to discuss VectorizedRNG.jl a little. This library provides two vectorized permuted congruential generators (PCG): XSH-RR and RXS-M-XS. These are used for producing vectors of random bits, which can then be transformed to uniform random floating point numbers, which in turn can be used for normal or exponential random samples. To start with, […]

### Optimizing a Gibbs Sampler

OptimizedGibbs Now, let’s revisit the Gibbs sampler, and see how much faster we can get it. My Julia code depends on a lot of packages that I’ve written or adapted from others. They also assume they can find each other in Julia’s package-development folder, so you have to install them by saying you wish to […]

### Vectorization Introduction

GibbsVectorizationIntro Before we try to hand-optimize the Gibbs model, let’s talk some about how to actually optimize code. Or at least hit a few basics. Here I wanted to write a short introduction to writing fast code. For code to run fast, there are a few basic principles: More efficient algorithm. This is where the […]

### BYOG: Building Your Own Gibbs Sampler

BYOGProbColInto I will rewrite the JAGS model in Julia. The reason for this is that we have a lot of little arithmetic operations, which a compiled language (like Julia or C++) can make run much more quickly. For now, I will only use code in base Julia and registered libraries that you can install simply […]

### Autocorrelation in Gibbs Sampling, and intro to Covariance Realism

GibbsSamplingAutocorrelation Now, lets talk about a practical reason why we may want to write our own Gibbs sampler. The number one reason is that you can get much better sampling if you can come up with clever ways to group your parameters. In the last post, we saw that JAGS trounced our own Gibbs sampler […]

### Introduction to Gibbs Sampling

GibbsSamplingIntro Markov Chain Monte Carlo (MCMC) methods are a popular way of solving the math problems in Bayesian statistics. Unfortunately, while popular packages for doing MCMC can be convenient, they can sometimes seem like opaque black boxes. So, let’s write our own Gibbs samplers to have something to play with, and try to understand what […]

### SIMDArrays: fast single threaded matrix multiplication for small and moderate sizes

SIMDArrays Right now my work has been focusing on small dimensional problems, sometimes involving billions of operations with tiny matrices. BLAS libraries like MKL and especially OpenBLAS perform poorly with small matrices, which is one of the primary motivations for “rolling my own”. StaticArrays.jl is a fairly solid library of unrolled expressions, more like my […]