itp_example_stan_julia








Now, we look at an ITP model. This model takes longer to run, so we’ll want to fit multiple chains in parallel.

The easiest way to do that at the moment is with the standard library Distributed.

This library is for multiprocessing. We add child processes, and can then have them run code.
Unfortunately, this requires prefixing some code with @everywhere, so that the children are aware of the functions and data types we’re using.

Julia also supports multithreading, but these are still experimental. In particular, IO from base Julia is not thread safe, so something as simple as printing from two threads at the same time is likely to freeze or crash Julia. I didn’t actually print in my example here, so I will likely try threading in a future version of this document.

In [1]:
using Distributed
addprocs(14); # Create 14 children

@everywhere begin # Load the following on master and all children
using DistributionParameters, ProbabilityDistributions
using LoopVectorization, DynamicHMC, LogDensityProblems, SLEEFPirates, SIMDPirates
using LinearAlgebra, StructuredMatrices, ScatteredArrays, PaddedMatrices
using ProbabilityModels
using DistributionParameters: LKJ_Correlation_Cholesky
using ProbabilityModels: Domains, HierarchicalCentering, ∂HierarchicalCentering, ITPExpectedValue, ∂ITPExpectedValue
BLAS.set_num_threads(1)
end # everywhere

# Define the model on master and all children.
@everywhere @model ITPModel begin

    # Non-hierarchical Priors
    (0.5ρ + 0.5) ~ Beta(2, 2)
    κ ~ Gamma(0.1, 0.1) # μ = 1, σ² = 10
    σ ~ Gamma(1.5, 0.25) # μ = 6, σ² = 2.4
    θ ~ Normal(10)
    L ~ LKJ(2.0)

    # Hierarchical Priors.
    # h subscript, for highest in the hierarhcy.
    μₕ₁ ~ Normal(10) # μ = 0
    μₕ₂ ~ Normal(10) # μ = 0
    σₕ ~ Normal(10) # μ = 0
    # Raw μs; non-cenetered parameterization
    μᵣ₁ ~ Normal() # μ = 0, σ = 1
    μᵣ₂ ~ Normal() # μ = 0, σ = 1
    # Center the μs
    μᵦ₁ = HierarchicalCentering(μᵣ₁, μₕ₁, σₕ)
    μᵦ₂ = HierarchicalCentering(μᵣ₂, μₕ₂, σₕ)
    σᵦ ~ Normal(10) # μ = 0
    # Raw βs; non-cenetered parameterization
    βᵣ₁ ~ Normal()
    βᵣ₂ ~ Normal()
    # Center the βs.
    β₁ = HierarchicalCentering(βᵣ₁, μᵦ₁, σᵦ, domains)
    β₂ = HierarchicalCentering(βᵣ₂, μᵦ₂, σᵦ, domains)

    U = inv′(Diagonal(σ) * L)

    # Likelihood
    μ₁ = ITPExpectedValue(t, β₁, κ, θ)
    μ₂ = ITPExpectedValue(t, β₂, κ, θ)
    AR = AutoregressiveMatrix(ρ, δₜ)
    Y₁ ~ Normal(μ₁, AR, U)
    Y₂ ~ Normal(μ₂, AR, U)

end
    Defined model: ITPModel.
    Unknowns: Y₂, domains, μₕ₂, μᵣ₁, σ, σᵦ, θ, μᵣ₂, ρ, σₕ, μₕ₁, κ, L, δₜ, Y₁, βᵣ₂, t, βᵣ₁.

This model shows off the overloading of distributions. In particular, we have the Normal with 0, 1, and 3 parameters, and HierarchicalCentering with 3 and 4 paameters.
For example, when given only one argument, the Normal distribution has mean 0.
When given 3 arguments, the Normal distribution distribution is a matrix normal. Additionally, many of these calls are vectorized, depending on whether or not the arguments are vectors.

Also neat is that we can place expressions on the left hand side of sampling statements. The use case here is that $\rho\in(-1,1)$, so we can scale it to be in $(-1,1)$ and assign a Beta$(2,2)$ prior.

HierarchicalCentering transforms from a non-centered to a centered parameterization. If given a fourth argument — domains — it will break up the centering along domains. That is, we can define the domains as:

In [2]:
domains = Domains(2,2,2,3)
Out[2]:
Domains{(2, 2, 2, 3)}()

And now we have

In [3]:
K, D = sum(domains), length(domains)
Out[3]:
(9, 4)

nine total endpoints dsitributed across 4 domains of sizes 2, 2, 2, and 3. HierarchicalCentering will divide the 4 values of means and standard deviations among the non-centered parameters accordingly: the first 2 will have the first mean and standard deviation, the next 2 the second…

ITPExpectedValue calculates a $T \times K$ matrix of unscaled ITP values, where $T$ is the number of times, and $K$ is the number of endpoints and lengths of $\beta, \kappa,$ and $\theta$. Then element of the matrix equal:

$f(t, \beta, \kappa, \theta) = \beta\left(1+e^{-\kappa t}\right) + \theta$.

Because these are all Julia functions, we are free to try them out interactively, or use them to help generate sample data with which to validate the model. Let’s do that:

In [4]:
κ = (1/32) * reduce(+, (@Constant randexp(K)) for i  1:8) # κ ~ Gamma(8, 32)
σd = sum(@Constant randexp(4)) / 16 # σd ~ Gamma(4,16)
θ = 2.0 * (@Constant randn(K)) # θ ~ Normal(0,2)
S = (@Constant randn(K,4K)) |> x -> x * x'
S *= (1/16)
pS = StructuredMatrices.SymmetricMatrixL(S)
L = PaddedMatrices.chol(S); U = PaddedMatrices.invchol(pS)
μₕ₁, μₕ₂ = -3.0, 9.0
μᵦ₁ = μₕ₁ + @Constant randn(D); # placebo
μᵦ₂ = μₕ₂ + @Constant randn(D); #treatment
β₁ = HierarchicalCentering((@Constant randn(K)), μᵦ₁, σd, domains); # placebo
β₂ = HierarchicalCentering((@Constant randn(K)), μᵦ₂, σd, domains); # treatment

# rand generates uniform(0,1); we take the cumulate sum for the times.
T = 24; δₜ = (1/16) * reduce(+, (@Constant randexp(T-1)) for i  1:8)
times = vcat(zero(ConstantFixedSizePaddedVector{1,Float64}), cumsum(δₜ));

μ₁ = ITPExpectedValue(times, β₁, κ, θ)
Out[4]:
24×9 ConstantFixedSizePaddedArray{Tuple{24,9},Float64,2,24,216}:
 -1.29439  -2.54654  -0.469901   0.316811   …   0.228689  -0.131786  -1.12861
 -1.6443   -2.98767  -0.614601   0.109865      -0.322197  -0.559595  -1.46973
 -1.99243  -3.41545  -0.756333  -0.0849254     -0.834959  -0.968937  -1.80805
 -2.18817  -3.65077  -0.83497   -0.189381      -1.1073    -1.19155   -1.99777
 -2.35165  -3.84425  -0.900024  -0.27372       -1.3257    -1.37311   -2.15592
 -2.56476  -4.0921   -0.983929  -0.379576   …  -1.59773   -1.60355   -2.36164
 -2.85059  -4.41618  -1.09474   -0.513984      -1.93935   -1.90094   -2.6367 
 -3.01207  -4.59477  -1.1564    -0.585923      -2.1202    -2.06268   -2.79163
 -3.08763  -4.67714  -1.185     -0.618556      -2.20173   -2.13673   -2.864  
 -3.33839  -4.94471  -1.27868   -0.721968      -2.45769   -2.3746    -3.10353
 -3.50825  -5.12057  -1.34099   -0.787576   …  -2.61795   -2.52848   -3.2652 
 -3.68897  -5.30239  -1.40614   -0.853203      -2.77627   -2.68524   -3.43659
 -3.89017  -5.49781  -1.47712   -0.920905      -2.93711   -2.85065   -3.6266 
 -3.97435  -5.57717  -1.50629   -0.947457      -2.99937   -2.91678   -3.70581
 -4.04433  -5.64197  -1.53027   -0.968693      -3.04877   -2.97028   -3.7715 
 -4.13036  -5.7201   -1.55941   -0.993726   …  -3.10652   -3.03413   -3.85208
 -4.20275  -5.78445  -1.58361   -1.01384       -3.15248   -3.08614   -3.91971
 -4.2764   -5.84852  -1.60791   -1.03337       -3.1967    -3.13734   -3.98833
 -4.3824   -5.93808  -1.64227   -1.05974       -3.25566   -3.20782   -4.08675
 -4.44798  -5.99175  -1.66312   -1.07499       -3.28926   -3.24937   -4.1474 
 -4.49596  -6.03012  -1.67815   -1.08559    …  -3.3124    -3.27871   -4.19165
 -4.51615  -6.04602  -1.68443   -1.0899        -3.32176   -3.29078   -4.21023
 -4.60468  -6.11393  -1.71148   -1.10779       -3.36008   -3.3416    -4.29147
 -4.65828  -6.15347  -1.72748   -1.11774       -3.38104   -3.37059   -4.34041
In [5]:
μ₂ = ITPExpectedValue(times, β₂, κ, θ)
Out[5]:
24×9 ConstantFixedSizePaddedArray{Tuple{24,9},Float64,2,24,216}:
 -1.29439   -2.54654    -0.469901  …  0.228689  -0.131786  -1.12861 
 -0.467603  -1.54279     0.338783     1.56576    0.957519  -0.313421
  0.35495   -0.569446    1.13088      2.8103     1.9998     0.495067
  0.817451  -0.0340122   1.57036      3.47131    2.56664    0.948449
  1.20372    0.406236    1.93393      4.0014     3.02893    1.32638 
  1.70727    0.97017     2.40284   …  4.66165    3.6157     1.81801 
  2.3826     1.70758     3.02211      5.4908     4.3729     2.47532 
  2.76416    2.11392     3.36672      5.92976    4.78474    2.84557 
  2.94269    2.30134     3.52656      6.12762    4.97328    3.01851 
  3.53518    2.91017     4.05014      6.74889    5.57897    3.59092 
  3.93654    3.31031     4.39835   …  7.13785    5.97079    3.97727 
  4.36354    3.72402     4.76242      7.52212    6.36994    4.38685 
  4.83894    4.16867     5.15916      7.9125     6.7911     4.84092 
  5.03784    4.34924     5.32216      8.06361    6.95948    5.0302  
  5.20317    4.49668     5.45618      8.18352    7.0957     5.1872  
  5.40645    4.67445     5.61903   …  8.32369    7.25829    5.37976 
  5.57749    4.82087     5.75428      8.43524    7.39071    5.54138 
  5.75151    4.96666     5.89009      8.54256    7.52109    5.70537 
  6.00198    5.17043     6.0821       8.68567    7.70054    5.94056 
  6.15693    5.29256     6.19861      8.76722    7.80634    6.0855  
  6.27029    5.37985     6.28266   …  8.82337    7.88105    6.19123 
  6.31799    5.41604     6.3177       8.84609    7.91177    6.23565 
  6.52718    5.57056     6.46891      8.93911    8.04118    6.42978 
  6.65381    5.66052     6.5583       8.98999    8.11499    6.54674 

You can see here how the mean values increase over time (as we move down the rows), and that baselines differ across columns. $\mu_1$ tends to decrease over time, while $\mu_2$ increases.

Now, let’s generate the rest of the data, and specify all our unknowns.

In [6]:
ρ = 0.7
ARcorrelation = StructuredMatrices.AutoregressiveMatrix(ρ, δₜ)
ARcholesky = PaddedMatrices.chol(ConstantFixedSizePaddedMatrix(ARcorrelation))
# Create an Array of matrix-normal entries.
Y₁a = [ARcholesky * (@Constant randn(T, K)) * L' + μ₁ for n in 1:120]
Y₂a = [ARcholesky * (@Constant randn(T, K)) * L' + μ₂ for n in 1:120]
Y₁ = ChunkedArray(Y₁a) # Rearranges how the data is stored under the hood.
Y₂ = ChunkedArray(Y₂a) # This often allows for better vectorization.

ℓ_itp = ITPModel(
    domains = domains, Y₁ = Y₁, Y₂ = Y₂, t = times, δₜ = δₜ,
    L = LKJ_Correlation_Cholesky{K}, ρ = BoundedFloat{-1,1},
    κ = PositiveVector{K}, θ = RealVector{K},
    μₕ₁ = RealFloat, μₕ₂ = RealFloat,
    μᵣ₁ = RealVector{D}, μᵣ₂ = RealVector{D},
    βᵣ₁ = RealVector{K}, βᵣ₂ = RealVector{K},
    σᵦ = PositiveFloat, σₕ = PositiveFloat,
    σ = PositiveVector{K}
);

But before fitting this model, we’ll allow Stan to set the baseline. Our equivalent Stan model:

In [7]:
using CmdStan

ProjDir = "/home/chriselrod/Documents/progwork/julia/StanProjDir"

function itp_stan_model(; T = "T", K = "K", D = "D")
    if T isa Number
        T_def = ""
        Tm1 = T - 1
        Tm1_def = ""
    else
        T_def = "int T; // Number of times"
        Tm1 = "Tm1"
        Tm1_def = "int Tm1 = T-1;"
    end
    K_def = K isa Number ? "" : "int K; // Number end points"
    D_def = D isa Number ? "" : "int D; // Number domains"
    """
    data {
      int N1; // Sample size for both categories
      int N2; // Sample size for both categories
      $T_def
      $K_def
      $D_def
      int domains[$D];
      // Dose 1
      matrix[$K,$T] Y1[N1];
      // Dose 2
      matrix[$K,$T] Y2[N2];
      row_vector[$T] times;

    }
    transformed data{
      int N = N1 + N2;
      $Tm1_def
      int ind = 0;
      row_vector[$Tm1] delta_times = times[2:$T] - times[1:$Tm1];
      matrix[$K,$D] domain_map = rep_matrix(0, $K, $D);
      for (d in 1:$D){
        for (k in 1:domains[d]){
          ind += 1;
          domain_map[ind,d] = 1;
        }
      }
    }
    parameters {
      real muh[2];
      real<lower=-1,upper=1> rho;
      vector<lower=0>[$K] kappa;
      vector[$K] theta;
      cholesky_factor_corr[$K] L;
      vector[$D] muraw[2];
      vector[$K] betaraw[2];
      real<lower=0> sigma_beta;
      real<lower=0> sigma_h;
      vector<lower=0>[$K] sigma;
    }
    model {
      real scaledrho = 0.5*rho + 0.5;
      vector[$K] mu_beta[2];
      vector[$K] beta[2];
      row_vector[$Tm1] temp;
      vector[$T] temp2;
      row_vector[$Tm1] AR_diag;
      row_vector[$Tm1] nAR_offdiag;
      matrix[$K,$T] kappa_time;
      matrix[$K,$T] delta;
      matrix[$K,$T] SLdelta;
      matrix[$K,$Tm1] SLdeltaAR;
      matrix[$K,$T] itp_expected_value[2];
      matrix[$K,$K] SL = diag_pre_multiply(sigma, L);

      scaledrho ~ beta(2,2);
      kappa ~ gamma(0.1,0.1);
      sigma ~ gamma(1.5, 0.25);
      theta ~ normal(0, 10);
      L ~ lkj_corr_cholesky(2);
      for (i in 1:2){
        muh[i] ~ normal(0,10);
        muraw[i] ~ normal(0,1);
        betaraw[i] ~ normal(0,1);
        mu_beta[i] = domain_map * (muh[i] + muraw[i] * sigma_h);
        beta[i] = mu_beta[i] + betaraw[i] * sigma_beta;
      }
      sigma_h ~ normal(0, 10);
      sigma_beta ~ normal(0, 10);

      if (rho > 0){ // temp = - sign(rho) * abs(rho)^delta_times
        temp =   exp(delta_times * log(  rho)); 
      } else if (rho == 0) {
        temp = rep_row_vector(0, $Tm1);
      } else { // rho < 0
        temp =  - exp(delta_times * log(-rho)); 
      }

      AR_diag = inv_sqrt(1 - temp .* temp);
      nAR_offdiag = AR_diag .* temp;

      kappa_time = expm1(-1 * kappa * times);
      for (i in 1:2){
        itp_expected_value[i] =  rep_matrix(theta, $T) - rep_matrix(beta[i], $T) .* kappa_time; 
      }
      // covariance logdeterminant
      // AR is the inverse of the cholesky factor of a correlation matrix => +
      // SL is the cholesky factor of a covariance matrix => -
      target += N*($K*sum(log(AR_diag)) - $T*log_determinant(SL));

      for (n in 1:N1){
        delta = Y1[n] - itp_expected_value[1];
        SLdelta = mdivide_left_tri_low(SL, delta);
        target += -0.5 * dot_self(SLdelta[:,1]);
        SLdeltaAR = SLdelta[:,2:$T] .* rep_matrix(AR_diag, $K) - SLdelta[:,1:$Tm1] .* rep_matrix(nAR_offdiag, $K);
        target += -0.5 * sum(columns_dot_self(SLdeltaAR));
      }
      for (n in 1:N2){
        delta = Y2[n] - itp_expected_value[2];
        SLdelta = mdivide_left_tri_low(SL, delta);
        target += -0.5 * dot_self(SLdelta[:,1]);
        SLdeltaAR = SLdelta[:,2:$T] .* rep_matrix(AR_diag, $K) - SLdelta[:,1:$Tm1] .* rep_matrix(nAR_offdiag, $K);
        target += -0.5 * sum(columns_dot_self(SLdeltaAR));
      }
    }
    """
end

stan_itp_data_dict = Dict(
    "N1" => length(Y₁),
    "N2" => length(Y₂),
    "T" => T,
    "K" => K,
    "D" => D,
    "domains" => Array(domains),
    "Y1" => permutedims(reshape(reinterpret(Float64, Y₁a),T,K,length(Y₁)),(3,2,1)),
    "Y2" => permutedims(reshape(reinterpret(Float64, Y₂a),T,K,length(Y₂)),(3,2,1)),
    "times" => times
)
stanmodel_itp = Stanmodel(
    name = "ITP",
    Sample(
        num_samples=2000,num_warmup=900,
        adapt=CmdStan.Adapt(delta=0.99)
    ), model = itp_stan_model(), nchains = 14
);

Running it:

In [8]:
@time rc_itp, chns_itp, cnames_itp = stan(stanmodel_itp, stan_itp_data_dict, ProjDir);
Inference for Stan model: ITP_model
14 chains: each with iter=(2000,2000,2000,2000,2000,2000,2000,2000,2000,2000,2000,2000,2000,2000); warmup=(0,0,0,0,0,0,0,0,0,0,0,0,0,0); thin=(1,1,1,1,1,1,1,1,1,1,1,1,1,1); 28000 iterations saved.

Warmup took (2193, 71, 2339, 103, 1939, 2422, 2350, 110, 2015, 2320, 1698, 2194, 2257, 2375) seconds, 6.8 hours total
Sampling took (5106, 217, 3256, 194, 4472, 3711, 3244, 223, 4400, 3253, 3300, 3301, 3273, 4274) seconds, 12 hours total

                    Mean     MCSE   StdDev        5%       50%       95%  N_Eff  N_Eff/s    R_hat
lp__            -2.4e+07  2.8e+07  7.5e+07  -2.9e+08  -1.3e+04  -1.3e+04    7.0  1.7e-04  6.2e+01
accept_stat__    9.7e-01  2.2e-02  1.2e-01   9.2e-01   9.9e-01   1.0e+00     32  7.6e-04  1.6e+00
stepsize__       1.1e-02  2.5e-03  6.7e-03   5.5e-06   1.3e-02   2.0e-02    7.0  1.7e-04  3.4e+14
treedepth__      7.1e+00     -nan  2.2e+00   2.0e+00   8.0e+00   9.0e+00   -nan     -nan  4.2e+00
n_leapfrog__     2.6e+02  5.6e+01  1.6e+02   3.0e+00   2.6e+02   5.1e+02    8.5  2.0e-04  2.4e+00
divergent__      2.3e-03     -nan  4.8e-02   0.0e+00   0.0e+00   0.0e+00   -nan     -nan  1.0e+00
energy__         2.4e+07  2.8e+07  7.5e+07   1.3e+04   1.3e+04   2.9e+08    7.0  1.7e-04  6.2e+01
muh[1]          -2.2e+00  4.9e-01  1.4e+00  -3.8e+00  -2.7e+00   1.3e+00    8.3  2.0e-04  2.6e+00
muh[2]           6.8e+00  1.3e+00  3.4e+00  -8.3e-01   8.4e+00   9.5e+00    7.2  1.7e-04  6.0e+00
rho              5.5e-01  1.1e-01  2.9e-01  -1.5e-02   7.0e-01   7.1e-01    7.0  1.7e-04  8.5e+01
kappa[1]         2.1e-01  1.6e-02  4.3e-02   1.8e-01   2.0e-01   3.2e-01    7.3  1.7e-04  5.1e+00
kappa[2]         8.2e-01  5.7e-01  1.5e+00   2.3e-01   2.5e-01   5.7e+00    7.0  1.7e-04  1.7e+02
kappa[3]         8.2e-01  6.7e-01  1.8e+00   2.1e-01   2.4e-01   7.1e+00    7.0  1.7e-04  1.3e+02
kappa[4]         4.4e-01  1.1e-01  3.0e-01   2.6e-01   2.9e-01   1.2e+00    7.0  1.7e-04  2.1e+01
kappa[5]         2.8e-01  7.3e-02  1.9e-01   1.7e-01   1.9e-01   8.3e-01    7.0  1.7e-04  1.7e+01
kappa[6]         1.1e+00  7.9e-01  2.1e+00   1.8e-01   2.0e-01   7.2e+00    7.0  1.7e-04  2.3e+02
kappa[7]         3.6e-01  5.6e-02  1.5e-01   2.8e-01   3.1e-01   7.7e-01    7.0  1.7e-04  1.3e+01
kappa[8]         5.5e-01  4.2e-01  1.1e+00   1.5e-01   2.5e-01   4.6e+00    7.0  1.7e-04  1.1e+02
kappa[9]         2.9e-01  1.5e-01  3.8e-01   1.6e-01   1.8e-01   1.7e+00    7.0  1.7e-04  4.4e+01
theta[1]        -7.8e-01  3.5e-01  9.3e-01  -1.4e+00  -1.2e+00   1.0e+00    7.0  1.7e-04  1.3e+01
theta[2]        -2.0e+00  4.1e-01  1.1e+00  -2.6e+00  -2.5e+00   4.5e-01    7.0  1.7e-04  1.6e+01
theta[3]        -3.9e-01  1.6e-01  4.4e-01  -8.7e-01  -4.9e-01   1.1e+00    7.3  1.7e-04  5.0e+00
theta[4]         5.1e-01  1.2e-01  3.3e-01   1.6e-01   4.3e-01   1.6e+00    7.5  1.8e-04  3.8e+00
theta[5]         5.9e-01  2.0e-01  5.3e-01  -9.2e-01   5.8e-01   1.6e+00    7.2  1.7e-04  6.5e+00
theta[6]        -8.2e-02  2.0e-01  5.3e-01  -1.8e+00   2.7e-03   7.6e-01    7.2  1.7e-04  6.7e+00
theta[7]        -1.0e-01  1.9e-01  5.0e-01  -1.3e+00   1.0e-01   2.8e-01    7.2  1.7e-04  5.8e+00
theta[8]        -3.7e-01  2.2e-01  5.7e-01  -1.9e+00  -1.3e-01   3.2e-02    7.1  1.7e-04  7.9e+00
theta[9]        -8.8e-01  2.9e-01  7.6e-01  -1.7e+00  -1.1e+00   1.5e+00    7.1  1.7e-04  1.0e+01
L[1,1]           1.0e+00     -nan  7.5e-14   1.0e+00   1.0e+00   1.0e+00   -nan     -nan  1.0e+00
L[1,2]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,1]           7.8e-02  1.1e-01  2.9e-01  -6.7e-01   9.0e-02   8.4e-01    7.0  1.7e-04  2.5e+01
L[2,2]           9.4e-01  4.9e-02  1.3e-01   5.4e-01   1.0e+00   1.0e+00    7.0  1.7e-04  1.2e+02
L[2,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,1]          -1.9e-01  9.9e-02  2.6e-01  -8.0e-01  -6.3e-02  -3.8e-02    7.0  1.7e-04  2.3e+01
L[3,2]           3.2e-01  5.4e-02  1.4e-01   2.3e-01   2.5e-01   7.1e-01    7.0  1.7e-04  1.3e+01
L[3,3]           8.5e-01  8.5e-02  2.2e-01   4.1e-01   9.7e-01   9.7e-01    7.0  1.7e-04  8.0e+01
L[3,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,1]           2.4e-01  1.2e-01  3.2e-01  -5.0e-01   1.9e-01   9.0e-01    7.0  1.7e-04  2.9e+01
L[4,2]           3.4e-02  1.2e-01  3.1e-01  -8.0e-01   1.8e-01   2.1e-01    7.0  1.7e-04  2.8e+01
L[4,3]           2.4e-01  3.9e-02  1.0e-01  -1.0e-01   2.8e-01   3.0e-01    7.1  1.7e-04  1.0e+01
L[4,4]           7.7e-01  1.1e-01  2.9e-01   1.3e-01   9.2e-01   9.3e-01    7.0  1.7e-04  7.0e+01
L[4,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,1]           4.2e-03  8.4e-02  2.2e-01  -7.8e-01   6.9e-02   1.3e-01    7.0  1.7e-04  1.9e+01
L[5,2]          -1.8e-01  1.0e-01  2.8e-01  -9.1e-01  -1.6e-01   5.4e-01    7.0  1.7e-04  2.5e+01
L[5,3]          -1.5e-01  8.8e-02  2.3e-01  -8.4e-01  -1.2e-01   3.6e-01    7.0  1.7e-04  2.1e+01
L[5,4]          -1.9e-01  1.2e-02  3.2e-02  -2.2e-01  -2.0e-01  -9.1e-02    7.8  1.9e-04  3.0e+00
L[5,5]           8.0e-01  1.2e-01  3.1e-01   5.9e-02   9.5e-01   9.6e-01    7.0  1.7e-04  9.7e+01
L[5,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,1]          -1.9e-01  1.2e-01  3.2e-01  -3.6e-01  -3.3e-01   5.6e-01    7.0  1.7e-04  3.1e+01
L[6,2]           2.2e-02  8.8e-02  2.3e-01  -6.4e-01   1.1e-01   2.1e-01    7.0  1.7e-04  2.2e+01
L[6,3]           1.4e-01  7.3e-02  1.9e-01  -3.1e-01   1.1e-01   5.5e-01    7.0  1.7e-04  1.8e+01
L[6,4]          -2.9e-01  9.3e-02  2.5e-01  -3.9e-01  -3.7e-01   5.4e-01    7.0  1.7e-04  2.7e+01
L[6,5]          -1.7e-02  6.2e-02  1.7e-01  -3.4e-01  -5.0e-02   4.4e-01    7.0  1.7e-04  1.7e+01
L[6,6]           7.3e-01  8.6e-02  2.3e-01   2.2e-01   8.5e-01   8.6e-01    7.0  1.7e-04  4.5e+01
L[6,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,1]           4.4e-02  1.2e-01  3.1e-01  -8.0e-01   5.9e-02   8.4e-01    7.0  1.7e-04  2.8e+01
L[7,2]           1.9e-01  9.9e-02  2.6e-01  -4.0e-01   1.5e-01   9.4e-01    7.0  1.7e-04  2.4e+01
L[7,3]          -8.9e-02  4.9e-02  1.3e-01  -2.8e-01  -1.2e-01   3.4e-01    7.1  1.7e-04  1.2e+01
L[7,4]          -1.8e-01  2.1e-02  5.6e-02  -2.7e-01  -1.9e-01  -5.1e-02    7.3  1.7e-04  5.2e+00
L[7,5]           1.0e-01  3.5e-02  9.4e-02  -2.2e-01   1.3e-01   1.6e-01    7.1  1.7e-04  8.7e+00
L[7,6]          -1.6e-01  3.1e-02  8.4e-02  -2.2e-01  -2.0e-01   5.6e-02    7.1  1.7e-04  8.0e+00
L[7,7]           7.7e-01  1.1e-01  3.0e-01   1.7e-01   9.3e-01   9.4e-01    7.0  1.7e-04  7.7e+01
L[7,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[8,1]          -1.0e-01  6.3e-02  1.7e-01  -5.9e-01  -7.5e-02   2.4e-01    7.0  1.7e-04  1.5e+01
L[8,2]           2.4e-01  9.9e-02  2.6e-01   9.3e-02   1.2e-01   8.6e-01    7.0  1.7e-04  2.3e+01
L[8,3]          -6.0e-02  9.3e-02  2.5e-01  -2.0e-01  -1.8e-01   5.4e-01    7.0  1.7e-04  2.3e+01
L[8,4]           1.8e-01  3.4e-02  9.1e-02  -7.9e-02   2.0e-01   3.3e-01    7.1  1.7e-04  8.5e+00
L[8,5]           2.9e-02  2.0e-02  5.3e-02  -1.4e-02   9.9e-03   1.8e-01    7.3  1.7e-04  4.9e+00
L[8,6]           3.9e-02  2.2e-02  5.8e-02  -3.4e-02   2.6e-02   2.3e-01    7.2  1.7e-04  5.4e+00
L[8,7]           2.7e-02  2.4e-02  6.5e-02  -2.0e-01   4.6e-02   6.9e-02    7.2  1.7e-04  6.0e+00
L[8,8]           7.9e-01  1.2e-01  3.1e-01   1.0e-01   9.5e-01   9.6e-01    7.0  1.7e-04  9.3e+01
L[8,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[9,1]          -1.6e-01  1.3e-01  3.5e-01  -8.8e-01  -1.6e-01   8.9e-01    7.0  1.7e-04  3.2e+01
L[9,2]          -3.4e-02  6.7e-02  1.8e-01  -5.2e-01   4.8e-02   7.2e-02    7.0  1.7e-04  1.6e+01
L[9,3]           7.7e-02  7.1e-02  1.9e-01  -3.7e-01   7.2e-02   6.1e-01    7.0  1.7e-04  1.7e+01
L[9,4]          -2.2e-03  3.2e-02  8.6e-02  -6.2e-02  -3.8e-02   2.2e-01    7.1  1.7e-04  7.6e+00
L[9,5]          -6.4e-02  2.7e-02  7.2e-02  -1.1e-01  -8.7e-02   1.7e-01    7.2  1.7e-04  6.4e+00
L[9,6]           1.5e-01  4.2e-02  1.1e-01  -1.7e-01   1.9e-01   2.2e-01    7.1  1.7e-04  1.0e+01
L[9,7]           2.5e-01  6.2e-02  1.7e-01  -1.1e-01   3.4e-01   3.6e-01    7.0  1.7e-04  1.7e+01
L[9,8]           2.7e-01  4.2e-02  1.1e-01   2.5e-02   3.2e-01   3.4e-01    7.0  1.7e-04  1.2e+01
L[9,9]           6.7e-01  1.2e-01  3.2e-01   3.7e-02   8.3e-01   8.4e-01    7.0  1.7e-04  6.0e+01
muraw[1,1]      -9.1e-01  3.3e-01  1.0e+00  -2.1e+00  -1.0e+00   1.9e+00    9.7  2.3e-04  1.9e+00
muraw[1,2]       1.2e+00  6.4e-02  6.2e-01   2.4e-01   1.1e+00   2.2e+00     93  2.2e-03  1.1e+00
muraw[1,3]       2.5e-01  3.0e-01  9.3e-01  -1.9e+00   3.6e-01   1.7e+00    9.7  2.3e-04  1.9e+00
muraw[1,4]      -5.4e-01  2.6e-01  8.6e-01  -1.6e+00  -7.8e-01   2.0e+00     11  2.5e-04  1.7e+00
muraw[2,1]       6.1e-02  1.2e-01  5.9e-01  -1.0e+00   1.1e-01   1.0e+00     23  5.4e-04  1.2e+00
muraw[2,2]      -4.5e-01  3.0e-01  9.5e-01  -1.8e+00  -6.4e-01   1.2e+00     10  2.4e-04  1.8e+00
muraw[2,3]       2.7e-01  2.2e-01  7.7e-01  -1.5e+00   4.3e-01   1.4e+00     12  2.9e-04  1.5e+00
muraw[2,4]      -1.2e-02  2.5e-01  8.1e-01  -1.9e+00   8.5e-02   1.1e+00     11  2.5e-04  1.7e+00
betaraw[1,1]    -1.6e-01  1.9e-01  8.6e-01  -1.6e+00  -2.5e-01   1.1e+00     22  5.1e-04  1.2e+00
betaraw[1,2]     2.5e-01  1.8e-01  8.6e-01  -1.2e+00   1.5e-01   1.6e+00     22  5.3e-04  1.2e+00
betaraw[1,3]     3.3e-01  2.1e-01  9.0e-01  -1.2e+00   4.0e-01   1.7e+00     19  4.4e-04  1.3e+00
betaraw[1,4]    -4.8e-03  2.4e-01  9.6e-01  -1.4e+00  -3.1e-02   1.7e+00     15  3.7e-04  1.4e+00
betaraw[1,5]     3.4e-01  1.1e-01  8.0e-01  -9.3e-01   4.4e-01   1.6e+00     50  1.2e-03  1.1e+00
betaraw[1,6]    -3.0e-01  2.3e-01  9.6e-01  -1.8e+00  -3.1e-01   1.2e+00     17  4.0e-04  1.3e+00
betaraw[1,7]    -5.6e-03  2.8e-01  9.8e-01  -1.5e+00  -1.0e-01   1.9e+00     12  2.9e-04  1.5e+00
betaraw[1,8]     4.4e-01  2.4e-01  9.0e-01  -1.8e+00   5.5e-01   1.8e+00     14  3.4e-04  1.4e+00
betaraw[1,9]    -4.5e-01  2.5e-01  9.2e-01  -1.9e+00  -4.3e-01   1.5e+00     13  3.2e-04  1.4e+00
betaraw[2,1]     3.8e-01  2.5e-01  9.8e-01  -1.4e+00   4.0e-01   2.0e+00     16  3.7e-04  1.3e+00
betaraw[2,2]    -3.7e-01  2.7e-01  1.0e+00  -1.9e+00  -5.4e-01   2.0e+00     15  3.4e-04  1.4e+00
betaraw[2,3]    -5.0e-02  1.6e-01  8.5e-01  -1.4e+00  -1.4e-01   1.3e+00     28  6.6e-04  1.2e+00
betaraw[2,4]    -1.2e-01  2.8e-01  1.0e+00  -1.9e+00  -1.6e-01   1.9e+00     14  3.3e-04  1.4e+00
betaraw[2,5]    -6.0e-01  2.8e-01  1.1e+00  -2.2e+00  -7.4e-01   1.5e+00     15  3.5e-04  1.4e+00
betaraw[2,6]     7.0e-01  2.8e-01  1.1e+00  -1.8e+00   6.9e-01   2.3e+00     15  3.5e-04  1.4e+00
betaraw[2,7]     6.7e-01  3.8e-01  1.2e+00  -2.0e+00   9.3e-01   2.2e+00     10  2.4e-04  1.8e+00
betaraw[2,8]    -4.1e-01  2.5e-01  9.4e-01  -1.6e+00  -5.4e-01   1.7e+00     14  3.4e-04  1.4e+00
betaraw[2,9]    -5.0e-01  1.4e-01  8.0e-01  -1.7e+00  -5.5e-01   7.3e-01     33  7.8e-04  1.1e+00
sigma_beta       3.0e-01  2.4e-02  1.1e-01   1.3e-01   2.8e-01   4.4e-01     23  5.5e-04  1.2e+00
sigma_h          1.1e+00  1.2e-01  5.5e-01   1.9e-01   1.1e+00   1.9e+00     20  4.8e-04  1.2e+00
sigma[1]         1.9e+00  3.9e-01  1.0e+00   1.4e+00   1.4e+00   4.4e+00    7.0  1.7e-04  7.4e+01
sigma[2]         1.3e+00  2.4e-01  6.4e-01   4.0e-01   1.3e+00   3.4e+00    7.0  1.7e-04  5.3e+01
sigma[3]         1.6e+00  1.3e-01  3.5e-01   3.4e-01   1.7e+00   1.9e+00    7.0  1.7e-04  2.2e+01
sigma[4]         2.1e+00  3.5e-01  9.3e-01   1.6e+00   1.6e+00   4.4e+00    7.0  1.7e-04  5.9e+01
sigma[5]         1.9e+00  2.7e-01  7.1e-01   1.6e+00   1.6e+00   3.6e+00    7.0  1.7e-04  4.6e+01
sigma[6]         1.8e+00  5.6e-01  1.5e+00   2.7e-01   1.6e+00   6.9e+00    7.0  1.7e-04  9.7e+01
sigma[7]         1.5e+00  1.4e-01  3.6e-01   4.3e-01   1.7e+00   1.7e+00    7.0  1.7e-04  2.2e+01
sigma[8]         1.4e+00  1.8e-01  4.8e-01   2.0e-01   1.4e+00   2.7e+00    7.0  1.7e-04  3.6e+01
sigma[9]         1.6e+00  1.5e-01  3.9e-01   1.4e+00   1.4e+00   2.8e+00    7.0  1.7e-04  2.8e+01

Samples were drawn using hmc with nuts.
For each parameter, N_Eff is a crude measure of effective sample size,
and R_hat is the potential scale reduction factor on split chains (at 
convergence, R_hat=1).

7340.985143 seconds (47.04 M allocations: 2.196 GiB, 0.01% gc time)

We see that it failed to converge. Chains 3, 10, 12, and 13 ran much faster than the other 10, which means that these two sets adapted differently. Naturally, we suspect that one of these adaptations was “correct”, causing convergence, and the other “incorrect”, resulting in convergence failure.
Before focusing on analysis, lets perform a second run to see if we can speed things up by making many of these arrays statically sized.

“Statically sized” here means, making Stan (and C++) already aware of the sizes of the arrays while compiling, instead of only while running. This is one of the “tricks” I often use in my Julia code to get massive speed ups. It is also a very natural trick to exploit in Julia, because you’ll never personally have to recompile anything: it will just happen automatically whenever you use arrays of diferent sizes. Multiple dispatch also means we will not have to define any separate functions or models to make it work. It’s all automatic.

We have to do things a little more manually in Stan. This is why I wrote the Stan model above as a function that returns a string, to let me define array sizes as either variables, or as inserted numbers.
This also makes it clear what I mean by making this seemless in Julia: our Julia model definition looks a lot more like the first, with variable sized arrays. But it compiles as the second, with awareness of the array sizes. As we will see from the difference in runtime here, while Julia also benefits from this, Stan apparently does not.

To provide a concrete example of why knowing array bounds may help:
Without knowing the bounds of an array, the clang++ compiler will often vectorize simple “for loop”s of length $N$ by splitting it into two for loops. The first of these loops is “vectorized”, calculating 32 iterations at a time in parallel (while still only using a single core). The second of these loops does a single iteration, and catches the remainder of $N\div32$ that the first loop misses. This may ideal when $N > 32$, but will actually slow things down when $N < 32$: the vectorized code doesn’t get used, and is just bloat that gets in the way. If the compiler knows that $N < 32$, it wont create the first loop. Intead, it will decide to do something different, perhaps not even creating a loop at all, but “unrolling” the entire thing. As a brief example:

In [9]:
using PaddedMatrices
x = @Constant randn(16); y = @Constant randn(16);
function my_dot(x, y)
    d = 0.0
    @fastmath @inbounds for i  eachindex(x,y)
        d += x[i] * y[i]
    end
    d
end
@code_native my_dot(x, y)
	.text
; ┌ @ In[9]:6 within `my_dot'
; │┌ @ In[9]:4 within `mul_fast'
	vmovupd	(%rsi), %zmm0
	vmovupd	64(%rsi), %zmm1
	vmulpd	64(%rdi), %zmm1, %zmm1
; │└
; │┌ @ fastmath.jl:161 within `add_fast'
	vfmadd231pd	(%rdi), %zmm0, %zmm1
	vextractf64x4	$1, %zmm1, %ymm0
	vaddpd	%zmm0, %zmm1, %zmm0
	vextractf128	$1, %ymm0, %xmm1
	vaddpd	%zmm1, %zmm0, %zmm0
	vpermilpd	$1, %xmm0, %xmm1 # xmm1 = xmm0[1,0]
	vaddpd	%zmm1, %zmm0, %zmm0
; │└
; │ @ In[9]:8 within `my_dot'
	vzeroupper
	retq
	nopw	%cs:(%rax,%rax)
; └

What the assembly says here is: move (vmov) packed data (pd = vmov_pd) into register zmm0 from memory location %rsi, and into zmm1 from 64 bytes down from %rsi. Registers whose name starts with zmm contain 64 bytes of data, i.e. 64 bytes / 8 bytes/double = 8 double precision numbers. This moves the entirety of one of the vectors into two CPU registers.
Then it multiplies packed data (vmulpd) zmm1 with memory 64 bytes down from %rdi, the second half of the other vector, and overwriting zmm1.
Then it does a fused-multiplication-addition of packed data. vfmadd231. This multiplies the start of memory %rdi with register zmm0, adding the result to zmm1 in one instruction. The contents of zmm1 are now:

\begin{align*}
\texttt{zmm1} &= \begin{bmatrix}
x_1 \times y_1 + x_{9} \times y_{9}\\
x_2 \times y_2 + x_{10} \times y_{10}\\
x_3 \times y_3 + x_{11} \times y_{11}\\
x_4 \times y_4 + x_{12} \times y_{12}\\
x_5 \times y_5 + x_{13} \times y_{13}\\
x_6 \times y_6 + x_{14} \times y_{14}\\
x_7 \times y_7 + x_{15} \times y_{15}\\
x_8 \times y_8 + x_{16} \times y_{16}\\
\end{bmatrix}.
\end{align*}

It then procedes to add the second half of zmm1 to the first half, so that it now has 4 numbers total. Then it does this again, and again, until in the end it has:

\begin{align*}
\texttt{zmm0} &= \begin{bmatrix}
\sum_{i=1}^{16} x_i\times y_i\\
\text{junk}\\
\text{junk}\\
\text{junk}\\
\text{junk}\\
\text{junk}\\
\text{junk}\\
\text{junk}
\end{bmatrix}
\end{align*}

Overall, this involves a lot less work for th CPU than the loop we actually wrote!
If we used an x and a y of different lengths, Julia will $-$ under the hood $-$ compile a different version of the functions specific to those new sizes, because the type of x and y are defined by their sizes.
For as long as that Julia session keeps running, Julia will “remember” both versions and not have to compile them again.

I’m using g++ as my c++ compiler, but the same ideas ought to apply. Perhaps I will try clang++ next instead, but g++ can vectorize functions like exp and log, while clang cannot. CmdStan does helpfully emit .hpp files showing the c++ code the Stan programs get translated into though, but I haven’t dove into it to try and understand what’s going on and how I might actually be able to make Stan faster.
I think it would be very hard to get Stan to be competitive in speed with my Julia code, however.

Without further delay:

In [10]:
stanmodel_itp2 = Stanmodel(
    name = "StanITP",
    Sample(
        num_samples=2000,num_warmup=900,
        adapt=CmdStan.Adapt(delta=0.99)
    ), model = itp_stan_model(T = T, K = K, D = D), nchains = 14
);
@time rc_itp2, chns_itp2, cnames_itp2 = stan(stanmodel_itp2, stan_itp_data_dict, ProjDir);
Informational Message: The current Metropolis proposal is about to be rejected because of the following issue:
Exception: log1m: x is 1, but must be less than or equal to 1  (in '/home/chriselrod/Documents/progwork/julia/tmp/StanITP.stan' at line 33)

If this warning occurs sporadically, such as for highly constrained variable types like covariance matrices, then the sampler is fine,
but if this warning occurs often then your model may be either severely ill-conditioned or misspecified.

Inference for Stan model: StanITP_model
14 chains: each with iter=(2000,2000,2000,2000,2000,2000,2000,2000,2000,2000,2000,2000,2000,2000); warmup=(0,0,0,0,0,0,0,0,0,0,0,0,0,0); thin=(1,1,1,1,1,1,1,1,1,1,1,1,1,1); 28000 iterations saved.

Warmup took (2330, 2379, 1991, 2125, 2418, 90, 2099, 2657, 2132, 2493, 81, 2164, 2388, 6087) seconds, 8.7 hours total
Sampling took (3621, 3576, 3647, 3660, 3546, 215, 4918, 4254, 3662, 3592, 208, 3637, 4254, 6592) seconds, 14 hours total

                    Mean     MCSE   StdDev        5%       50%       95%  N_Eff  N_Eff/s    R_hat
lp__            -1.5e+11     -nan  5.3e+11  -2.1e+12  -1.3e+04  -1.3e+04   -nan     -nan  2.7e+05
accept_stat__    9.8e-01  2.0e-03  4.8e-02   9.4e-01   9.9e-01   1.0e+00    553  1.1e-02  1.0e+00
stepsize__       1.2e-02  2.6e-03  6.8e-03   6.5e-15   1.5e-02   2.0e-02    7.0  1.4e-04  3.3e+14
treedepth__      7.5e+00     -nan  2.0e+00   2.0e+00   8.0e+00   1.0e+01   -nan     -nan  4.3e+00
n_leapfrog__     3.0e+02     -nan  2.4e+02   3.0e+00   2.6e+02   1.0e+03   -nan     -nan  3.9e+00
divergent__      5.4e-04     -nan  2.3e-02   0.0e+00   0.0e+00   0.0e+00   -nan     -nan  1.0e+00
energy__         1.5e+11     -nan  5.3e+11   1.3e+04   1.3e+04   2.1e+12   -nan     -nan  2.7e+05
muh[1]          -2.2e+00  5.3e-01  1.5e+00  -3.8e+00  -2.7e+00   1.7e+00    8.1  1.6e-04  2.7e+00
muh[2]           6.8e+00  1.3e+00  3.5e+00  -1.4e+00   8.4e+00   9.5e+00    7.2  1.5e-04  6.1e+00
rho              5.5e-01  1.1e-01  2.9e-01  -2.1e-02   7.0e-01   7.1e-01    7.0  1.4e-04  8.8e+01
kappa[1]         7.6e-01  5.4e-01  1.4e+00   1.8e-01   2.0e-01   5.4e+00    7.0  1.4e-04  1.7e+02
kappa[2]         9.5e-01  7.1e-01  1.9e+00   2.4e-01   2.5e-01   7.3e+00    7.0  1.4e-04  2.1e+02
kappa[3]         4.1e-01  1.6e-01  4.2e-01   2.1e-01   2.4e-01   1.5e+00    7.0  1.4e-04  3.1e+01
kappa[4]         6.9e-01  5.4e-01  1.4e+00   2.4e-01   2.9e-01   5.8e+00    7.0  1.4e-04  1.0e+02
kappa[5]         3.3e-01  1.7e-01  4.5e-01   1.7e-01   1.9e-01   1.9e+00    7.0  1.4e-04  4.0e+01
kappa[6]         2.7e-01     -nan  2.3e-01   1.8e-01   2.0e-01   1.1e+00   -nan     -nan  2.5e+01
kappa[7]         3.1e-01  1.6e-02  4.5e-02   2.4e-01   3.0e-01   4.5e-01    7.5  1.5e-04  3.9e+00
kappa[8]         7.3e-01  4.2e-01  1.1e+00   2.3e-01   2.5e-01   4.1e+00    7.0  1.4e-04  1.1e+02
kappa[9]         3.5e-01  1.8e-01  4.9e-01   1.7e-01   1.8e-01   2.1e+00    7.0  1.4e-04  5.7e+01
theta[1]        -1.0e+00  2.6e-01  7.0e-01  -1.6e+00  -1.3e+00   9.5e-01    7.1  1.4e-04  9.5e+00
theta[2]        -1.9e+00  5.1e-01  1.4e+00  -2.6e+00  -2.5e+00   1.5e+00    7.0  1.4e-04  2.0e+01
theta[3]        -5.4e-01  1.2e-01  3.3e-01  -1.6e+00  -4.9e-01  -8.4e-02    7.5  1.5e-04  3.8e+00
theta[4]         4.1e-01  1.3e-01  3.4e-01  -6.3e-01   4.3e-01   1.0e+00    7.5  1.5e-04  3.9e+00
theta[5]         4.1e-01  1.8e-01  4.8e-01  -1.1e+00   5.6e-01   7.2e-01    7.2  1.5e-04  5.9e+00
theta[6]         1.3e-01  1.9e-01  5.0e-01  -1.5e-01   3.7e-03   1.9e+00    7.2  1.5e-04  6.3e+00
theta[7]        -1.0e-01  2.3e-01  6.2e-01  -1.9e+00   1.2e-01   2.8e-01    7.1  1.4e-04  7.1e+00
theta[8]        -4.6e-02  1.7e-01  4.5e-01  -9.0e-01  -8.6e-02   1.4e+00    7.2  1.5e-04  6.1e+00
theta[9]        -9.9e-01  2.2e-01  5.8e-01  -1.5e+00  -1.1e+00   1.0e+00    7.1  1.4e-04  7.6e+00
L[1,1]           1.0e+00     -nan  7.5e-14   1.0e+00   1.0e+00   1.0e+00   -nan     -nan  1.0e+00
L[1,2]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,1]           1.3e-01  1.4e-01  3.6e-01  -8.2e-01   9.3e-02   8.1e-01    7.0  1.4e-04  3.2e+01
L[2,2]           9.1e-01  6.3e-02  1.7e-01   5.7e-01   1.0e+00   1.0e+00    7.0  1.4e-04  1.6e+02
L[2,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,1]          -9.1e-02  8.7e-02  2.3e-01  -8.3e-01  -6.0e-02   3.4e-01    7.0  1.4e-04  2.0e+01
L[3,2]           1.5e-01  1.1e-01  3.0e-01  -6.4e-01   2.5e-01   5.1e-01    7.0  1.4e-04  2.8e+01
L[3,3]           8.9e-01  7.4e-02  2.0e-01   2.2e-01   9.7e-01   9.7e-01    7.0  1.4e-04  6.9e+01
L[3,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,1]           2.8e-01  7.3e-02  1.9e-01   1.7e-01   2.0e-01   8.5e-01    7.0  1.4e-04  1.8e+01
L[4,2]           2.1e-01  8.4e-02  2.2e-01  -3.5e-01   1.9e-01   7.5e-01    7.0  1.4e-04  2.1e+01
L[4,3]           2.7e-01  4.0e-02  1.1e-01  -2.3e-02   2.8e-01   5.1e-01    7.1  1.4e-04  1.0e+01
L[4,4]           8.1e-01  8.7e-02  2.3e-01   1.5e-01   9.2e-01   9.3e-01    7.0  1.4e-04  5.5e+01
L[4,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,1]           2.4e-01  1.3e-01  3.4e-01   5.0e-02   7.5e-02   9.6e-01    7.0  1.4e-04  2.9e+01
L[5,2]          -9.6e-02  5.4e-02  1.4e-01  -1.8e-01  -1.6e-01   3.1e-01    7.0  1.4e-04  1.3e+01
L[5,3]          -1.0e-01  6.0e-02  1.6e-01  -4.8e-01  -1.2e-01   3.4e-01    7.0  1.4e-04  1.4e+01
L[5,4]          -1.5e-01  4.6e-02  1.2e-01  -2.2e-01  -2.0e-01   2.0e-01    7.1  1.4e-04  1.1e+01
L[5,5]           7.9e-01  1.2e-01  3.3e-01   8.9e-02   9.5e-01   9.6e-01    7.0  1.4e-04  1.0e+02
L[5,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,1]          -4.6e-01  8.8e-02  2.3e-01  -9.5e-01  -3.4e-01  -3.2e-01    7.0  1.4e-04  2.3e+01
L[6,2]           3.8e-02  6.9e-02  1.8e-01  -4.6e-01   1.1e-01   1.6e-01    7.0  1.4e-04  1.7e+01
L[6,3]           5.6e-02  4.3e-02  1.1e-01  -2.4e-01   1.0e-01   1.3e-01    7.1  1.4e-04  1.1e+01
L[6,4]          -3.0e-01  5.5e-02  1.5e-01  -3.9e-01  -3.7e-01   8.9e-02    7.0  1.4e-04  1.6e+01
L[6,5]          -3.4e-02  1.7e-02  4.7e-02  -6.8e-02  -5.0e-02   1.0e-01    7.3  1.5e-04  4.8e+00
L[6,6]           6.9e-01  1.1e-01  3.0e-01   6.9e-02   8.5e-01   8.6e-01    7.0  1.4e-04  5.7e+01
L[6,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,1]          -6.9e-02  1.2e-01  3.2e-01  -8.5e-01   5.6e-02   8.1e-02    7.0  1.4e-04  2.7e+01
L[7,2]           1.3e-02  1.1e-01  3.0e-01  -9.1e-01   1.5e-01   1.7e-01    7.0  1.4e-04  2.7e+01
L[7,3]          -7.0e-02  6.1e-02  1.6e-01  -2.9e-01  -1.2e-01   3.9e-01    7.0  1.4e-04  1.4e+01
L[7,4]          -2.0e-01  1.8e-02  4.8e-02  -3.2e-01  -1.9e-01  -1.0e-01    7.4  1.5e-04  4.4e+00
L[7,5]           1.1e-01  2.8e-02  7.5e-02  -8.3e-02   1.4e-01   1.8e-01    7.1  1.4e-04  7.0e+00
L[7,6]          -1.4e-01  4.7e-02  1.3e-01  -2.2e-01  -2.0e-01   1.4e-01    7.1  1.4e-04  1.2e+01
L[7,7]           7.5e-01  1.3e-01  3.4e-01   3.9e-02   9.3e-01   9.4e-01    7.0  1.4e-04  8.6e+01
L[7,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[8,1]          -2.2e-01  1.1e-01  3.0e-01  -9.4e-01  -7.8e-02  -5.3e-02    7.0  1.4e-04  2.6e+01
L[8,2]           6.2e-02  8.1e-02  2.2e-01  -5.3e-01   1.1e-01   4.5e-01    7.0  1.4e-04  1.9e+01
L[8,3]          -1.9e-01  5.6e-02  1.5e-01  -6.4e-01  -1.8e-01   9.9e-02    7.0  1.4e-04  1.3e+01
L[8,4]           1.7e-01  2.5e-02  6.7e-02  -2.7e-03   2.0e-01   2.2e-01    7.2  1.5e-04  6.3e+00
L[8,5]          -2.2e-02  2.0e-02  5.4e-02  -1.3e-01   1.0e-03   2.5e-02    7.3  1.5e-04  4.9e+00
L[8,6]           2.0e-02  1.2e-02  3.5e-02  -5.7e-02   2.3e-02   9.5e-02    7.8  1.6e-04  3.1e+00
L[8,7]           2.8e-02  1.7e-02  4.6e-02  -9.1e-02   4.6e-02   7.0e-02    7.4  1.5e-04  4.3e+00
L[8,8]           7.5e-01  1.4e-01  3.8e-01   1.1e-02   9.5e-01   9.6e-01    7.0  1.4e-04  1.1e+02
L[8,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[9,1]          -5.8e-02  1.6e-01  4.3e-01  -8.6e-01  -1.6e-01   9.3e-01    7.0  1.4e-04  3.8e+01
L[9,2]          -4.4e-03  5.7e-02  1.5e-01  -4.0e-01   5.1e-02   1.0e-01    7.0  1.4e-04  1.3e+01
L[9,3]           2.1e-02  4.9e-02  1.3e-01  -4.0e-01   6.9e-02   9.3e-02    7.1  1.4e-04  1.1e+01
L[9,4]          -3.3e-02  3.7e-02  9.8e-02  -2.5e-01  -4.1e-02   2.6e-01    7.1  1.4e-04  8.8e+00
L[9,5]          -6.6e-02  2.0e-02  5.4e-02  -1.1e-01  -8.7e-02   6.4e-02    7.3  1.5e-04  4.8e+00
L[9,6]           1.5e-01  3.4e-02  9.0e-02  -8.2e-02   1.9e-01   2.2e-01    7.1  1.4e-04  8.3e+00
L[9,7]           2.6e-01  5.6e-02  1.5e-01  -7.4e-02   3.4e-01   3.6e-01    7.0  1.4e-04  1.5e+01
L[9,8]           2.6e-01  4.7e-02  1.2e-01   3.5e-03   3.2e-01   3.4e-01    7.0  1.4e-04  1.4e+01
L[9,9]           6.7e-01  1.2e-01  3.2e-01   1.9e-03   8.3e-01   8.4e-01    7.0  1.4e-04  6.1e+01
muraw[1,1]      -6.7e-01  2.7e-01  8.9e-01  -1.9e+00  -8.3e-01   1.2e+00     11  2.2e-04  1.7e+00
muraw[1,2]       1.1e+00  1.5e-01  7.1e-01   1.7e-02   1.1e+00   2.2e+00     21  4.3e-04  1.2e+00
muraw[1,3]       3.0e-01  2.2e-01  7.5e-01  -1.6e+00   3.6e-01   1.3e+00     12  2.4e-04  1.5e+00
muraw[1,4]      -5.7e-01  2.3e-01  7.9e-01  -1.6e+00  -6.5e-01   1.6e+00     12  2.4e-04  1.6e+00
muraw[2,1]       1.5e-01  2.5e-01  8.2e-01  -1.6e+00   1.1e-01   1.9e+00     11  2.1e-04  1.7e+00
muraw[2,2]      -7.0e-01  2.9e-01  9.3e-01  -1.8e+00  -9.3e-01   2.0e+00     11  2.1e-04  1.7e+00
muraw[2,3]       2.7e-01  2.5e-01  8.4e-01  -1.9e+00   4.3e-01   1.4e+00     11  2.2e-04  1.7e+00
muraw[2,4]       7.5e-02  3.1e-01  9.5e-01  -2.0e+00   2.1e-01   1.5e+00    9.4  1.9e-04  2.0e+00
betaraw[1,1]    -5.0e-01  1.3e-01  7.9e-01  -1.6e+00  -5.2e-01   8.5e-01     35  7.2e-04  1.1e+00
betaraw[1,2]     2.0e-01  2.0e-01  8.7e-01  -1.1e+00   1.4e-01   1.9e+00     20  4.0e-04  1.2e+00
betaraw[1,3]     1.6e-01  2.5e-01  9.5e-01  -1.3e+00   2.2e-01   1.7e+00     15  3.0e-04  1.4e+00
betaraw[1,4]    -1.8e-03  2.0e-01  8.9e-01  -1.4e+00  -3.2e-02   1.8e+00     19  3.9e-04  1.3e+00
betaraw[1,5]     2.0e-01  3.1e-01  1.1e+00  -1.6e+00   2.5e-01   2.0e+00     13  2.5e-04  1.5e+00
betaraw[1,6]     6.2e-02  2.1e-01  9.2e-01  -1.5e+00   6.9e-02   1.6e+00     18  3.7e-04  1.3e+00
betaraw[1,7]    -2.7e-01  1.8e-01  7.9e-01  -1.5e+00  -2.7e-01   8.1e-01     20  4.1e-04  1.2e+00
betaraw[1,8]     3.5e-01  2.3e-01  8.9e-01  -1.3e+00   3.8e-01   1.8e+00     15  3.0e-04  1.4e+00
betaraw[1,9]    -5.4e-01  1.7e-01  7.8e-01  -1.6e+00  -5.8e-01   6.5e-01     21  4.3e-04  1.2e+00
betaraw[2,1]     7.6e-01  8.2e-02  7.7e-01  -5.7e-01   7.8e-01   2.0e+00     90  1.8e-03  1.0e+00
betaraw[2,2]    -3.3e-01  3.0e-01  1.1e+00  -1.9e+00  -5.3e-01   2.0e+00     13  2.6e-04  1.5e+00
betaraw[2,3]     1.6e-01  2.6e-01  9.9e-01  -1.3e+00  -2.4e-02   2.0e+00     15  3.0e-04  1.4e+00
betaraw[2,4]    -2.1e-01  3.0e-01  1.1e+00  -1.7e+00  -2.5e-01   1.9e+00     13  2.6e-04  1.5e+00
betaraw[2,5]    -4.4e-01  3.4e-01  1.2e+00  -2.2e+00  -5.4e-01   1.8e+00     12  2.4e-04  1.5e+00
betaraw[2,6]     7.8e-01  3.0e-01  1.1e+00  -1.9e+00   8.9e-01   2.3e+00     14  2.8e-04  1.4e+00
betaraw[2,7]     1.0e+00  1.2e-01  7.5e-01  -1.4e-01   1.1e+00   2.2e+00     41  8.3e-04  1.1e+00
betaraw[2,8]    -5.7e-02  3.1e-01  1.1e+00  -1.6e+00  -2.2e-01   2.0e+00     11  2.3e-04  1.6e+00
betaraw[2,9]    -1.8e-01  2.4e-01  9.3e-01  -1.7e+00  -2.0e-01   1.6e+00     16  3.1e-04  1.4e+00
sigma_beta       4.8e-01  2.2e-01  6.0e-01   1.3e-01   2.9e-01   2.5e+00    7.2  1.5e-04  6.4e+00
sigma_h          1.4e+00  2.7e-01  8.6e-01   4.4e-01   1.1e+00   3.7e+00    9.9  2.0e-04  1.9e+00
sigma[1]         1.8e+00  4.9e-01  1.3e+00   1.8e-01   1.4e+00   6.0e+00    7.0  1.4e-04  9.4e+01
sigma[2]         1.1e+00  1.0e-01  2.7e-01   3.3e-01   1.3e+00   1.3e+00    7.0  1.4e-04  2.2e+01
sigma[3]         1.5e+00  1.5e-01  4.0e-01   2.1e-01   1.7e+00   1.7e+00    7.0  1.4e-04  2.4e+01
sigma[4]         1.7e+00  3.0e-01  7.8e-01   2.8e-01   1.6e+00   4.2e+00    7.0  1.4e-04  4.9e+01
sigma[5]         1.5e+00  2.8e-01  7.4e-01   1.7e-01   1.6e+00   3.6e+00    7.0  1.4e-04  4.9e+01
sigma[6]         1.6e+00  1.8e-01  4.6e-01   7.1e-01   1.6e+00   3.1e+00    7.0  1.4e-04  3.1e+01
sigma[7]         1.6e+00  2.2e-01  5.8e-01   2.3e-01   1.7e+00   2.9e+00    7.0  1.4e-04  3.6e+01
sigma[8]         2.2e+00  6.3e-01  1.7e+00   1.4e+00   1.4e+00   6.5e+00    7.0  1.4e-04  1.2e+02
sigma[9]         1.3e+00  1.3e-01  3.4e-01   2.1e-01   1.4e+00   1.5e+00    7.0  1.4e-04  2.5e+01

Samples were drawn using hmc with nuts.
For each parameter, N_Eff is a crude measure of effective sample size,
and R_hat is the potential scale reduction factor on split chains (at 
convergence, R_hat=1).

12784.739731 seconds (31.45 M allocations: 1.389 GiB, 0.00% gc time)

This was somehow slower. Weird.

Chains 1, 7, and 9 ran much faster than the others. Overall, convergence failed.

Thankfully, CmdStan makes it easy to analyze arbitrary sets of chains. We just need to call stansummary on the batches we want to analyze.

Lets look at the fast chains of the first run:

In [11]:
fast1 = (2,4,8)
fast2 = (6,11)
# path to the stansummary executable
stansummary = "/home/chriselrod/Documents/languages/cmdstan/bin/stansummary"
# path to where the samples were saved.
resdir = "/home/chriselrod/Documents/progwork/julia/tmp"
run(`$stansummary $(resdir)/ITP_samples_$[i for i ∈ fast1].csv`)
Inference for Stan model: ITP_model
3 chains: each with iter=(2000,2000,2000); warmup=(0,0,0); thin=(1,1,1); 6000 iterations saved.

Warmup took (71, 103, 110) seconds, 4.7 minutes total
Sampling took (217, 194, 223) seconds, 11 minutes total

                    Mean     MCSE   StdDev        5%       50%       95%  N_Eff  N_Eff/s    R_hat
lp__            -1.1e+08  1.0e+08  1.3e+08  -3.0e+08  -3.5e+07  -1.4e+07    1.5  2.4e-03  5.2e+01
accept_stat__    8.9e-01  9.3e-02  2.5e-01   1.6e-01   1.0e+00   1.0e+00    7.2  1.1e-02  1.6e+00
stepsize__       1.1e-05  4.3e-06  5.3e-06   5.5e-06   8.9e-06   1.8e-05    1.5  2.4e-03  4.0e+14
treedepth__      3.0e+00  1.1e-01  1.1e+00   2.0e+00   3.0e+00   5.0e+00     96  1.5e-01  1.0e+00
n_leapfrog__     1.2e+01  3.5e-01  2.6e+01   3.0e+00   7.0e+00   3.1e+01   5438  8.6e+00  1.0e+00
divergent__      1.1e-02  1.9e-03  1.0e-01   0.0e+00   0.0e+00   0.0e+00   3014  4.7e+00  1.0e+00
energy__         1.1e+08  1.0e+08  1.3e+08   1.4e+07   3.5e+07   3.0e+08    1.5  2.4e-03  5.2e+01
muh[1]           9.0e-02  8.1e-01  9.9e-01  -1.1e+00   8.2e-02   1.3e+00    1.5  2.4e-03  4.5e+04
muh[2]           5.4e-01  8.7e-01  1.1e+00  -8.3e-01   6.8e-01   1.8e+00    1.5  2.4e-03  4.6e+04
rho             -9.5e-03  4.9e-03  7.2e-03  -2.2e-02  -8.5e-03  -4.2e-04    2.2  3.5e-03  3.8e+00
kappa[1]         2.8e-01  4.2e-02  5.1e-02   2.1e-01   3.2e-01   3.2e-01    1.5  2.4e-03  2.6e+04
kappa[2]         2.9e+00  1.8e+00  2.2e+00   2.3e-01   2.8e+00   5.7e+00    1.5  2.4e-03  2.7e+05
kappa[3]         3.0e+00  2.4e+00  2.9e+00   7.3e-01   1.1e+00   7.1e+00    1.5  2.4e-03  5.4e+05
kappa[4]         9.7e-01  1.8e-01  2.2e-01   6.7e-01   1.1e+00   1.2e+00    1.5  2.4e-03  7.0e+04
kappa[5]         6.2e-01  1.2e-01  1.5e-01   5.1e-01   5.3e-01   8.3e-01    1.5  2.4e-03  8.6e+04
kappa[6]         4.6e+00  1.8e+00  2.2e+00   1.7e+00   5.0e+00   7.2e+00    1.5  2.4e-03  1.7e+05
kappa[7]         5.7e-01  1.7e-01  2.1e-01   2.8e-01   6.7e-01   7.7e-01    1.5  2.4e-03  6.1e+04
kappa[8]         1.6e+00  1.7e+00  2.1e+00   1.5e-01   1.7e-01   4.6e+00    1.5  2.4e-03  4.1e+05
kappa[9]         7.1e-01  5.6e-01  6.8e-01   1.6e-01   3.0e-01   1.7e+00    1.5  2.4e-03  2.2e+05
theta[1]         9.9e-01  4.7e-02  5.8e-02   9.1e-01   1.0e+00   1.0e+00    1.5  2.4e-03  1.0e+04
theta[2]         6.3e-02  4.2e-01  5.2e-01  -6.7e-01   4.1e-01   4.5e-01    1.5  2.4e-03  6.0e+03
theta[3]         1.5e-02  6.7e-01  8.2e-01  -8.7e-01  -2.0e-01   1.1e+00    1.5  2.4e-03  1.7e+04
theta[4]         8.4e-01  4.8e-01  5.9e-01   1.6e-01   7.7e-01   1.6e+00    1.5  2.4e-03  2.0e+05
theta[5]         6.6e-01  9.2e-01  1.1e+00  -9.2e-01   1.3e+00   1.6e+00    1.5  2.4e-03  9.4e+04
theta[6]        -3.5e-01  8.8e-01  1.1e+00  -1.8e+00   8.5e-03   7.6e-01    1.5  2.4e-03  3.7e+04
theta[7]        -9.6e-01  3.7e-01  4.5e-01  -1.3e+00  -1.2e+00  -3.2e-01    1.5  2.4e-03  2.5e+04
theta[8]        -1.4e+00  4.0e-01  4.9e-01  -1.9e+00  -1.4e+00  -7.4e-01    1.5  2.4e-03  1.2e+04
theta[9]        -2.0e-02  1.1e+00  1.3e+00  -1.7e+00   1.7e-01   1.5e+00    1.5  2.4e-03  1.0e+05
L[1,1]           1.0e+00     -nan  1.0e-14   1.0e+00   1.0e+00   1.0e+00   -nan     -nan  1.0e+00
L[1,2]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,1]           2.8e-02  5.1e-01  6.2e-01  -6.7e-01  -9.1e-02   8.4e-01    1.5  2.4e-03  1.1e+04
L[2,2]           7.6e-01  1.5e-01  1.9e-01   5.4e-01   7.4e-01   1.0e+00    1.5  2.4e-03  3.5e+03
L[2,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,1]          -6.9e-01  7.6e-02  9.2e-02  -8.0e-01  -6.9e-01  -5.7e-01    1.5  2.4e-03  2.2e+03
L[3,2]           5.6e-01  1.1e-01  1.3e-01   3.9e-01   5.9e-01   7.1e-01    1.5  2.4e-03  1.4e+04
L[3,3]           4.2e-01  1.8e-02  2.2e-02   4.1e-01   4.1e-01   4.5e-01    1.5  2.4e-03  2.9e+02
L[3,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,1]           4.2e-01  5.3e-01  6.5e-01  -5.0e-01   8.7e-01   9.0e-01    1.5  2.4e-03  5.3e+04
L[4,2]          -5.3e-01  1.5e-01  1.9e-01  -8.0e-01  -4.1e-01  -3.8e-01    1.5  2.4e-03  9.9e+03
L[4,3]           7.3e-02  1.0e-01  1.2e-01  -1.0e-01   1.5e-01   1.7e-01    1.5  2.4e-03  1.4e+04
L[4,4]           2.3e-01  6.4e-02  7.8e-02   1.3e-01   2.3e-01   3.2e-01    1.5  2.4e-03  2.2e+03
L[4,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,1]          -2.4e-01  3.2e-01  3.9e-01  -7.8e-01  -6.4e-02   1.3e-01    1.5  2.4e-03  5.2e+04
L[5,2]          -2.4e-01  4.9e-01  6.0e-01  -9.1e-01  -3.6e-01   5.4e-01    1.5  2.4e-03  1.5e+04
L[5,3]          -2.4e-01  4.0e-01  4.9e-01  -8.4e-01  -2.4e-01   3.6e-01    1.5  2.4e-03  5.2e+04
L[5,4]          -1.4e-01  3.3e-02  4.1e-02  -1.9e-01  -1.4e-01  -9.1e-02    1.5  2.4e-03  2.7e+03
L[5,5]           2.1e-01  1.0e-01  1.2e-01   5.9e-02   2.0e-01   3.6e-01    1.5  2.4e-03  3.3e+03
L[5,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,1]           3.7e-01  2.1e-01  2.6e-01   8.5e-03   5.5e-01   5.6e-01    1.5  2.4e-03  5.0e+03
L[6,2]          -2.9e-01  3.0e-01  3.6e-01  -6.4e-01  -4.4e-01   2.1e-01    1.5  2.4e-03  5.9e+03
L[6,3]           2.5e-01  3.3e-01  4.0e-01  -3.1e-01   5.1e-01   5.5e-01    1.5  2.4e-03  1.2e+04
L[6,4]           4.1e-02  3.1e-01  3.9e-01  -3.9e-01  -3.0e-02   5.4e-01    1.5  2.4e-03  2.6e+04
L[6,5]           1.1e-01  2.7e-01  3.3e-01  -3.4e-01   2.2e-01   4.4e-01    1.5  2.4e-03  1.0e+04
L[6,6]           2.9e-01  4.6e-02  5.6e-02   2.2e-01   3.1e-01   3.5e-01    1.5  2.4e-03  6.2e+02
L[6,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,1]          -1.6e-02  5.5e-01  6.7e-01  -8.0e-01  -9.3e-02   8.4e-01    1.5  2.4e-03  1.7e+04
L[7,2]           3.1e-01  4.5e-01  5.5e-01  -4.0e-01   4.0e-01   9.4e-01    1.5  2.4e-03  2.2e+04
L[7,3]           2.0e-02  2.1e-01  2.5e-01  -2.8e-01   4.1e-03   3.4e-01    1.5  2.4e-03  1.1e+04
L[7,4]          -1.3e-01  8.5e-02  1.0e-01  -2.7e-01  -5.3e-02  -5.1e-02    1.5  2.4e-03  5.8e+03
L[7,5]          -3.5e-02  1.1e-01  1.3e-01  -2.2e-01   3.4e-02   7.7e-02    1.5  2.4e-03  5.8e+03
L[7,6]          -3.3e-03  3.6e-02  4.4e-02  -4.8e-02  -1.8e-02   5.6e-02    1.5  2.4e-03  2.5e+03
L[7,7]           1.9e-01  1.3e-02  1.6e-02   1.7e-01   1.9e-01   2.1e-01    1.5  2.4e-03  2.8e+02
L[7,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[8,1]          -2.0e-01  2.8e-01  3.4e-01  -5.9e-01  -2.4e-01   2.4e-01    1.5  2.4e-03  6.4e+03
L[8,2]           7.1e-01  1.6e-01  2.0e-01   4.2e-01   8.4e-01   8.6e-01    1.5  2.4e-03  4.6e+03
L[8,3]           3.9e-01  1.2e-01  1.5e-01   1.9e-01   4.5e-01   5.4e-01    1.5  2.4e-03  2.4e+03
L[8,4]           9.7e-02  1.4e-01  1.7e-01  -7.9e-02   4.2e-02   3.3e-01    1.5  2.4e-03  6.6e+03
L[8,5]           1.1e-01  4.9e-02  6.0e-02   3.4e-02   1.2e-01   1.8e-01    1.5  2.4e-03  1.5e+03
L[8,6]           9.1e-02  8.9e-02  1.1e-01  -3.4e-02   7.6e-02   2.3e-01    1.5  2.4e-03  7.5e+03
L[8,7]          -6.1e-02  7.8e-02  9.6e-02  -2.0e-01   8.6e-05   1.3e-02    1.5  2.4e-03  9.0e+03
L[8,8]           1.9e-01  5.0e-02  6.2e-02   9.9e-02   2.2e-01   2.4e-01    1.5  2.4e-03  8.8e+02
L[8,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[9,1]          -1.7e-01  6.2e-01  7.6e-01  -8.8e-01  -5.1e-01   8.9e-01    1.5  2.4e-03  1.1e+05
L[9,2]          -3.5e-01  1.2e-01  1.4e-01  -5.2e-01  -3.6e-01  -1.7e-01    1.5  2.4e-03  7.3e+03
L[9,3]           8.7e-02  3.3e-01  4.0e-01  -3.7e-01   3.0e-02   6.1e-01    1.5  2.4e-03  2.3e+04
L[9,4]           1.5e-01  6.3e-02  7.7e-02   4.1e-02   1.7e-01   2.2e-01    1.5  2.4e-03  2.2e+03
L[9,5]           3.6e-02  8.4e-02  1.0e-01  -8.2e-02   2.0e-02   1.7e-01    1.5  2.4e-03  2.0e+04
L[9,6]          -1.4e-02  1.2e-01  1.5e-01  -1.7e-01  -5.0e-02   1.8e-01    1.5  2.4e-03  1.5e+04
L[9,7]          -6.1e-02  2.8e-02  3.4e-02  -1.1e-01  -5.4e-02  -2.3e-02    1.5  2.4e-03  4.0e+03
L[9,8]           5.6e-02  2.2e-02  2.6e-02   2.5e-02   5.3e-02   8.9e-02    1.5  2.4e-03  1.9e+03
L[9,9]           6.2e-02  2.3e-02  2.9e-02   3.7e-02   4.8e-02   1.0e-01    1.5  2.4e-03  9.4e+02
muraw[1,1]      -7.0e-01  1.5e+00  1.8e+00  -2.1e+00  -1.9e+00   1.9e+00    1.5  2.4e-03  9.5e+04
muraw[1,2]       9.5e-01  2.8e-01  3.4e-01   5.0e-01   1.0e+00   1.3e+00    1.5  2.4e-03  4.0e+04
muraw[1,3]      -3.9e-01  1.3e+00  1.5e+00  -1.9e+00  -9.4e-01   1.7e+00    1.5  2.4e-03  2.0e+05
muraw[1,4]       1.2e-01  1.1e+00  1.3e+00  -8.2e-01  -7.8e-01   2.0e+00    1.5  2.4e-03  3.8e+05
muraw[2,1]      -3.5e-01  4.4e-01  5.4e-01  -1.0e+00  -3.3e-01   3.0e-01    1.5  2.4e-03  3.1e+04
muraw[2,2]       1.0e+00  1.2e-01  1.4e-01   8.6e-01   1.0e+00   1.2e+00    1.5  2.4e-03  1.0e+04
muraw[2,3]      -5.8e-01  6.5e-01  7.9e-01  -1.5e+00  -7.6e-01   4.6e-01    1.5  2.4e-03  1.0e+05
muraw[2,4]      -1.1e+00  6.2e-01  7.6e-01  -1.9e+00  -1.2e+00  -7.2e-02    1.5  2.4e-03  3.3e+04
betaraw[1,1]     4.9e-01  6.2e-01  7.7e-01  -5.9e-01   9.6e-01   1.1e+00    1.5  2.4e-03  2.0e+05
betaraw[1,2]     9.5e-01  5.6e-01  6.9e-01   1.0e-02   1.2e+00   1.6e+00    1.5  2.4e-03  3.5e+04
betaraw[1,3]    -2.8e-01  7.9e-01  9.6e-01  -1.2e+00  -7.0e-01   1.0e+00    1.5  2.4e-03  7.4e+04
betaraw[1,4]     4.5e-01  1.1e+00  1.3e+00  -1.3e+00   1.0e+00   1.7e+00    1.5  2.4e-03  3.9e+05
betaraw[1,5]     3.3e-01  5.4e-01  6.6e-01  -5.9e-01   6.5e-01   9.2e-01    1.5  2.4e-03  2.2e+05
betaraw[1,6]    -5.8e-01  1.0e+00  1.3e+00  -1.8e+00  -1.2e+00   1.2e+00    1.5  2.4e-03  1.8e+05
betaraw[1,7]     1.3e+00  5.4e-01  6.7e-01   3.6e-01   1.6e+00   1.9e+00    1.5  2.4e-03  1.9e+05
betaraw[1,8]    -3.2e-01  8.4e-01  1.0e+00  -1.8e+00   2.2e-01   5.7e-01    1.5  2.4e-03  2.8e+05
betaraw[1,9]    -2.2e-01  1.1e+00  1.4e+00  -1.9e+00  -3.2e-01   1.5e+00    1.5  2.4e-03  2.8e+05
betaraw[2,1]    -8.0e-01  3.6e-01  4.5e-01  -1.4e+00  -7.3e-01  -3.0e-01    1.5  2.4e-03  1.3e+05
betaraw[2,2]     5.8e-01  8.7e-01  1.1e+00  -6.4e-01   4.1e-01   2.0e+00    1.5  2.4e-03  6.3e+04
betaraw[2,3]     8.5e-02  7.4e-01  9.0e-01  -8.9e-01  -1.4e-01   1.3e+00    1.5  2.4e-03  1.1e+05
betaraw[2,4]    -4.7e-02  1.3e+00  1.6e+00  -1.9e+00  -1.6e-01   1.9e+00    1.5  2.4e-03  3.2e+05
betaraw[2,5]     3.0e-01  9.6e-01  1.2e+00  -1.3e+00   6.5e-01   1.5e+00    1.5  2.4e-03  2.4e+05
betaraw[2,6]    -3.5e-01  8.3e-01  1.0e+00  -1.8e+00   7.2e-02   6.4e-01    1.5  2.4e-03  1.5e+05
betaraw[2,7]    -6.0e-01  1.3e+00  1.6e+00  -2.0e+00  -1.5e+00   1.7e+00    1.5  2.4e-03  2.4e+05
betaraw[2,8]    -1.8e-01  1.1e+00  1.4e+00  -1.5e+00  -7.8e-01   1.7e+00    1.5  2.4e-03  1.8e+05
betaraw[2,9]    -5.9e-01  6.5e-01  7.9e-01  -1.3e+00  -9.7e-01   5.1e-01    1.5  2.4e-03  3.4e+05
sigma_beta       4.2e-01  8.5e-03  1.0e-02   4.0e-01   4.2e-01   4.3e-01    1.5  2.4e-03  9.0e+02
sigma_h          1.1e+00  5.7e-01  7.0e-01   1.9e-01   1.2e+00   1.9e+00    1.5  2.4e-03  1.7e+04
sigma[1]         3.4e+00  1.1e+00  1.3e+00   1.5e+00   4.3e+00   4.4e+00    1.5  2.4e-03  7.6e+03
sigma[2]         1.4e+00  1.1e+00  1.4e+00   4.0e-01   5.2e-01   3.4e+00    1.5  2.4e-03  1.8e+04
sigma[3]         1.3e+00  5.5e-01  6.7e-01   3.4e-01   1.7e+00   1.9e+00    1.5  2.4e-03  1.1e+04
sigma[4]         3.9e+00  3.7e-01  4.6e-01   3.3e+00   3.8e+00   4.4e+00    1.5  2.4e-03  5.4e+03
sigma[5]         3.1e+00  5.4e-01  6.7e-01   2.2e+00   3.6e+00   3.6e+00    1.5  2.4e-03  7.4e+03
sigma[6]         2.5e+00  2.5e+00  3.1e+00   2.7e-01   3.4e-01   6.9e+00    1.5  2.4e-03  4.4e+04
sigma[7]         8.7e-01  2.7e-01  3.3e-01   4.3e-01   9.7e-01   1.2e+00    1.5  2.4e-03  2.4e+03
sigma[8]         1.3e+00  8.5e-01  1.0e+00   2.0e-01   1.0e+00   2.7e+00    1.5  2.4e-03  1.5e+04
sigma[9]         2.3e+00  3.3e-01  4.1e-01   1.8e+00   2.3e+00   2.8e+00    1.5  2.4e-03  6.1e+03

Samples were drawn using hmc with nuts.
For each parameter, N_Eff is a crude measure of effective sample size,
and R_hat is the potential scale reduction factor on split chains (at 
convergence, R_hat=1).

Out[11]:
Process(`/home/chriselrod/Documents/languages/cmdstan/bin/stansummary /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_2.csv /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_4.csv /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_8.csv`, ProcessExited(0))

We see they had an average treedepth of only 2.9, averaging only 12 leapfrog steps per iteration. This is why they were fast.
They also failed to converge, with abysmal effective sample sizes.

The slow chains:

In [12]:
run(`$stansummary $(resdir)/ITP_samples_$[i for i ∈ 1:14 if i ∉ fast1].csv`)
Inference for Stan model: ITP_model
11 chains: each with iter=(2000,2000,2000,2000,2000,2000,2000,2000,2000,2000,2000); warmup=(0,0,0,0,0,0,0,0,0,0,0); thin=(1,1,1,1,1,1,1,1,1,1,1); 22000 iterations saved.

Warmup took (2193, 2339, 1939, 2422, 2350, 2015, 2320, 1698, 2194, 2257, 2375) seconds, 6.7 hours total
Sampling took (5106, 3256, 4472, 3711, 3244, 4400, 3253, 3300, 3301, 3273, 4274) seconds, 12 hours total

                    Mean     MCSE   StdDev        5%       50%       95%  N_Eff  N_Eff/s    R_hat
lp__            -1.3e+04  9.4e-02  7.6e+00  -1.3e+04  -1.3e+04  -1.3e+04   6485  1.6e-01  1.0e+00
accept_stat__    9.9e-01  1.5e-03  1.9e-02   9.6e-01   9.9e-01   1.0e+00    155  3.7e-03  1.0e+00
stepsize__       1.5e-02  1.4e-03  3.3e-03   7.6e-03   1.5e-02   2.0e-02    5.5  1.3e-04  1.5e+14
treedepth__      8.2e+00     -nan  3.9e-01   8.0e+00   8.0e+00   9.0e+00   -nan     -nan  1.5e+00
n_leapfrog__     3.2e+02  3.7e+01  1.1e+02   2.6e+02   2.6e+02   5.1e+02    9.8  2.4e-04  1.5e+00
divergent__      0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
energy__         1.3e+04  1.3e-01  1.0e+01   1.3e+04   1.3e+04   1.3e+04   6319  1.5e-01  1.0e+00
muh[1]          -2.8e+00  7.3e-03  6.3e-01  -3.9e+00  -2.9e+00  -1.8e+00   7466  1.8e-01  1.0e+00
muh[2]           8.5e+00  7.8e-03  6.4e-01   7.5e+00   8.5e+00   9.5e+00   6774  1.6e-01  1.0e+00
rho              7.0e-01  4.1e-05  3.8e-03   6.9e-01   7.0e-01   7.1e-01   8682  2.1e-01  1.0e+00
kappa[1]         2.0e-01  6.9e-05  9.7e-03   1.8e-01   2.0e-01   2.1e-01  19932  4.8e-01  1.0e+00
kappa[2]         2.5e-01  6.4e-05  1.0e-02   2.3e-01   2.5e-01   2.7e-01  25498  6.1e-01  1.0e+00
kappa[3]         2.3e-01  9.8e-05  1.5e-02   2.1e-01   2.3e-01   2.6e-01  24207  5.8e-01  1.0e+00
kappa[4]         2.9e-01  1.1e-04  1.6e-02   2.6e-01   2.9e-01   3.2e-01  20407  4.9e-01  1.0e+00
kappa[5]         1.9e-01  9.7e-05  1.3e-02   1.7e-01   1.9e-01   2.1e-01  16951  4.1e-01  1.0e+00
kappa[6]         2.0e-01  7.3e-05  1.0e-02   1.8e-01   2.0e-01   2.2e-01  20186  4.9e-01  1.0e+00
kappa[7]         3.0e-01  8.5e-05  1.3e-02   2.8e-01   3.0e-01   3.3e-01  25047  6.0e-01  1.0e+00
kappa[8]         2.5e-01  8.5e-05  1.2e-02   2.3e-01   2.5e-01   2.7e-01  18646  4.5e-01  1.0e+00
kappa[9]         1.8e-01  7.5e-05  1.0e-02   1.6e-01   1.8e-01   2.0e-01  18349  4.4e-01  1.0e+00
theta[1]        -1.3e+00  5.2e-04  8.4e-02  -1.4e+00  -1.3e+00  -1.1e+00  25908  6.2e-01  1.0e+00
theta[2]        -2.5e+00  4.8e-04  7.7e-02  -2.6e+00  -2.5e+00  -2.4e+00  25617  6.2e-01  1.0e+00
theta[3]        -5.0e-01  6.6e-04  1.0e-01  -6.7e-01  -5.0e-01  -3.3e-01  23884  5.7e-01  1.0e+00
theta[4]         4.2e-01  6.8e-04  1.0e-01   2.5e-01   4.2e-01   5.8e-01  22168  5.3e-01  1.0e+00
theta[5]         5.7e-01  5.9e-04  9.4e-02   4.1e-01   5.7e-01   7.2e-01  25533  6.1e-01  1.0e+00
theta[6]        -8.0e-03  6.0e-04  9.0e-02  -1.6e-01  -7.9e-03   1.4e-01  22587  5.4e-01  1.0e+00
theta[7]         1.3e-01  6.6e-04  9.9e-02  -3.1e-02   1.3e-01   3.0e-01  22760  5.5e-01  1.0e+00
theta[8]        -9.6e-02  5.3e-04  8.4e-02  -2.3e-01  -9.6e-02   4.2e-02  24643  5.9e-01  1.0e+00
theta[9]        -1.1e+00  6.1e-04  8.7e-02  -1.3e+00  -1.1e+00  -9.7e-01  20612  5.0e-01  1.0e+00
L[1,1]           1.0e+00     -nan  3.9e-14   1.0e+00   1.0e+00   1.0e+00   -nan     -nan  1.0e+00
L[1,2]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,1]           9.1e-02  7.4e-05  1.3e-02   7.0e-02   9.2e-02   1.1e-01  31905  7.7e-01  1.0e+00
L[2,2]           1.0e+00  6.9e-06  1.2e-03   9.9e-01   1.0e+00   1.0e+00  31081  7.5e-01  1.0e+00
L[2,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,1]          -5.9e-02  7.4e-05  1.3e-02  -8.1e-02  -5.9e-02  -3.7e-02  32200  7.7e-01  1.0e+00
L[3,2]           2.5e-01  7.2e-05  1.2e-02   2.3e-01   2.5e-01   2.7e-01  29178  7.0e-01  1.0e+00
L[3,3]           9.7e-01  1.9e-05  3.2e-03   9.6e-01   9.7e-01   9.7e-01  29076  7.0e-01  1.0e+00
L[3,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,1]           1.9e-01  7.9e-05  1.3e-02   1.7e-01   1.9e-01   2.1e-01  26005  6.3e-01  1.0e+00
L[4,2]           1.9e-01  7.8e-05  1.2e-02   1.7e-01   1.9e-01   2.1e-01  25208  6.1e-01  1.0e+00
L[4,3]           2.8e-01  7.1e-05  1.2e-02   2.6e-01   2.8e-01   3.0e-01  26621  6.4e-01  1.0e+00
L[4,4]           9.2e-01  2.9e-05  4.7e-03   9.1e-01   9.2e-01   9.3e-01  26552  6.4e-01  1.0e+00
L[4,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,1]           7.0e-02  7.4e-05  1.3e-02   4.8e-02   7.0e-02   9.2e-02  31736  7.6e-01  1.0e+00
L[5,2]          -1.6e-01  7.3e-05  1.3e-02  -1.8e-01  -1.6e-01  -1.4e-01  30749  7.4e-01  1.0e+00
L[5,3]          -1.2e-01  7.2e-05  1.3e-02  -1.4e-01  -1.2e-01  -9.8e-02  30975  7.4e-01  1.0e+00
L[5,4]          -2.0e-01  6.7e-05  1.2e-02  -2.2e-01  -2.0e-01  -1.8e-01  33434  8.0e-01  1.0e+00
L[5,5]           9.6e-01  2.1e-05  3.7e-03   9.5e-01   9.6e-01   9.6e-01  31556  7.6e-01  1.0e+00
L[5,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,1]          -3.4e-01  7.2e-05  1.2e-02  -3.6e-01  -3.4e-01  -3.2e-01  26608  6.4e-01  1.0e+00
L[6,2]           1.1e-01  7.0e-05  1.2e-02   8.7e-02   1.1e-01   1.3e-01  30352  7.3e-01  1.0e+00
L[6,3]           1.1e-01  7.2e-05  1.2e-02   8.7e-02   1.1e-01   1.3e-01  28104  6.8e-01  1.0e+00
L[6,4]          -3.8e-01  5.9e-05  1.0e-02  -3.9e-01  -3.8e-01  -3.6e-01  31384  7.5e-01  1.0e+00
L[6,5]          -5.1e-02  6.2e-05  1.1e-02  -7.0e-02  -5.1e-02  -3.3e-02  32305  7.8e-01  1.0e+00
L[6,6]           8.5e-01  3.4e-05  5.9e-03   8.4e-01   8.5e-01   8.6e-01  30814  7.4e-01  1.0e+00
L[6,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,1]           6.1e-02  7.1e-05  1.3e-02   4.0e-02   6.1e-02   8.3e-02  33662  8.1e-01  1.0e+00
L[7,2]           1.5e-01  7.5e-05  1.3e-02   1.3e-01   1.5e-01   1.7e-01  29215  7.0e-01  1.0e+00
L[7,3]          -1.2e-01  7.5e-05  1.3e-02  -1.4e-01  -1.2e-01  -9.7e-02  29388  7.1e-01  1.0e+00
L[7,4]          -1.9e-01  6.9e-05  1.2e-02  -2.1e-01  -1.9e-01  -1.7e-01  31910  7.7e-01  1.0e+00
L[7,5]           1.4e-01  7.0e-05  1.2e-02   1.2e-01   1.4e-01   1.6e-01  31618  7.6e-01  1.0e+00
L[7,6]          -2.0e-01  6.7e-05  1.2e-02  -2.2e-01  -2.0e-01  -1.8e-01  32504  7.8e-01  1.0e+00
L[7,7]           9.3e-01  2.5e-05  4.5e-03   9.2e-01   9.3e-01   9.4e-01  32612  7.8e-01  1.0e+00
L[7,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[8,1]          -7.3e-02  7.6e-05  1.3e-02  -9.4e-02  -7.3e-02  -5.2e-02  29335  7.1e-01  1.0e+00
L[8,2]           1.1e-01  7.5e-05  1.3e-02   9.1e-02   1.1e-01   1.3e-01  30127  7.2e-01  1.0e+00
L[8,3]          -1.8e-01  6.9e-05  1.2e-02  -2.0e-01  -1.8e-01  -1.6e-01  32315  7.8e-01  1.0e+00
L[8,4]           2.0e-01  7.1e-05  1.2e-02   1.8e-01   2.0e-01   2.2e-01  30575  7.4e-01  1.0e+00
L[8,5]           5.6e-03  6.9e-05  1.3e-02  -1.5e-02   5.5e-03   2.6e-02  32520  7.8e-01  1.0e+00
L[8,6]           2.5e-02  7.0e-05  1.2e-02   4.5e-03   2.5e-02   4.5e-02  31607  7.6e-01  1.0e+00
L[8,7]           5.1e-02  6.8e-05  1.2e-02   3.0e-02   5.1e-02   7.1e-02  33702  8.1e-01  1.0e+00
L[8,8]           9.5e-01  2.3e-05  3.9e-03   9.4e-01   9.5e-01   9.6e-01  29594  7.1e-01  1.0e+00
L[8,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[9,1]          -1.6e-01  7.5e-05  1.3e-02  -1.8e-01  -1.6e-01  -1.4e-01  28778  6.9e-01  1.0e+00
L[9,2]           5.2e-02  7.6e-05  1.3e-02   3.1e-02   5.2e-02   7.4e-02  29505  7.1e-01  1.0e+00
L[9,3]           7.4e-02  7.6e-05  1.3e-02   5.2e-02   7.4e-02   9.5e-02  28883  6.9e-01  1.0e+00
L[9,4]          -4.3e-02  8.1e-05  1.3e-02  -6.4e-02  -4.3e-02  -2.1e-02  25821  6.2e-01  1.0e+00
L[9,5]          -9.1e-02  7.6e-05  1.3e-02  -1.1e-01  -9.1e-02  -7.0e-02  28966  7.0e-01  1.0e+00
L[9,6]           2.0e-01  7.3e-05  1.2e-02   1.8e-01   2.0e-01   2.2e-01  28186  6.8e-01  1.0e+00
L[9,7]           3.4e-01  6.5e-05  1.1e-02   3.2e-01   3.4e-01   3.6e-01  29774  7.2e-01  1.0e+00
L[9,8]           3.3e-01  5.7e-05  1.1e-02   3.1e-01   3.3e-01   3.4e-01  34514  8.3e-01  1.0e+00
L[9,9]           8.3e-01  3.4e-05  6.1e-03   8.2e-01   8.3e-01   8.4e-01  31461  7.6e-01  1.0e+00
muraw[1,1]      -9.6e-01  6.6e-03  6.1e-01  -2.0e+00  -9.5e-01   3.1e-02   8569  2.1e-01  1.0e+00
muraw[1,2]       1.2e+00  7.1e-03  6.6e-01   1.6e-01   1.2e+00   2.3e+00   8710  2.1e-01  1.0e+00
muraw[1,3]       4.3e-01  5.9e-03  5.6e-01  -4.7e-01   4.2e-01   1.4e+00   9164  2.2e-01  1.0e+00
muraw[1,4]      -7.2e-01  6.2e-03  5.8e-01  -1.7e+00  -7.2e-01   2.2e-01   8586  2.1e-01  1.0e+00
muraw[2,1]       1.7e-01  6.1e-03  5.5e-01  -7.2e-01   1.7e-01   1.1e+00   8089  1.9e-01  1.0e+00
muraw[2,2]      -8.6e-01  6.8e-03  6.1e-01  -1.9e+00  -8.5e-01   1.3e-01   7910  1.9e-01  1.0e+00
muraw[2,3]       5.0e-01  6.2e-03  5.7e-01  -4.2e-01   4.9e-01   1.5e+00   8324  2.0e-01  1.0e+00
muraw[2,4]       2.8e-01  6.1e-03  5.4e-01  -6.1e-01   2.7e-01   1.2e+00   7854  1.9e-01  1.0e+00
betaraw[1,1]    -3.4e-01  5.6e-03  8.0e-01  -1.7e+00  -3.4e-01   9.8e-01  20298  4.9e-01  1.0e+00
betaraw[1,2]     6.4e-02  5.6e-03  8.1e-01  -1.3e+00   5.7e-02   1.4e+00  20928  5.0e-01  1.0e+00
betaraw[1,3]     5.0e-01  5.5e-03  8.0e-01  -8.1e-01   4.9e-01   1.8e+00  21332  5.1e-01  1.0e+00
betaraw[1,4]    -1.3e-01  5.9e-03  8.1e-01  -1.5e+00  -1.2e-01   1.2e+00  18351  4.4e-01  1.0e+00
betaraw[1,5]     3.4e-01  5.5e-03  8.3e-01  -1.0e+00   3.5e-01   1.7e+00  22453  5.4e-01  1.0e+00
betaraw[1,6]    -2.2e-01  5.4e-03  8.3e-01  -1.6e+00  -2.2e-01   1.2e+00  24146  5.8e-01  1.0e+00
betaraw[1,7]    -3.6e-01  5.3e-03  7.3e-01  -1.6e+00  -3.6e-01   8.5e-01  19496  4.7e-01  1.0e+00
betaraw[1,8]     6.5e-01  5.6e-03  7.4e-01  -5.3e-01   6.3e-01   1.9e+00  17342  4.2e-01  1.0e+00
betaraw[1,9]    -5.1e-01  5.3e-03  7.3e-01  -1.7e+00  -5.0e-01   6.8e-01  19050  4.6e-01  1.0e+00
betaraw[2,1]     7.0e-01  5.4e-03  8.3e-01  -6.5e-01   6.9e-01   2.1e+00  23376  5.6e-01  1.0e+00
betaraw[2,2]    -6.3e-01  5.6e-03  8.3e-01  -2.0e+00  -6.3e-01   7.2e-01  22219  5.3e-01  1.0e+00
betaraw[2,3]    -8.7e-02  5.5e-03  8.3e-01  -1.4e+00  -8.8e-02   1.3e+00  22419  5.4e-01  1.0e+00
betaraw[2,4]    -1.4e-01  5.5e-03  8.3e-01  -1.5e+00  -1.4e-01   1.2e+00  22502  5.4e-01  1.0e+00
betaraw[2,5]    -8.5e-01  5.4e-03  8.8e-01  -2.3e+00  -8.4e-01   6.0e-01  26715  6.4e-01  1.0e+00
betaraw[2,6]     9.9e-01  5.4e-03  8.9e-01  -4.8e-01   9.9e-01   2.5e+00  26730  6.4e-01  1.0e+00
betaraw[2,7]     1.0e+00  5.3e-03  7.9e-01  -2.7e-01   1.0e+00   2.3e+00  22182  5.3e-01  1.0e+00
betaraw[2,8]    -4.7e-01  5.2e-03  7.7e-01  -1.7e+00  -4.6e-01   7.9e-01  21909  5.3e-01  1.0e+00
betaraw[2,9]    -4.7e-01  5.3e-03  8.0e-01  -1.8e+00  -4.6e-01   8.2e-01  22499  5.4e-01  1.0e+00
sigma_beta       2.7e-01  1.3e-03  1.1e-01   1.2e-01   2.5e-01   4.6e-01   7266  1.7e-01  1.0e+00
sigma_h          1.2e+00  6.3e-03  5.0e-01   6.4e-01   1.1e+00   2.1e+00   6255  1.5e-01  1.0e+00
sigma[1]         1.4e+00  1.3e-04  1.6e-02   1.4e+00   1.4e+00   1.5e+00  16038  3.9e-01  1.0e+00
sigma[2]         1.3e+00  1.1e-04  1.4e-02   1.2e+00   1.3e+00   1.3e+00  15378  3.7e-01  1.0e+00
sigma[3]         1.7e+00  1.6e-04  1.9e-02   1.7e+00   1.7e+00   1.7e+00  14548  3.5e-01  1.0e+00
sigma[4]         1.6e+00  1.6e-04  1.8e-02   1.6e+00   1.6e+00   1.7e+00  13011  3.1e-01  1.0e+00
sigma[5]         1.6e+00  1.4e-04  1.8e-02   1.6e+00   1.6e+00   1.6e+00  15007  3.6e-01  1.0e+00
sigma[6]         1.6e+00  1.5e-04  1.8e-02   1.5e+00   1.6e+00   1.6e+00  14243  3.4e-01  1.0e+00
sigma[7]         1.7e+00  1.5e-04  1.9e-02   1.6e+00   1.7e+00   1.7e+00  16252  3.9e-01  1.0e+00
sigma[8]         1.4e+00  1.2e-04  1.5e-02   1.4e+00   1.4e+00   1.4e+00  15554  3.7e-01  1.0e+00
sigma[9]         1.4e+00  1.3e-04  1.6e-02   1.4e+00   1.4e+00   1.5e+00  14343  3.4e-01  1.0e+00

Samples were drawn using hmc with nuts.
For each parameter, N_Eff is a crude measure of effective sample size,
and R_hat is the potential scale reduction factor on split chains (at 
convergence, R_hat=1).

Out[12]:
Process(`/home/chriselrod/Documents/languages/cmdstan/bin/stansummary /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_1.csv /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_3.csv /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_5.csv /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_6.csv /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_7.csv /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_9.csv /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_10.csv /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_11.csv /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_12.csv /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_13.csv /home/chriselrod/Documents/progwork/julia/tmp/ITP_samples_14.csv`, ProcessExited(0))

That’s better! These chains converged. Their treedepth averaged 8.9, with a mean of 510 leapfrog steps per iteration.
Runtime is roughly proportional to the number of leapfrog steps.

The other noteworthy thing about adaptation is the difference in stepsize: the slow chains that converged took steps nearly 1000 times larger!

Similarly, lets look at the second run. First, the fast chains:

In [13]:
run(`$stansummary $(resdir)/StanITP_samples_$[i for i ∈ fast2].csv`)
Inference for Stan model: StanITP_model
2 chains: each with iter=(2000,2000); warmup=(0,0); thin=(1,1); 4000 iterations saved.

Warmup took (90, 81) seconds, 2.9 minutes total
Sampling took (215, 208) seconds, 7.1 minutes total

                    Mean     MCSE   StdDev        5%       50%       95%  N_Eff  N_Eff/s    R_hat
lp__            -3.1e+08  2.0e+07  2.2e+07  -3.4e+08  -3.0e+08  -2.7e+08    1.3  3.0e-03  4.7e+00
accept_stat__    9.7e-01  1.9e-03  1.1e-01   8.6e-01   1.0e+00   1.0e+00   3189  7.5e+00  1.0e+00
stepsize__       6.5e-06  1.0e-07  1.0e-07   6.4e-06   6.6e-06   6.6e-06    1.0  2.4e-03  7.3e+12
treedepth__      2.8e+00  1.8e-02  1.1e+00   2.0e+00   2.0e+00   5.0e+00   3928  9.3e+00  1.0e+00
n_leapfrog__     1.1e+01  5.3e-01  3.1e+01   3.0e+00   3.0e+00   3.1e+01   3453  8.2e+00  1.0e+00
divergent__      3.8e-03  9.6e-04  6.1e-02   0.0e+00   0.0e+00   0.0e+00   4034  9.5e+00  1.0e+00
energy__         3.1e+08  2.0e+07  2.2e+07   2.7e+08   3.1e+08   3.4e+08    1.3  3.0e-03  4.7e+00
muh[1]          -5.6e-01  1.1e+00  1.1e+00  -1.6e+00   5.2e-01   5.2e-01    1.0  2.4e-03  1.8e+04
muh[2]          -7.5e-02  1.4e+00  1.4e+00  -1.4e+00   1.3e+00   1.3e+00    1.0  2.4e-03  2.1e+04
rho             -1.9e-02  3.9e-03  5.7e-03  -3.0e-02  -1.8e-02  -1.1e-02    2.1  4.9e-03  3.2e+00
kappa[1]         1.5e+00  1.2e+00  1.2e+00   3.6e-01   2.7e+00   2.7e+00    1.0  2.4e-03  8.3e+04
kappa[2]         3.8e+00  3.5e+00  3.5e+00   3.3e-01   7.3e+00   7.3e+00    1.0  2.4e-03  5.8e+05
kappa[3]         1.4e+00  5.0e-02  5.0e-02   1.4e+00   1.5e+00   1.5e+00    1.0  2.4e-03  4.2e+03
kappa[4]         3.3e-01  9.2e-02  9.2e-02   2.4e-01   4.2e-01   4.2e-01    1.0  2.4e-03  3.5e+04
kappa[5]         1.0e+00  8.7e-01  8.7e-01   1.8e-01   1.9e+00   1.9e+00    1.0  2.4e-03  9.5e+04
kappa[6]         6.5e-01  4.3e-01  4.3e-01   2.3e-01   1.1e+00   1.1e+00    1.0  2.4e-03  9.4e+04
kappa[7]         2.7e-01  2.6e-02  2.6e-02   2.4e-01   2.9e-01   2.9e-01    1.0  2.4e-03  2.2e+03
kappa[8]         3.3e+00  8.0e-01  8.0e-01   2.5e+00   4.1e+00   4.1e+00    1.0  2.4e-03  1.7e+05
kappa[9]         4.0e-01  1.9e-01  1.9e-01   2.1e-01   6.0e-01   6.0e-01    1.0  2.4e-03  4.3e+04
theta[1]         6.6e-01  2.9e-01  2.9e-01   3.7e-01   9.5e-01   9.5e-01    1.0  2.4e-03  6.2e+04
theta[2]         7.1e-03  1.5e+00  1.5e+00  -1.5e+00   1.5e+00   1.5e+00    1.0  2.4e-03  4.3e+04
theta[3]        -8.5e-01  7.7e-01  7.7e-01  -1.6e+00  -8.4e-02  -8.4e-02    1.0  2.4e-03  1.1e+05
theta[4]         2.1e-01  8.4e-01  8.4e-01  -6.3e-01   1.0e+00   1.0e+00    1.0  2.4e-03  5.4e+04
theta[5]        -6.5e-01  5.0e-01  5.0e-01  -1.1e+00  -1.5e-01  -1.5e-01    1.0  2.4e-03  1.2e+04
theta[6]         1.0e+00  8.8e-01  8.8e-01   1.6e-01   1.9e+00   1.9e+00    1.0  2.4e-03  2.9e+04
theta[7]        -5.3e-01  7.4e-01  7.4e-01  -1.3e+00   2.0e-01   2.0e-01    1.0  2.4e-03  2.4e+04
theta[8]         2.4e-01  1.1e+00  1.1e+00  -9.0e-01   1.4e+00   1.4e+00    1.0  2.4e-03  4.1e+05
theta[9]        -2.3e-01  1.3e+00  1.3e+00  -1.5e+00   1.0e+00   1.0e+00    1.0  2.4e-03  2.0e+04
L[1,1]           1.0e+00     -nan  6.7e-16   1.0e+00   1.0e+00   1.0e+00   -nan     -nan  1.0e+00
L[1,2]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,1]          -1.2e-02  8.1e-01  8.1e-01  -8.2e-01   8.0e-01   8.0e-01    1.0  2.4e-03  1.2e+04
L[2,2]           5.9e-01  1.7e-02  1.7e-02   5.7e-01   6.0e-01   6.0e-01    1.0  2.4e-03  1.8e+02
L[2,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,1]           1.0e-01  2.3e-01  2.3e-01  -1.3e-01   3.4e-01   3.4e-01    1.0  2.4e-03  1.0e+04
L[3,2]          -6.5e-02  5.8e-01  5.8e-01  -6.4e-01   5.1e-01   5.1e-01    1.0  2.4e-03  2.4e+04
L[3,3]           7.7e-01  1.8e-02  1.8e-02   7.6e-01   7.9e-01   7.9e-01    1.0  2.4e-03  1.9e+03
L[3,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,1]           4.6e-01  1.6e-01  1.6e-01   2.9e-01   6.2e-01   6.2e-01    1.0  2.4e-03  2.4e+03
L[4,2]           2.0e-01  5.5e-01  5.5e-01  -3.5e-01   7.5e-01   7.5e-01    1.0  2.4e-03  5.5e+03
L[4,3]           2.4e-01  2.7e-01  2.7e-01  -2.3e-02   5.1e-01   5.1e-01    1.0  2.4e-03  6.9e+03
L[4,4]           5.3e-01  5.3e-02  5.3e-02   4.8e-01   5.9e-01   5.9e-01    1.0  2.4e-03  4.3e+02
L[4,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,1]           8.5e-01  7.4e-02  7.4e-02   7.7e-01   9.2e-01   9.2e-01    1.0  2.4e-03  9.5e+02
L[5,2]           1.2e-01  1.8e-01  1.8e-01  -6.0e-02   3.1e-01   3.1e-01    1.0  2.4e-03  5.9e+03
L[5,3]          -6.8e-02  4.1e-01  4.1e-01  -4.8e-01   3.4e-01   3.4e-01    1.0  2.4e-03  5.2e+03
L[5,4]          -1.3e-02  1.1e-01  1.1e-01  -1.2e-01   9.2e-02   9.2e-02    1.0  2.4e-03  3.2e+03
L[5,5]           2.0e-01  6.8e-02  6.8e-02   1.3e-01   2.7e-01   2.7e-01    1.0  2.4e-03  6.2e+02
L[5,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,1]          -9.3e-01  1.5e-02  1.5e-02  -9.5e-01  -9.2e-01  -9.2e-01    1.0  2.4e-03  7.3e+02
L[6,2]          -9.4e-02  2.6e-01  2.6e-01  -3.5e-01   1.6e-01   1.6e-01    1.0  2.4e-03  1.4e+04
L[6,3]          -1.0e-01  1.4e-01  1.4e-01  -2.4e-01   3.9e-02   3.9e-02    1.0  2.4e-03  4.7e+03
L[6,4]          -1.6e-02  1.1e-01  1.1e-01  -1.2e-01   8.9e-02   8.9e-02    1.0  2.4e-03  4.4e+03
L[6,5]           2.2e-02  7.8e-02  7.8e-02  -5.6e-02   1.0e-01   1.0e-01    1.0  2.4e-03  2.2e+03
L[6,6]           7.9e-02  1.0e-02  1.0e-02   6.9e-02   8.9e-02   9.0e-02    1.0  2.4e-03  1.8e+02
L[6,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,1]          -4.0e-01  4.5e-01  4.5e-01  -8.5e-01   4.8e-02   4.8e-02    1.0  2.4e-03  1.3e+04
L[7,2]          -6.3e-01  2.8e-01  2.8e-01  -9.1e-01  -3.5e-01  -3.5e-01    1.0  2.4e-03  6.0e+03
L[7,3]          -3.8e-02  2.5e-01  2.5e-01  -2.9e-01   2.2e-01   2.2e-01    1.0  2.4e-03  7.6e+03
L[7,4]          -1.9e-01  8.7e-02  8.7e-02  -2.8e-01  -1.0e-01  -1.0e-01    1.0  2.4e-03  1.4e+03
L[7,5]           4.8e-02  1.3e-01  1.3e-01  -8.3e-02   1.8e-01   1.8e-01    1.0  2.4e-03  1.7e+03
L[7,6]           1.2e-01  1.7e-02  1.7e-02   1.1e-01   1.4e-01   1.4e-01    1.0  2.4e-03  2.5e+02
L[7,7]           1.3e-01  6.3e-02  6.4e-02   6.9e-02   2.0e-01   2.0e-01    1.0  2.4e-03  7.0e+02
L[7,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[8,1]          -6.9e-01  1.7e-01  1.7e-01  -8.6e-01  -5.2e-01  -5.2e-01    1.0  2.4e-03  3.6e+04
L[8,2]          -3.8e-02  4.9e-01  4.9e-01  -5.3e-01   4.5e-01   4.5e-01    1.0  2.4e-03  2.4e+04
L[8,3]          -3.5e-01  2.9e-01  2.9e-01  -6.4e-01  -5.6e-02  -5.6e-02    1.0  2.4e-03  2.9e+04
L[8,4]           8.3e-02  8.6e-02  8.6e-02  -2.7e-03   1.7e-01   1.7e-01    1.0  2.4e-03  2.0e+04
L[8,5]          -1.3e-01  4.7e-03  4.7e-03  -1.3e-01  -1.2e-01  -1.2e-01    1.0  2.4e-03  1.5e+03
L[8,6]           2.9e-02  6.6e-02  6.6e-02  -3.6e-02   9.5e-02   9.5e-02    1.0  2.4e-03  5.4e+03
L[8,7]          -7.2e-02  1.8e-02  1.8e-02  -9.1e-02  -5.4e-02  -5.4e-02    1.0  2.4e-03  1.7e+03
L[8,8]           4.8e-02  6.0e-03  6.0e-03   4.2e-02   5.4e-02   5.4e-02    1.0  2.4e-03  2.9e+02
L[8,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[9,1]           6.0e-03  8.6e-01  8.6e-01  -8.6e-01   8.7e-01   8.7e-01    1.0  2.4e-03  1.9e+04
L[9,2]          -1.5e-01  2.5e-01  2.5e-01  -4.0e-01   1.0e-01   1.0e-01    1.0  2.4e-03  9.6e+03
L[9,3]          -2.0e-01  2.0e-01  2.0e-01  -4.1e-01   8.7e-04   9.8e-04    1.0  2.4e-03  9.7e+03
L[9,4]           4.7e-03  2.5e-01  2.5e-01  -2.5e-01   2.6e-01   2.6e-01    1.0  2.4e-03  4.3e+03
L[9,5]           5.8e-02  5.6e-03  5.6e-03   5.2e-02   6.3e-02   6.4e-02    1.0  2.4e-03  1.6e+02
L[9,6]           4.8e-03  8.7e-02  8.7e-02  -8.2e-02   9.1e-02   9.1e-02    1.0  2.4e-03  2.1e+03
L[9,7]          -2.1e-02  5.3e-02  5.3e-02  -7.4e-02   3.2e-02   3.2e-02    1.0  2.4e-03  1.3e+03
L[9,8]           3.5e-02  3.1e-02  3.1e-02   3.4e-03   6.6e-02   6.6e-02    1.0  2.4e-03  8.8e+02
L[9,9]           7.6e-02  2.2e-02  2.2e-02   5.4e-02   9.7e-02   9.8e-02    1.0  2.4e-03  2.8e+02
muraw[1,1]       1.8e-01  1.0e+00  1.0e+00  -8.3e-01   1.2e+00   1.2e+00    1.0  2.4e-03  2.9e+04
muraw[1,2]       9.6e-01  9.5e-01  9.5e-01   1.7e-02   1.9e+00   1.9e+00    1.0  2.4e-03  9.4e+04
muraw[1,3]      -8.0e-01  7.8e-01  7.8e-01  -1.6e+00  -1.4e-02  -1.4e-02    1.0  2.4e-03  6.9e+04
muraw[1,4]       5.2e-01  1.1e+00  1.1e+00  -5.4e-01   1.6e+00   1.6e+00    1.0  2.4e-03  9.8e+03
muraw[2,1]       1.9e-01  1.8e+00  1.8e+00  -1.6e+00   1.9e+00   1.9e+00    1.0  2.4e-03  3.4e+05
muraw[2,2]       4.4e-01  1.5e+00  1.5e+00  -1.1e+00   2.0e+00   2.0e+00    1.0  2.4e-03  1.6e+05
muraw[2,3]      -5.5e-01  1.3e+00  1.3e+00  -1.9e+00   7.9e-01   7.9e-01    1.0  2.4e-03  2.9e+04
muraw[2,4]       2.1e-02  1.5e+00  1.5e+00  -1.5e+00   1.5e+00   1.5e+00    1.0  2.4e-03  8.7e+04
betaraw[1,1]    -7.5e-01  2.2e-01  2.2e-01  -9.7e-01  -5.2e-01  -5.2e-01    1.0  2.4e-03  2.0e+04
betaraw[1,2]     1.4e-01  6.0e-01  6.0e-01  -4.6e-01   7.4e-01   7.4e-01    1.0  2.4e-03  1.1e+04
betaraw[1,3]    -9.4e-01  1.2e-01  1.2e-01  -1.1e+00  -8.3e-01  -8.3e-01    1.0  2.4e-03  6.6e+03
betaraw[1,4]     1.0e+00  7.7e-01  7.7e-01   2.6e-01   1.8e+00   1.8e+00    1.0  2.4e-03  1.7e+05
betaraw[1,5]     1.9e-01  1.8e+00  1.8e+00  -1.6e+00   2.0e+00   2.0e+00    1.0  2.4e-03  5.7e+04
betaraw[1,6]     1.1e+00  5.2e-01  5.2e-01   6.0e-01   1.6e+00   1.6e+00    1.0  2.4e-03  1.7e+04
betaraw[1,7]    -3.5e-01  9.3e-01  9.3e-01  -1.3e+00   5.8e-01   5.8e-01    1.0  2.4e-03  1.1e+04
betaraw[1,8]    -7.1e-01  5.5e-01  5.5e-01  -1.3e+00  -1.5e-01  -1.5e-01    1.0  2.4e-03  1.3e+05
betaraw[1,9]    -1.4e+00  5.1e-02  5.1e-02  -1.4e+00  -1.3e+00  -1.3e+00    1.0  2.4e-03  1.3e+03
betaraw[2,1]     1.2e+00  3.1e-01  3.1e-01   9.2e-01   1.5e+00   1.5e+00    1.0  2.4e-03  1.4e+05
betaraw[2,2]     1.9e-01  9.3e-01  9.3e-01  -7.5e-01   1.1e+00   1.1e+00    1.0  2.4e-03  1.5e+05
betaraw[2,3]     1.8e+00  1.6e-01  1.6e-01   1.7e+00   2.0e+00   2.0e+00    1.0  2.4e-03  2.5e+04
betaraw[2,4]    -1.6e+00  1.5e-01  1.5e-01  -1.7e+00  -1.5e+00  -1.5e+00    1.0  2.4e-03  9.0e+03
betaraw[2,5]     1.7e+00  3.4e-02  3.4e-02   1.7e+00   1.8e+00   1.8e+00    1.0  2.4e-03  7.6e+02
betaraw[2,6]    -8.6e-01  1.0e+00  1.0e+00  -1.9e+00   1.6e-01   1.6e-01    1.0  2.4e-03  1.1e+05
betaraw[2,7]     1.6e+00  1.7e-02  1.7e-02   1.6e+00   1.6e+00   1.6e+00    1.0  2.4e-03  1.3e+03
betaraw[2,8]     1.6e+00  3.7e-01  3.7e-01   1.2e+00   2.0e+00   2.0e+00    1.0  2.4e-03  1.2e+05
betaraw[2,9]     1.1e+00  4.6e-01  4.6e-01   6.7e-01   1.6e+00   1.6e+00    1.0  2.4e-03  1.6e+04
sigma_beta       1.8e+00  6.8e-01  6.8e-01   1.1e+00   2.5e+00   2.5e+00    1.0  2.4e-03  1.5e+03
sigma_h          1.3e+00  8.7e-01  8.7e-01   4.4e-01   2.2e+00   2.2e+00    1.0  2.4e-03  1.7e+03
sigma[1]         4.6e+00  1.5e+00  1.5e+00   3.1e+00   6.0e+00   6.0e+00    1.0  2.4e-03  1.3e+04
sigma[2]         5.9e-01  2.5e-01  2.5e-01   3.3e-01   8.4e-01   8.4e-01    1.0  2.4e-03  1.9e+03
sigma[3]         9.6e-01  7.5e-01  7.5e-01   2.1e-01   1.7e+00   1.7e+00    1.0  2.4e-03  1.3e+04
sigma[4]         2.3e+00  2.0e+00  2.0e+00   2.8e-01   4.2e+00   4.2e+00    1.0  2.4e-03  3.9e+04
sigma[5]         2.8e-01  1.2e-01  1.2e-01   1.7e-01   4.0e-01   4.0e-01    1.0  2.4e-03  1.5e+03
sigma[6]         1.9e+00  1.2e+00  1.2e+00   7.1e-01   3.1e+00   3.1e+00    1.0  2.4e-03  5.2e+03
sigma[7]         3.7e-01  1.4e-01  1.4e-01   2.3e-01   5.1e-01   5.1e-01    1.0  2.4e-03  1.4e+03
sigma[8]         5.9e+00  6.3e-01  6.3e-01   5.3e+00   6.5e+00   6.5e+00    1.0  2.4e-03  6.8e+04
sigma[9]         5.3e-01  3.3e-01  3.3e-01   2.1e-01   8.6e-01   8.6e-01    1.0  2.4e-03  4.4e+03

Samples were drawn using hmc with nuts.
For each parameter, N_Eff is a crude measure of effective sample size,
and R_hat is the potential scale reduction factor on split chains (at 
convergence, R_hat=1).

Out[13]:
Process(`/home/chriselrod/Documents/languages/cmdstan/bin/stansummary /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_6.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_11.csv`, ProcessExited(0))

We have an average of 12 leapfrog steps per iteration and a mean treedepth of 2.8. Also, total convergence failure.

The slow chains:

In [14]:
run(`$stansummary $(resdir)/StanITP_samples_$[i for i ∈ 1:14 if i ∉ fast2].csv`)
Inference for Stan model: StanITP_model
12 chains: each with iter=(2000,2000,2000,2000,2000,2000,2000,2000,2000,2000,2000,2000); warmup=(0,0,0,0,0,0,0,0,0,0,0,0); thin=(1,1,1,1,1,1,1,1,1,1,1,1); 24000 iterations saved.

Warmup took (2330, 2379, 1991, 2125, 2418, 2099, 2657, 2132, 2493, 2164, 2388, 6087) seconds, 8.7 hours total
Sampling took (3621, 3576, 3647, 3660, 3546, 4918, 4254, 3662, 3592, 3637, 4254, 6592) seconds, 14 hours total

                    Mean     MCSE   StdDev        5%       50%       95%  N_Eff  N_Eff/s    R_hat
lp__            -1.7e+11     -nan  5.7e+11  -2.1e+12  -1.3e+04  -1.3e+04   -nan     -nan  8.1e+10
accept_stat__    9.8e-01  2.8e-03  2.7e-02   9.4e-01   9.9e-01   1.0e+00     95  1.9e-03  1.0e+00
stepsize__       1.4e-02  2.0e-03  4.9e-03   6.5e-15   1.6e-02   2.0e-02    6.0  1.2e-04  2.2e+14
treedepth__      8.2e+00     -nan  5.9e-01   8.0e+00   8.0e+00   1.0e+01   -nan     -nan  2.8e+00
n_leapfrog__     3.5e+02     -nan  2.2e+02   2.6e+02   2.6e+02   1.0e+03   -nan     -nan  3.4e+00
divergent__      0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
energy__         1.7e+11     -nan  5.7e+11   1.3e+04   1.3e+04   2.1e+12   -nan     -nan  6.0e+10
muh[1]          -2.5e+00  5.1e-01  1.4e+00  -3.8e+00  -2.8e+00   1.7e+00    7.4  1.5e-04  2.3e+00
muh[2]           7.9e+00  8.8e-01  2.2e+00   7.8e-01   8.5e+00   9.5e+00    6.5  1.3e-04  3.6e+00
rho              6.4e-01  8.0e-02  2.0e-01  -1.2e-02   7.0e-01   7.1e-01    6.0  1.2e-04  5.6e+01
kappa[1]         6.3e-01  5.8e-01  1.4e+00   1.8e-01   2.0e-01   5.4e+00    6.0  1.2e-04  1.6e+02
kappa[2]         4.7e-01  2.9e-01  7.2e-01   2.3e-01   2.5e-01   2.8e+00    6.0  1.2e-04  7.4e+01
kappa[3]         2.4e-01  9.6e-03  2.7e-02   2.1e-01   2.3e-01   3.1e-01    8.3  1.7e-04  1.9e+00
kappa[4]         7.5e-01  6.2e-01  1.5e+00   2.6e-01   2.9e-01   5.8e+00    6.0  1.2e-04  1.0e+02
kappa[5]         2.1e-01  3.7e-02  9.1e-02   1.7e-01   1.9e-01   5.1e-01    6.1  1.2e-04  7.7e+00
kappa[6]         2.1e-01     -nan  3.1e-02   1.8e-01   2.0e-01   3.0e-01   -nan     -nan  3.2e+00
kappa[7]         3.2e-01  1.7e-02  4.3e-02   2.8e-01   3.1e-01   4.5e-01    6.5  1.3e-04  3.4e+00
kappa[8]         3.0e-01  7.3e-02  1.8e-01   2.3e-01   2.5e-01   9.0e-01    6.0  1.2e-04  1.6e+01
kappa[9]         3.4e-01  2.1e-01  5.2e-01   1.7e-01   1.8e-01   2.1e+00    6.0  1.2e-04  5.6e+01
theta[1]        -1.3e+00  4.1e-02  1.3e-01  -1.6e+00  -1.3e+00  -1.1e+00    9.9  2.0e-04  1.6e+00
theta[2]        -2.2e+00  4.2e-01  1.0e+00  -2.6e+00  -2.5e+00   1.2e+00    6.0  1.2e-04  1.4e+01
theta[3]        -4.9e-01  1.6e-02  1.0e-01  -6.6e-01  -4.9e-01  -3.4e-01     42  8.6e-04  1.1e+00
theta[4]         4.4e-01  2.7e-02  1.2e-01   2.6e-01   4.3e-01   6.6e-01     19  3.8e-04  1.2e+00
theta[5]         5.8e-01  1.7e-02  9.8e-02   4.2e-01   5.8e-01   7.2e-01     35  7.1e-04  1.1e+00
theta[6]        -1.9e-02  1.6e-02  9.6e-02  -1.5e-01  -1.7e-02   1.4e-01     36  7.4e-04  1.1e+00
theta[7]        -3.4e-02  2.3e-01  5.6e-01  -1.9e+00   1.2e-01   2.9e-01    6.2  1.3e-04  6.0e+00
theta[8]        -9.3e-02  2.7e-03  8.2e-02  -2.3e-01  -8.6e-02   3.9e-02    935  1.9e-02  1.0e+00
theta[9]        -1.1e+00  9.5e-03  8.7e-02  -1.3e+00  -1.1e+00  -9.7e-01     84  1.7e-03  1.0e+00
L[1,1]           1.0e+00     -nan  4.4e-14   1.0e+00   1.0e+00   1.0e+00   -nan     -nan  1.0e+00
L[1,2]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,1]           1.5e-01  8.1e-02  2.0e-01   7.0e-02   9.3e-02   8.1e-01    6.0  1.2e-04  1.6e+01
L[2,2]           9.6e-01  4.6e-02  1.1e-01   5.9e-01   1.0e+00   1.0e+00    6.0  1.2e-04  9.9e+01
L[2,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,1]          -1.2e-01  8.7e-02  2.1e-01  -8.3e-01  -6.0e-02  -3.8e-02    6.0  1.2e-04  1.7e+01
L[3,2]           1.8e-01  8.5e-02  2.1e-01  -5.1e-01   2.5e-01   2.7e-01    6.0  1.2e-04  1.8e+01
L[3,3]           9.0e-01  8.4e-02  2.1e-01   2.2e-01   9.7e-01   9.7e-01    6.0  1.2e-04  6.8e+01
L[3,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,1]           2.5e-01  7.4e-02  1.8e-01   1.7e-01   1.9e-01   8.5e-01    6.0  1.2e-04  1.5e+01
L[4,2]           2.1e-01  3.3e-02  8.2e-02   1.7e-01   1.9e-01   4.8e-01    6.1  1.3e-04  7.1e+00
L[4,3]           2.7e-01  1.3e-02  3.3e-02   1.7e-01   2.8e-01   3.0e-01    6.7  1.4e-04  3.0e+00
L[4,4]           8.6e-01  8.7e-02  2.1e-01   1.5e-01   9.2e-01   9.3e-01    6.0  1.2e-04  4.8e+01
L[4,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,1]           1.4e-01  1.0e-01  2.5e-01   4.9e-02   7.1e-02   9.6e-01    6.0  1.2e-04  2.0e+01
L[5,2]          -1.3e-01  3.9e-02  9.7e-02  -1.8e-01  -1.6e-01   1.9e-01    6.1  1.2e-04  8.1e+00
L[5,3]          -1.1e-01  1.5e-02  3.9e-02  -1.4e-01  -1.2e-01   1.4e-02    6.6  1.4e-04  3.2e+00
L[5,4]          -1.7e-01  4.5e-02  1.1e-01  -2.2e-01  -2.0e-01   2.0e-01    6.1  1.2e-04  9.6e+00
L[5,5]           8.8e-01  9.8e-02  2.4e-01   8.9e-02   9.6e-01   9.6e-01    6.0  1.2e-04  7.0e+01
L[5,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,1]          -3.8e-01  5.6e-02  1.4e-01  -8.4e-01  -3.4e-01  -3.2e-01    6.0  1.2e-04  1.3e+01
L[6,2]           6.0e-02  6.4e-02  1.6e-01  -4.6e-01   1.1e-01   1.3e-01    6.0  1.2e-04  1.4e+01
L[6,3]           8.2e-02  3.4e-02  8.5e-02  -2.0e-01   1.1e-01   1.3e-01    6.1  1.2e-04  7.4e+00
L[6,4]          -3.5e-01  3.4e-02  8.4e-02  -3.9e-01  -3.7e-01  -7.2e-02    6.1  1.2e-04  8.5e+00
L[6,5]          -4.3e-02  1.2e-02  3.0e-02  -6.9e-02  -5.0e-02   5.1e-02    6.8  1.4e-04  2.9e+00
L[6,6]           7.9e-01  7.1e-02  1.7e-01   2.2e-01   8.5e-01   8.6e-01    6.0  1.2e-04  3.1e+01
L[6,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,1]          -1.3e-02  1.0e-01  2.5e-01  -8.3e-01   5.9e-02   8.2e-02    6.0  1.2e-04  2.0e+01
L[7,2]           1.2e-01  4.2e-02  1.0e-01  -2.2e-01   1.5e-01   1.7e-01    6.1  1.2e-04  8.7e+00
L[7,3]          -7.6e-02  5.7e-02  1.4e-01  -1.4e-01  -1.2e-01   3.9e-01    6.0  1.2e-04  1.2e+01
L[7,4]          -2.0e-01  1.4e-02  3.7e-02  -3.2e-01  -1.9e-01  -1.7e-01    6.7  1.4e-04  3.2e+00
L[7,5]           1.2e-01  2.1e-02  5.4e-02  -5.2e-02   1.4e-01   1.6e-01    6.3  1.3e-04  4.6e+00
L[7,6]          -1.8e-01  2.9e-02  7.2e-02  -2.2e-01  -2.0e-01   5.8e-02    6.1  1.3e-04  6.4e+00
L[7,7]           8.5e-01  1.0e-01  2.5e-01   3.9e-02   9.3e-01   9.4e-01    6.0  1.2e-04  5.7e+01
L[7,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[8,1]          -1.5e-01  9.8e-02  2.4e-01  -9.4e-01  -7.5e-02  -5.2e-02    6.0  1.2e-04  2.0e+01
L[8,2]           7.9e-02  4.6e-02  1.1e-01  -2.9e-01   1.1e-01   1.3e-01    6.1  1.2e-04  9.3e+00
L[8,3]          -1.6e-01  3.2e-02  7.9e-02  -2.0e-01  -1.8e-01   9.9e-02    6.1  1.3e-04  6.6e+00
L[8,4]           1.9e-01  2.0e-02  4.9e-02   3.1e-02   2.0e-01   2.2e-01    6.3  1.3e-04  4.3e+00
L[8,5]          -4.3e-03  1.3e-02  3.5e-02  -1.1e-01   4.0e-03   2.6e-02    6.8  1.4e-04  2.9e+00
L[8,6]           1.8e-02  9.3e-03  2.6e-02  -5.7e-02   2.3e-02   4.5e-02    7.6  1.6e-04  2.2e+00
L[8,7]           4.5e-02  7.7e-03  2.2e-02  -1.7e-02   4.9e-02   7.1e-02    8.3  1.7e-04  1.9e+00
L[8,8]           8.7e-01  1.1e-01  2.6e-01   1.1e-02   9.5e-01   9.6e-01    6.0  1.2e-04  7.1e+01
L[8,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[9,1]          -6.9e-02  1.2e-01  3.0e-01  -1.8e-01  -1.6e-01   9.3e-01    6.0  1.2e-04  2.5e+01
L[9,2]           2.0e-02  4.4e-02  1.1e-01  -3.4e-01   5.1e-02   7.3e-02    6.1  1.2e-04  8.9e+00
L[9,3]           5.8e-02  2.1e-02  5.3e-02  -1.1e-01   7.2e-02   9.5e-02    6.3  1.3e-04  4.4e+00
L[9,4]          -4.0e-02  3.8e-03  1.5e-02  -6.3e-02  -4.1e-02  -8.8e-03     16  3.3e-04  1.3e+00
L[9,5]          -8.6e-02  6.8e-03  2.1e-02  -1.1e-01  -9.0e-02  -3.2e-02    9.2  1.9e-04  1.7e+00
L[9,6]           1.8e-01  2.5e-02  6.2e-02  -2.2e-02   2.0e-01   2.2e-01    6.2  1.3e-04  5.3e+00
L[9,7]           3.1e-01  3.9e-02  9.6e-02  -7.3e-03   3.4e-01   3.6e-01    6.1  1.2e-04  9.2e+00
L[9,8]           3.0e-01  3.6e-02  8.9e-02   5.6e-03   3.2e-01   3.4e-01    6.1  1.2e-04  9.0e+00
L[9,9]           7.6e-01  9.4e-02  2.3e-01   1.9e-03   8.3e-01   8.4e-01    6.0  1.2e-04  4.1e+01
muraw[1,1]      -8.1e-01  2.2e-01  7.8e-01  -1.9e+00  -8.9e-01   9.3e-01     13  2.7e-04  1.4e+00
muraw[1,2]       1.2e+00  7.5e-02  6.6e-01   2.0e-01   1.1e+00   2.3e+00     78  1.6e-03  1.1e+00
muraw[1,3]       4.9e-01  7.3e-02  5.6e-01  -4.5e-01   4.8e-01   1.3e+00     60  1.2e-03  1.1e+00
muraw[1,4]      -7.5e-01  2.3e-02  5.5e-01  -1.6e+00  -7.8e-01   1.7e-01    572  1.2e-02  1.0e+00
muraw[2,1]       1.4e-01  1.1e-02  5.3e-01  -7.0e-01   1.1e-01   1.0e+00   2431  5.0e-02  1.0e+00
muraw[2,2]      -9.0e-01  1.6e-02  6.0e-01  -1.9e+00  -9.3e-01   9.9e-02   1407  2.9e-02  1.0e+00
muraw[2,3]       4.1e-01  1.2e-01  6.2e-01  -5.7e-01   4.3e-01   1.4e+00     28  5.7e-04  1.1e+00
muraw[2,4]       8.4e-02  2.6e-01  8.2e-01  -2.0e+00   2.1e-01   1.1e+00   10.0  2.0e-04  1.6e+00
betaraw[1,1]    -4.6e-01  1.4e-01  8.4e-01  -1.6e+00  -4.4e-01   9.1e-01     34  7.0e-04  1.1e+00
betaraw[1,2]     2.1e-01  2.1e-01  9.1e-01  -1.2e+00   1.4e-01   1.9e+00     19  4.0e-04  1.2e+00
betaraw[1,3]     3.4e-01  2.1e-01  9.1e-01  -1.3e+00   4.1e-01   1.8e+00     20  4.0e-04  1.2e+00
betaraw[1,4]    -1.7e-01  4.6e-02  7.8e-01  -1.4e+00  -2.2e-01   1.1e+00    290  5.9e-03  1.0e+00
betaraw[1,5]     2.0e-01  1.9e-01  9.1e-01  -1.3e+00   2.5e-01   1.6e+00     24  4.8e-04  1.2e+00
betaraw[1,6]    -1.1e-01  1.3e-01  8.5e-01  -1.5e+00  -1.2e-01   1.1e+00     43  8.9e-04  1.1e+00
betaraw[1,7]    -2.6e-01  1.3e-01  7.7e-01  -1.5e+00  -2.7e-01   8.1e-01     35  7.2e-04  1.1e+00
betaraw[1,8]     5.3e-01  1.6e-01  8.1e-01  -7.8e-01   5.5e-01   1.8e+00     25  5.1e-04  1.2e+00
betaraw[1,9]    -4.1e-01  1.3e-01  7.6e-01  -1.7e+00  -4.2e-01   6.5e-01     35  7.1e-04  1.1e+00
betaraw[2,1]     6.8e-01  4.9e-03  8.0e-01  -6.3e-01   6.0e-01   2.1e+00  26346  5.4e-01  1.0e+00
betaraw[2,2]    -4.2e-01  3.0e-01  1.1e+00  -2.0e+00  -5.3e-01   2.0e+00     13  2.6e-04  1.4e+00
betaraw[2,3]    -1.1e-01  4.8e-03  7.8e-01  -1.4e+00  -1.8e-01   1.2e+00  25849  5.3e-01  1.0e+00
betaraw[2,4]     2.1e-02  2.4e-01  9.7e-01  -1.4e+00  -6.6e-02   1.9e+00     17  3.4e-04  1.2e+00
betaraw[2,5]    -8.1e-01  3.1e-02  8.5e-01  -2.3e+00  -7.4e-01   5.4e-01    772  1.6e-02  1.0e+00
betaraw[2,6]     1.1e+00  6.6e-02  8.7e-01  -4.2e-01   1.1e+00   2.4e+00    172  3.5e-03  1.0e+00
betaraw[2,7]     9.4e-01  9.4e-02  7.8e-01  -2.0e-01   9.2e-01   2.3e+00     68  1.4e-03  1.1e+00
betaraw[2,8]    -3.3e-01  2.0e-01  8.7e-01  -1.7e+00  -3.9e-01   1.3e+00     19  3.9e-04  1.2e+00
betaraw[2,9]    -4.0e-01  1.1e-01  7.9e-01  -1.7e+00  -3.8e-01   7.6e-01     56  1.1e-03  1.1e+00
sigma_beta       2.7e-01  1.1e-03  1.0e-01   1.3e-01   2.6e-01   4.5e-01   8822  1.8e-01  1.0e+00
sigma_h          1.4e+00  2.9e-01  8.6e-01   6.3e-01   1.1e+00   3.7e+00    9.1  1.9e-04  1.7e+00
sigma[1]         1.3e+00  1.4e-01  3.5e-01   1.8e-01   1.4e+00   1.5e+00    6.0  1.2e-04  2.3e+01
sigma[2]         1.2e+00  4.5e-02  1.1e-01   8.7e-01   1.3e+00   1.3e+00    6.1  1.2e-04  8.4e+00
sigma[3]         1.6e+00  6.0e-02  1.5e-01   1.2e+00   1.7e+00   1.7e+00    6.1  1.2e-04  8.4e+00
sigma[4]         1.6e+00  2.2e-02  5.7e-02   1.4e+00   1.6e+00   1.7e+00    6.6  1.3e-04  3.3e+00
sigma[5]         1.8e+00  2.3e-01  5.7e-01   1.6e+00   1.6e+00   3.6e+00    6.0  1.2e-04  3.5e+01
sigma[6]         1.6e+00  1.4e-04  1.7e-02   1.5e+00   1.6e+00   1.6e+00  14530  3.0e-01  1.0e+00
sigma[7]         1.8e+00  1.3e-01  3.3e-01   1.6e+00   1.7e+00   2.9e+00    6.0  1.2e-04  1.9e+01
sigma[8]         1.6e+00  2.9e-01  7.2e-01   1.4e+00   1.4e+00   4.0e+00    6.0  1.2e-04  4.9e+01
sigma[9]         1.4e+00  3.5e-02  8.8e-02   1.1e+00   1.4e+00   1.5e+00    6.2  1.3e-04  5.9e+00

Samples were drawn using hmc with nuts.
For each parameter, N_Eff is a crude measure of effective sample size,
and R_hat is the potential scale reduction factor on split chains (at 
convergence, R_hat=1).

Out[14]:
Process(`/home/chriselrod/Documents/languages/cmdstan/bin/stansummary /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_1.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_2.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_3.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_4.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_5.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_7.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_8.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_9.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_10.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_12.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_13.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_14.csv`, ProcessExited(0))

These did not converge. Yikes. We see there is another outlier that was slower than the rest. Let’s exclude it too:

In [15]:
bad2 = (fast2...,14)
run(`$stansummary $(resdir)/StanITP_samples_$[i for i ∈ 1:14 if i ∉ bad2].csv`)
Inference for Stan model: StanITP_model
11 chains: each with iter=(2000,2000,2000,2000,2000,2000,2000,2000,2000,2000,2000); warmup=(0,0,0,0,0,0,0,0,0,0,0); thin=(1,1,1,1,1,1,1,1,1,1,1); 22000 iterations saved.

Warmup took (2330, 2379, 1991, 2125, 2418, 2099, 2657, 2132, 2493, 2164, 2388) seconds, 7.0 hours total
Sampling took (3621, 3576, 3647, 3660, 3546, 4918, 4254, 3662, 3592, 3637, 4254) seconds, 12 hours total

                    Mean     MCSE   StdDev        5%       50%       95%  N_Eff  N_Eff/s    R_hat
lp__            -1.3e+04  9.1e-02  7.5e+00  -1.3e+04  -1.3e+04  -1.3e+04   6818  1.6e-01  1.0e+00
accept_stat__    9.9e-01  1.5e-03  2.4e-02   9.5e-01   9.9e-01   1.0e+00    259  6.1e-03  1.0e+00
stepsize__       1.6e-02  1.1e-03  2.5e-03   1.2e-02   1.6e-02   2.0e-02    5.5  1.3e-04  1.1e+14
treedepth__      8.1e+00     -nan  2.7e-01   8.0e+00   8.0e+00   9.0e+00   -nan     -nan  1.2e+00
n_leapfrog__     2.9e+02  2.4e+01  8.8e+01   2.6e+02   2.6e+02   5.1e+02     13  3.1e-04  1.3e+00
divergent__      0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
energy__         1.3e+04  1.3e-01  1.0e+01   1.3e+04   1.3e+04   1.3e+04   6595  1.6e-01  1.0e+00
muh[1]          -2.8e+00  7.5e-03  6.3e-01  -3.8e+00  -2.9e+00  -1.9e+00   7175  1.7e-01  1.0e+00
muh[2]           8.5e+00  8.9e-03  6.7e-01   7.5e+00   8.6e+00   9.5e+00   5622  1.3e-01  1.0e+00
rho              7.0e-01  3.9e-05  3.8e-03   6.9e-01   7.0e-01   7.1e-01   9521  2.2e-01  1.0e+00
kappa[1]         2.0e-01  6.4e-05  9.6e-03   1.8e-01   2.0e-01   2.1e-01  22106  5.2e-01  1.0e+00
kappa[2]         2.5e-01  6.2e-05  1.0e-02   2.3e-01   2.5e-01   2.7e-01  27766  6.6e-01  1.0e+00
kappa[3]         2.3e-01  9.8e-05  1.5e-02   2.1e-01   2.3e-01   2.6e-01  24328  5.7e-01  1.0e+00
kappa[4]         2.9e-01  1.1e-04  1.6e-02   2.6e-01   2.9e-01   3.2e-01  23395  5.5e-01  1.0e+00
kappa[5]         1.9e-01  9.6e-05  1.3e-02   1.7e-01   1.9e-01   2.1e-01  17447  4.1e-01  1.0e+00
kappa[6]         2.0e-01  7.0e-05  1.0e-02   1.8e-01   2.0e-01   2.2e-01  21653  5.1e-01  1.0e+00
kappa[7]         3.0e-01  8.2e-05  1.3e-02   2.8e-01   3.0e-01   3.3e-01  26173  6.2e-01  1.0e+00
kappa[8]         2.5e-01  8.5e-05  1.2e-02   2.3e-01   2.5e-01   2.7e-01  19110  4.5e-01  1.0e+00
kappa[9]         1.8e-01  7.2e-05  9.9e-03   1.6e-01   1.8e-01   2.0e-01  18847  4.4e-01  1.0e+00
theta[1]        -1.3e+00  5.2e-04  8.5e-02  -1.4e+00  -1.3e+00  -1.1e+00  26837  6.3e-01  1.0e+00
theta[2]        -2.5e+00  4.9e-04  7.7e-02  -2.6e+00  -2.5e+00  -2.4e+00  24931  5.9e-01  1.0e+00
theta[3]        -5.0e-01  6.7e-04  1.0e-01  -6.7e-01  -5.0e-01  -3.3e-01  22953  5.4e-01  1.0e+00
theta[4]         4.2e-01  7.1e-04  1.0e-01   2.5e-01   4.2e-01   5.9e-01  20618  4.9e-01  1.0e+00
theta[5]         5.7e-01  6.0e-04  9.3e-02   4.2e-01   5.7e-01   7.2e-01  24357  5.7e-01  1.0e+00
theta[6]        -7.0e-03  6.0e-04  9.1e-02  -1.6e-01  -6.4e-03   1.4e-01  22953  5.4e-01  1.0e+00
theta[7]         1.3e-01  6.5e-04  9.9e-02  -3.1e-02   1.3e-01   3.0e-01  23489  5.5e-01  1.0e+00
theta[8]        -9.7e-02  5.3e-04  8.4e-02  -2.4e-01  -9.6e-02   4.1e-02  25242  6.0e-01  1.0e+00
theta[9]        -1.1e+00  5.9e-04  8.7e-02  -1.3e+00  -1.1e+00  -9.7e-01  21875  5.2e-01  1.0e+00
L[1,1]           1.0e+00     -nan  3.9e-14   1.0e+00   1.0e+00   1.0e+00   -nan     -nan  1.0e+00
L[1,2]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[1,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,1]           9.1e-02  7.4e-05  1.3e-02   7.0e-02   9.2e-02   1.1e-01  31356  7.4e-01  1.0e+00
L[2,2]           1.0e+00  6.9e-06  1.2e-03   9.9e-01   1.0e+00   1.0e+00  30884  7.3e-01  1.0e+00
L[2,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[2,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,1]          -5.9e-02  7.9e-05  1.3e-02  -8.0e-02  -5.9e-02  -3.7e-02  27822  6.6e-01  1.0e+00
L[3,2]           2.5e-01  7.1e-05  1.2e-02   2.3e-01   2.5e-01   2.7e-01  29686  7.0e-01  1.0e+00
L[3,3]           9.7e-01  1.9e-05  3.2e-03   9.6e-01   9.7e-01   9.7e-01  29079  6.9e-01  1.0e+00
L[3,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[3,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,1]           1.9e-01  7.8e-05  1.3e-02   1.7e-01   1.9e-01   2.1e-01  26196  6.2e-01  1.0e+00
L[4,2]           1.9e-01  7.6e-05  1.2e-02   1.7e-01   1.9e-01   2.1e-01  26557  6.3e-01  1.0e+00
L[4,3]           2.8e-01  6.9e-05  1.2e-02   2.6e-01   2.8e-01   3.0e-01  28568  6.7e-01  1.0e+00
L[4,4]           9.2e-01  2.9e-05  4.7e-03   9.1e-01   9.2e-01   9.3e-01  26860  6.3e-01  1.0e+00
L[4,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[4,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,1]           7.0e-02  7.3e-05  1.3e-02   4.8e-02   7.0e-02   9.2e-02  32922  7.8e-01  1.0e+00
L[5,2]          -1.6e-01  7.2e-05  1.3e-02  -1.8e-01  -1.6e-01  -1.4e-01  31885  7.5e-01  1.0e+00
L[5,3]          -1.2e-01  7.1e-05  1.3e-02  -1.4e-01  -1.2e-01  -9.8e-02  32613  7.7e-01  1.0e+00
L[5,4]          -2.0e-01  6.2e-05  1.2e-02  -2.2e-01  -2.0e-01  -1.8e-01  38642  9.1e-01  1.0e+00
L[5,5]           9.6e-01  2.0e-05  3.7e-03   9.5e-01   9.6e-01   9.6e-01  34502  8.1e-01  1.0e+00
L[5,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[5,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,1]          -3.4e-01  7.1e-05  1.2e-02  -3.6e-01  -3.4e-01  -3.2e-01  27336  6.5e-01  1.0e+00
L[6,2]           1.1e-01  6.9e-05  1.2e-02   8.7e-02   1.1e-01   1.3e-01  30901  7.3e-01  1.0e+00
L[6,3]           1.1e-01  7.0e-05  1.2e-02   8.7e-02   1.1e-01   1.3e-01  30144  7.1e-01  1.0e+00
L[6,4]          -3.8e-01  5.9e-05  1.1e-02  -3.9e-01  -3.8e-01  -3.6e-01  31407  7.4e-01  1.0e+00
L[6,5]          -5.1e-02  5.9e-05  1.1e-02  -7.0e-02  -5.1e-02  -3.3e-02  35668  8.4e-01  1.0e+00
L[6,6]           8.5e-01  3.4e-05  6.0e-03   8.4e-01   8.5e-01   8.6e-01  30882  7.3e-01  1.0e+00
L[6,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[6,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,1]           6.1e-02  7.5e-05  1.3e-02   3.9e-02   6.1e-02   8.3e-02  31177  7.4e-01  1.0e+00
L[7,2]           1.5e-01  7.1e-05  1.3e-02   1.3e-01   1.5e-01   1.7e-01  33068  7.8e-01  1.0e+00
L[7,3]          -1.2e-01  7.5e-05  1.3e-02  -1.4e-01  -1.2e-01  -9.7e-02  29616  7.0e-01  1.0e+00
L[7,4]          -1.9e-01  6.6e-05  1.2e-02  -2.1e-01  -1.9e-01  -1.7e-01  35779  8.4e-01  1.0e+00
L[7,5]           1.4e-01  7.0e-05  1.2e-02   1.2e-01   1.4e-01   1.6e-01  30876  7.3e-01  1.0e+00
L[7,6]          -2.0e-01  6.7e-05  1.2e-02  -2.2e-01  -2.0e-01  -1.8e-01  32578  7.7e-01  1.0e+00
L[7,7]           9.3e-01  2.5e-05  4.6e-03   9.2e-01   9.3e-01   9.4e-01  32771  7.7e-01  1.0e+00
L[7,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[7,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[8,1]          -7.3e-02  7.4e-05  1.3e-02  -9.5e-02  -7.3e-02  -5.1e-02  31736  7.5e-01  1.0e+00
L[8,2]           1.1e-01  7.3e-05  1.3e-02   9.1e-02   1.1e-01   1.3e-01  31316  7.4e-01  1.0e+00
L[8,3]          -1.8e-01  6.9e-05  1.3e-02  -2.0e-01  -1.8e-01  -1.6e-01  33863  8.0e-01  1.0e+00
L[8,4]           2.0e-01  6.7e-05  1.2e-02   1.8e-01   2.0e-01   2.2e-01  33273  7.9e-01  1.0e+00
L[8,5]           5.5e-03  6.7e-05  1.3e-02  -1.5e-02   5.5e-03   2.6e-02  36512  8.6e-01  1.0e+00
L[8,6]           2.5e-02  6.6e-05  1.3e-02   4.1e-03   2.5e-02   4.6e-02  36839  8.7e-01  1.0e+00
L[8,7]           5.1e-02  6.8e-05  1.2e-02   3.0e-02   5.1e-02   7.1e-02  33243  7.8e-01  1.0e+00
L[8,8]           9.5e-01  2.1e-05  3.9e-03   9.4e-01   9.5e-01   9.6e-01  34050  8.0e-01  1.0e+00
L[8,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan     -nan
L[9,1]          -1.6e-01  7.8e-05  1.3e-02  -1.8e-01  -1.6e-01  -1.4e-01  27308  6.4e-01  1.0e+00
L[9,2]           5.2e-02  7.6e-05  1.3e-02   3.1e-02   5.2e-02   7.4e-02  29921  7.1e-01  1.0e+00
L[9,3]           7.4e-02  7.5e-05  1.3e-02   5.2e-02   7.4e-02   9.5e-02  29802  7.0e-01  1.0e+00
L[9,4]          -4.3e-02  7.4e-05  1.3e-02  -6.4e-02  -4.3e-02  -2.1e-02  30089  7.1e-01  1.0e+00
L[9,5]          -9.1e-02  7.3e-05  1.3e-02  -1.1e-01  -9.1e-02  -7.0e-02  31833  7.5e-01  1.0e+00
L[9,6]           2.0e-01  7.2e-05  1.2e-02   1.8e-01   2.0e-01   2.2e-01  29189  6.9e-01  1.0e+00
L[9,7]           3.4e-01  6.5e-05  1.1e-02   3.2e-01   3.4e-01   3.6e-01  29575  7.0e-01  1.0e+00
L[9,8]           3.3e-01  5.8e-05  1.0e-02   3.1e-01   3.3e-01   3.4e-01  33223  7.8e-01  1.0e+00
L[9,9]           8.3e-01  3.4e-05  6.0e-03   8.2e-01   8.3e-01   8.4e-01  32020  7.6e-01  1.0e+00
muraw[1,1]      -9.7e-01  6.5e-03  6.0e-01  -2.0e+00  -9.6e-01   6.1e-03   8735  2.1e-01  1.0e+00
muraw[1,2]       1.2e+00  7.0e-03  6.6e-01   1.8e-01   1.2e+00   2.3e+00   8837  2.1e-01  1.0e+00
muraw[1,3]       4.3e-01  5.7e-03  5.5e-01  -4.7e-01   4.2e-01   1.4e+00   9355  2.2e-01  1.0e+00
muraw[1,4]      -7.2e-01  6.1e-03  5.7e-01  -1.7e+00  -7.1e-01   2.0e-01   8806  2.1e-01  1.0e+00
muraw[2,1]       1.7e-01  5.8e-03  5.4e-01  -7.2e-01   1.7e-01   1.1e+00   8808  2.1e-01  1.0e+00
muraw[2,2]      -8.7e-01  6.6e-03  6.1e-01  -1.9e+00  -8.6e-01   1.2e-01   8561  2.0e-01  1.0e+00
muraw[2,3]       4.9e-01  6.0e-03  5.7e-01  -4.3e-01   4.9e-01   1.4e+00   8929  2.1e-01  1.0e+00
muraw[2,4]       2.7e-01  5.9e-03  5.4e-01  -6.2e-01   2.7e-01   1.2e+00   8591  2.0e-01  1.0e+00
betaraw[1,1]    -3.6e-01  5.2e-03  7.9e-01  -1.6e+00  -3.6e-01   9.4e-01  23342  5.5e-01  1.0e+00
betaraw[1,2]     6.1e-02  5.5e-03  7.9e-01  -1.2e+00   5.4e-02   1.4e+00  21107  5.0e-01  1.0e+00
betaraw[1,3]     4.9e-01  5.3e-03  7.9e-01  -8.2e-01   4.9e-01   1.8e+00  22389  5.3e-01  1.0e+00
betaraw[1,4]    -1.3e-01  5.7e-03  8.0e-01  -1.4e+00  -1.2e-01   1.2e+00  20034  4.7e-01  1.0e+00
betaraw[1,5]     3.4e-01  5.3e-03  8.2e-01  -1.0e+00   3.4e-01   1.7e+00  24130  5.7e-01  1.0e+00
betaraw[1,6]    -2.1e-01  5.1e-03  8.2e-01  -1.6e+00  -2.2e-01   1.1e+00  25528  6.0e-01  1.0e+00
betaraw[1,7]    -3.5e-01  5.1e-03  7.3e-01  -1.6e+00  -3.5e-01   8.4e-01  19927  4.7e-01  1.0e+00
betaraw[1,8]     6.5e-01  5.5e-03  7.4e-01  -5.4e-01   6.3e-01   1.9e+00  18093  4.3e-01  1.0e+00
betaraw[1,9]    -5.0e-01  5.1e-03  7.2e-01  -1.7e+00  -5.0e-01   6.7e-01  19922  4.7e-01  1.0e+00
betaraw[2,1]     7.0e-01  5.3e-03  8.3e-01  -6.6e-01   6.9e-01   2.1e+00  25148  5.9e-01  1.0e+00
betaraw[2,2]    -6.4e-01  5.2e-03  8.2e-01  -2.0e+00  -6.3e-01   7.0e-01  25386  6.0e-01  1.0e+00
betaraw[2,3]    -1.1e-01  5.3e-03  8.1e-01  -1.4e+00  -1.1e-01   1.2e+00  23861  5.6e-01  1.0e+00
betaraw[2,4]    -1.5e-01  5.3e-03  8.2e-01  -1.5e+00  -1.6e-01   1.2e+00  23404  5.5e-01  1.0e+00
betaraw[2,5]    -8.5e-01  5.4e-03  8.8e-01  -2.3e+00  -8.4e-01   5.8e-01  26222  6.2e-01  1.0e+00
betaraw[2,6]     9.9e-01  5.6e-03  8.9e-01  -4.5e-01   9.9e-01   2.5e+00  25076  5.9e-01  1.0e+00
betaraw[2,7]     1.0e+00  5.4e-03  7.7e-01  -2.3e-01   1.0e+00   2.3e+00  20472  4.8e-01  1.0e+00
betaraw[2,8]    -4.8e-01  5.2e-03  7.6e-01  -1.7e+00  -4.7e-01   7.5e-01  21320  5.0e-01  1.0e+00
betaraw[2,9]    -4.8e-01  5.4e-03  7.8e-01  -1.8e+00  -4.7e-01   7.9e-01  21001  5.0e-01  1.0e+00
sigma_beta       2.7e-01  1.2e-03  1.1e-01   1.2e-01   2.5e-01   4.6e-01   8367  2.0e-01  1.0e+00
sigma_h          1.2e+00  8.0e-03  5.3e-01   6.3e-01   1.0e+00   2.1e+00   4397  1.0e-01  1.0e+00
sigma[1]         1.4e+00  1.2e-04  1.6e-02   1.4e+00   1.4e+00   1.5e+00  16781  4.0e-01  1.0e+00
sigma[2]         1.3e+00  1.1e-04  1.4e-02   1.2e+00   1.3e+00   1.3e+00  16246  3.8e-01  1.0e+00
sigma[3]         1.7e+00  1.5e-04  1.9e-02   1.7e+00   1.7e+00   1.7e+00  16310  3.8e-01  1.0e+00
sigma[4]         1.6e+00  1.5e-04  1.8e-02   1.6e+00   1.6e+00   1.7e+00  14816  3.5e-01  1.0e+00
sigma[5]         1.6e+00  1.3e-04  1.7e-02   1.6e+00   1.6e+00   1.6e+00  17538  4.1e-01  1.0e+00
sigma[6]         1.6e+00  1.4e-04  1.7e-02   1.5e+00   1.6e+00   1.6e+00  15186  3.6e-01  1.0e+00
sigma[7]         1.7e+00  1.5e-04  1.8e-02   1.6e+00   1.7e+00   1.7e+00  16191  3.8e-01  1.0e+00
sigma[8]         1.4e+00  1.2e-04  1.6e-02   1.4e+00   1.4e+00   1.4e+00  16781  4.0e-01  1.0e+00
sigma[9]         1.4e+00  1.3e-04  1.6e-02   1.4e+00   1.4e+00   1.5e+00  15669  3.7e-01  1.0e+00

Samples were drawn using hmc with nuts.
For each parameter, N_Eff is a crude measure of effective sample size,
and R_hat is the potential scale reduction factor on split chains (at 
convergence, R_hat=1).

Out[15]:
Process(`/home/chriselrod/Documents/languages/cmdstan/bin/stansummary /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_1.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_2.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_3.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_4.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_5.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_7.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_8.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_9.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_10.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_12.csv /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_13.csv`, ProcessExited(0))

There we go. These remaining 10 chains did converge. These give us reasonable numbers of effetive samples. Before moving on to Julia, let’s take a look at the slow chain that failed to converge.

In [16]:
run(`$stansummary $(resdir)/StanITP_samples_14.csv`)
Inference for Stan model: StanITP_model
1 chains: each with iter=(2000); warmup=(0); thin=(1); 2000 iterations saved.

Warmup took (6087) seconds, 1.7 hours total
Sampling took (6592) seconds, 1.8 hours total

                    Mean     MCSE   StdDev        5%       50%       95%  N_Eff  N_Eff/s  R_hat
lp__            -2.1e+12     -nan  0.0e+00  -2.1e+12  -2.1e+12  -2.1e+12   -nan     -nan   -nan
accept_stat__    9.6e-01  1.0e-03  4.3e-02   8.8e-01   9.8e-01   1.0e+00   1757  2.7e-01    1.0
stepsize__       6.5e-15  5.4e-29  3.8e-29   6.5e-15   6.5e-15   6.5e-15   0.50  7.6e-05   1.00
treedepth__      1.0e+01     -nan  3.7e-14   1.0e+01   1.0e+01   1.0e+01   -nan     -nan   1.00
n_leapfrog__     1.0e+03     -nan  2.5e-12   1.0e+03   1.0e+03   1.0e+03   -nan     -nan   1.00
divergent__      0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
energy__         2.1e+12     -nan  0.0e+00   2.1e+12   2.1e+12   2.1e+12   -nan     -nan   -nan
muh[1]           1.7e+00  5.7e-15  4.0e-15   1.7e+00   1.7e+00   1.7e+00   0.50  7.6e-05   1.00
muh[2]           7.8e-01  3.3e-15  2.3e-15   7.8e-01   7.8e-01   7.8e-01   0.50  7.6e-05   1.00
rho             -1.2e-02  7.4e-17  5.2e-17  -1.2e-02  -1.2e-02  -1.2e-02   0.50  7.6e-05   1.00
kappa[1]         5.4e+00  1.9e-14  1.3e-14   5.4e+00   5.4e+00   5.4e+00   0.50  7.6e-05   1.00
kappa[2]         2.8e+00  6.3e-16  4.4e-16   2.8e+00   2.8e+00   2.8e+00   0.50  7.6e-05   1.00
kappa[3]         3.1e-01  5.5e-16  3.9e-16   3.1e-01   3.1e-01   3.1e-01   0.50  7.6e-05   -nan
kappa[4]         5.8e+00  4.6e-14  3.3e-14   5.8e+00   5.8e+00   5.8e+00   0.50  7.6e-05   1.00
kappa[5]         5.1e-01  1.3e-15  8.9e-16   5.1e-01   5.1e-01   5.1e-01   0.50  7.6e-05   1.00
kappa[6]         3.0e-01     -nan  2.8e-16   3.0e-01   3.0e-01   3.0e-01   -nan     -nan   1.00
kappa[7]         4.5e-01  1.4e-15  1.0e-15   4.5e-01   4.5e-01   4.5e-01   0.50  7.6e-05   1.00
kappa[8]         9.0e-01  9.4e-16  6.7e-16   9.0e-01   9.0e-01   9.0e-01   0.50  7.6e-05   1.00
kappa[9]         2.1e+00  1.5e-14  1.1e-14   2.1e+00   2.1e+00   2.1e+00   0.50  7.6e-05   1.00
theta[1]        -1.6e+00  6.9e-15  4.9e-15  -1.6e+00  -1.6e+00  -1.6e+00   0.50  7.6e-05   1.00
theta[2]         1.2e+00  6.9e-15  4.9e-15   1.2e+00   1.2e+00   1.2e+00   0.50  7.6e-05   -nan
theta[3]        -3.5e-01  2.7e-15  1.9e-15  -3.5e-01  -3.5e-01  -3.5e-01   0.50  7.6e-05   1.00
theta[4]         6.6e-01  3.1e-16  2.2e-16   6.6e-01   6.6e-01   6.6e-01   0.50  7.6e-05   1.00
theta[5]         7.2e-01  4.1e-15  2.9e-15   7.2e-01   7.2e-01   7.2e-01   0.50  7.6e-05   1.00
theta[6]        -1.5e-01  9.4e-16  6.7e-16  -1.5e-01  -1.5e-01  -1.5e-01   0.50  7.6e-05   -nan
theta[7]        -1.9e+00  3.1e-16  2.2e-16  -1.9e+00  -1.9e+00  -1.9e+00   0.50  7.6e-05   1.00
theta[8]        -5.5e-02  3.1e-16  2.2e-16  -5.5e-02  -5.5e-02  -5.5e-02   0.50  7.6e-05   1.00
theta[9]        -1.2e+00  2.8e-15  2.0e-15  -1.2e+00  -1.2e+00  -1.2e+00   0.50  7.6e-05   1.00
L[1,1]           1.0e+00     -nan  6.7e-16   1.0e+00   1.0e+00   1.0e+00   -nan     -nan   1.00
L[1,2]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[1,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[1,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[1,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[1,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[1,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[1,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[1,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[2,1]           8.1e-01  2.7e-15  1.9e-15   8.1e-01   8.1e-01   8.1e-01   0.50  7.6e-05   1.00
L[2,2]           5.9e-01  2.0e-15  1.4e-15   5.9e-01   5.9e-01   5.9e-01   0.50  7.6e-05   1.00
L[2,3]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[2,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[2,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[2,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[2,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[2,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[2,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[3,1]          -8.3e-01  2.4e-15  1.7e-15  -8.3e-01  -8.3e-01  -8.3e-01   0.50  7.6e-05   1.00
L[3,2]          -5.1e-01  4.7e-16  3.3e-16  -5.1e-01  -5.1e-01  -5.1e-01   0.50  7.6e-05   1.00
L[3,3]           2.2e-01  1.4e-15  1.0e-15   2.2e-01   2.2e-01   2.2e-01   0.50  7.6e-05   1.00
L[3,4]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[3,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[3,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[3,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[3,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[3,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[4,1]           8.5e-01  6.0e-15  4.2e-15   8.5e-01   8.5e-01   8.5e-01   0.50  7.6e-05   1.00
L[4,2]           4.8e-01  3.2e-15  2.3e-15   4.8e-01   4.8e-01   4.8e-01   0.50  7.6e-05   1.00
L[4,3]           1.7e-01  4.3e-16  3.1e-16   1.7e-01   1.7e-01   1.7e-01   0.50  7.6e-05   1.00
L[4,4]           1.5e-01  5.1e-16  3.6e-16   1.5e-01   1.5e-01   1.5e-01   0.50  7.6e-05   1.00
L[4,5]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[4,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[4,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[4,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[4,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[5,1]           9.6e-01  3.5e-15  2.4e-15   9.6e-01   9.6e-01   9.6e-01   0.50  7.6e-05   1.00
L[5,2]           1.9e-01  7.5e-16  5.3e-16   1.9e-01   1.9e-01   1.9e-01   0.50  7.6e-05   1.00
L[5,3]           1.4e-02  8.1e-17  5.7e-17   1.4e-02   1.4e-02   1.4e-02   0.50  7.6e-05   1.00
L[5,4]           2.0e-01  1.6e-15  1.2e-15   2.0e-01   2.0e-01   2.0e-01   0.50  7.6e-05   1.00
L[5,5]           8.9e-02  1.2e-16  8.3e-17   8.9e-02   8.9e-02   8.9e-02   0.50  7.6e-05   1.00
L[5,6]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[5,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[5,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[5,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[6,1]          -8.4e-01  3.0e-15  2.1e-15  -8.4e-01  -8.4e-01  -8.4e-01   0.50  7.6e-05   1.00
L[6,2]          -4.6e-01  0.0e+00  0.0e+00  -4.6e-01  -4.6e-01  -4.6e-01   0.50  7.6e-05   -nan
L[6,3]          -2.0e-01  1.5e-15  1.1e-15  -2.0e-01  -2.0e-01  -2.0e-01   0.50  7.6e-05   1.00
L[6,4]          -7.2e-02  2.4e-16  1.7e-16  -7.2e-02  -7.2e-02  -7.2e-02   0.50  7.6e-05   1.00
L[6,5]           5.1e-02  1.5e-16  1.0e-16   5.1e-02   5.1e-02   5.1e-02   0.50  7.6e-05   1.00
L[6,6]           2.2e-01  1.4e-15  1.0e-15   2.2e-01   2.2e-01   2.2e-01   0.50  7.6e-05   1.00
L[6,7]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[6,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[6,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[7,1]          -8.3e-01  3.1e-15  2.2e-15  -8.3e-01  -8.3e-01  -8.3e-01   0.50  7.6e-05   1.00
L[7,2]          -2.2e-01  1.7e-15  1.2e-15  -2.2e-01  -2.2e-01  -2.2e-01   0.50  7.6e-05   1.00
L[7,3]           3.9e-01  1.4e-15  1.0e-15   3.9e-01   3.9e-01   3.9e-01   0.50  7.6e-05   1.00
L[7,4]          -3.2e-01  2.7e-15  1.9e-15  -3.2e-01  -3.2e-01  -3.2e-01   0.50  7.6e-05   1.00
L[7,5]          -5.2e-02  3.5e-16  2.5e-16  -5.2e-02  -5.2e-02  -5.2e-02   0.50  7.6e-05   1.00
L[7,6]           5.8e-02  1.2e-16  8.3e-17   5.8e-02   5.8e-02   5.8e-02   0.50  7.6e-05   1.00
L[7,7]           3.9e-02  1.2e-16  8.3e-17   3.9e-02   3.9e-02   3.9e-02   0.50  7.6e-05   1.00
L[7,8]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[7,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[8,1]          -9.4e-01  4.7e-15  3.3e-15  -9.4e-01  -9.4e-01  -9.4e-01   0.50  7.6e-05   1.00
L[8,2]          -2.9e-01  2.2e-15  1.6e-15  -2.9e-01  -2.9e-01  -2.9e-01   0.50  7.6e-05   1.00
L[8,3]           9.9e-02  3.9e-17  2.8e-17   9.9e-02   9.9e-02   9.9e-02   0.50  7.6e-05   1.00
L[8,4]           3.1e-02  1.6e-16  1.1e-16   3.1e-02   3.1e-02   3.1e-02   0.50  7.6e-05   1.00
L[8,5]          -1.1e-01  9.6e-16  6.8e-16  -1.1e-01  -1.1e-01  -1.1e-01   0.50  7.6e-05   1.00
L[8,6]          -5.7e-02  2.8e-16  2.0e-16  -5.7e-02  -5.7e-02  -5.7e-02   0.50  7.6e-05   1.00
L[8,7]          -1.7e-02  4.4e-17  3.1e-17  -1.7e-02  -1.7e-02  -1.7e-02   0.50  7.6e-05   1.00
L[8,8]           1.1e-02  3.2e-17  2.3e-17   1.1e-02   1.1e-02   1.1e-02   0.50  7.6e-05   1.00
L[8,9]           0.0e+00     -nan  0.0e+00   0.0e+00   0.0e+00   0.0e+00   -nan     -nan   -nan
L[9,1]           9.3e-01  3.3e-15  2.3e-15   9.3e-01   9.3e-01   9.3e-01   0.50  7.6e-05   1.00
L[9,2]          -3.4e-01  2.7e-15  1.9e-15  -3.4e-01  -3.4e-01  -3.4e-01   0.50  7.6e-05   1.00
L[9,3]          -1.1e-01  6.7e-16  4.7e-16  -1.1e-01  -1.1e-01  -1.1e-01   0.50  7.6e-05   1.00
L[9,4]          -8.8e-03  7.4e-17  5.2e-17  -8.8e-03  -8.8e-03  -8.8e-03   0.50  7.6e-05   1.00
L[9,5]          -3.2e-02  2.0e-16  1.4e-16  -3.2e-02  -3.2e-02  -3.2e-02   0.50  7.6e-05   1.00
L[9,6]          -2.2e-02  1.3e-16  9.4e-17  -2.2e-02  -2.2e-02  -2.2e-02   0.50  7.6e-05   1.00
L[9,7]          -7.3e-03  4.9e-18  3.5e-18  -7.3e-03  -7.3e-03  -7.3e-03   0.50  7.6e-05   1.00
L[9,8]           5.6e-03  2.5e-18  1.7e-18   5.6e-03   5.6e-03   5.6e-03   0.50  7.6e-05   1.00
L[9,9]           1.9e-03  1.4e-17  1.0e-17   1.9e-03   1.9e-03   1.9e-03   0.50  7.6e-05   1.00
muraw[1,1]       9.3e-01  4.7e-15  3.3e-15   9.3e-01   9.3e-01   9.3e-01   0.50  7.6e-05   1.00
muraw[1,2]       4.7e-01  1.9e-15  1.3e-15   4.7e-01   4.7e-01   4.7e-01   0.50  7.6e-05   1.00
muraw[1,3]       1.1e+00  3.1e-15  2.2e-15   1.1e+00   1.1e+00   1.1e+00   0.50  7.6e-05   1.00
muraw[1,4]      -1.1e+00  9.7e-15  6.9e-15  -1.1e+00  -1.1e+00  -1.1e+00   0.50  7.6e-05   1.00
muraw[2,1]      -1.4e-01  5.9e-16  4.2e-16  -1.4e-01  -1.4e-01  -1.4e-01   0.50  7.6e-05   1.00
muraw[2,2]      -1.2e+00  4.7e-15  3.3e-15  -1.2e+00  -1.2e+00  -1.2e+00   0.50  7.6e-05   1.00
muraw[2,3]      -5.7e-01  3.6e-15  2.6e-15  -5.7e-01  -5.7e-01  -5.7e-01   0.50  7.6e-05   1.00
muraw[2,4]      -2.0e+00  1.8e-14  1.3e-14  -2.0e+00  -2.0e+00  -2.0e+00   0.50  7.6e-05   1.00
betaraw[1,1]    -1.6e+00  8.8e-15  6.2e-15  -1.6e+00  -1.6e+00  -1.6e+00   0.50  7.6e-05   1.00
betaraw[1,2]     1.9e+00  1.2e-14  8.4e-15   1.9e+00   1.9e+00   1.9e+00   0.50  7.6e-05   1.00
betaraw[1,3]    -1.3e+00  7.5e-15  5.3e-15  -1.3e+00  -1.3e+00  -1.3e+00   0.50  7.6e-05   1.00
betaraw[1,4]    -6.6e-01  4.2e-15  3.0e-15  -6.6e-01  -6.6e-01  -6.6e-01   0.50  7.6e-05   1.00
betaraw[1,5]    -1.3e+00  9.1e-15  6.4e-15  -1.3e+00  -1.3e+00  -1.3e+00   0.50  7.6e-05   1.00
betaraw[1,6]     9.6e-01  4.1e-15  2.9e-15   9.6e-01   9.6e-01   9.6e-01   0.50  7.6e-05   1.00
betaraw[1,7]     8.1e-01  6.6e-15  4.7e-15   8.1e-01   8.1e-01   8.1e-01   0.50  7.6e-05   1.00
betaraw[1,8]    -7.8e-01  2.2e-15  1.6e-15  -7.8e-01  -7.8e-01  -7.8e-01   0.50  7.6e-05   1.00
betaraw[1,9]     6.5e-01  3.1e-15  2.2e-15   6.5e-01   6.5e-01   6.5e-01   0.50  7.6e-05   1.00
betaraw[2,1]     5.0e-01  1.3e-15  8.9e-16   5.0e-01   5.0e-01   5.0e-01   0.50  7.6e-05   1.00
betaraw[2,2]     2.0e+00  1.0e-14  7.1e-15   2.0e+00   2.0e+00   2.0e+00   0.50  7.6e-05   1.00
betaraw[2,3]    -1.8e-01  4.3e-16  3.1e-16  -1.8e-01  -1.8e-01  -1.8e-01   0.50  7.6e-05   1.00
betaraw[2,4]     1.9e+00  3.8e-15  2.7e-15   1.9e+00   1.9e+00   1.9e+00   0.50  7.6e-05   1.00
betaraw[2,5]    -3.9e-01  9.4e-16  6.7e-16  -3.9e-01  -3.9e-01  -3.9e-01   0.50  7.6e-05   1.00
betaraw[2,6]     1.7e+00  3.1e-15  2.2e-15   1.7e+00   1.7e+00   1.7e+00   0.50  7.6e-05   1.00
betaraw[2,7]     1.3e-01  5.9e-16  4.2e-16   1.3e-01   1.3e-01   1.3e-01   0.50  7.6e-05   1.00
betaraw[2,8]     1.3e+00  7.8e-15  5.6e-15   1.3e+00   1.3e+00   1.3e+00   0.50  7.6e-05   1.00
betaraw[2,9]     5.0e-01  3.3e-15  2.3e-15   5.0e-01   5.0e-01   5.0e-01   0.50  7.6e-05   1.00
sigma_beta       2.9e-01  1.6e-16  1.1e-16   2.9e-01   2.9e-01   2.9e-01   0.50  7.6e-05   1.00
sigma_h          3.7e+00  1.4e-14  9.8e-15   3.7e+00   3.7e+00   3.7e+00   0.50  7.6e-05   1.00
sigma[1]         1.8e-01  6.3e-16  4.4e-16   1.8e-01   1.8e-01   1.8e-01   0.50  7.6e-05   1.00
sigma[2]         8.7e-01  5.7e-15  4.0e-15   8.7e-01   8.7e-01   8.7e-01   0.50  7.6e-05   1.00
sigma[3]         1.2e+00  8.8e-15  6.2e-15   1.2e+00   1.2e+00   1.2e+00   0.50  7.6e-05   1.00
sigma[4]         1.4e+00  1.1e-14  8.0e-15   1.4e+00   1.4e+00   1.4e+00   0.50  7.6e-05   1.00
sigma[5]         3.6e+00  3.3e-14  2.3e-14   3.6e+00   3.6e+00   3.6e+00   0.50  7.6e-05   1.00
sigma[6]         1.6e+00  9.1e-15  6.4e-15   1.6e+00   1.6e+00   1.6e+00   0.50  7.6e-05   1.00
sigma[7]         2.9e+00  7.5e-15  5.3e-15   2.9e+00   2.9e+00   2.9e+00   0.50  7.6e-05   1.00
sigma[8]         4.0e+00  7.5e-15  5.3e-15   4.0e+00   4.0e+00   4.0e+00   0.50  7.6e-05   1.00
sigma[9]         1.1e+00  6.9e-15  4.9e-15   1.1e+00   1.1e+00   1.1e+00   0.50  7.6e-05   1.00

Samples were drawn using hmc with nuts.
For each parameter, N_Eff is a crude measure of effective sample size,
and R_hat is the potential scale reduction factor on split chains (at 
convergence, R_hat=1).

Out[16]:
Process(`/home/chriselrod/Documents/languages/cmdstan/bin/stansummary /home/chriselrod/Documents/progwork/julia/tmp/StanITP_samples_14.csv`, ProcessExited(0))

We see it had an average treedepth of 10, reaching a steady 1023 leapfrog steps per iteration.

Also, note that stepszie! $8\times 10^{-14},$ versus the $9.8\times 10^{-3}$ of the chains that did converge. What’s a difference of more than $10^{11}$.
Despite maxing out the treedepth / number of leapfrogs and taking such small steps, the acceptance rate was also worse (0.95) than in the chains that did converge (0.99).

Given that the chains which fail to converge all share this feature $-$ a step size that is far too small $-$ it may be nice to be able to add a minimum acceptable step size. I could look into modifying the Julia code to allow enforcing that.
I am not sure that will be enough though. The energy matrix probably gets badly misspecified in the process leading to the tiny stepsizes. Enforcing a correct step size would have to allow the adaptation to recover.
The stepsize decreases during adaptation when the acceptance rate is lower than the targeted value. So, the question is, why was the acceptance rate so small? Why did it keep rejecting?

I have tried decreasing adapt_delta (the targeted acceptance rate) in Stan, but this seemed to result in more chains failing to converge. It unfortunately takes a long time to get a large sample of chains running in which to draw accurate conclusions through the noise.

Choosing low values of adapt_delta however appear to work well in Julia’s DynamicHMC.jl. DynamicHMC’s adaptation works differently than Stan’s by default. Stan uses a diagonal energy matrix, while DynamicHMC’s is dense. Stan adapts for 200 iterations, while DynamicHMC adapts for 900. A dense matrix has $\frac{P(P-1)}{2}$ more parameters than a diagonal matrix, where $P$ is the total number of parameters in the original model. For reference, $P=94$ for this model when $D=4, K=9$ as in the example here.

Compiling then running 14 chains in Julia:

In [17]:
@time chains1, tuned_samplers1 = NUTS_init_tune_distributed(ℓ_itp, 2000, δ = 0.75, report = DynamicHMC.ReportSilent());
 74.398652 seconds (42.98 M allocations: 2.229 GiB, 1.56% gc time)
In [18]:
@time chains2, tuned_samplers2 = NUTS_init_tune_distributed(ℓ_itp, 2000, δ = 0.75, report = DynamicHMC.ReportSilent());
 39.811955 seconds (409.17 k allocations: 53.151 MiB, 0.11% gc time)

I am rather certain Julia’s allocation tracker doesn’t see what the children are doing, and that it in fact used oodles (terabytes) of memory. Obviously not at the same time $-$ but that this means the garbage collector was rather busy, and actually ate a significant chunk of this runtime.

Either way, 75 seconds to run and compile, and then 40 seconds to run again! Pretty good. Especially compared to Stan, where many chains took 6000+ seconds to run.

The chains we get back are raw, unconstrained prameters. That is, they’re simply a chain of vectors of length 94.
In the logistic regression example, they were vectors of length 5, and it was easy to associate the first with $\beta_0$, and the remaining 4 with $\beta_1$.
Here, we had many more parameters. It would be cumbersome to connect individual indicies of our vectors with the actual parameters they match to.
Here, many also underwent constraining transoformations. For example, $\textbf{L}$, the Cholesky factor of a correlation matrix.

Thankfully, the function constrain will constrain our parameter vectors, producing the actual (constrained) parameters of our model:

In [19]:
chains = vcat(chains1, chains2)
tuned_samplers = vcat(tuned_samplers1, tuned_samplers2)
itp_samples = [constrain.(Ref(ℓ_itp), get_position.(chain)) for chain  chains];

These constrained samples are named tuples. For example, the last sample from the 7th chain is:

In [20]:
itp_samples[7][end]
Out[20]:
(μₕ₂ = 8.944687419134594, μᵣ₁ = [-0.2209489263015413, 1.400339039462079, 0.8563758356130792, -0.09319533252816282], σ = [1.4353275728838908, 1.2711232029274753, 1.7017282672246694, 1.6164878578320487, 1.5993114454670705, 1.5736236729475752, 1.6717141364732622, 1.3831556371045113, 1.46532307501429], σᵦ = 0.23323383941740264, θ = [-1.2820555869284855, -2.5818746082507746, -0.6942945650519528, 0.28928980157302076, 0.6130718184121277, 0.09091476958584863, 0.26788146406385366, -0.017753757780973958, -1.0583491520446193], μᵣ₂ = [-0.037297992654770955, -0.8595831101221987, -0.07713952788683394, 0.014032205445088658], ρ = 0.7027516539891208, σₕ = 1.4080832797483769, μₕ₁ = -3.5972036712220907, κ = [0.20839717647057018, 0.24550752857532468, 0.22847063947825214, 0.2951099139409577, 0.1725603383049393, 0.1999419001238751, 0.29772060832035296, 0.2421514703753704, 0.18087052545560814], L = [1.0 0.0 … 0.0 0.0; 0.07815790204093354 0.9969409924105638 … 0.0 0.0; … ; -0.06415561782110357 0.1110238358089644 … 0.9546919424181347 0.0; -0.15794825721051997 0.06742557882118196 … 0.3429554889740356 0.8235239505841849], βᵣ₂ = [-0.44180262607525944, -0.7840128804899973, -0.16720805139273834, -0.1421303214584531, 0.8148927735342953, 1.21741570380586, -0.2649796554115259, -1.2042663848822501, -1.429458109194492], βᵣ₁ = [-0.8328851555469112, 0.3433555423398755, 0.7670286093237305, 0.421760393148893, -0.17760224052922138, -0.5600755117125109, -0.7673898331927972, 1.7146868640619084, 0.8770182512715042])

This is a named tuple. Glancing at it, we see for example that $\mu_{h2}\approx 8.94$ for this sample.
If we wanted to extract a specific parameter, say $\textbf{L}$, the Cholesky factor of a correlation matrix, all we must do is:

In [21]:
L_7_end = itp_samples[7][end].L # L from the first sample of the third chain
Out[21]:
9×9 LKJ_Correlation_Cholesky{9,Float64,45}:
  1.0         0.0         0.0        …  0.0        0.0       0.0     
  0.0781579   0.996941    0.0           0.0        0.0       0.0     
 -0.0517606   0.239933    0.969409      0.0        0.0       0.0     
  0.192737    0.19944     0.259472      0.0        0.0       0.0     
  0.105598   -0.158423   -0.123241      0.0        0.0       0.0     
 -0.328368    0.0804611   0.110591   …  0.0        0.0       0.0     
  0.0833114   0.16519    -0.104868      0.928221   0.0       0.0     
 -0.0641556   0.111024   -0.168961      0.0443136  0.954692  0.0     
 -0.157948    0.0674256   0.0878966     0.337035   0.342955  0.823524

We can confirm that this is in fact the Cholesky factor of a correlation matrix:

In [22]:
L_7_end * L_7_end'
Out[22]:
9×9 Array{Float64,2}:
  1.0         0.0781579  -0.0517606  …   0.0833114  -0.0641556  -0.157948 
  0.0781579   1.0         0.235154       0.171196    0.10567     0.0548744
 -0.0517606   0.235154    1.0           -0.0663371  -0.133834    0.109561 
  0.192737    0.213893    0.28941       -0.155993    0.154364   -0.0608108
  0.105598   -0.149685   -0.162947       0.14434    -0.0491739  -0.131416 
 -0.328368    0.0545505   0.14351    …  -0.133764   -0.0588526   0.26108  
  0.0833114   0.171196   -0.0663371      1.0         0.0301068   0.263405 
 -0.0641556   0.10567    -0.133834       0.0301068   1.0         0.33285  
 -0.157948    0.0548744   0.109561       0.263405    0.33285     1.0      

Note that the output type is Array{Float64,2}. That is a standard, unsized, heap-allocated Julia array type.
These are efficient at large sizes, but not small. It was produced by a generic fallback method for matrix multiplication: a sign that I haven’t yet implement a method that autospecializes for these LKJ_Correlation_Cholesky types.

Let’s focus our analysis on the parameters $\mu_{h1}, \mu_{h2},$ and $\rho$. We’ll look at the effective sample sizes of these parameters on each of the chains.

In [23]:
using MCMCDiagnostics

μₕ₁_chains = [[s.μₕ₁ for s  sample] for sample  itp_samples]
μₕ₂_chains = [[s.μₕ₂ for s  sample] for sample  itp_samples]
ρ_chains = [[s.ρ for s  sample] for sample  itp_samples]

poi_chains = (μₕ₁_chains, μₕ₂_chains, ρ_chains)

ess = [effective_sample_size(s[i]) for i  eachindex(itp_samples), s  poi_chains]
ess[1:14,:]
Out[23]:
14×3 Array{Float64,2}:
   24.2331   129.756   608.285
 1122.39    1161.46   2000.0  
  557.425    713.089  1960.48 
  741.728    573.58   1645.54 
  903.667    635.633  1686.59 
  635.705    970.488  2000.0  
  166.988    546.471  1160.26 
  935.116   1002.71   1080.87 
  391.48     953.45   2000.0  
 1288.98    1243.24   2000.0  
  366.981    385.36   2000.0  
  773.445    631.29   2000.0  
  705.258    419.628  2000.0  
 1159.84     773.693  1952.02 
In [24]:
ess[15:end,:]
Out[24]:
14×3 Array{Float64,2}:
  536.894    373.769    2000.0    
 1132.03     983.7      2000.0    
  851.267    517.848    1731.37   
  913.495    436.651    1641.12   
   10.134     67.3014     25.3701 
  367.588    455.135    2000.0    
  345.227    378.014    2000.0    
  723.417    426.147    2000.0    
    2.99284    2.99284     2.99288
  723.944    679.49     1717.72   
  698.047    709.301    1783.71   
  387.252    281.523    1652.33   
  900.822    830.342    2000.0    
  474.353    356.067    2000.0    

We see that 1 chains completely failed to converge, while another two had rather low effective sample sizes. I’ll filter out all three of these chains.

Let’s look at the NUTS statistics of those chains that failed to converge.

In [25]:
converged = vec(sum(ess, dims = 2)) .> 1000
not_converged = .! converged
NUTS_statistics.(chains[not_converged])
Out[25]:
3-element Array{DynamicHMC.NUTS_Statistics{Float64,DataStructures.Accumulator{DynamicHMC.Termination,Int64},DataStructures.Accumulator{Int64,Int64}},1}:
 Hamiltonian Monte Carlo sample of length 2000
  acceptance rate mean: 0.72, min/25%/median/75%/max: 0.0 0.62 0.79 0.91 1.0
  termination: AdjacentDivergent => 8% AdjacentTurn => 2% DoubledTurn => 90%
  depth: 2 => 1% 3 => 4% 4 => 2% 5 => 2% 6 => 91% 7 => 0% 8 => 0%
          
 Hamiltonian Monte Carlo sample of length 2000
  acceptance rate mean: 0.68, min/25%/median/75%/max: 0.0 0.58 0.79 0.92 1.0
  termination: AdjacentDivergent => 17% AdjacentTurn => 10% DoubledTurn => 73%
  depth: 1 => 0% 2 => 7% 3 => 8% 4 => 1% 5 => 1% 6 => 79% 7 => 4% 8 => 0%

 Hamiltonian Monte Carlo sample of length 2000
  acceptance rate mean: 0.06, min/25%/median/75%/max: 0.0 0.0 0.0 0.0 0.96
  termination: AdjacentDivergent => 21% DoubledTurn => 79%
  depth: 2 => 72% 3 => 26% 4 => 2% 5 => 0%
                                                     

For the chain that completely failed, we see very shallow trees and poor acceptance rates, like with Stan.
Lets look at the corresponding stats of the first three chains that did converge:

In [26]:
NUTS_statistics.((chains[converged])[1:3])
Out[26]:
3-element Array{DynamicHMC.NUTS_Statistics{Float64,DataStructures.Accumulator{DynamicHMC.Termination,Int64},DataStructures.Accumulator{Int64,Int64}},1}:
 Hamiltonian Monte Carlo sample of length 2000
  acceptance rate mean: 0.76, min/25%/median/75%/max: 0.0 0.66 0.8 0.91 1.0
  termination: AdjacentDivergent => 2% AdjacentTurn => 1% DoubledTurn => 97%
  depth: 3 => 0% 4 => 0% 5 => 1% 6 => 98% 7 => 0%

 Hamiltonian Monte Carlo sample of length 2000
  acceptance rate mean: 0.78, min/25%/median/75%/max: 0.0 0.68 0.81 0.93 1.0
  termination: AdjacentDivergent => 1% AdjacentTurn => 2% DoubledTurn => 97%
  depth: 4 => 0% 5 => 1% 6 => 99% 7 => 0%
       
 Hamiltonian Monte Carlo sample of length 2000
  acceptance rate mean: 0.8, min/25%/median/75%/max: 0.03 0.7 0.83 0.94 1.0
  termination: AdjacentDivergent => 0% AdjacentTurn => 3% DoubledTurn => 96%
  depth: 4 => 0% 5 => 0% 6 => 98% 7 => 1% 8 => 0%

Remember that the third of these produced much lower effective sample sizes for $\mu_{h_1}$ and $\mu_{h_2}$ than the first two did. But its acceptance rate looks good, and we don’t see an obvious difference in mean treedepth vs chain 2. But the depth was more variable than chains 1 or 2.
Chain 1, which had the best effective sample sizes, had a tree depth of just 7 99% of the time.

Let’s also look at the tuned smaplers:

In [27]:
tuned_samplers[not_converged]
Out[27]:
3-element Array{NUTS{Array{Float64,1},Float64,ProbabilityModels.ScalarVectorPCG{4},DynamicHMC.Hamiltonian{ITPModel{94,ChunkedArray{Float64,2,ConstantFixedSizePaddedArray{Tuple{24,9},Float64,2,24,216},1,3,UInt8},Domains{(2, 2, 2, 3)},Val{RealFloat},Val{RealVector{4,T,P,L} where L where P where T},Val{PositiveVector{9,T,P,L} where L where P where T},Val{PositiveFloat},Val{RealVector{9,T,P,L} where L where P where T},Val{RealVector{4,T,P,L} where L where P where T},Val{BoundedFloat{-1,1,T} where T},Val{PositiveFloat},Val{RealFloat},Val{PositiveVector{9,T,P,L} where L where P where T},Val{LKJ_Correlation_Cholesky{9,T,L} where L where T},ConstantFixedSizePaddedArray{Tuple{23},Float64,1,24,24},ChunkedArray{Float64,2,ConstantFixedSizePaddedArray{Tuple{24,9},Float64,2,24,216},1,3,UInt8},Val{RealVector{9,T,P,L} where L where P where T},ConstantFixedSizePaddedArray{Tuple{24},Float64,1,24,24},Val{RealVector{9,T,P,L} where L where P where T}},GaussianKE{Array{Float64,2},UpperTriangular{Float64,Array{Float64,2}}}},ReportSilent},1}:
 NUTS sampler in 94 dimensions
  stepsize (ϵ) ≈ 0.078
  maximum depth = 10
  Gaussian kinetic energy, √diag(M⁻¹): [0.6277683611416773, 0.596216010427481, 0.623033697844963, 0.5566804868923119, 0.5949096313330703, 0.012062028354466852, 0.012758629538525756, 0.012143486576275028, 0.01139457586860317, 0.011567038023288309, 0.01135710961910725, 0.01185059652933793, 0.011842757935658161, 0.01126534998456878, 0.42142854889500697, 0.08183028691079794, 0.08012828505736075, 0.09881133816745322, 0.09672259539452756, 0.09200352231625367, 0.08346840346919217, 0.0984994299146975, 0.08320450840184299, 0.07964192095200184, 0.5375519653540265, 0.6588267910765878, 0.5927966264257182, 0.5505926139039595, 0.014391773615390511, 0.33149364972340495, 0.5879631165151611, 0.049149229453739396, 0.04171473861868931, 0.0638763531966401, 0.056916950058465775, 0.07384131962482554, 0.05561327966197441, 0.04335143974667679, 0.04777534578729736, 0.05393646556235126, 0.026284085101291346, 0.028116430185156603, 0.029475131230053908, 0.025967916327435076, 0.026068084099733744, 0.02494703509631611, 0.026353471370727265, 0.025247503635006715, 0.026959826907642037, 0.024613031888794715, 0.02469031672466362, 0.02523829587032551, 0.025919329265635033, 0.025848343923092883, 0.026982020108658338, 0.028091287597739774, 0.027111066888743943, 0.02670217043000058, 0.025890971035087373, 0.025373222628072904, 0.025337018429715086, 0.028243492175451926, 0.026010574686040056, 0.025179110567401804, 0.026778838641127605, 0.029946616589518877, 0.026769404703776034, 0.028083246259703838, 0.02588721646667019, 0.028333014806928198, 0.02386942700749046, 0.0255514143957361, 0.026562347258930783, 0.026167510260012943, 0.025515188984689675, 0.02495991502676027, 0.7339256093039761, 0.7493941141589919, 0.7532097222893933, 0.7158560620933015, 0.9061606702833714, 0.9165693534562586, 0.7790827517848723, 0.6942741397994932, 0.8088920003960505, 0.8432664709043102, 0.7989042774471319, 0.7556396240153284, 0.7844052523799955, 0.7988065389498867, 0.7829756265595562, 0.7416558072684072, 0.7062323019955251, 0.6862666418358305]
                                                                                                                                                                          
 NUTS sampler in 94 dimensions
  stepsize (ϵ) ≈ 0.061
  maximum depth = 10
  Gaussian kinetic energy, √diag(M⁻¹): [0.6128770123899545, 0.6048273028967667, 0.6308295200437481, 0.5432478959500822, 0.5622649428757419, 0.012014855125275793, 0.011294199081252744, 0.012836529687100766, 0.012381470379076104, 0.011538631293602965, 0.011408496534807706, 0.011847501708107812, 0.011691660190294235, 0.011620701872205083, 0.37841377234429857, 0.0841846022667437, 0.07667901685837525, 0.09952871172801352, 0.1003877940381503, 0.091784394725657, 0.09198441598642987, 0.09663184488704728, 0.08008684561151251, 0.08211554146895071, 0.5306603869140116, 0.6247132570899098, 0.5583298145929688, 0.5261479945071026, 0.014758226640451826, 0.3722508295087927, 0.6107585824689881, 0.04647928394522552, 0.04375789189247665, 0.0687317622403065, 0.055858671870905034, 0.0690152149843714, 0.04772446505047933, 0.04239520758303713, 0.04759069936122897, 0.05610453119149407, 0.027556037203683964, 0.025156279985600636, 0.026977704990583818, 0.0272404743934342, 0.026576051947727018, 0.02833629436592387, 0.02483907100113558, 0.025669142004492388, 0.02495452551557473, 0.024862636338604367, 0.02598502744005569, 0.02741809824778307, 0.02685553559881865, 0.025671075116392748, 0.030095081972252663, 0.02413699103677456, 0.026689453924549817, 0.02622364665187369, 0.026114680316961476, 0.025628027281431844, 0.026596411608935146, 0.025971427608624708, 0.02784239713244614, 0.02799164262589996, 0.02668408210687823, 0.026638984328567178, 0.026962832416633168, 0.027561279305326654, 0.025848495077472063, 0.027201099363894178, 0.027421057220667937, 0.027239168216294556, 0.02721277290209621, 0.02679475826266052, 0.024006730649577087, 0.02897433559573642, 0.7096056022485335, 0.7328788274540327, 0.6830165407003129, 0.7165267976779206, 0.8500804194195068, 0.8463962495213547, 0.8171626644275035, 0.7038893285727941, 0.77660019619465, 0.7911430653718692, 0.7868584735460855, 0.8317800802256453, 0.8136003332168146, 0.8104984634595082, 0.8391643609866997, 0.7312457087150874, 0.7070990485714901, 0.6622642119011172]
                                                                                                                                                                                   
 NUTS sampler in 94 dimensions
  stepsize (ϵ) ≈ 3.82e-5
  maximum depth = 10
  Gaussian kinetic energy, √diag(M⁻¹): [0.00014738319012975768, 8.077192336917283e-5, 6.943094104068127e-5, 7.19717382930257e-5, 3.038512234720038e-5, 3.8818802252638995e-5, 0.0011585203976719528, 0.0008404226127264168, 0.0006947166306841706, 0.00048520595395872826, 9.527455351854144e-6, 2.210415995091748e-5, 7.923341731007402e-5, 5.612347545531166e-5, 0.0006782826654100789, 6.27679355401914e-5, 0.0001525179746971026, 0.0001459109200927568, 0.0002169819202794901, 0.00015198172753297563, 2.1717886942716705e-5, 9.756691333363823e-5, 1.892524681095015e-5, 1.9612618658229532e-5, 4.3469936967442666e-5, 0.0001318004428489255, 3.758079772731855e-5, 1.3668517172675462e-5, 0.004463511976120239, 0.0001576108927035259, 0.00020976636322677728, 2.676481552778341e-5, 1.4560103841265782e-5, 4.852131080271657e-5, 6.549681622929582e-5, 3.8834473716502366e-5, 8.791542634705384e-6, 3.073766394945401e-5, 1.9017252196211747e-5, 2.6122062464506286e-5, 1.7941551593322802e-5, 5.382378099033665e-5, 1.1277959313928232e-5, 0.00032319567740787526, 2.03172741188628e-5, 2.6234978362081674e-5, 0.00013617624673428205, 7.311170773131256e-5, 0.00022763034727105795, 0.00023017799088064834, 0.00090808236791735, 5.053115532400914e-5, 0.00012248154899042224, 0.00027403788607313856, 0.0003058799284963771, 0.00010447957933827817, 0.0005902549722019506, 0.00013065556202533984, 9.390773831713708e-5, 0.0002495736859571633, 5.286361166033092e-5, 0.0002737673312205595, 3.7874516537467045e-5, 0.00010709083146655536, 0.0003734267479080438, 6.781917776688348e-5, 0.00013571307825016864, 0.00025222422514293446, 0.00028080904285854595, 0.0005347552686258406, 1.4259056641988715e-5, 6.017274512261336e-5, 1.1620674882839013e-5, 1.2447623021854116e-5, 0.00019038428217100915, 0.0002285231277733912, 1.818402980381017e-5, 1.9942050653528433e-5, 0.00013573622234794766, 0.00016248077574505435, 4.600471366721049e-5, 2.6897935732679763e-5, 4.648909720666999e-5, 2.048818499200105e-5, 2.3346912373576046e-5, 1.3482984815760197e-5, 0.00021665564153776286, 0.0001621551557840225, 6.997158300489517e-5, 0.00022094469290528517, 1.8257319456925155e-5, 1.8814862053394525e-5, 3.736035739387117e-5, 2.605880129969929e-5]
In [28]:
tuned_samplers[converged][1:3]
Out[28]:
3-element Array{NUTS{Array{Float64,1},Float64,ProbabilityModels.ScalarVectorPCG{4},DynamicHMC.Hamiltonian{ITPModel{94,ChunkedArray{Float64,2,ConstantFixedSizePaddedArray{Tuple{24,9},Float64,2,24,216},1,3,UInt8},Domains{(2, 2, 2, 3)},Val{RealFloat},Val{RealVector{4,T,P,L} where L where P where T},Val{PositiveVector{9,T,P,L} where L where P where T},Val{PositiveFloat},Val{RealVector{9,T,P,L} where L where P where T},Val{RealVector{4,T,P,L} where L where P where T},Val{BoundedFloat{-1,1,T} where T},Val{PositiveFloat},Val{RealFloat},Val{PositiveVector{9,T,P,L} where L where P where T},Val{LKJ_Correlation_Cholesky{9,T,L} where L where T},ConstantFixedSizePaddedArray{Tuple{23},Float64,1,24,24},ChunkedArray{Float64,2,ConstantFixedSizePaddedArray{Tuple{24,9},Float64,2,24,216},1,3,UInt8},Val{RealVector{9,T,P,L} where L where P where T},ConstantFixedSizePaddedArray{Tuple{24},Float64,1,24,24},Val{RealVector{9,T,P,L} where L where P where T}},GaussianKE{Array{Float64,2},UpperTriangular{Float64,Array{Float64,2}}}},ReportSilent},1}:
 NUTS sampler in 94 dimensions
  stepsize (ϵ) ≈ 0.0843
  maximum depth = 10
  Gaussian kinetic energy, √diag(M⁻¹): [0.6309463842882493, 0.5997960138939399, 0.6546435173716084, 0.5304222731419923, 0.548431246569161, 0.011734040722030202, 0.012813206343517148, 0.011544666984463492, 0.012001901446875319, 0.013176681148078495, 0.012054788115481205, 0.012193656401794485, 0.01159458891956949, 0.012184703359373888, 0.4040198674669616, 0.08294017141778165, 0.07626857448618159, 0.09441817295930148, 0.10316203957464752, 0.0919303712170745, 0.08211377458744487, 0.08747979923775982, 0.0820512225266604, 0.08264700453015734, 0.5217238403547956, 0.5524219790901299, 0.5361868414033862, 0.5174849942181842, 0.015728362278287833, 0.37724379978138217, 0.6374059745168612, 0.047697167917689076, 0.04111978114534379, 0.06339021971124152, 0.0582294064063019, 0.06639230630472656, 0.05722552540281084, 0.041311368214560995, 0.04344149320418269, 0.05343211870789878, 0.027072946242422933, 0.02588492547565767, 0.026557435246397736, 0.026461593809781705, 0.02613607342795048, 0.02608526585280564, 0.026835260385342376, 0.02744573750070027, 0.027594445370802006, 0.02660976117668538, 0.025857111969615158, 0.025743387572105786, 0.026144537443536616, 0.02814154056726786, 0.02669501481958122, 0.025832623766899397, 0.026328186616212862, 0.027674169533030635, 0.025137093124432613, 0.026219828936941282, 0.02583938124496037, 0.02750001266098258, 0.02603713679396949, 0.02436557375618744, 0.027453729851568805, 0.025555680787875405, 0.027235803284882794, 0.027752018200549546, 0.027401699131780536, 0.0263189883830063, 0.029860363173389543, 0.026665863476389304, 0.02803888508602584, 0.022728229125608998, 0.025401618040581348, 0.030175069242168787, 0.7052237805440772, 0.728500973440587, 0.6634547422711475, 0.655728972162171, 0.813333138932818, 0.8287969369864129, 0.7820392021272744, 0.717058125357493, 0.7863121735828574, 0.7611227779111842, 0.8105354886225663, 0.7220431992266195, 0.736448401361643, 0.7865391211839371, 0.7908719379432513, 0.7300809567940342, 0.7684991361795847, 0.687771155648962]
         
 NUTS sampler in 94 dimensions
  stepsize (ϵ) ≈ 0.0843
  maximum depth = 10
  Gaussian kinetic energy, √diag(M⁻¹): [0.5285385462745646, 0.5855277952359289, 0.6315391463342584, 0.5300750542823836, 0.5763691125009497, 0.011452245477822318, 0.012308585639806809, 0.011360138133607273, 0.01157159146473685, 0.01230623450921443, 0.012569751152844422, 0.011530960477623234, 0.01144212528570984, 0.011993088583715922, 0.3381463723707466, 0.09135680172447003, 0.07512992192063869, 0.1003845972987559, 0.10175076701522294, 0.09754215457586232, 0.09443298186773968, 0.10433282421662102, 0.08740170495300852, 0.0833204810419767, 0.5312133018126064, 0.5910997896809018, 0.5562311659826006, 0.5564130973569661, 0.014618069307655204, 0.27745888257370394, 0.5130389703027243, 0.044712878851918526, 0.03994352175738663, 0.06467794835439813, 0.05704037426468085, 0.06601108183828316, 0.052018188958925966, 0.04430113557323915, 0.041466257923335734, 0.05686643969326523, 0.025011544816970394, 0.0270716976847648, 0.026844341367981354, 0.026600400432591834, 0.02606213562924881, 0.02806556255240926, 0.026515674269662105, 0.026615668399325342, 0.02931415015763269, 0.026875139775205597, 0.02544317054047768, 0.02534909114633422, 0.027134967887431163, 0.029253617536014156, 0.02765624035747886, 0.02630650677343718, 0.025755596163920494, 0.026569756381369124, 0.02650808438497664, 0.026293821953365066, 0.02655476538739036, 0.026334096850277697, 0.02690308429871072, 0.02654185268681088, 0.026122294131154508, 0.026838383443668137, 0.02490245590503594, 0.027028752826823484, 0.027866640166356992, 0.028384499736303737, 0.027017869046199183, 0.027013156562764186, 0.023871163597273357, 0.025881711690464874, 0.024469176324156395, 0.02556946792990661, 0.6553890531366146, 0.6629977883690915, 0.7131622288586198, 0.6930530117941557, 0.8593951900615213, 0.8019700121900678, 0.7219022570521191, 0.6977584702917989, 0.7271402248679014, 0.8090099949776399, 0.829751616326894, 0.8335327113019703, 0.8111420899658006, 0.7297997058063012, 0.7583326201010807, 0.7007657170673636, 0.6675941443275774, 0.7283720125998185]
    
 NUTS sampler in 94 dimensions
  stepsize (ϵ) ≈ 0.0711
  maximum depth = 10
  Gaussian kinetic energy, √diag(M⁻¹): [0.5491814188373191, 0.5584508372592785, 0.6429194108095596, 0.5380599308093037, 0.55546362991702, 0.01142873662654436, 0.012116588583464928, 0.01167160393974744, 0.011991671706797734, 0.01143426512699172, 0.01172842683569287, 0.010664866079548094, 0.010523284046821275, 0.011298117248228691, 0.41051091604781864, 0.08339076184786429, 0.07793053574412262, 0.10331315642170406, 0.10335985425507038, 0.08981512028700873, 0.09403849193351645, 0.10315078120763908, 0.08487079400229718, 0.08791842936773203, 0.5291559316436041, 0.5833803135503485, 0.5508740011975601, 0.5311470478589082, 0.014356644762153209, 0.3385960888875418, 0.5943393303739113, 0.04374275266689454, 0.04002238201735633, 0.07014213702525497, 0.05514692250402814, 0.06420618145241389, 0.050650374304357516, 0.03881269573216551, 0.044484146085053376, 0.05571681311058528, 0.028294096355031554, 0.025979163296706607, 0.026538805564789752, 0.026940880418840858, 0.027961355078883362, 0.026903866545600463, 0.026400206002439412, 0.026899502061414293, 0.02454539162703539, 0.02714372112748547, 0.024859659780630934, 0.02756080989595376, 0.029489088537015255, 0.02845800880197371, 0.02542439886624742, 0.02580449319311481, 0.027833387906577457, 0.02767927138206551, 0.027483978883768783, 0.026114972067914098, 0.026855307527140604, 0.02789515991393559, 0.025343655395355283, 0.025902595584071972, 0.027142799539297147, 0.027318752513752126, 0.026977276182240845, 0.025328466762441645, 0.026672922472174537, 0.026165467676688692, 0.028373954269781492, 0.026136847260210053, 0.02779512573978342, 0.026241008591272236, 0.025461478431456703, 0.024807785383922164, 0.7206533153919474, 0.6745437825708636, 0.7215274072651837, 0.6960088055431405, 0.9513512548090182, 0.8593283780213314, 0.8026965689680214, 0.7016535143057462, 0.737230307963851, 0.7382329603118651, 0.7498970472104073, 0.8026487808115708, 0.7980381653666924, 0.7898650893117133, 0.8013790327449021, 0.7405418337603221, 0.7228742222603829, 0.7178004293698728]

The chain that completely failed had a very small stepsize, while the two that did relatively poorly had smaller step sizes than those that did converge.

Lets look at the total effective sample sizes of the chains that did converge:

In [29]:
poi_chain = [vcat((chains[converged])...) for chains  poi_chains]
μₕ₁_chain, μₕ₂_chain, ρ_chain = poi_chain

(
    μₕ₁_ess = effective_sample_size(μₕ₁_chain),
    μₕ₂_ess = effective_sample_size(μₕ₂_chain),
    ρ_ess = effective_sample_size(ρ_chain)
)
Out[29]:
(μₕ₁_ess = 14396.510181546806, μₕ₂_ess = 15458.051655799783, ρ_ess = 50000.0)
In [30]:
μₕ₁_chains, μₕ₂_chains, ρ_chains = (chains[converged] for chains  poi_chains)
(
    μₕ₁_r̂ = potential_scale_reduction(μₕ₁_chains...),
    μₕ₂_r̂ = potential_scale_reduction(μₕ₂_chains...),
    ρ_r̂  = potential_scale_reduction(ρ_chains...)
)
Out[30]:
(μₕ₁_r̂ = 1.0009681864564473, μₕ₂_r̂ = 1.0008285876590401, ρ_r̂ = 1.000144959408677)

Our effective sample sizes and $\hat{r}$s look good.

Finally, let’s look at the major quantiles for these parameters, and compare to the true values:

In [31]:
using Statistics

major_quantiles = [0.05 0.25 0.5 0.75 0.95]

true_values = (μₕ₁, μₕ₂, ρ)

for i  eachindex(poi_chain)
    display("Major Quantiles for paramter with true values: $(true_values[i]):")
    display(vcat(major_quantiles, quantile(poi_chain[i], major_quantiles)))
end
"Major Quantiles for paramter with true values: -3.0:"
2×5 Array{Float64,2}:
  0.05     0.25      0.5       0.75      0.95   
 -3.8202  -3.21029  -2.85386  -2.49156  -1.87508
"Major Quantiles for paramter with true values: 9.0:"
2×5 Array{Float64,2}:
 0.05     0.25     0.5    0.75     0.95   
 7.54145  8.18481  8.551  8.90634  9.50759
"Major Quantiles for paramter with true values: 0.7:"
2×5 Array{Float64,2}:
 0.05      0.25      0.5       0.75      0.95    
 0.694748  0.698419  0.700964  0.703497  0.707206

I chose $0.05$ and $0.95$ as the extreme bounds in place of the usual $0.025$ and $0.975$ to match stansummary.