I'm attempting to implement mixed effects logistic regression in python. As a point of comparison, I'm using the glmer
function from the lme4
package in R.
I've found that the statsmodels
module has a BinomialBayesMixedGLM
that should be able to fit such a model. However, I've encountered a number of issues:
- I find the documentation for the
statsmodels
function to be not entirely helpful or clear, so I'm not completely sure how to use the function appropriately. - So far, my attempts have not produced results that replicate what I get when fitting the model with
glmer
in R. - I expect the
BinomialBayesMixedGLM
function does not calculate p-values since it is Bayesian, but I can't seem to figure out how to access the full posterior distributions for the parameters.
As a test case, I'm using the titanic dataset available here.
import os
import pandas as pd
import statsmodels.genmod.bayes_mixed_glm as smgb
titanic = pd.read_csv(os.path.join(os.getcwd(), 'titanic.csv'))
r = {"Pclass": '0 + Pclass'}
mod = smgb.BinomialBayesMixedGLM.from_formula('Survived ~ Age', r, titanic)
fit = mod.fit_map()
fit.summary()
# Type Post. Mean Post. SD SD SD (LB) SD (UB)
# Intercept M 3.1623 0.3616
# Age M -0.0380 0.0061
# Pclass V 0.0754 0.5669 1.078 0.347 3.351
However, aside from the slope for Age, this doesn't appear to match what I get in R with glmer(Survived ~ Age + (1 | Pclass), data = titanic, family = "binomial")
:
Random effects:
Groups Name Variance Std.Dev.
Pclass (Intercept) 0.8563 0.9254
Number of obs: 887, groups: Pclass, 3
Fixed effects:
Estimate Std. Error z value Pr(>|z|)
(Intercept) 0.961780 0.573402 1.677 0.0935 .
Age -0.038708 0.006243 -6.200 5.65e-10 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
So what error am I making when creating my model in python? And, once that's addressed, how can I extract either the posteriors or p-values? Lastly, are their any python implementations of mixed effect logistic regressions that are more akin to the implementation in R?