vglmer - Variational Inference for Hierarchical Generalized Linear Models
Estimates hierarchical models using variational inference. At present, it can estimate logistic, linear, and negative binomial models. It can accommodate models with an arbitrary number of random effects and requires no integration to estimate. It also provides the ability to improve the quality of the approximation using marginal augmentation. Goplerud (2022) <doi:10.1214/21-BA1266> and Goplerud (2024) <doi:10.1017/S0003055423000035> provide details on the variational algorithms.
Last updated 3 months ago
cpp
5.03 score 18 stars 1 dependents 6 scripts 643 downloadsgKRLS - Generalized Kernel Regularized Least Squares
Kernel regularized least squares, also known as kernel ridge regression, is a flexible machine learning method. This package implements this method by providing a smooth term for use with 'mgcv' and uses random sketching to facilitate scalable estimation on large datasets. It provides additional functions for calculating marginal effects after estimation and for use with ensembles ('SuperLearning'), double/debiased machine learning ('DoubleML'), and robust/clustered standard errors ('sandwich'). Chang and Goplerud (2024) <doi:10.1017/pan.2023.27> provide further details.
Last updated 3 months ago
cpp
3.90 score 8 stars 4 scripts 594 downloadsFactorHet - Estimate Heterogeneous Effects in Factorial Experiments Using Grouping and Sparsity
Estimates heterogeneous effects in factorial (and conjoint) models. The methodology employs a Bayesian finite mixture of regularized logistic regressions, where moderators can affect each observation's probability of group membership and a sparsity-inducing prior fuses together levels of each factor while respecting ANOVA-style sum-to-zero constraints. Goplerud, Imai, and Pashley (2024) <doi:10.48550/ARXIV.2201.01357> provide further details.
Last updated 1 months ago
cpp
3.18 score 3 stars 2 scripts 534 downloads