deepgp: Deep Gaussian Processes using MCMC
Performs posterior inference for deep Gaussian processes following
Sauer, Gramacy, and Higdon (2020) <arXiv:2012.08015>. Models are trained through
MCMC including elliptical slice sampling of latent Gaussian layers and Metropolis-Hastings
sampling of kernel hyperparameters. Vecchia-approximation for faster computation is implemented
following Sauer, Cooper, and Gramacy (2022) <arXiv:2204.02904>. Downstream tasks include
sequential design through active learning Cohn/integrated mean squared error (ALC/IMSE; Sauer,
Gramacy, and Higdon, 2020) and optimization through expected improvement (EI;
Gramacy, Sauer, and Wycoff, 2021 <arXiv:2112.07457>). Models
extend up to three layers deep; a one layer model is equivalent to typical Gaussian
process regression. Covariance kernel options are matern (default) and squared
exponential. Incorporates OpenMP and SNOW parallelization and utilizes C/C++ under the hood.
Version: |
1.1.0 |
Depends: |
R (≥ 3.6) |
Imports: |
grDevices, graphics, stats, doParallel, foreach, parallel, GpGp, Matrix, Rcpp, mvtnorm, FNN |
LinkingTo: |
Rcpp, RcppArmadillo |
Suggests: |
interp, knitr, rmarkdown |
Published: |
2022-12-15 |
Author: |
Annie Sauer |
Maintainer: |
Annie Sauer <anniees at vt.edu> |
License: |
LGPL-2 | LGPL-2.1 | LGPL-3 [expanded from: LGPL] |
NeedsCompilation: |
yes |
Materials: |
README |
CRAN checks: |
deepgp results |
Documentation:
Downloads:
Linking:
Please use the canonical form
https://CRAN.R-project.org/package=deepgp
to link to this page.