deepgp
Bayesian Deep Gaussian Processes using MCMC
Performs Bayesian posterior inference for deep Gaussian processes following Sauer, Gramacy, and Higdon (2023, doi:10.48550/arXiv.2012.08015). See Sauer (2023, http://hdl.handle.net/10919/114845) for comprehensive methodological details and https://bitbucket.org/gramacylab/deepgp-ex/ for a variety of coding examples. Models are trained through MCMC including elliptical slice sampling of latent Gaussian layers and Metropolis-Hastings sampling of kernel hyperparameters. Vecchia-approximation for faster computation is implemented following Sauer, Cooper, and Gramacy (2023, doi:10.48550/arXiv.2204.02904). Optional monotonic warpings are implemented following Barnett et al. (2024, doi:10.48550/arXiv.2408.01540). Downstream tasks include sequential design through active learning Cohn/integrated mean squared error (ALC/IMSE; Sauer, Gramacy, and Higdon, 2023), optimization through expected improvement (EI; Gramacy, Sauer, and Wycoff, 2022 doi:10.48550/arXiv.2112.07457), and contour location through entropy (Booth, Renganathan, and Gramacy, 2024 doi:10.48550/arXiv.2308.04420). Models extend up to three layers deep; a one layer model is equivalent to typical Gaussian process regression. Incorporates OpenMP and SNOW parallelization and utilizes C/C++ under the hood.
- Version1.1.3
- R versionunknown
- LicenseLGPL-2
- LicenseLGPL-2.1
- LicenseLGPL-3
- Needs compilation?Yes
- Last release08/19/2024
Documentation
Team
Annie S. Booth
Insights
Last 30 days
Last 365 days
The following line graph shows the downloads per day. You can hover over the graph to see the exact number of downloads per day.
Data provided by CRAN
Binaries
Dependencies
- Imports7 packages
- Suggests3 packages
- Linking To2 packages