1) Adventures in Statistics I: From multi-armed bandit strategies to designs for phase 3 adaptive Bayesian platform clinical trials
Loading...
Description
This lecture was presented on September 28, 2021.
The recorded lecture and slides are available to view and download
The Adventures in Statistics I series was organized by Dr. J. Jack Lee and Dr. Yu Shen from the Department of Biostatistics.
References
- Berry, DA. (1972). A Bernoulli Two-armed Bandit. Ann Math Statist 43 (3) 871 - 897.
- Thompson, WR. (1933). On the Likelihood That One Unknown Probability Exceeds Another in View of the Evidence of Two Samples. Biometrika, 25(3/4) 285–94.
- Scott, SL. (2015). Multi-armed bandit experiments in the online service economy. Appl Stoch Models Bus Ind, 31, 37-49.
- Berry, DA. (1978). Modified two-armed bandit strategies for certain clinical trials. J Am Stat Assoc, 73(362), 339-345.
- Woodcock, J, LaVange, LM. (2017). Master protocols to study multiple therapies, multiple diseases, or both. N Engl J Med. 377:62-70.
- Berry, DA, Chen, RW, Zame, A., Heath, DC, & Shepp, LA. (1997). Bandit Problems With Infinitely Many Arms. Ann Stat, 25 (5), 2103-2116.
Publication Date
9-28-2021
City
Houston
Keywords
Bayesian; Clinical Trials; Research Design
Recommended Citation
Berry, Donald A. PhD, "1) Adventures in Statistics I: From multi-armed bandit strategies to designs for phase 3 adaptive Bayesian platform clinical trials" (2021). Adventures in Statistics Lecture Series II. 1.
https://openworks.mdanderson.org/biostatistics_adventures/1