Part 1: From multi-armed bandit strategies to designs for phase 3 adaptive Bayesian platform clinical trials
This lecture was presented on September 28, 2021.
The Adventures in Statistics series was organized by Dr. J. Jack Lee and Dr. Yu Shen from the Department of Biostatistics.
Berry, DA. (1972). A Bernoulli Two-armed Bandit." Ann. Math. Statist. 43 (3) 871 - 897. https://doi.org/10.1214/aoms/1177692553
Thompson, WR. (1933). On the Likelihood That One Unknown Probability Exceeds Another in View of the Evidence of Two Samples. Biometrika, 25(3/4) 285–94. https://doi.org/10.2307/2332286.
Scott, SL. (2015). Multi-armed bandit experiments in the online service economy. Applied Stochastic Models in Business and Industry, 31, 37-49. https://doi.org/10.1002/asmb.2104
Berry, DA. (1978). Modified two-armed bandit strategies for certain clinical trials. Journal of the American Statistical Association, 73(362), 339-345. https://doi.org/10.1080/01621459.1978.10481579
Woodcock, J, LaVange, LM. (2017). Master protocols to study multiple therapies, multiple diseases, or both. N Engl J Med. 377:62-70 https://doi.org/10.1056/NEJMra1510062
Berry, DA, Chen, RW, Zame, A., Heath, DC, & Shepp, LA. (1997). Bandit Problems With Infinitely Many Arms. The Annals of Statistics, 25 (5), 2103-2116. https://doi.org/10.1214/aos/1069362389
Bayesian; Clinical Trials; Research Design
Berry, Donald A. PhD, "Part 1: From multi-armed bandit strategies to designs for phase 3 adaptive Bayesian platform clinical trials" (2021). Adventures in Statistics Lecture Series. 1.