Adventures in Statistics Lecture Series
 
Part 1: From multi-armed bandit strategies to designs for phase 3 adaptive Bayesian platform clinical trials

Title

Part 1: From multi-armed bandit strategies to designs for phase 3 adaptive Bayesian platform clinical trials

Files

Download Presentation slides (6.8 MB)

Download Closed captions (127 KB)

Loading...

Media is loading
 

Description

This lecture was presented on September 28, 2021.

The recorded lecture and slides are available to view and download

The Adventures in Statistics series was organized by Dr. J. Jack Lee and Dr. Yu Shen from the Department of Biostatistics.

References

Berry, DA. (1972). A Bernoulli Two-armed Bandit." Ann. Math. Statist. 43 (3) 871 - 897. https://doi.org/10.1214/aoms/1177692553

Thompson, WR. (1933). On the Likelihood That One Unknown Probability Exceeds Another in View of the Evidence of Two Samples. Biometrika, 25(3/4) 285–94. https://doi.org/10.2307/2332286.

Scott, SL. (2015). Multi-armed bandit experiments in the online service economy. Applied Stochastic Models in Business and Industry, 31, 37-49. https://doi.org/10.1002/asmb.2104

Berry, DA. (1978). Modified two-armed bandit strategies for certain clinical trials. Journal of the American Statistical Association, 73(362), 339-345. https://doi.org/10.1080/01621459.1978.10481579

Woodcock, J, LaVange, LM. (2017). Master protocols to study multiple therapies, multiple diseases, or both. N Engl J Med. 377:62-70 https://doi.org/10.1056/NEJMra1510062

Berry, DA, Chen, RW, Zame, A., Heath, DC, & Shepp, LA. (1997). Bandit Problems With Infinitely Many Arms. The Annals of Statistics, 25 (5), 2103-2116. https://doi.org/10.1214/aos/1069362389

Publication Date

9-28-2021

City

Houston

Keywords

Bayesian; Clinical Trials; Research Design

Part 1: From multi-armed bandit strategies to designs for phase 3 adaptive Bayesian platform clinical trials

Share

COinS