Cite

HARVARD Citation

    Burnetas, A. et al. (2017). ASYMPTOTICALLY OPTIMAL MULTI-ARMED BANDIT POLICIES UNDER A COST CONSTRAINT. Probability in the engineering and informational sciences. 31 (3), pp. 284-310. [Online]. 
  
Back to record