Gadat, Sébastien, Panloup, Fabien
and Saadane, Sofiane
(2015)
Regret bound for Narendra-Shapiro bandit algorithms.
TSE Working Paper, n. 15-556, Toulouse

Preview |
Text
Download (1MB) | Preview |
Abstract
Narendra-Shapiro (NS) algorithms are bandit-type algorithms that were introduced in the 1960s in view of applications to Psychology or clinical trials. The long time behavior of such algorithms has been studied in depth but it seems that few results exist in a non-asymptotic setting, which can be of primary interest for applications. In this paper, we focus on the study of the regret of NS-algorithms and address the following question: are the Narendra-Shapiro (NS) bandit algorithms competitive from this non-asymptotic point of view? In our main result, we show that some competitive bounds can be obtained in their penalized version (introduced in [14]). More precisely, up to a slight modification, the regret of the penalized two-armed bandit algorithm is uniformly bounded by C \sqrt{n} (where C is a positive constant made explicit in the paper). We also generalize existing convergence and rate of convergence results to the multi-armed case of the over-penalized bandit algorithm, including the convergence toward the invariant measure of a Piecewise Deterministic Markov Process (PDMP) after a suitable renormalization. Finally, ergodic properties of this PDMP are given in the multi-armed case.
Item Type: | Monograph (Working Paper) |
---|---|
Language: | English |
Date: | February 2015 |
Place of Publication: | Toulouse |
Uncontrolled Keywords: | Regret, Stochastic Bandit Algorithms, Piecewise Deterministic Markov Processes |
Subjects: | B- ECONOMIE ET FINANCE |
Divisions: | TSE-R (Toulouse) |
Institution: | Université Toulouse 1 Capitole |
Site: | UT1 |
Date Deposited: | 16 Mar 2015 14:56 |
Last Modified: | 02 Apr 2021 15:49 |
OAI Identifier: | oai:tse-fr.eu:29077 |
URI: | https://publications.ut-capitole.fr/id/eprint/16707 |
Available Versions of this Item
- Regret bound for Narendra-Shapiro bandit algorithms. (deposited 16 Mar 2015 14:56) [Currently Displayed]