Enlaces: |
We explore, in increasing complexity, various strategies for addressing the Multi- Armed Bandit Problem in order to find the algorithm that maximizes the clickthrough rate in the Exploration and Exploitation Challenge 3 competition. The data used result from the actions…
|