Day & Time
30th October 2015, 15:00-
Room C816, Building of the Faculty of Science Graduate school of Science
Objective prior based on the α-divergence for non-regular model
In Bayesian inference, the selection of priors has been an important and much discussed problem. When we have a little prior information, we need to consider `objective’ prior. Under some regularity conditions, the Jeffreys prior is well-known (Jeffreys (1961)). The Jeffreys prior is also obtained by asymptotically maximizing the Kullback-Leibler divergence between the prior and the corresponding posterior under some regularity conditions. In this talk, we derive the prior which maximizes a more general divergence between the prior and the posterior for non-regular model. As an extension, we also show the prior for multi-parameter non-regular models in the presence of nuisance parameters.