Standard Bayesian analyses can be difficult to perform when the full likelihood, and consequently the full posterior distribution, is too complex or even impossible to specify or if robustness with respect to data or to model misspecifications is required. In these situations, we suggest to resort to a posterior distribution for the parameter of interest based on proper scoring rules. Scoring rules are loss functions designed to measure the quality of a probability distribution for a random variable, given its observed value. Important examples are the Tsallis score and the Hyvärinen score, which allow us to deal with model misspecifications or with complex models. Also the full and the composite likelihoods are both special instances of scoring rules. The aim of this paper is twofold. Firstly, we discuss the use of scoring rules in the Bayes formula in order to compute a posterior distribution, named SR-posterior distribution, and we derive its asymptotic normality. Secondly, we propose a procedure for building default priors for the unknown parameter of interest that can be used to update the information provided by the scoring rule in the SR-posterior distribution. In particular, a reference prior is obtained by maximizing the average α-divergence from the SR-posterior distribution. For 0≤|α|<1, the result is a Jeffreys-type prior that is proportional to the square root of the determinant of the Godambe information matrix associated with the scoring rule. Some examples are discussed.

Objective Bayesian inference with proper scoring rules

F. Giummole';V. Mameli;
2019-01-01

Abstract

Standard Bayesian analyses can be difficult to perform when the full likelihood, and consequently the full posterior distribution, is too complex or even impossible to specify or if robustness with respect to data or to model misspecifications is required. In these situations, we suggest to resort to a posterior distribution for the parameter of interest based on proper scoring rules. Scoring rules are loss functions designed to measure the quality of a probability distribution for a random variable, given its observed value. Important examples are the Tsallis score and the Hyvärinen score, which allow us to deal with model misspecifications or with complex models. Also the full and the composite likelihoods are both special instances of scoring rules. The aim of this paper is twofold. Firstly, we discuss the use of scoring rules in the Bayes formula in order to compute a posterior distribution, named SR-posterior distribution, and we derive its asymptotic normality. Secondly, we propose a procedure for building default priors for the unknown parameter of interest that can be used to update the information provided by the scoring rule in the SR-posterior distribution. In particular, a reference prior is obtained by maximizing the average α-divergence from the SR-posterior distribution. For 0≤|α|<1, the result is a Jeffreys-type prior that is proportional to the square root of the determinant of the Godambe information matrix associated with the scoring rule. Some examples are discussed.
2019
28
File in questo prodotto:
File Dimensione Formato  
Giummolè2019_Article_ObjectiveBayesianInferenceWith.pdf

non disponibili

Descrizione: Articolo postprint
Tipologia: Versione dell'editore
Licenza: Accesso libero (no vincoli)
Dimensione 2.24 MB
Formato Adobe PDF
2.24 MB Adobe PDF   Visualizza/Apri
test.pdf

accesso aperto

Descrizione: Articolo principale post print
Tipologia: Documento in Post-print
Licenza: Accesso libero (no vincoli)
Dimensione 5.4 MB
Formato Adobe PDF
5.4 MB Adobe PDF Visualizza/Apri

I documenti in ARCA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10278/3703205
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 13
  • ???jsp.display-item.citation.isi??? 7
social impact