We analyse the idiosyncrasies of LambdaMART gradients and we introduce some strategies to remove or reduce gradient incoherencies. Specifically, we designed three selection strategies to compute the full gradient for only those documents that should be ranked in the top-k positions of the ranking. We empirically demonstrate on publicly available datasets that the proposed approach leads to models that can achieve statistically significant improvements in terms of NDCG while maintaining the same training efficiently as optimising truncated metrics.

Does LambdaMART Do What You Expect?

Lucchese C.;Marcuzzi F.
;
Orlando S.
2023-01-01

Abstract

We analyse the idiosyncrasies of LambdaMART gradients and we introduce some strategies to remove or reduce gradient incoherencies. Specifically, we designed three selection strategies to compute the full gradient for only those documents that should be ranked in the top-k positions of the ranking. We empirically demonstrate on publicly available datasets that the proposed approach leads to models that can achieve statistically significant improvements in terms of NDCG while maintaining the same training efficiently as optimising truncated metrics.
2023
CEUR Workshop Proceedings
File in questo prodotto:
File Dimensione Formato  
paper-16.pdf

accesso aperto

Tipologia: Versione dell'editore
Licenza: Creative commons
Dimensione 443.51 kB
Formato Adobe PDF
443.51 kB Adobe PDF Visualizza/Apri

I documenti in ARCA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10278/5034902
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact