MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

From Variability to Stability: Advancing RecSys Benchmarking Practices

Author(s)
Shevchenko, Valeriy; Belousov, Nikita; Vasilev, Alexey; Zholobov, Vladimir; Sosedka, Artyom; Semenova, Natalia; Volodkevich, Anna; Savchenko, Andrey; Zaytsev, Alexey; ... Show more Show less
Thumbnail
Download3637528.3671655.pdf (1.691Mb)
Publisher Policy

Publisher Policy

Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.

Terms of use
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Metadata
Show full item record
Abstract
In the rapidly evolving domain of Recommender Systems (RecSys), new algorithms frequently claim state-of-the-art performance based on evaluations over a limited set of arbitrarily selected datasets. However, this approach may fail to holistically reflect their effectiveness due to the significant impact of dataset characteristics on algorithm performance. Addressing this deficiency, this paper introduces a novel benchmarking methodology to facilitate a fair and robust comparison of RecSys algorithms, thereby advancing evaluation practices. By utilizing a diverse set of 30 open datasets, including two introduced in this work, and evaluating 11 collaborative filtering algorithms across 9 metrics, we critically examine the influence of dataset characteristics on algorithm performance. We further investigate the feasibility of aggregating outcomes from multiple datasets into a unified ranking. Through rigorous experimental analysis, we validate the reliability of our methodology under the variability of datasets, offering a benchmarking strategy that balances quality and computational demands. This methodology enables a fair yet effective means of evaluating RecSys algorithms, providing valuable guidance for future research endeavors.
Description
KDD ’24, August 25–29, 2024, Barcelona, Spain
Date issued
2024-08-25
URI
https://hdl.handle.net/1721.1/159400
Publisher
ACM|Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
Citation
Shevchenko, Valeriy, Belousov, Nikita, Vasilev, Alexey, Zholobov, Vladimir, Sosedka, Artyom et al. 2024. "From Variability to Stability: Advancing RecSys Benchmarking Practices."
Version: Final published version
ISBN
979-8-4007-0490-1

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.