Scalable Cross-Entropy Loss for Sequential Recommendations with Large Item Catalogs
Author(s)
Mezentsev, Gleb; Gusak, Danil; Oseledets, Ivan; Frolov, Evgeny
Download3640457.3688140.pdf (1.342Mb)
Publisher Policy
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
Scalability issue plays a crucial role in productionizing modern recommender systems. Even lightweight architectures may suffer from high computational overload due to intermediate calculations, limiting their practicality in real-world applications. Specifically, applying full Cross-Entropy (CE) loss often yields state-of-the-art performance in terms of recommendations quality. Still, it suffers from excessive GPU memory utilization when dealing with large item catalogs. This paper introduces a novel Scalable Cross-Entropy (SCE) loss function in the sequential learning setup. It approximates the CE loss for datasets with large-size catalogs, enhancing both time efficiency and memory usage without compromising recommendations quality. Unlike traditional negative sampling methods, our approach utilizes a selective GPU-efficient computation strategy, focusing on the most informative elements of the catalog, particularly those most likely to be false positives. This is achieved by approximating the softmax distribution over a subset of the model outputs through the maximum inner product search. Experimental results on multiple datasets demonstrate the effectiveness of SCE in reducing peak memory usage by a factor of up to 100 compared to the alternatives, retaining or even exceeding their metrics values. The proposed approach also opens new perspectives for large-scale developments in different domains, such as large language models.
Description
RecSys ’24, October 14–18, 2024, Bari, Italy
Date issued
2024-10-08Publisher
ACM|18th ACM Conference on Recommender Systems
Citation
Mezentsev, Gleb, Gusak, Danil, Oseledets, Ivan and Frolov, Evgeny. 2024. "Scalable Cross-Entropy Loss for Sequential Recommendations with Large Item Catalogs."
Version: Final published version
ISBN
979-8-4007-0505-2