Understanding Embedding Scaling in Collaborative Filtering
Abstract
Large-scale experiments reveal double-peak and logarithmic performance patterns in collaborative filtering models as embedding dimensions scale, and provide theoretical insights into their causes.
Scaling recommendation models into large recommendation models has become one of the most widely discussed topics. Recent efforts focus on components beyond the scaling embedding dimension, as it is believed that scaling embedding may lead to performance degradation. Although there have been some initial observations on embedding, the root cause of their non-scalability remains unclear. Moreover, whether performance degradation occurs across different types of models and datasets is still an unexplored area. Regarding the effect of embedding dimensions on performance, we conduct large-scale experiments across 10 datasets with varying sparsity levels and scales, using 4 representative classical architectures. We surprisingly observe two novel phenomenon: double-peak and logarithmic. For the former, as the embedding dimension increases, performance first improves, then declines, rises again, and eventually drops. For the latter, it exhibits a perfect logarithmic curve. Our contributions are threefold. First, we discover two novel phenomena when scaling collaborative filtering models. Second, we gain an understanding of the underlying causes of the double-peak phenomenon. Lastly, we theoretically analyze the noise robustness of collaborative filtering models, with results matching empirical observations.
Community
This paper is motivated by one question: can we scale the native architecture of CF to achieve performance comparable to Transformers?
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- Semantic Item Graph Enhancement for Multimodal Recommendation (2025)
- Hybrid Matrix Factorization Based Graph Contrastive Learning for Recommendation System (2025)
- Enhancing Graph-based Recommendations with Majority-Voting LLM-Rerank Augmentation (2025)
- CrossDenoise: Denoising Implicit Feedback via a Lightweight Entity-Aware Synergistic Framework (2025)
- SPARK: Adaptive Low-Rank Knowledge Graph Modeling in Hybrid Geometric Spaces for Recommendation (2025)
- A Node-Aware Dynamic Quantization Approach for Graph Collaborative Filtering (2025)
- Representation Quantization for Collaborative Filtering Augmentation (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper