个性化推荐的基础模型

By Ko-Jen Hsiao, Yesu Feng and Sudarshan Lamkhede

作者 Ko-Jen Hsiao, Yesu FengSudarshan Lamkhede

Motivation

动机

Netflix’s personalized recommender system is a complex system, boasting a variety of specialized machine learned models each catering to distinct needs including “Continue Watching” and “Today’s Top Picks for You.” (Refer to our recent overview for more details). However, as we expanded our set of personalization algorithms to meet increasing business needs, maintenance of the recommender system became quite costly. Furthermore, it was difficult to transfer innovations from one model to another, given that most are independently trained despite using common data sources. This scenario underscored the need for a new recommender system architecture where member preference learning is centralized, enhancing accessibility and utility across different models.

Netflix 的个性化推荐系统是一个复杂的系统,拥有多种专门的机器学习模型,每个模型满足不同的需求,包括“继续观看”和“今天为您推荐的热门内容”。(有关更多细节,请参阅我们最近的 概述)。然而,随着我们扩展个性化算法以满足日益增长的业务需求,推荐系统的维护变得相当昂贵。此外,由于大多数模型是独立训练的,尽管使用了共同的数据源,因此将创新从一个模型转移到另一个模型变得困难。这种情况突显了需要一种新的推荐系统架构,其中成员偏好学习是集中化的,从而增强了不同模型之间的可访问性和实用性。

Particularly, these models predominantly extract features from members’ recent interaction histories on the platform. Yet, many are confined to a brief temporal window due to constraints in serving latency or training costs. This limitation has inspired us to develop a foundation model for recommendation. This model aims to assimilate information both from members’ comprehensive interaction histories and our content at a very large scale. It facilitates the distribution of these learnings to other models, either through shared model weights for fine tuning or directly through embeddings.

特别是,这些模型主要从成员在平台上的近期互动历史中提取特征。然而,由于服务延迟或训练成本的限制,许多模型被限制在一个简短的时间窗口内。这一限制激励我们开发一个推荐的基础模型。该模型旨在从成员的全面互动历史和我们内容中以非常大规模吸收信息。它促进了这些学习的分发到其他模型,无论是通过共享模型权重进行微调,还是通过嵌入直接进行。

The impetus for constructing a foundational recommendation model is based on the paradigm shift in natural language processing (NLP) to large language models (LLMs). In ...

开通本站会员,查看完整译文。

首页 - Wiki
Copyright © 2011-2025 iteam. Current version is 2.142.1. UTC+08:00, 2025-04-03 05:18
浙ICP备14020137号-1 $访客地图$