用Shopify的机器学习平台解开实时预测之谜

In 2022, we shipped Merlin, Shopify’s new and improved machine learning platform built on Ray. We built a flexible platform to support the varying requirements, inputs, data types, dependencies and integrations we deal with at Shopify (you can read all about Merlin’s architecture in our previous blog). Since then, we’ve enhanced the platform by onboarding more use cases and adding features to complete the end-to-end machine learning workflow, including:

2022年,我们推出了Merlin,Shopify新的和改进的机器学习平台,建立在Ray之上。我们建立了一个灵活的平台,以支持我们在Shopify处理的不同需求、输入、数据类型、依赖性和集成(你可以在我们之前的博客中阅读所有关于Merlin的架构)。从那时起,我们通过加入更多的用例和增加功能来完成端到端的机器学习工作流程来增强该平台,包括。

  1. Merlin Online Inference, which provides the ability to deploy and serve machine learning models for real-time predictions.
  2. Merlin Online Inference,它提供了部署和服务机器学习模型的能力,用于实时预测。
  3. Model Registry and experiment tracking with Comet ML.
  4. Comet ML进行模型注册实验跟踪
  5. Merlin Pipelines, a framework for reproducible machine learning pipelines on top of Merlin.
  6. Merlin Pipelines,一个在Merlin基础上的可重复的机器学习管道框架。
  7. Pano Feature Store, an offline / online feature store built on top of an open source feature store, Feast.
  8. Pano功能商店,一个建立在开源功能商店Feast之上的离线/在线功能商店。

The ability to provide real-time predictions for user-facing applications and services is an important requirement at Shopify, and will become increasingly critical as more machine learning models are integrated closer to user-facing products. But it’s a challenging requirement. As machine learning is utilized by many different teams across Shopify, each with its own use-cases and requirements, we had to ensure that Merlin’s online inference could be an effective, generalized solution. We needed to build something robust that could be used by all of our use-cases and allow low-latency, while serving machine learning models at Shopify scale.

为面向用户的应用程序和服务提供实时预测的能力是Shopify的一个重要要求,而且随着更多的机器学习模型被集成到面向用户的产品中,这一要求将变得越来越关键。但这是一个具有挑战性的要求。由于机器学习被Shopify的许多不同团队所利用,每个团队都有自己的用例和要求,我们必须确保Merlin的在线推理能够成为一个有效的、通用的解决方案。我们需要建立一个强大的东西,可以被我们所有的用例使用,并允许低延迟,同时以Shopify的规模来服务机器学习模型。

In thi...

开通本站会员,查看完整译文。

首页 - Wiki
Copyright © 2011-2024 iteam. Current version is 2.124.0. UTC+08:00, 2024-04-25 10:12
浙ICP备14020137号-1 $访客地图$