Multi-gate-Mixture-of-Experts (MMoE) model architecture and knowledge distillation in Ads Engagement modeling development | by Pinterest Engineering | Pinterest Engineering Blog | Apr, 2025

pinterest技术

Building Dash: How RAG and AI agents help us meet the needs of businesses

dropbox技术

Migrating Uber’ s Compute Platform to Kubernetes: A Technical Journey

uber技术

How Hyperforce Edge Networking Handles 20M Domains on 30GB RAM

salesforce技术

Introducing the SOP-driven LLM agent frameworks

grab技术

Building Deep Research Agent from scratch

How to Fine-tune Florence-2 for Object Detection Tasks

Fine-Tuning Gemma 3 1B-IT for Financial Sentiment Analysis: A Step-by-Step Guide

Migrating 3.7 Million Lines of Flow Code to TypeScript

pinterest技术

Handling Network Throttling with AWS EC2 at Pinterest

How Netflix Accurately Attributes eBPF Flow Logs

netflix技术

Uber’ s Journey to Ray on Kubernetes: Resource Management

uber技术

Training and Finetuning Reranker Models with Sentence Transformers v4

Training and Finetuning Embedding Models with Sentence Transformers v3

Improving Pinterest Search Relevance Using Large Language Models

pinterest技术

inicio - Wiki
Copyright © 2011-2025 iteam. Current version is 2.143.0. UTC+08:00, 2025-04-26 02:39
浙ICP备14020137号-1 $mapa de visitantes$