如何修复你的Context

Mitigating & Avoiding Context Failures

缓解与避免上下文失效

Following up on our earlier post, “How Long Contexts Fail”, let’s run through the ways we can mitigate or avoid these failures entirely.

继我们之前的文章“How Long Contexts Fail”之后,让我们逐一梳理如何缓解或完全避免这些失效。

But before we do, let’s briefly recap some of the ways long contexts can fail:

但在那之前,我们先快速回顾一下长上下文可能失败的几种方式:

  • Context Poisoning: When a hallucination or other error makes it into the context, where it is repeatedly referenced.
  • Context Poisoning:当幻觉或其他错误进入上下文并被反复引用时发生。
  • Context Distraction: When a context grows so long that the model over-focuses on the context, neglecting what it learned during training.
  • Context Distraction:当上下文变得过长,模型过度关注上下文,忽视了训练期间学到的内容。
  • Context Confusion: When superfluous information in the context is used by the model to generate a low-quality response.
  • Context Confusion:当上下文中的冗余信息被模型用来生成低质量回复。
  • Context Clash: When you accrue new information and tools in your context that conflicts with other information in the prompt.
  • Context Clash: 当你在上下文中累积的新信息和工具与提示中的其他信息发生冲突时。

Everything here is about information management. Everything in the context influences the response. We’re back to the old programming adage of, “Garbage in, garbage out.” Thankfully, there’s plenty of options for dealing with the issues above.

这里的一切都关乎信息管理。上下文中的所有内容都会影响响应。我们又回到了那句古老的编程格言:“Garbage in, garbage out”。值得庆幸的是,我们有很多方法可以应对上述问题。

A stack of papers.

RAG

RAG

Retrieval-Augmented Generation (RAG) is the act of selectively adding relevant information to help the LLM generate a better response.

Retrieval-Augmented Generation (RAG) 是指有选择地添加相关信息,以帮助 LLM 生成更优质的回答。

So much has been written about RAG, we’re not going to cover it today beyond saying: it’s very much alive.

关于 RAG 的文章已经很多,今天我们不再赘述,只想说:它依然非常活跃。

Every time a model ups the context window ante, a new “RAG is Dead” debate is born. The last significant event was when Llama 4 Scout landed with a 10 million token window. At that size it’s really tempting to think, “Screw it, throw it all in,” and call it a day.

每当模型把上下文窗口再扩大一点,就会诞生一场新的“RAG 已死”争论。最近一次大事件是 Llam...

开通本站会员,查看完整译文。

首页 - Wiki
Copyright © 2011-2025 iteam. Current version is 2.144.1. UTC+08:00, 2025-08-01 00:29
浙ICP备14020137号-1 $访客地图$