使用MCP、Q和tmux构建每日新闻摘要的新闻代理
To better understand MCPs and agentic workflows, I built news-agents to help me generate a daily news recap. It’s built on Amazon Q CLI and MCP. The former provides the agentic framework and the latter provides news feeds via tools. It also uses tmux to spawn and display each sub-agent’s work. At a high level, here’s how it works:
为了更好地理解MCP和代理工作流程,我构建了news-agents来帮助我生成每日新闻摘要。它基于Amazon Q CLI和MCP。前者提供了代理框架,后者通过工具提供新闻源。它还使用tmux来生成和显示每个子代理的工作。从高层次来看,它的工作原理如下:
Main Agent (in the main tmux pane)
├── Read feeds.txt
├── Split feeds into 3 chunks
├── Spawns 3 Sub-Agents (in separate tmux panes)
│ ├── Sub-Agent #1
│ │ ├── Process feeds in chunk 1
│ │ └── Report back when done
│ ├── Sub-Agent #2
│ │ ├── Process feeds in chunk 2
│ │ └── Report back when done
│ └── Sub-Agent #3
│ ├── Process feeds in chunk 3
│ └── Report back when done
└── Combine everything into main-summary.md
Here, we’ll walk through how the MCP tools are built and how the main agent spawns and monitors sub-agents. Each sub-agent processes its allocated news feeds and generates summaries for each feed. The main agent then combines these summaries into a final summary. Here’s the three-minute 1080p demo (watch till at least 0:30):
在这里,我们将介绍 MCP 工具的构建方式,以及主代理如何生成和监控子代理。每个子代理处理其分配的新闻源,并为每个源生成摘要。然后,主代理将这些摘要合并成最终摘要。以下是三分钟的 1080p 演示(至少观看到 0:30):
Setting up MCPs for news feeds
为新闻源设置MCP
Each news feed has its own rss reader, parser, and formatter. These handle the unique structure and format of each feed. (Perhaps in future we can just use an LLM to parse these large text blobs reliably and cheaply.) For example, here’s the code for fetching and parsing the Hacker News rss feed:
每个新闻源都有自己的 rss 阅读器、解析器和格式化器。这些工具处理每个源的独特结构和格式。(也许在未来,我们可以仅使用 LLM 来可靠且廉价地解析这些大型文本块。)例如,以下是获取和解析 Hacker News rss 源的代码:
async def fetch_hn_rss(feed_url: str) -> str: """ Fetch Hacker News RSS feed. Args: feed_url: URL of the RSS feed to fetch (defaults to Hacker News) """ headers = {"User-Agent": USER_AGENT} async with httpx.AsyncClient() as client: try: re...