使用 A2A 协议和 MCP 在 Elasticsearch 中创建一个 LLM agent 新闻室:第二部分

20 阅读11分钟

作者:来自 Elastic Justin Castilla

Agent Builder 现在作为技术预览版提供。使用 Elastic Cloud 试用即可开始,并在这里查看 Agent Builder 文档。


A2A 和 MCP:代码实战

这是文章 “在 Elasticsearch 中使用 A2A 协议和 MCP 创建一个 LLM agent 新闻室” 的配套内容,该文章解释了在同一个 agent 中同时实现 A2A 和 MCP 架构以真正获得两个框架独特优势的好处。我们也提供了一个代码仓库,方便你自己运行这个演示。

让我们来看看新闻室的 agent 如何同时使用 A2A 和 MCP 协作完成一篇新闻文章。你可以在这里找到配套的代码仓库,查看这些 agent 的实际运作。

步骤 1:选题分配

新闻主管(作为 client)分配一个选题:

`

1.  {
2.    "message_type": "task_request",
3.    "sender": "news_chief",
4.    "receiver": "reporter_agent",
5.    "payload": {
6.      "task_id": "story_renewable_energy_2024",
7.      "assignment": {
8.        "topic": "Renewable Energy Adoption in Europe",
9.        "angle": "Policy changes driving solar and wind expansion",
10.        "target_length": 1200,
11.        "deadline": "2025-09-30T18:00:00Z"
12.      }
13.    }
14.  }

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

步骤 2:记者请求调研

Reporter agent 发现它需要背景信息,并通过 A2A 将任务委派给 Researcher agent:

`

1.  {
2.    "message_type": "task_request",
3.    "sender": "reporter_agent",
4.    "receiver": "researcher_agent",
5.    "payload": {
6.      "task_id": "research_eu_renewable_2024",
7.      "parent_task_id": "story_renewable_energy_2024",
8.      "capability": "fact_gathering",
9.      "parameters": {
10.        "queries": [
11.          "EU renewable energy capacity 2024",
12.          "Solar installations growth Europe",
13.          "Wind energy policy changes 2024"
14.        ],
15.        "depth": "comprehensive"
16.      }
17.    }
18.  }

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

步骤 3:记者向 Archive agent 请求历史背景

Reporter agent 意识到历史背景会强化报道。它通过 A2A 将任务委派给由 Elastic 的 A2A agent 驱动的 Archive agent,在新闻室的 Elasticsearch 驱动文章档案中进行搜索:

`

1.  {
2.    "message_type": "task_request",
3.    "sender": "reporter_agent",
4.    "receiver": "archive_agent",
5.    "payload": {
6.      "task_id": "archive_search_renewable_2024",
7.      "parent_task_id": "story_renewable_energy_2024",
8.      "capability": "search_archive",
9.      "parameters": {
10.        "query": "European renewable energy policy changes and adoption trends over past 5 years",
11.        "focus_areas": ["solar", "wind", "policy", "Germany", "France"],
12.        "time_range": "2019-2024",
13.        "result_count": 10
14.      }
15.    }
16.  }

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

步骤 4:Archive agent 使用带 MCP 的 Elastic A2A agent

Archive agent 使用 Elastic 的 A2A agent,而这个 A2A agent 又使用 MCP 来访问 Elasticsearch 工具。这个过程展示了混合架构:A2A 负责 agent 协作,而 MCP 提供工具访问:

`

1.  # Archive Agent using Elastic A2A Agent
2.  async def search_historical_articles(self, query_params):
3.      # The Archive Agent sends a request to Elastic's A2A Agent
4.      elastic_response = await self.a2a_client.send_request(
5.          agent="elastic_agent",
6.          capability="search_and_analyze",
7.          parameters={
8.              "natural_language_query": query_params["query"],
9.              "index_pattern": "newsroom-articles-*",
10.              "filters": {
11.                  "topics": query_params["focus_areas"],
12.                  "date_range": query_params["time_range"]
13.              },
14.              "analysis_type": "trend_analysis"
15.          }
16.      )

18.      # Elastic's A2A Agent internally uses MCP tools:
19.      # - platform.core.search (to find relevant articles)
20.      # - platform.core.generate_esql (to analyze trends)
21.      # - platform.core.index_explorer (to identify relevant indices)

23.      return elastic_response

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

Archive agent 从 Elastic 的 A2A agent 收到全面的历史数据,并将其返回给 Reporter:

`

1.  {
2.    "message_type": "task_response",
3.    "sender": "archive_agent",
4.    "receiver": "reporter_agent",
5.    "payload": {
6.      "task_id": "archive_search_renewable_2024",
7.      "status": "completed",
8.      "archive_data": {
9.        "historical_articles": [
10.          {
11.            "title": "Germany's Energiewende: Five Years of Solar Growth",
12.            "published": "2022-06-15",
13.            "key_points": [
14.              "Germany added 7 GW annually 2020-2022",
15.              "Policy subsidies drove 60% of growth"
16.            ],
17.            "relevance_score": 0.94
18.          },
19.          {
20.            "title": "France Balances Nuclear and Renewables",
21.            "published": "2023-03-20",
22.            "key_points": [
23.              "France increased renewable target to 40% by 2030",
24.              "Solar capacity doubled 2021-2023"
25.            ],
26.            "relevance_score": 0.89
27.          }
28.        ],
29.        "trend_analysis": {
30.          "coverage_frequency": "EU renewable stories increased 150% since 2019",
31.          "emerging_themes": ["policy incentives", "grid modernization", "battery storage"],
32.          "coverage_gaps": ["Small member states", "offshore wind permitting"]
33.        },
34.        "total_articles_found": 47,
35.        "search_confidence": 0.91
36.      }
37.    }
38.  }

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

这一步展示了 Elastic 的 A2A agent 是如何融入新闻室工作流的。Archive agent(新闻室的专用 agent)与 Elastic 的 A2A agent(第三方专家)协作,从而利用 Elasticsearch 强大的搜索和分析能力。Elastic 的 agent 在内部使用 MCP 来访问 Elasticsearch 工具,体现了 agent 协作(A2A)与工具访问(MCP)之间清晰的分离。

步骤 5:Researcher 使用 MCP 服务器

Researcher agent 访问多个 MCP 服务器来收集信息:

`

1.  # Researcher Agent using MCP to access tools
2.  async def gather_facts(self, queries):
3.      results = []

5.      # Use News API MCP Server
6.      news_data = await self.mcp_client.invoke_tool(
7.          server="news_api_mcp",
8.          tool="search_articles",
9.          parameters={
10.              "query": queries[0],
11.              "date_range": "2024-01-01 to 2024-09-30",
12.              "sources": ["reuters", "bloomberg", "euractiv"]
13.          }
14.      )

16.      # Use Statistics MCP Server
17.      stats_data = await self.mcp_client.invoke_tool(
18.          server="eurostat_mcp",
19.          tool="get_energy_statistics",
20.          parameters={
21.              "metrics": ["solar_capacity_mw", "wind_capacity_mw"],
22.              "countries": ["EU27"],
23.              "year": 2024
24.          }
25.      )

27.      # Use Fact-Checking MCP Server
28.      verified_facts = await self.mcp_client.invoke_tool(
29.          server="factcheck_mcp",
30.          tool="verify_claims",
31.          parameters={
32.              "claims": [
33.                  "EU solar capacity increased 40% in 2024",
34.                  "Germany leads in wind installations"
35.              ]
36.          }
37.      )

39.      return self._synthesize_research(news_data, stats_data, verified_facts)

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

步骤 6:Researcher 将数据返回给 Reporter

Researcher agent 通过 A2A 将完整的调研结果发回:

`

1.  {
2.    "message_type": "task_response",
3.    "sender": "researcher_agent",
4.    "receiver": "reporter_agent",
5.    "payload": {
6.      "task_id": "research_eu_renewable_2024",
7.      "status": "completed",
8.      "research_data": {
9.        "key_facts": [
10.          {
11.            "claim": "EU solar capacity reached 260 GW in 2024",
12.            "source": "Eurostat Q3 2024 Report",
13.            "confidence": 0.95,
14.            "verified": true
15.          },
16.          {
17.            "claim": "Germany installed 12 GW of wind capacity in 2024",
18.            "source": "German Federal Network Agency",
19.            "confidence": 0.92,
20.            "verified": true
21.          }
22.        ],
23.        "statistics": {
24.          "solar_growth_rate": "35%",
25.          "wind_growth_rate": "28%"
26.        },
27.        "sources_count": 15
28.      }
29.    }
30.  }

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

步骤 7:Reporter 撰写文章

Reporter agent 使用调研数据和自身的 LLM 能力来撰写文章。在写作过程中,Reporter 使用 MCP 服务器来获取风格和模板:

`

1.  # Reporter Agent writing with MCP assistance
2.  async def write_article(self, research_data, assignment):
3.      # Get style guidelines via MCP
4.      style_guide = await self.mcp_client.get_resource(
5.          server="newsroom_mcp",
6.          resource="style://editorial/ap_style_guide"
7.      )

9.      # Get article template via MCP
10.      template = await self.mcp_client.get_resource(
11.          server="newsroom_mcp",
12.          resource="template://articles/news_story"
13.      )

15.      # Generate article using LLM + research + style
16.      draft = await self.llm.generate(
17.          prompt=f"""
18.          Write a news article following these guidelines:
19.          {style_guide}

21.          Using this template:
22.          {template}

24.          Based on this research:
25.          {research_data}

27.          Assignment: {assignment}
28.          """
29.      )

31.      # Self-evaluate confidence in claims
32.      confidence_check = await self._evaluate_confidence(draft)

34.      return draft, confidence_check

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

步骤 8:低置信度触发重新调研

Reporter agent 评估其草稿后发现有一个观点置信度较低。它向 Researcher agent 发送另一个请求:

`

1.  {
2.    "message_type": "collaboration_request",
3.    "sender": "reporter_agent",
4.    "receiver": "researcher_agent",
5.    "payload": {
6.      "request_type": "fact_verification",
7.      "claims": [
8.        {
9.          "text": "France's nuclear phase-down contributed to 15% increase in renewable capacity",
10.          "context": "Discussing policy drivers for renewable growth",
11.          "current_confidence": 0.45,
12.          "required_confidence": 0.80
13.        }
14.      ],
15.      "urgency": "high"
16.    }
17.  }

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

Researcher 使用 fact-checking MCP 服务器验证该观点,并返回更新的信息:

`

1.  {
2.    "message_type": "collaboration_response",
3.    "sender": "researcher_agent",
4.    "receiver": "reporter_agent",
5.    "payload": {
6.      "verified_claims": [
7.        {
8.          "original_claim": "France's nuclear phase-down contributed to 15% increase...",
9.          "verified_claim": "France's renewable capacity increased 18% in 2024, partially offsetting reduced nuclear output",
10.          "confidence": 0.88,
11.          "corrections": "Percentage was 18%, not 15%; nuclear phase-down is gradual, not primary driver",
12.          "sources": ["RTE France", "French Energy Ministry Report 2024"]
13.        }
14.      ]
15.    }
16.  }

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

步骤 9:Reporter 修改并提交给 Editor

Reporter 将已验证的事实纳入文章,并通过 A2A 将完成的草稿发送给 Editor agent:

`

1.  {
2.    "message_type": "task_request",
3.    "sender": "reporter_agent",
4.    "receiver": "editor_agent",
5.    "payload": {
6.      "task_id": "edit_renewable_story",
7.      "parent_task_id": "story_renewable_energy_2024",
8.      "content": {
9.        "headline": "Europe's Renewable Revolution: Solar and Wind Surge 30% in 2024",
10.        "body": "[Full article text...]",
11.        "word_count": 1185,
12.        "sources": [/* array of sources */]
13.      },
14.      "editing_requirements": {
15.        "check_style": true,
16.        "check_facts": true,
17.        "check_seo": true
18.      }
19.    }
20.  }

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

步骤 10:Editor 使用 MCP 工具进行审核

Editor agent 使用多个 MCP 服务器来审核文章:

`

1.  # Editor Agent using MCP for quality checks
2.  async def review_article(self, content):
3.      # Grammar and style check
4.      grammar_issues = await self.mcp_client.invoke_tool(
5.          server="grammarly_mcp",
6.          tool="check_document",
7.          parameters={"text": content["body"]}
8.      )

10.      # SEO optimization check
11.      seo_analysis = await self.mcp_client.invoke_tool(
12.          server="seo_mcp",
13.          tool="analyze_content",
14.          parameters={
15.              "headline": content["headline"],
16.              "body": content["body"],
17.              "target_keywords": ["renewable energy", "Europe", "solar", "wind"]
18.          }
19.      )

21.      # Plagiarism check
22.      originality = await self.mcp_client.invoke_tool(
23.          server="plagiarism_mcp",
24.          tool="check_originality",
25.          parameters={"text": content["body"]}
26.      )

28.      # Generate editorial feedback
29.      feedback = await self._generate_feedback(
30.          grammar_issues, 
31.          seo_analysis, 
32.          originality
33.      )

35.      return feedback

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

Editor 批准文章并将其发送出去:

`

1.  {
2.    "message_type": "task_response",
3.    "sender": "editor_agent",
4.    "receiver": "reporter_agent",
5.    "payload": {
6.      "status": "approved",
7.      "quality_score": 9.2,
8.      "minor_edits": [
9.        "Changed 'surge' to 'increased' in paragraph 3 for AP style consistency",
10.        "Added Oxford comma in list of countries"
11.      ],
12.      "approved_content": "[Final edited article]"
13.    }
14.  }

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

步骤 11:Publisher 通过 CI/CD 发布

最后,Printer agent 使用 MCP 服务器操作 CMS 和 CI/CD 流水线来发布已批准的文章:

`

1.  # Publisher Agent publishing via MCP
2.  async def publish_article(self, content, metadata):
3.      # Upload to CMS via MCP
4.      cms_result = await self.mcp_client.invoke_tool(
5.          server="wordpress_mcp",
6.          tool="create_post",
7.          parameters={
8.              "title": content["headline"],
9.              "body": content["body"],
10.              "status": "draft",
11.              "categories": metadata["categories"],
12.              "tags": metadata["tags"],
13.              "featured_image_url": metadata["image_url"]
14.          }
15.      )

17.      post_id = cms_result["post_id"]

19.      # Trigger CI/CD deployment via MCP
20.      deploy_result = await self.mcp_client.invoke_tool(
21.          server="cicd_mcp",
22.          tool="trigger_deployment",
23.          parameters={
24.              "pipeline": "publish_article",
25.              "environment": "production",
26.              "post_id": post_id,
27.              "schedule": "immediate"
28.          }
29.      )

31.      # Track analytics
32.      await self.mcp_client.invoke_tool(
33.          server="analytics_mcp",
34.          tool="register_publication",
35.          parameters={
36.              "post_id": post_id,
37.              "publish_time": datetime.now().isoformat(),
38.              "story_id": metadata["story_id"]
39.          }
40.      )

42.      return {
43.          "status": "published",
44.          "post_id": post_id,
45.          "url": f"https://newsroom.example.com/articles/{post_id}",
46.          "deployment_id": deploy_result["deployment_id"]
47.      }

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

Publisher 通过 A2A 确认文章已发布:

`

1.  {
2.    "message_type": "task_complete",
3.    "sender": "printer_agent",
4.    "receiver": "news_chief",
5.    "payload": {
6.      "task_id": "story_renewable_energy_2024",
7.      "status": "published",
8.      "publication": {
9.        "url": "https://newsroom.example.com/articles/renewable-europe-2024",
10.        "published_at": "2025-09-30T17:45:00Z",
11.        "post_id": "12345"
12.      },
13.      "workflow_metrics": {
14.        "total_time_minutes": 45,
15.        "agents_involved": ["reporter", "researcher", "archive", "editor", "printer"],
16.        "iterations": 2,
17.        "mcp_calls": 12
18.      }
19.    }
20.  }

`AI写代码![](https://csdnimg.cn/release/blogv2/dist/pc/img/runCode/icon-arrowwhite.png)

这是配套代码仓库中使用上述相同 agent 的完整 A2A 工作流序列。

#FromToActionProtocolDescription
1UserNews ChiefAssign StoryHTTP POSTUser submits story topic and angle
2News ChiefInternalCreate Story-Creates story record with unique ID
3News ChiefReporterDelegate AssignmentA2ASends story assignment via A2A protocol
4ReporterInternalAccept Assignment-Stores assignment internally
5ReporterMCP ServerGenerate OutlineMCP/HTTPCreates article outline and research questions
6aReporterResearcherRequest ResearchA2ASends questions (parallel with 6b)
6bReporterArchivistSearch ArchiveA2A JSONRPCSearches historical articles (parallel with 6a)
7ResearcherMCP ServerResearch QuestionsMCP/HTTPUses Anthropic via MCP to answer questions
8ResearcherReporterReturn ResearchA2AReturns research answers
9ArchivistElasticsearchSearch IndexES REST APIQueries news_archive index
10ArchivistReporterReturn ArchiveA2A JSONRPCReturns historical search results
11ReporterMCP ServerGenerate ArticleMCP/HTTPCreates article with research/archive context
12ReporterInternalStore Draft-Saves draft internally
13ReporterNews ChiefSubmit DraftA2ASubmits completed draft
14News ChiefInternalUpdate Story-Stores draft, updates status to "draft_submitted"
15News ChiefEditorReview DraftA2AAuto-routes to Editor for review
16EditorMCP ServerReview ArticleMCP/HTTPAnalyzes content using Anthropic via MCP
17EditorNews ChiefReturn ReviewA2ASends editorial feedback and suggestions
18News ChiefInternalStore Review-Stores editor feedback
19News ChiefReporterApply EditsA2ARoutes review feedback to Reporter
20ReporterMCP ServerApply EditsMCP/HTTPRevises article based on feedback
21ReporterInternalUpdate Draft-Updates draft with revisions
22ReporterNews ChiefReturn RevisedA2AReturns revised article
23News ChiefInternalUpdate Story-Stores revised draft, status to "revised"
24News ChiefPublisherPublish ArticleA2AAuto-routes to Publisher
25PublisherMCP ServerGenerate TagsMCP/HTTPCreates tags and categories
26PublisherElasticsearchIndex ArticleES REST APIIndexes article to news_archive index
27PublisherFilesystemSave MarkdownFile I/OSaves article as .md file in /articles
28PublisherNews ChiefConfirm PublicationA2AReturns success status
29News ChiefInternalUpdate Story-Updates story status to "published"

结论

在现代增强型 LLM 基础设施范式中,A2A 和 MCP 都扮演着重要角色。A2A 为复杂的多 agent 系统提供灵活性,但可能可移植性较低且运维复杂度更高。MCP 提供了标准化的工具集成方法,更易于实现和维护,但并非设计用于多 agent 协同。

选择并非二选一。如我们新闻室示例所示,最复杂且高效的 LLM 支持系统通常结合两种方法:agent 通过 A2A 协议进行协调和专业化,同时通过 MCP 服务器访问其工具和资源。这种混合架构在提供多 agent 系统组织效益的同时,也带来了 MCP 的标准化和生态优势。这表明,可能根本不需要选择:直接将两者作为标准方法即可。

作为开发者或架构师,你需要测试并确定两种解决方案的最佳组合,以针对你的具体用例产生最佳结果。理解每种方法的优势、局限性及适用场景,将帮助你构建更高效、可维护且可扩展的 AI 系统。

无论你是在构建数字新闻室、客户服务平台、研究助理还是其他 LLM 驱动的应用,仔细考虑你的协调需求(A2A)和工具访问需求(MCP)都将为成功奠定基础。

附加资源

原文:www.elastic.co/search-labs…