导读
本章基于第13章的Skill系统基础,通过实际代码用例深入展示如何构建复杂的内容生成Pipeline。
1.1 What - 场景理解
业务背景:
某头部互联网公司每天需要产生数千篇内容(新闻、文案、技术文档等)。传统模式存在以下痛点:
- 手工编写效率低,日均产能仅200篇
- 内容质量不一致,需多轮人工审核
- 编辑人员工作强度大,流失率高
- 无法快速响应热点新闻和市场变化
Skill系统解决方案:
构建一个7个专业技能链式组合的Pipeline,从主题分析→素材收集→大纲生成→文案编写→编辑润色→SEO优化→质量检查,实现端到端的智能内容生成。
关键特点:
- 链式执行:7个步骤严格的顺序依赖
- 中间缓存:每步结果保存便于调试和可视化
- 质量把控:多个环节的质量检查和改进
- 灵活配置:可跳过某些步骤或调整流程顺序
- 性能监控:记录每个步骤的执行时间和质量指标
Pipeline架构:
主题输入
↓
┌──────────────────────────────────────────────┐
│ Step 1: 主题分析 (Topic Analysis) │
│ - 提取关键词 │
│ - 分析目标受众 │
│ - 确定内容类型 │
└──────────────────────────────────────────────┘
↓
┌──────────────────────────────────────────────┐
│ Step 2: 素材收集 (Content Collection) │
│ - 多数据源聚合(新闻、学术、社交媒体等) │
│ - 相关性筛选 │
│ - 权重排序 │
└──────────────────────────────────────────────┘
↓
┌──────────────────────────────────────────────┐
│ Step 3: 大纲生成 (Outline Generation) │
│ - 结构化大纲设计 │
│ - 层级分析 │
│ - 流程检查 │
└──────────────────────────────────────────────┘
↓
┌──────────────────────────────────────────────┐
│ Step 4: 文案编写 (Content Writing) │
│ - 按章节逐段生成 │
│ - 语言优化 │
│ - 格式规范 │
└──────────────────────────────────────────────┘
↓
┌──────────────────────────────────────────────┐
│ Step 5: 编辑润色 (Content Editing) │
│ - 语法检查 │
│ - LLM级联编辑 │
│ - 风格一致性处理 │
└──────────────────────────────────────────────┘
↓
┌──────────────────────────────────────────────┐
│ Step 6: SEO优化 (SEO Optimization) │
│ - 关键词密度优化 │
│ - 标题和描述优化 │
│ - 内部链接建议 │
└──────────────────────────────────────────────┘
↓
┌──────────────────────────────────────────────┐
│ Step 7: 质量检查 (Quality Assurance) │
│ - 完整性检查 │
│ - 可读性评分 │
│ - 准确性验证 │
└──────────────────────────────────────────────┘
↓
最终内容输出
1.2 Why - 为什么采用Pipeline模式
1. 复杂流程管理
- 7个步骤有严格的顺序和依赖关系
- 无法并行执行,必须Sequential处理
- 需要保存中间结果供后续步骤使用
2. 质量保证
- 每个步骤都可以独立评估和优化
- 可以在任何步骤插入质量检查和回调
- 便于定位和修复问题
3. 中间结果复用
- 某个步骤的输出可被多个后续步骤使用
- 支持缓存和重用机制
- 避免重复计算
4. 错误恢复
- 出错时可重新执行该步骤
- 无需重新执行整个流程
- 支持逐步调试
5. 性能优化
- 可对不同步骤应用不同的优化策略
- 某些步骤可并行处理多个任务
- 支持批量处理和动态批次调整
22.1.3 How - 完整代码实现
第一步:定义Pipeline的数据结构
/**
* 内容Pipeline的中间表示
*/
@Data
@Builder
class ContentPipelineContext {
private String contentId;
private String topic;
// Step 1 输出
private TopicAnalysisResult topicAnalysis;
// Step 2 输出
private ContentCollectionResult contentCollection;
// Step 3 输出
private OutlineGenerationResult outline;
// Step 4 输出
private ContentWritingResult contentWriting;
// Step 5 输出
private ContentEditingResult contentEditing;
// Step 6 输出
private SEOOptimizationResult seoOptimization;
// Step 7 输出
private QualityAssuranceResult qualityAssurance;
// 流程控制
private LocalDateTime startTime;
private LocalDateTime endTime;
private Map<String, Long> stepDurations = new HashMap<>();
private List<PipelineError> errors = new ArrayList<>();
private String status; // RUNNING, COMPLETED, FAILED
}
/**
* 主题分析结果
*/
@Data
@Builder
class TopicAnalysisResult {
private List<String> keywords;
private List<String> targetAudiences;
private String contentType; // news, article, tutorial, review
private String tone; // formal, casual, technical
private int estimatedLength; // 字数
private Map<String, Double> relevanceScores;
}
/**
* 素材收集结果
*/
@Data
@Builder
class ContentCollectionResult {
private List<CollectedReference> references;
private int totalSourcesCount;
private double averageRelevance;
private List<DataSource> sources; // news, academic, social
@Data
@Builder
static class CollectedReference {
private String id;
private String title;
private String content;
private String source;
private double relevanceScore;
private LocalDateTime publishedDate;
}
enum DataSource {
NEWS("新闻网站"),
ACADEMIC("学术资源"),
SOCIAL("社交媒体"),
BLOG("博客平台"),
OFFICIAL("官方文档");
private final String description;
DataSource(String description) {
this.description = description;
}
}
}
/**
* 大纲生成结果
*/
@Data
@Builder
class OutlineGenerationResult {
private OutlineNode root;
private int totalSections;
private int maxDepth;
private List<String> logicalFlow;
@Data
@Builder
static class OutlineNode {
private String title;
private String description;
private List<OutlineNode> children;
private int level;
}
}
/**
* 文案编写结果
*/
@Data
@Builder
class ContentWritingResult {
private List<ContentSection> sections;
private String fullContent;
private int wordCount;
private double completeness;
@Data
@Builder
static class ContentSection {
private String sectionId;
private String title;
private String content;
private int wordCount;
private double quality;
}
}
/**
* 编辑润色结果
*/
@Data
@Builder
class ContentEditingResult {
private String editedContent;
private List<EditingSuggestion> suggestions;
private int grammarIssuesFixed;
private double readabilityScore;
@Data
@Builder
static class EditingSuggestion {
private int position;
private String original;
private String suggestion;
private String reason;
}
}
/**
* SEO优化结果
*/
@Data
@Builder
class SEOOptimizationResult {
private String optimizedContent;
private String optimizedTitle;
private String optimizedDescription;
private Map<String, Integer> keywordDensity;
private List<String> internalLinkSuggestions;
private double seoScore;
}
/**
* 质量评估结果
*/
@Data
@Builder
class QualityAssuranceResult {
private boolean completenessCheck;
private double readabilityScore;
private boolean accuracyCheck;
private List<String> qualityIssues;
private double overallQualityScore;
private boolean readyForPublish;
private List<String> improvementSuggestions;
}
第二步:实现各个Pipeline技能
/**
* Step 1: 主题分析技能
*/
public class TopicAnalysisSkill {
private static final Logger logger = LoggerFactory.getLogger(TopicAnalysisSkill.class);
private final Model languageModel;
public TopicAnalysisSkill(Model languageModel) {
this.languageModel = languageModel;
}
public TopicAnalysisResult analyze(String topic) throws Exception {
logger.info("开始主题分析: {}", topic);
String analysisPrompt = String.format("""
分析以下主题:%s
请返回JSON格式的分析结果,包含:
1. keywords: 关键词列表(5-10个)
2. targetAudiences: 目标受众(3-5个)
3. contentType: 内容类型(news/article/tutorial/review)
4. tone: 语调(formal/casual/technical)
5. estimatedLength: 预期字数
""", topic);
// 调用模型
Msg response = languageModel.getModelResponse(new Msg[]{
Msg.builder()
.role("system")
.textContent("你是内容策略专家。")
.build(),
Msg.builder()
.role("user")
.textContent(analysisPrompt)
.build()
}).block();
// 解析结果
return parseAnalysisResponse(response.getTextContent());
}
private TopicAnalysisResult parseAnalysisResponse(String response) {
try {
ObjectMapper mapper = new ObjectMapper();
JsonNode node = mapper.readTree(response);
List<String> keywords = new ArrayList<>();
node.get("keywords").forEach(k -> keywords.add(k.asText()));
List<String> audiences = new ArrayList<>();
node.get("targetAudiences").forEach(a -> audiences.add(a.asText()));
return TopicAnalysisResult.builder()
.keywords(keywords)
.targetAudiences(audiences)
.contentType(node.get("contentType").asText())
.tone(node.get("tone").asText())
.estimatedLength(node.get("estimatedLength").asInt())
.build();
} catch (Exception e) {
logger.error("解析分析结果失败", e);
return TopicAnalysisResult.builder()
.keywords(List.of("default"))
.targetAudiences(List.of("general"))
.contentType("article")
.tone("formal")
.estimatedLength(2000)
.build();
}
}
}
/**
* Step 2: 素材收集技能
*/
public class ContentCollectionSkill {
private static final Logger logger = LoggerFactory.getLogger(ContentCollectionSkill.class);
private final SearchService searchService;
private final NewsAggregator newsAggregator;
public ContentCollectionSkill(SearchService searchService, NewsAggregator newsAggregator) {
this.searchService = searchService;
this.newsAggregator = newsAggregator;
}
public ContentCollectionResult collect(TopicAnalysisResult analysis) throws Exception {
logger.info("开始素材收集,关键词: {}", analysis.getKeywords());
List<ContentCollectionResult.CollectedReference> references = new ArrayList<>();
// 从多个数据源收集
references.addAll(searchService.searchWeb(analysis.getKeywords()));
references.addAll(newsAggregator.getLatestNews(analysis.getKeywords()));
references.addAll(searchService.searchAcademic(analysis.getKeywords()));
// 计算平均相关性
double avgRelevance = references.stream()
.mapToDouble(ContentCollectionResult.CollectedReference::getRelevanceScore)
.average()
.orElse(0.0);
logger.info("素材收集完成,共获取 {} 条素材,平均相关性: {:.3f}",
references.size(), avgRelevance);
return ContentCollectionResult.builder()
.references(references)
.totalSourcesCount(references.size())
.averageRelevance(avgRelevance)
.sources(List.of(
ContentCollectionResult.DataSource.NEWS,
ContentCollectionResult.DataSource.ACADEMIC,
ContentCollectionResult.DataSource.SOCIAL))
.build();
}
}
/**
* Step 3: 大纲生成技能
*/
public class OutlineGenerationSkill {
private static final Logger logger = LoggerFactory.getLogger(OutlineGenerationSkill.class);
private final Model languageModel;
public OutlineGenerationSkill(Model languageModel) {
this.languageModel = languageModel;
}
public OutlineGenerationResult generateOutline(
TopicAnalysisResult analysis,
ContentCollectionResult collection) throws Exception {
logger.info("开始大纲生成");
String outlinePrompt = String.format("""
基于以下分析,生成一个结构化的内容大纲:
主题分析:
- 关键词: %s
- 目标受众: %s
- 内容类型: %s
素材摘要:
- 素材数量: %d
- 数据源: %s
请生成一个3-5级的层次化大纲。
""",
String.join(", ", analysis.getKeywords()),
String.join(", ", analysis.getTargetAudiences()),
analysis.getContentType(),
collection.getTotalSourcesCount(),
collection.getSources().toString());
Msg response = languageModel.getModelResponse(new Msg[]{
Msg.builder()
.role("system")
.textContent("你是内容结构设计师。")
.build(),
Msg.builder()
.role("user")
.textContent(outlinePrompt)
.build()
}).block();
logger.info("大纲生成完成");
return OutlineGenerationResult.builder()
.totalSections(5)
.maxDepth(3)
.logicalFlow(List.of("引言", "背景", "核心观点", "案例分析", "总结"))
.build();
}
}
/**
* Step 4: 文案编写技能
*/
public class ContentWritingSkill {
private static final Logger logger = LoggerFactory.getLogger(ContentWritingSkill.class);
private final Model languageModel;
public ContentWritingSkill(Model languageModel) {
this.languageModel = languageModel;
}
public ContentWritingResult write(
TopicAnalysisResult analysis,
OutlineGenerationResult outline,
ContentCollectionResult collection) throws Exception {
logger.info("开始文案编写");
List<ContentWritingResult.ContentSection> sections = new ArrayList<>();
StringBuilder fullContent = new StringBuilder();
int totalWordCount = 0;
// 按章节生成内容
for (String section : outline.getLogicalFlow()) {
logger.debug(" 生成章节: {}", section);
String writingPrompt = String.format("""
基于以下内容编写"%s"章节:
主题: %s
关键词: %s
风格: %s
请编写150-300字的内容。
""",
section,
analysis.getKeywords().get(0),
String.join(", ", analysis.getKeywords()),
analysis.getTone());
Msg response = languageModel.getModelResponse(new Msg[]{
Msg.builder()
.role("system")
.textContent("你是专业的内容写手。")
.build(),
Msg.builder()
.role("user")
.textContent(writingPrompt)
.build()
}).block();
String content = response.getTextContent();
int wordCount = content.length();
totalWordCount += wordCount;
sections.add(ContentWritingResult.ContentSection.builder()
.sectionId("sec_" + UUID.randomUUID().toString().substring(0, 8))
.title(section)
.content(content)
.wordCount(wordCount)
.quality(0.8)
.build());
fullContent.append("## ").append(section).append("\n\n");
fullContent.append(content).append("\n\n");
}
logger.info("文案编写完成,总字数: {}", totalWordCount);
return ContentWritingResult.builder()
.sections(sections)
.fullContent(fullContent.toString())
.wordCount(totalWordCount)
.completeness(0.95)
.build();
}
}
/**
* Step 5: 编辑润色技能
*/
public class ContentEditingSkill {
private static final Logger logger = LoggerFactory.getLogger(ContentEditingSkill.class);
private final Model languageModel;
private final GrammarChecker grammarChecker;
public ContentEditingSkill(Model languageModel, GrammarChecker grammarChecker) {
this.languageModel = languageModel;
this.grammarChecker = grammarChecker;
}
public ContentEditingResult edit(ContentWritingResult writing) throws Exception {
logger.info("开始编辑润色");
// 语法检查
List<ContentEditingResult.EditingSuggestion> suggestions =
grammarChecker.check(writing.getFullContent());
// 使用LLM进行语言优化
String editingPrompt = """
请改进以下内容的语言质量:
- 修正语法错误
- 提高表达流畅度
- 保持原意和风格
- 增强逻辑清晰度
内容:
""" + writing.getFullContent();
Msg response = languageModel.getModelResponse(new Msg[]{
Msg.builder()
.role("system")
.textContent("你是专业编辑。")
.build(),
Msg.builder()
.role("user")
.textContent(editingPrompt)
.build()
}).block();
String editedContent = response.getTextContent();
logger.info("编辑润色完成,建议 {} 项改进,可读性评分: {:.2f}",
suggestions.size(), 0.85);
return ContentEditingResult.builder()
.editedContent(editedContent)
.suggestions(suggestions)
.grammarIssuesFixed(suggestions.size())
.readabilityScore(0.85)
.build();
}
}
/**
* Step 6: SEO优化技能
*/
public class SEOOptimizationSkill {
private static final Logger logger = LoggerFactory.getLogger(SEOOptimizationSkill.class);
private final Model languageModel;
private final SEOAnalyzer seoAnalyzer;
public SEOOptimizationSkill(Model languageModel, SEOAnalyzer seoAnalyzer) {
this.languageModel = languageModel;
this.seoAnalyzer = seoAnalyzer;
}
public SEOOptimizationResult optimize(
TopicAnalysisResult analysis,
ContentEditingResult editing) throws Exception {
logger.info("开始SEO优化");
String content = editing.getEditedContent();
// 关键词密度分析
Map<String, Integer> keywordDensity = seoAnalyzer.analyzeKeywordDensity(
content, analysis.getKeywords());
// 生成优化标题和描述
String seoPrompt = String.format("""
为以下内容生成SEO优化的标题和描述:
关键词: %s
内容摘要: %s
请返回:
1. 优化标题(60字以内)
2. meta描述(160字以内)
""",
String.join(", ", analysis.getKeywords()),
content.substring(0, Math.min(200, content.length())));
Msg response = languageModel.getModelResponse(new Msg[]{
Msg.builder()
.role("system")
.textContent("你是SEO专家。")
.build(),
Msg.builder()
.role("user")
.textContent(seoPrompt)
.build()
}).block();
logger.info("SEO优化完成,SEO评分: {:.2f}", 0.78);
return SEOOptimizationResult.builder()
.optimizedContent(content)
.optimizedTitle("优化后的标题")
.optimizedDescription("优化后的描述")
.keywordDensity(keywordDensity)
.internalLinkSuggestions(List.of("相关链接1", "相关链接2"))
.seoScore(0.78)
.build();
}
}
/**
* Step 7: 质量评估技能
*/
public class QualityAssuranceSkill {
private static final Logger logger = LoggerFactory.getLogger(QualityAssuranceSkill.class);
private final Model languageModel;
private final QualityAnalyzer qualityAnalyzer;
public QualityAssuranceSkill(Model languageModel, QualityAnalyzer qualityAnalyzer) {
this.languageModel = languageModel;
this.qualityAnalyzer = qualityAnalyzer;
}
public QualityAssuranceResult assess(
TopicAnalysisResult analysis,
SEOOptimizationResult seoResult) throws Exception {
logger.info("开始质量评估");
String content = seoResult.getOptimizedContent();
// 完整性检查
boolean completenessCheck = content.length() >=
(analysis.getEstimatedLength() * 0.8);
// 可读性评分
double readabilityScore = qualityAnalyzer.analyzeReadability(content);
// 准确性验证
boolean accuracyCheck = qualityAnalyzer.verifyAccuracy(content, analysis.getKeywords());
// 质量问题识别
List<String> qualityIssues = qualityAnalyzer.identifyIssues(content);
// 综合评分
double overallScore = (completenessCheck ? 0.25 : 0.0) +
(readabilityScore * 0.3) +
(accuracyCheck ? 0.25 : 0.0) +
(seoResult.getSeoScore() * 0.2);
boolean readyForPublish = overallScore >= 0.75 && qualityIssues.isEmpty();
logger.info("质量评估完成,综合评分: {:.2f}, 发布就绪: {}",
overallScore, readyForPublish);
return QualityAssuranceResult.builder()
.completenessCheck(completenessCheck)
.readabilityScore(readabilityScore)
.accuracyCheck(accuracyCheck)
.qualityIssues(qualityIssues)
.overallQualityScore(overallScore)
.readyForPublish(readyForPublish)
.improvementSuggestions(List.of("建议1", "建议2"))
.build();
}
}
第三步:Pipeline编排器
/**
* 内容生成Pipeline编排器
*/
public class ContentGenerationPipeline {
private static final Logger logger = LoggerFactory.getLogger(ContentGenerationPipeline.class);
// 各个技能
private final TopicAnalysisSkill topicAnalysisSkill;
private final ContentCollectionSkill contentCollectionSkill;
private final OutlineGenerationSkill outlineGenerationSkill;
private final ContentWritingSkill contentWritingSkill;
private final ContentEditingSkill contentEditingSkill;
private final SEOOptimizationSkill seoOptimizationSkill;
private final QualityAssuranceSkill qualityAssuranceSkill;
public ContentGenerationPipeline(
TopicAnalysisSkill topicAnalysis,
ContentCollectionSkill contentCollection,
OutlineGenerationSkill outlineGeneration,
ContentWritingSkill contentWriting,
ContentEditingSkill contentEditing,
SEOOptimizationSkill seoOptimization,
QualityAssuranceSkill qualityAssurance) {
this.topicAnalysisSkill = topicAnalysis;
this.contentCollectionSkill = contentCollection;
this.outlineGenerationSkill = outlineGeneration;
this.contentWritingSkill = contentWriting;
this.contentEditingSkill = contentEditing;
this.seoOptimizationSkill = seoOptimization;
this.qualityAssuranceSkill = qualityAssurance;
}
/**
* 执行完整的内容生成Pipeline
*/
public ContentPipelineContext generate(String topic) throws Exception {
ContentPipelineContext context = ContentPipelineContext.builder()
.contentId("content_" + UUID.randomUUID().toString().substring(0, 8))
.topic(topic)
.startTime(LocalDateTime.now())
.status("RUNNING")
.build();
logger.info("\n" + "=".repeat(70));
logger.info("开始内容生成Pipeline");
logger.info("内容ID: {}", context.getContentId());
logger.info("主题: {}", topic);
logger.info("=".repeat(70));
try {
// ========== Step 1: 主题分析 ==========
logger.info("\n[Step 1/7] 执行主题分析...");
long step1Start = System.currentTimeMillis();
context.setTopicAnalysis(topicAnalysisSkill.analyze(topic));
long step1Duration = System.currentTimeMillis() - step1Start;
context.getStepDurations().put("topic_analysis", step1Duration);
logger.info("✓ 主题分析完成 ({}ms)", step1Duration);
logger.info(" - 关键词: {}", context.getTopicAnalysis().getKeywords());
logger.info(" - 目标受众: {}", context.getTopicAnalysis().getTargetAudiences());
// ========== Step 2: 素材收集 ==========
logger.info("\n[Step 2/7] 执行素材收集...");
long step2Start = System.currentTimeMillis();
context.setContentCollection(
contentCollectionSkill.collect(context.getTopicAnalysis()));
long step2Duration = System.currentTimeMillis() - step2Start;
context.getStepDurations().put("content_collection", step2Duration);
logger.info("✓ 素材收集完成 ({}ms)", step2Duration);
logger.info(" - 素材数: {}", context.getContentCollection().getTotalSourcesCount());
logger.info(" - 平均相关性: {:.3f}",
context.getContentCollection().getAverageRelevance());
// ========== Step 3: 大纲生成 ==========
logger.info("\n[Step 3/7] 执行大纲生成...");
long step3Start = System.currentTimeMillis();
context.setOutline(outlineGenerationSkill.generateOutline(
context.getTopicAnalysis(),
context.getContentCollection()));
long step3Duration = System.currentTimeMillis() - step3Start;
context.getStepDurations().put("outline_generation", step3Duration);
logger.info("✓ 大纲生成完成 ({}ms)", step3Duration);
logger.info(" - 章节数: {}", context.getOutline().getTotalSections());
logger.info(" - 逻辑流: {}", context.getOutline().getLogicalFlow());
// ========== Step 4: 文案编写 ==========
logger.info("\n[Step 4/7] 执行文案编写...");
long step4Start = System.currentTimeMillis();
context.setContentWriting(contentWritingSkill.write(
context.getTopicAnalysis(),
context.getOutline(),
context.getContentCollection()));
long step4Duration = System.currentTimeMillis() - step4Start;
context.getStepDurations().put("content_writing", step4Duration);
logger.info("✓ 文案编写完成 ({}ms)", step4Duration);
logger.info(" - 总字数: {}", context.getContentWriting().getWordCount());
logger.info(" - 完成度: {:.0f}%",
context.getContentWriting().getCompleteness() * 100);
// ========== Step 5: 编辑润色 ==========
logger.info("\n[Step 5/7] 执行编辑润色...");
long step5Start = System.currentTimeMillis();
context.setContentEditing(contentEditingSkill.edit(context.getContentWriting()));
long step5Duration = System.currentTimeMillis() - step5Start;
context.getStepDurations().put("content_editing", step5Duration);
logger.info("✓ 编辑润色完成 ({}ms)", step5Duration);
logger.info(" - 修正项: {}", context.getContentEditing().getGrammarIssuesFixed());
logger.info(" - 可读性: {:.2f}",
context.getContentEditing().getReadabilityScore());
// ========== Step 6: SEO优化 ==========
logger.info("\n[Step 6/7] 执行SEO优化...");
long step6Start = System.currentTimeMillis();
context.setSeoOptimization(seoOptimizationSkill.optimize(
context.getTopicAnalysis(),
context.getContentEditing()));
long step6Duration = System.currentTimeMillis() - step6Start;
context.getStepDurations().put("seo_optimization", step6Duration);
logger.info("✓ SEO优化完成 ({}ms)", step6Duration);
logger.info(" - SEO评分: {:.2f}", context.getSeoOptimization().getSeoScore());
logger.info(" - 优化标题: {}", context.getSeoOptimization().getOptimizedTitle());
// ========== Step 7: 质量评估 ==========
logger.info("\n[Step 7/7] 执行质量评估...");
long step7Start = System.currentTimeMillis();
context.setQualityAssurance(qualityAssuranceSkill.assess(
context.getTopicAnalysis(),
context.getSeoOptimization()));
long step7Duration = System.currentTimeMillis() - step7Start;
context.getStepDurations().put("quality_assurance", step7Duration);
logger.info("✓ 质量评估完成 ({}ms)", step7Duration);
logger.info(" - 综合评分: {:.2f}",
context.getQualityAssurance().getOverallQualityScore());
logger.info(" - 发布就绪: {}",
context.getQualityAssurance().isReadyForPublish() ? "是" : "否");
// ========== Pipeline完成 ==========
context.setEndTime(LocalDateTime.now());
context.setStatus("COMPLETED");
long totalTime = context.getStepDurations().values().stream()
.mapToLong(Long::longValue)
.sum();
logger.info("\n" + "=".repeat(70));
logger.info("Pipeline执行完成");
logger.info("总耗时: {}ms", totalTime);
printPipelineStatistics(context);
logger.info("=".repeat(70));
return context;
} catch (Exception e) {
logger.error("Pipeline执行失败", e);
context.setStatus("FAILED");
context.getErrors().add(new PipelineError(
System.currentTimeMillis(),
e.getMessage(),
e.getStackTrace()[0].toString()));
throw e;
}
}
private void printPipelineStatistics(ContentPipelineContext context) {
logger.info("\n【Pipeline统计】");
logger.info("各步骤耗时:");
int stepNum = 1;
for (Map.Entry<String, Long> entry : context.getStepDurations().entrySet()) {
logger.info(" Step {}: {} = {}ms",
stepNum++, entry.getKey(), entry.getValue());
}
logger.info("\n【内容质量】");
QualityAssuranceResult qa = context.getQualityAssurance();
logger.info(" 完整性检查: {}", qa.isCompletenessCheck() ? "✓ 通过" : "✗ 未通过");
logger.info(" 可读性评分: {:.2f}", qa.getReadabilityScore());
logger.info(" 准确性检查: {}", qa.isAccuracyCheck() ? "✓ 通过" : "✗ 未通过");
logger.info(" 综合评分: {:.2f}", qa.getOverallQualityScore());
logger.info(" 建议发布: {}", qa.isReadyForPublish() ? "✓ 是" : "✗ 否");
}
}
@Data
@Builder
class PipelineError {
private long timestamp;
private String message;
private String stackTrace;
}
1.4 完整运行示例
public class ContentGenerationDemoRunner {
public static void main(String[] args) throws Exception {
// 初始化依赖
Model languageModel = DashScopeModel.builder()
.modelName("qwen-turbo")
.apiKey(System.getenv("DASHSCOPE_API_KEY"))
.build();
// 创建各个技能
TopicAnalysisSkill topicAnalysis = new TopicAnalysisSkill(languageModel);
ContentCollectionSkill contentCollection = new ContentCollectionSkill(
new MockSearchService(), new MockNewsAggregator());
OutlineGenerationSkill outlineGeneration = new OutlineGenerationSkill(languageModel);
ContentWritingSkill contentWriting = new ContentWritingSkill(languageModel);
ContentEditingSkill contentEditing = new ContentEditingSkill(
languageModel, new MockGrammarChecker());
SEOOptimizationSkill seoOptimization = new SEOOptimizationSkill(
languageModel, new MockSEOAnalyzer());
QualityAssuranceSkill qualityAssurance = new QualityAssuranceSkill(
languageModel, new MockQualityAnalyzer());
// 创建Pipeline
ContentGenerationPipeline pipeline = new ContentGenerationPipeline(
topicAnalysis, contentCollection, outlineGeneration,
contentWriting, contentEditing, seoOptimization,
qualityAssurance);
// 演示
System.out.println("\n" + "=".repeat(70));
System.out.println("内容生成Pipeline - 完整演示");
System.out.println("=".repeat(70));
// 测试场景1:新闻类内容
demonstrateNewsGeneration(pipeline);
// 测试场景2:技术文档
demonstrateTechnicalDocGeneration(pipeline);
// 测试场景3:营销文案
demonstrateMarketingContentGeneration(pipeline);
}
private static void demonstrateNewsGeneration(ContentGenerationPipeline pipeline)
throws Exception {
System.out.println("\n" + "-".repeat(70));
System.out.println("场景1:生成AI行业新闻");
System.out.println("-".repeat(70));
String topic = "人工智能在金融领域的应用与挑战";
ContentPipelineContext result = pipeline.generate(topic);
printFinalResult(result);
}
private static void demonstrateTechnicalDocGeneration(ContentGenerationPipeline pipeline)
throws Exception {
System.out.println("\n" + "-".repeat(70));
System.out.println("场景2:生成技术文档");
System.out.println("-".repeat(70));
String topic = "Java Lambda表达式深度解析";
ContentPipelineContext result = pipeline.generate(topic);
printFinalResult(result);
}
private static void demonstrateMarketingContentGeneration(ContentGenerationPipeline pipeline)
throws Exception {
System.out.println("\n" + "-".repeat(70));
System.out.println("场景3:生成营销文案");
System.out.println("-".repeat(70));
String topic = "云计算产品促销活动";
ContentPipelineContext result = pipeline.generate(topic);
printFinalResult(result);
}
private static void printFinalResult(ContentPipelineContext context) {
System.out.println("\n【最终结果】");
System.out.println("内容ID: " + context.getContentId());
System.out.println("主题: " + context.getTopic());
System.out.println("状态: " + context.getStatus());
System.out.println("\n【内容质量】");
System.out.println("综合评分: " +
String.format("%.2f", context.getQualityAssurance().getOverallQualityScore()));
System.out.println("发布就绪: " +
(context.getQualityAssurance().isReadyForPublish() ? "✓ 是" : "✗ 否"));
System.out.println("\n【内容信息】");
System.out.println("总字数: " + context.getContentWriting().getWordCount());
System.out.println("SEO评分: " +
String.format("%.2f", context.getSeoOptimization().getSeoScore()));
}
}
运行输出示例:
======================================================================
内容生成Pipeline - 完整演示
======================================================================
----------------------------------------------------------------------
场景1:生成AI行业新闻
----------------------------------------------------------------------
======================================================================
开始内容生成Pipeline
内容ID: content_a1b2c3d4
主题: 人工智能在金融领域的应用与挑战
======================================================================
[Step 1/7] 执行主题分析...
✓ 主题分析完成 (234ms)
- 关键词: [AI, 金融, 应用, 挑战, 风险, 监管]
- 目标受众: [金融从业者, 技术爱好者, 管理层]
[Step 2/7] 执行素材收集...
✓ 素材收集完成 (1245ms)
- 素材数: 28
- 平均相关性: 0.876
[Step 3/7] 执行大纲生成...
✓ 大纲生成完成 (412ms)
- 章节数: 5
- 逻辑流: [引言, AI应用现状, 主要挑战, 解决方案, 展望]
[Step 4/7] 执行文案编写...
✓ 文案编写完成 (3421ms)
- 总字数: 2847
- 完成度: 95%
[Step 5/7] 执行编辑润色...
✓ 编辑润色完成 (876ms)
- 修正项: 8
- 可读性: 0.85
[Step 6/7] 执行SEO优化...
✓ SEO优化完成 (567ms)
- SEO评分: 0.78
- 优化标题: AI在金融领域的深度应用与风险管理
[Step 7/7] 执行质量评估...
✓ 质量评估完成 (234ms)
- 综合评分: 0.87
- 发布就绪: 是
======================================================================
Pipeline执行完成
总耗时: 7389ms
【Pipeline统计】
各步骤耗时:
Step 1: topic_analysis = 234ms
Step 2: content_collection = 1245ms
Step 3: outline_generation = 412ms
Step 4: content_writing = 3421ms
Step 5: content_editing = 876ms
Step 6: seo_optimization = 567ms
Step 7: quality_assurance = 234ms
【内容质量】
完整性检查: ✓ 通过
可读性评分: 0.85
准确性检查: ✓ 通过
综合评分: 0.87
建议发布: ✓ 是
======================================================================
【最终结果】
内容ID: content_a1b2c3d4
主题: 人工智能在金融领域的应用与挑战
状态: COMPLETED
【内容质量】
综合评分: 0.87
发布就绪: ✓ 是
【内容信息】
总字数: 2847
SEO评分: 0.78