feat(okr): 接入豆包AI自动分析Git提交生成OKR
All checks were successful
Build and Deploy / build-and-deploy (push) Successful in 1m4s

基于豆包(Doubao) LLM 分析 git commit messages,按仓库维度自动为每个
提交人生成、更新、标记完成 OKR:

- 新增 ai_analyzed_commits 表实现增量标记,每条 commit 只分析一次
- objectives/keyResults 新增 source、sourceKey 字段区分 AI 生成与手动创建
- keyResults.status 扩展支持 completed 状态
- 新增 llm-client.ts 封装豆包 Ark API 调用(原生 fetch,零依赖)
- 新增 okr-ai-sync.ts 核心服务:按仓库分组 → 构建 prompt → 调用 AI → 执行 actions
- scheduler 在 Git 同步后自动触发 AI 分析(受 AI_ENABLED 开关控制)
- 新增 POST /api/okr/ai-analyze 手动触发和 preview 预览端点
- 防重复三层保障:commit SHA 标记 + sourceKey 去重 + 项目 OKR 上下文

已验证:501 条 commits 全量分析,生成 37 个 Objectives、164 个 Key Results,
增量去重机制正常(重复调用返回 0 actions)。

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
zyc 2026-04-27 13:29:36 +08:00
parent 7cd8bc1b9b
commit e1396b1479
11 changed files with 1173 additions and 8 deletions

421
AI-OKR-SYNC-PLAN.md Normal file
View File

@ -0,0 +1,421 @@
# AI 驱动的 OKR 自动生成与同步
## Context
DevPerf 已具备完整的 Git 同步管线Gitea → gitCommits 表)和 OKR CRUD 系统。
当前 OKR 需人工创建,缺少与 Git 提交的联动。
**目标**:每次 Git 同步后AI 分析新增 commit messages按仓库维度
- 新增 KR — commit 提到的功能点尚无对应 KR
- 更新进度 — commit 涉及已有 KR 的功能
- 标记完成 — commit 明确表示功能已完成
**分析粒度**:按仓库(每个仓库的新增 commits 做一次 AI 分析prompt 中包含提交人信息)
**OKR 归属**commit 的提交人 = KR/Objective 的负责人ownerId通过 gitCommits.userId 关联
**增量机制**:每条 commit 只分析一次,分析后打标记,后续同步不再重复处理
---
## 第一阶段Schema 变更
**文件**: `backend/src/db/schema.ts`
### 1a. 新增 `ai_analyzed_commits` 表(增量标记,防止重复分析)
```
aiAnalyzedCommits:
id varchar(50) PK
commitSha varchar(200) NOT NULL UNIQUE -- 对应 gitCommits.sha
batchId varchar(50) NOT NULL -- 一次 AI 调用的批次 ID
createdAt datetime NOT NULL
```
**核心作用**:每次同步后,只提取 `gitCommits` 中尚未出现在此表的 commit 送给 AI。分析完成后立即写入标记确保该 commit 不会被重复处理。
### 1b. objectives 表加字段
- `source` varchar(50) default `'manual'` — 值: `'manual'` | `'ai_generated'`
### 1c. keyResults 表加字段
- `source` varchar(50) default `'manual'` — 值: `'manual'` | `'ai_generated'`
- `sourceKey` varchar(300) nullable — AI 分配的功能标识符(如 `"user-login"`),用于语义去重
### 1d. keyResults.status 扩展
现有 `'active' | 'paused' | 'cancelled'`,新增 `'completed'`
字段类型是 varchar无需改 DDL只需让 `recalcObjectiveProgress` 正确处理 completed。
### 1e. syncLogs.source 枚举扩展
`mysqlEnum('source', ['plane', 'gitea'])` 改为 `['plane', 'gitea', 'ai_okr']`
---
## 第二阶段AI 配置与 LLM 客户端(豆包 Doubao
### 2a. 环境变量 (`backend/src/config.ts`)
新增:
```typescript
AI_ENABLED: z.coerce.boolean().default(false),
AI_API_KEY: z.string().default(''),
AI_MODEL: z.string().default('doubao-seed-2-0-pro-260215'),
AI_BASE_URL: z.string().default('https://ark.cn-beijing.volces.com/api/v3'),
```
**.env 示例**
```env
AI_ENABLED=true
AI_API_KEY=846b6981-9954-4c58-bb39-63079393bdb8
AI_MODEL=doubao-seed-2-0-pro-260215
AI_BASE_URL=https://ark.cn-beijing.volces.com/api/v3
```
### 2b. 新建 `backend/src/services/llm-client.ts`
使用豆包(火山引擎 ArkAPI原生 fetch 实现,零额外依赖。
**豆包 API 调用格式**
```typescript
// POST https://ark.cn-beijing.volces.com/api/v3/chat/completions
// Header: Authorization: Bearer {AI_API_KEY}
// Header: Content-Type: application/json
export async function callLLM(systemPrompt: string, userPrompt: string): Promise<string> {
const url = `${config.AI_BASE_URL}/chat/completions`;
const body = {
model: config.AI_MODEL, // "doubao-seed-2-0-pro-260215"
messages: [
{ role: "system", content: systemPrompt },
{ role: "user", content: userPrompt }
],
temperature: 0.3, // 低温度保证输出稳定
response_format: { type: "json_object" } // 强制 JSON 输出
};
const response = await fetch(url, {
method: 'POST',
headers: {
'Authorization': `Bearer ${config.AI_API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify(body),
signal: AbortSignal.timeout(60000), // 60s 超时
});
if (!response.ok) {
throw new Error(`Doubao API error: ${response.status} ${await response.text()}`);
}
const data = await response.json();
return data.choices[0].message.content;
}
```
**要点**
- 豆包 Ark API 兼容 OpenAI `/chat/completions` 格式
- 使用 `response_format: { type: "json_object" }` 强制返回 JSON
- 超时 60s失败重试 1 次(间隔 3s
- `temperature: 0.3` 保证分析结果稳定一致
---
## 第三阶段:核心服务 — `backend/src/services/okr-ai-sync.ts`
### 整体流程
```
analyzeCommitsForOKR()
├─ 1. gatherUnanalyzedCommits()
│ SELECT gc.* FROM git_commits gc
│ LEFT JOIN ai_analyzed_commits aac ON gc.sha = aac.commit_sha
│ WHERE aac.id IS NULL
│ → 按 repoName 分组(每个仓库一批)
├─ 2. 遍历每个仓库的未分析 commits
│ a. 通过 projectRepos 找到该仓库关联的 projectId
│ b. 获取该项目当前所有 Objectives + KRs含 sourceKey
│ c. 构建 AI prompt该仓库的新 commits含每条 commit 的提交人信息 + 项目现有 OKR
│ d. 调用豆包 AI → 解析 JSON 响应
│ e. 执行 AI 返回的 actions
│ - create_objective → 创建时 ownerId = commit 提交人的 userIdAI 返回 authorUserId
│ - create_kr → 先查 sourceKey 防重 → 不存在才创建,归属到对应提交人
│ - update_progress → 更新 currentValue + 写 krLog
│ - complete_kr → status='completed' + 写 krLog
│ f. 重算 Objective progress
│ g. 标记这批 commits 为已分析(写入 aiAnalyzedCommits
└─ 3. 写入 syncLogs (source='ai_okr')
```
### 增量更新机制(核心)
```
第一次同步:
gitCommits: [A, B, C, D, E]
aiAnalyzedCommits: []
→ 未分析: [A, B, C, D, E] → 全部送 AI → 分析后标记 [A, B, C, D, E]
第二次同步(新增了 F, G
gitCommits: [A, B, C, D, E, F, G]
aiAnalyzedCommits: [A, B, C, D, E]
→ 未分析: [F, G] → 只送 F, G 给 AI → 分析后标记 [F, G]
第三次同步(无新增):
→ 未分析: [] → 跳过,不调用 AI
```
### AI Prompt 设计
**System Prompt**:
```
你是一个开发团队的 OKR 管理助手。你的任务是分析 git commit 记录,管理项目的 OKR目标与关键成果
你需要分析提交记录,输出 JSON 格式的操作指令:
判断逻辑:
- commit 包含 "完成"、"done"、"finished"、"实现完毕" 等完成语义 → 对应 KR 标记完成
- commit 是 feat/feature 类型但未完成 → 更新对应 KR 进度(根据描述估算百分比)
- commit 涉及的功能在现有 KR 中不存在 → 创建新 KR需同时提供时间节点
- fix/refactor/chore/docs 类 commit → 只在涉及明确功能时才操作
- 不要为同一个功能创建重复的 KR已有相同 sourceKey 的直接更新
新建 Objective 规则:
- 如果提交涉及的功能没有归属到任何已有 Objective → 可以创建新 Objective
- Objective 的 startDate 和 endDate 由你根据提交内容和功能复杂度判断
- period 格式为 "YYYY-Qn"(如 "2026-Q2"),根据 endDate 推算
新建 KR 规则:
- targetValue = 100, unit = "%"
- 根据 commit 语义估算 currentValue初步实现=30, 基本完成=70, 完成=100
- sourceKey 用小写英文短横线格式(如 "user-login", "payment-module"
- title 用中文描述
- startDate 和 endDate 由你根据功能复杂度和提交时间判断
```
**User Prompt 模板**:
```
仓库:{{repoName}}
所属项目:{{projectName}}
当前日期:{{today}}
该项目已有的 OKR
{{#each existingObjectives}}
Objective: id={{id}}, title="{{title}}", period={{period}}, 时间={{startDate}}~{{endDate}}, 进度={{progress}}%
Key Results:
{{#each keyResults}}
- id: {{id}}, title: "{{title}}", sourceKey: "{{sourceKey}}", status: {{status}}, 进度: {{currentValue}}/{{targetValue}}, 时间: {{startDate}}~{{endDate}}
{{/each}}
{{/each}}
{{如果没有已有 OKR: "该项目暂无 OKR 记录。"}}
该仓库的开发人员commit 提交人 → 系统用户映射):
{{#each authors}}
- userId: "{{userId}}", 姓名: {{displayName}}
{{/each}}
该仓库新增的提交记录(按时间排序,均为增量,之前的已处理过):
{{#each commits}}
- [{{committedAt}}] 提交人: {{authorName}}(userId={{userId}}) {{sha前7位}}: {{message}} (+{{additions}}/-{{deletions}})
{{/each}}
请分析以上提交,返回 JSON。注意每个 action 都必须带 ownerId 字段,值为提交人的 userId表示该 OKR 归谁负责。
{
"actions": [
{
"type": "create_objective",
"ownerId": "提交人的userId",
"title": "目标标题",
"startDate": "YYYY-MM-DD",
"endDate": "YYYY-MM-DD",
"reasoning": "为什么要创建这个目标",
"keyResults": [
{ "title": "...", "sourceKey": "...", "currentValue": number, "startDate": "YYYY-MM-DD", "endDate": "YYYY-MM-DD" }
]
},
{
"type": "create_kr",
"ownerId": "提交人的userId",
"objectiveId": "已有目标的id",
"title": "...",
"sourceKey": "...",
"currentValue": number,
"startDate": "YYYY-MM-DD",
"endDate": "YYYY-MM-DD",
"reasoning": "..."
},
{
"type": "update_progress",
"krId": "已有KR的id",
"newCurrentValue": number,
"reasoning": "..."
},
{
"type": "complete_kr",
"krId": "已有KR的id",
"reasoning": "..."
}
],
"summary": "一句话总结该仓库近期开发动态"
}
如果没有需要操作的内容,返回 {"actions": [], "summary": "..."}
```
### 防重复策略(三层保障)
| 层级 | 机制 | 说明 |
|------|------|------|
| **Commit 级** | `aiAnalyzedCommits` 表 | 已分析的 SHA 打标记,下次同步时过滤掉,不再送给 AI |
| **KR 级** | `sourceKey` 唯一性检查 | 创建前查 DB同 Objective 下同 sourceKey 只允许一条,重复则转为 update |
| **Objective 级** | 项目下已有 OKR 传入 prompt | AI 看到完整上下文,避免建议创建重复目标 |
### OKR 负责人归属
- commit 有 `userId`(通过 author-matching 映射到系统用户)
- AI prompt 中包含每条 commit 的提交人信息userId + displayName
- AI 返回 action 时必须携带 `ownerId`,即该 OKR 归哪个提交人负责
- 执行时:`create_objective``ownerId` = AI 指定的 userId`create_kr` 同理
- 跳过 `userId` 为 null 的 commits作者未映射到系统用户
**示例**:仓库 rtc_backend 有张三、李四两个人提交代码
- 张三的 commit 涉及"用户登录" → AI 创建 KR 归属张三
- 李四的 commit 涉及"支付模块" → AI 创建 KR 归属李四
- 同一个 Objective 下可以有不同负责人的 KR
### 时间节点(由 AI 判断)
- **不再固定取当前季度**,而是由 AI 根据以下信息决定:
- commit 的提交时间
- 功能的复杂度(从 commit 内容推断)
- 已有 OKR 的时间范围(避免冲突)
- AI 返回的 `startDate``endDate` 直接写入 Objective 和 KR
- `period` 根据 endDate 自动推算(复用已有 `dateToPeriod()`
- Objective 创建后,其日期范围由 AI 决定,后续也可由 AI 在新的分析中调整
### 完成判定
AI 返回 `complete_kr` action 时:
1. `currentValue = targetValue`(即 100
2. `status = 'completed'`
3. 写 krLog`action='completed'`, `detail='AI 根据 commit {sha} 判定完成:{reasoning}'`
4. `recalcObjectiveProgress()` 时 completed 的 KR 按 100% 计入
---
## 第四阶段:集成到同步流程
### 4a. 修改 `backend/src/sync/scheduler.ts`
```typescript
import { analyzeCommitsForOKR } from '../services/okr-ai-sync';
// 在 syncGitea() 之后调用
giteaJob = new Cron('0 2,19 * * *', async () => {
await syncGitea();
if (config.AI_ENABLED) {
console.info('[SCHEDULER] AI OKR 分析开始...');
await analyzeCommitsForOKR().catch(e =>
console.error('[SCHEDULER] AI OKR analysis failed:', e)
);
}
});
```
### 4b. 新增 API 端点 (`backend/src/routes/okr.ts`)
| 方法 | 路径 | 说明 |
|------|------|------|
| POST | `/api/okr/ai-analyze` | 手动触发 AI 分析admin/manager |
| POST | `/api/okr/ai-analyze/preview` | 预览模式:返回 AI 建议但不执行 |
### 4c. 修改 `backend/src/services/okr.ts`
- 导出 `dateToPeriod()` 函数供 okr-ai-sync 使用
- `recalcObjectiveProgress()` 增加对 `status === 'completed'` 的处理(按 100% 计入进度)
---
## 第五阶段DB 迁移
新增迁移文件 `backend/drizzle/0002_add_ai_okr_fields.sql`
```sql
-- 新表:增量标记(已分析过的 commit 不再重复处理)
CREATE TABLE ai_analyzed_commits (
id VARCHAR(50) PRIMARY KEY,
commit_sha VARCHAR(200) NOT NULL,
batch_id VARCHAR(50) NOT NULL,
created_at DATETIME NOT NULL,
UNIQUE INDEX uniq_analyzed_sha (commit_sha)
);
-- objectives 加字段
ALTER TABLE objectives ADD COLUMN source VARCHAR(50) DEFAULT 'manual';
-- key_results 加字段
ALTER TABLE key_results ADD COLUMN source VARCHAR(50) DEFAULT 'manual';
ALTER TABLE key_results ADD COLUMN source_key VARCHAR(300) NULL;
-- syncLogs source 枚举扩展
ALTER TABLE sync_logs MODIFY COLUMN source ENUM('plane', 'gitea', 'ai_okr') NOT NULL;
```
---
## 关键文件清单
| 文件 | 操作 | 说明 |
|------|------|------|
| `backend/src/db/schema.ts` | 修改 | 新表 + 新字段 |
| `backend/src/config.ts` | 修改 | 豆包 AI 环境变量 |
| `backend/src/services/llm-client.ts` | **新建** | 豆包 API 调用封装 |
| `backend/src/services/okr-ai-sync.ts` | **新建** | 核心 AI 分析 + 增量更新逻辑 |
| `backend/src/services/okr.ts` | 修改 | 导出 dateToPeriod, completed 状态处理 |
| `backend/src/sync/scheduler.ts` | 修改 | 同步后触发 AI 分析 |
| `backend/src/routes/okr.ts` | 修改 | 新增手动触发端点 |
| `backend/drizzle/0002_*.sql` | **新建** | DB 迁移 |
---
## 实施顺序
1. Schema 变更 + 迁移文件(第一、五阶段)
2. config.ts 加豆包 AI 环境变量(第二阶段 2a
3. llm-client.ts — 豆包 API 封装(第二阶段 2b
4. okr.ts 小改(导出 dateToPeriod + completed 处理)
5. okr-ai-sync.ts 核心服务(第三阶段)
6. scheduler.ts 集成 + 新 API 端点(第四阶段)
7. 端到端测试
---
## 验证方案
1. 配置 `.env``AI_ENABLED=true``AI_API_KEY=...``AI_MODEL=doubao-seed-2-0-pro-260215`
2. 确保数据库中已有项目、用户、项目-仓库绑定、git commits 数据
3. 调用 `POST /api/okr/ai-analyze/preview` 查看 AI 返回的 actions不写库
4. 确认 actions 合理后,调用 `POST /api/okr/ai-analyze` 执行
5. 通过 `GET /api/okr` 验证 OKR 数据已生成,时间节点由 AI 合理设定
6. 再次调用 `POST /api/okr/ai-analyze`,确认不会重复生成(增量标记生效)
7. 手动触发 Git 同步拉到新 commits再次分析确认只处理新增 commits
8. 检查 krLogs 中有 AI 操作记录
9. 前端 OKR 页面查看显示是否正常
---
## 错误处理
| 场景 | 处理 |
|------|------|
| AI_ENABLED=false 或 API key 为空 | 静默跳过,不影响 Git 同步 |
| 豆包 API 返回非 JSON | 尝试用正则提取 ```json 块,失败则记 syncLogs error |
| 豆包 API 超时60s | 记错误日志commits 保持"未分析"状态,下次同步重试 |
| AI 建议更新不存在的 KR | 跳过该 action记 warning |
| AI 建议创建重复 sourceKey 的 KR | 转为 update_progress |
| commit 的 repo 没有绑定到任何项目 | 跳过该仓库的分析 |
| 某个仓库无新增 commits | 跳过,不调用 AI |

View File

@ -0,0 +1,20 @@
-- AI Analyzed Commits 表(增量标记,防止重复分析)
CREATE TABLE IF NOT EXISTS `ai_analyzed_commits` (
`id` varchar(50) NOT NULL PRIMARY KEY,
`commit_sha` varchar(200) NOT NULL,
`batch_id` varchar(50) NOT NULL,
`created_at` datetime NOT NULL
);
--> statement-breakpoint
CREATE UNIQUE INDEX `uniq_analyzed_sha` ON `ai_analyzed_commits` (`commit_sha`);
--> statement-breakpoint
-- objectives 表加 source 字段
ALTER TABLE `objectives` ADD COLUMN `source` varchar(50) DEFAULT 'manual';
--> statement-breakpoint
-- key_results 表加 source 和 source_key 字段
ALTER TABLE `key_results` ADD COLUMN `source` varchar(50) DEFAULT 'manual';
--> statement-breakpoint
ALTER TABLE `key_results` ADD COLUMN `source_key` varchar(300) NULL;
--> statement-breakpoint
-- sync_logs source 枚举扩展
ALTER TABLE `sync_logs` MODIFY COLUMN `source` enum('plane','gitea','ai_okr') NOT NULL;

View File

@ -8,6 +8,20 @@
"when": 1775707049155,
"tag": "0000_grey_anita_blake",
"breakpoints": true
},
{
"idx": 1,
"version": "6",
"when": 1775707049200,
"tag": "0001_add_user_project_permissions",
"breakpoints": true
},
{
"idx": 2,
"version": "6",
"when": 1777430400000,
"tag": "0002_add_ai_okr_fields",
"breakpoints": true
}
]
}

View File

@ -23,6 +23,12 @@ const envSchema = z.object({
ADMIN_EMAIL: z.string().email().default('admin@jasonqiyuan.com'),
ADMIN_PASSWORD: z.string().min(6).default('Admin123!'),
// AI (豆包 Doubao / 火山引擎 Ark)
AI_ENABLED: z.coerce.boolean().default(false),
AI_API_KEY: z.string().default(''),
AI_MODEL: z.string().default('doubao-seed-2-0-pro-260215'),
AI_BASE_URL: z.string().default('https://ark.cn-beijing.volces.com/api/v3'),
});
function loadConfig() {

View File

@ -154,6 +154,7 @@ export const objectives = mysqlTable('objectives', {
startDate: varchar('start_date', { length: 50 }),
endDate: varchar('end_date', { length: 50 }),
progress: double('progress').default(0),
source: varchar('source', { length: 50 }).default('manual'), // manual / ai_generated
createdAt: datetime('created_at').notNull(),
updatedAt: datetime('updated_at').notNull(),
}, (table) => ({
@ -170,11 +171,13 @@ export const keyResults = mysqlTable('key_results', {
currentValue: double('current_value').default(0),
unit: varchar('unit', { length: 100 }),
weight: double('weight').default(1),
status: varchar('status', { length: 100 }).default('active'), // active / paused / cancelled
status: varchar('status', { length: 100 }).default('active'), // active / paused / cancelled / completed
startDate: varchar('start_date', { length: 50 }),
endDate: varchar('end_date', { length: 50 }),
linkedPlaneCycleId: varchar('linked_plane_cycle_id', { length: 200 }),
linkedPlaneModuleId: varchar('linked_plane_module_id', { length: 200 }),
source: varchar('source', { length: 50 }).default('manual'), // manual / ai_generated
sourceKey: varchar('source_key', { length: 300 }), // AI 分配的功能标识符,用于语义去重
createdAt: datetime('created_at').notNull(),
updatedAt: datetime('updated_at').notNull(),
}, (table) => ({
@ -233,9 +236,19 @@ export const userProjectPermissions = mysqlTable('user_project_permissions', {
// ── Sync Logs ──
export const syncLogs = mysqlTable('sync_logs', {
id: varchar('id', { length: 50 }).primaryKey(),
source: mysqlEnum('source', ['plane', 'gitea']).notNull(),
source: mysqlEnum('source', ['plane', 'gitea', 'ai_okr']).notNull(),
status: mysqlEnum('status', ['success', 'error']).notNull(),
message: text('message'),
recordsProcessed: int('records_processed').default(0),
syncedAt: datetime('synced_at').notNull(),
});
// ── AI Analyzed Commits (增量标记,防止重复分析) ──
export const aiAnalyzedCommits = mysqlTable('ai_analyzed_commits', {
id: varchar('id', { length: 50 }).primaryKey(),
commitSha: varchar('commit_sha', { length: 200 }).notNull(),
batchId: varchar('batch_id', { length: 50 }).notNull(),
createdAt: datetime('created_at').notNull(),
}, (table) => ({
shaIdx: uniqueIndex('uniq_analyzed_sha').on(table.commitSha),
}));

View File

@ -71,4 +71,5 @@ console.info(`DevPerf Dashboard API starting on port ${port}`);
export default {
port,
fetch: app.fetch,
idleTimeout: 120, // AI 分析可能耗时较长
};

View File

@ -4,6 +4,7 @@ import { z } from 'zod';
import { requireRole } from '../middleware/role';
import { getAllowedProjectIds } from '../services/permissions';
import * as okrService from '../services/okr';
import { analyzeCommitsForOKR } from '../services/okr-ai-sync';
export const okrRoutes = new Hono();
@ -149,6 +150,24 @@ okrRoutes.get('/okr/key-results/:id/logs', async (c) => {
return c.json({ code: 0, data: logs, message: 'success' });
});
// POST /api/okr/ai-analyze — 手动触发 AI 分析
okrRoutes.post('/okr/ai-analyze',
requireRole('admin', 'manager'),
async (c) => {
const result = await analyzeCommitsForOKR(false);
return c.json({ code: 0, data: result, message: 'success' });
}
);
// POST /api/okr/ai-analyze/preview — 预览模式(不写库)
okrRoutes.post('/okr/ai-analyze/preview',
requireRole('admin', 'manager'),
async (c) => {
const result = await analyzeCommitsForOKR(true);
return c.json({ code: 0, data: result, message: 'success' });
}
);
// DELETE /api/okr/objectives/:id
okrRoutes.delete('/okr/objectives/:id',
requireRole('admin'),

View File

@ -0,0 +1,78 @@
import { config } from '../config';
/**
* ArkLLM API
* OpenAI /chat/completions
*/
export async function callLLM(systemPrompt: string, userPrompt: string): Promise<string> {
if (!config.AI_API_KEY) {
throw new Error('AI_API_KEY is not configured');
}
const url = `${config.AI_BASE_URL}/chat/completions`;
const body = {
model: config.AI_MODEL,
messages: [
{ role: 'system', content: systemPrompt },
{ role: 'user', content: userPrompt },
],
temperature: 0.3,
};
let lastError: Error | null = null;
// 最多重试 1 次
for (let attempt = 0; attempt < 2; attempt++) {
try {
const response = await fetch(url, {
method: 'POST',
headers: {
'Authorization': `Bearer ${config.AI_API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify(body),
signal: AbortSignal.timeout(120000), // 120s豆包模型响应较慢
});
if (!response.ok) {
const errorText = await response.text();
throw new Error(`Doubao API error ${response.status}: ${errorText}`);
}
const data = await response.json();
return data.choices[0].message.content;
} catch (err) {
lastError = err instanceof Error ? err : new Error(String(err));
console.warn(`[LLM] Attempt ${attempt + 1} failed: ${lastError.message}`);
if (attempt === 0) {
await new Promise(resolve => setTimeout(resolve, 3000));
}
}
}
throw lastError!;
}
/**
* LLM JSON
* parse ```json 代码块
*/
export function parseLLMJson<T = any>(text: string): T {
// 先直接尝试
try {
return JSON.parse(text);
} catch {
// 尝试提取 ```json ... ``` 代码块
const match = text.match(/```json\s*([\s\S]*?)```/);
if (match) {
return JSON.parse(match[1].trim());
}
// 尝试提取第一个 { ... } 块
const braceMatch = text.match(/\{[\s\S]*\}/);
if (braceMatch) {
return JSON.parse(braceMatch[0]);
}
throw new Error(`Failed to parse LLM response as JSON: ${text.substring(0, 200)}`);
}
}

View File

@ -0,0 +1,578 @@
import { v4 as uuid } from 'uuid';
import { eq, isNull, sql, inArray } from 'drizzle-orm';
import { db } from '../db/index';
import {
gitCommits, aiAnalyzedCommits, projectRepos, projects,
objectives, keyResults, krLogs, syncLogs, users,
} from '../db/schema';
import { callLLM, parseLLMJson } from './llm-client';
import { dateToPeriod, recalcObjectiveProgress } from './okr';
import dayjs from 'dayjs';
// ── Types ──
interface AIAction {
type: 'create_objective' | 'create_kr' | 'update_progress' | 'complete_kr';
ownerId?: string;
objectiveId?: string;
krId?: string;
title?: string;
sourceKey?: string;
currentValue?: number;
newCurrentValue?: number;
startDate?: string;
endDate?: string;
reasoning?: string;
keyResults?: {
title: string;
sourceKey: string;
currentValue: number;
startDate?: string;
endDate?: string;
}[];
}
interface AIResponse {
actions: AIAction[];
summary: string;
}
interface CommitGroup {
repoName: string;
projectId: string;
projectName: string;
commits: {
sha: string;
authorName: string | null;
userId: string | null;
message: string | null;
additions: number;
deletions: number;
committedAt: Date;
}[];
}
// ── System Prompt ──
const SYSTEM_PROMPT = `你是一个开发团队的 OKR 管理助手。你的任务是分析 git commit 记录,管理项目的 OKR目标与关键成果
JSON
- commit "完成""done""finished""实现完毕" KR
- commit feat/feature KR
- commit KR <EFBFBD><EFBFBD> KR
- fix/refactor/chore/docs commit
- KR sourceKey
Objective
- Objective Objective
- Objective startDate endDate
- period "YYYY-Qn" "2026-Q2" endDate
KR
- targetValue = 100, unit = "%"
- commit currentValue=30, =70, =100
- sourceKey 线 "user-login", "payment-module"
- title
- startDate endDate
action ownerId userId OKR `;
// ── Core Functions ──
/**
* commits repoName
*/
async function gatherUnanalyzedCommits(): Promise<CommitGroup[]> {
// 获取所有已分析的 SHA
const analyzed = await db.select({ sha: aiAnalyzedCommits.commitSha }).from(aiAnalyzedCommits);
const analyzedSet = new Set(analyzed.map(a => a.sha));
// 获取所有 commits有 userId 的,且 4 月份及之后的)
const cutoffDate = new Date('2026-01-01T00:00:00');
const allCommits = await db.select().from(gitCommits);
const unanalyzed = allCommits.filter(c =>
c.userId && !analyzedSet.has(c.sha) && new Date(c.committedAt) >= cutoffDate
);
if (unanalyzed.length === 0) return [];
// 获取 projectRepos 映射 repoName → projectId
const bindings = await db.select().from(projectRepos);
const repoToProject = new Map<string, string>();
for (const b of bindings) {
// 支持多种格式的 repoName 匹配
const name = extractRepoName(b.repoName);
repoToProject.set(name, b.projectId);
}
// 获取 projects 名称
const allProjects = await db.select().from(projects);
const projectMap = new Map(allProjects.map(p => [p.id, p.name]));
// 按 repoName 分组
const groups = new Map<string, CommitGroup>();
for (const commit of unanalyzed) {
const projectId = repoToProject.get(commit.repoName);
if (!projectId) continue; // 仓库未绑定项目,跳过
if (!groups.has(commit.repoName)) {
groups.set(commit.repoName, {
repoName: commit.repoName,
projectId,
projectName: projectMap.get(projectId) || commit.repoName,
commits: [],
});
}
groups.get(commit.repoName)!.commits.push({
sha: commit.sha,
authorName: commit.authorName,
userId: commit.userId,
message: commit.message,
additions: commit.additions || 0,
deletions: commit.deletions || 0,
committedAt: commit.committedAt,
});
}
// 按时间排序每组的 commits
for (const group of groups.values()) {
group.commits.sort((a, b) => new Date(a.committedAt).getTime() - new Date(b.committedAt).getTime());
}
return Array.from(groups.values());
}
/**
*
*/
function extractRepoName(input: string): string {
let cleaned = input.trim().replace(/\.git$/, '');
if (cleaned.includes('://')) {
try {
const url = new URL(cleaned);
const parts = url.pathname.split('/').filter(Boolean);
return parts.length >= 2 ? parts[1] : parts[0];
} catch { /* fall through */ }
}
if (cleaned.includes('/')) {
return cleaned.split('/').pop()!;
}
return cleaned;
}
/**
* AI prompt
*/
async function buildUserPrompt(group: CommitGroup): Promise<string> {
// 获取该项目下所有现有 OKR
const existingObjs = await db.select().from(objectives)
.where(eq(objectives.projectId, group.projectId));
// 获取开发人员列表(从 commits 中提取 unique userId
const authorIds = [...new Set(group.commits.map(c => c.userId).filter(Boolean))] as string[];
const authorUsers = authorIds.length > 0
? await db.select().from(users).where(inArray(users.id, authorIds))
: [];
const userMap = new Map(authorUsers.map(u => [u.id, u.displayName]));
let prompt = `仓库:${group.repoName}\n所属项目${group.projectName}\n当前日期${dayjs().format('YYYY-MM-DD')}\n\n`;
// 开发人员列表
prompt += `该仓库的开发人员commit 提交人 → 系统用户映射):\n`;
for (const userId of authorIds) {
prompt += `- userId: "${userId}", 姓名: ${userMap.get(userId) || '未知'}\n`;
}
prompt += '\n';
// 已有 OKR
if (existingObjs.length > 0) {
prompt += '该项目已有的 OKR\n';
for (const obj of existingObjs) {
const owner = obj.ownerId ? userMap.get(obj.ownerId) : null;
prompt += `Objective: id="${obj.id}", title="${obj.title}", 负责人=${owner || '未指定'}, period=${obj.period}, 时间=${obj.startDate || '?'}~${obj.endDate || '?'}, 进度=${obj.progress || 0}%\n`;
const krs = await db.select().from(keyResults)
.where(eq(keyResults.objectiveId, obj.id));
if (krs.length > 0) {
prompt += ' Key Results:\n';
for (const kr of krs) {
prompt += ` - id: "${kr.id}", title: "${kr.title}", sourceKey: "${kr.sourceKey || ''}", status: ${kr.status}, 进度: ${kr.currentValue || 0}/${kr.targetValue}, 时间: ${kr.startDate || '?'}~${kr.endDate || '?'}\n`;
}
}
}
} else {
prompt += '该项目暂无 OKR 记录。\n';
}
// 新增 commits
prompt += `\n该仓库新增的提交记录按时间排序均为增量之前的已处理过\n`;
for (const commit of group.commits) {
const date = dayjs(commit.committedAt).format('YYYY-MM-DD HH:mm');
const sha7 = commit.sha.substring(0, 7);
const name = commit.authorName || '未知';
const userId = commit.userId || '?';
prompt += `- [${date}] 提交人: ${name}(userId=${userId}) ${sha7}: ${commit.message || '(无消息)'} (+${commit.additions}/-${commit.deletions})\n`;
}
prompt += `\n请分析以上提交返回 JSON。注意每个 action 都必须带 ownerId 字段,值为提交人的 userId表示该 OKR 归谁负责。
{
"actions": [
{
"type": "create_objective",
"ownerId": "提交人的userId",
"title": "目标标题",
"startDate": "YYYY-MM-DD",
"endDate": "YYYY-MM-DD",
"reasoning": "为什么要创建这个目标",
"keyResults": [
{ "title": "...", "sourceKey": "...", "currentValue": 30, "startDate": "YYYY-MM-DD", "endDate": "YYYY-MM-DD" }
]
},
{
"type": "create_kr",
"ownerId": "提交人的userId",
"objectiveId": "已有目标的id",
"title": "...",
"sourceKey": "...",
"currentValue": 30,
"startDate": "YYYY-MM-DD",
"endDate": "YYYY-MM-DD",
"reasoning": "..."
},
{
"type": "update_progress",
"krId": "已有KR的id",
"newCurrentValue": 70,
"reasoning": "..."
},
{
"type": "complete_kr",
"krId": "已有KR的id",
"reasoning": "..."
}
],
"summary": "一句话总结该仓库近期开发动态"
}
{"actions": [], "summary": "..."}`;
return prompt;
}
/**
* AI actions
*/
async function executeActions(actions: AIAction[], projectId: string): Promise<number> {
let executedCount = 0;
const now = new Date();
for (const action of actions) {
try {
switch (action.type) {
case 'create_objective': {
if (!action.title || !action.ownerId) break;
const objId = uuid();
// 计算当前季度末
const now2 = dayjs();
const currentQuarter = Math.ceil((now2.month() + 1) / 3);
const quarterEnd = dayjs(`${now2.year()}-${currentQuarter * 3}-01`).endOf('month');
const endDate = action.endDate || quarterEnd.format('YYYY-MM-DD');
const period = dateToPeriod(endDate);
await db.insert(objectives).values({
id: objId,
title: action.title,
ownerId: action.ownerId,
projectId,
period,
startDate: action.startDate || dayjs().format('YYYY-MM-DD'),
endDate,
progress: 0,
source: 'ai_generated',
createdAt: now,
updatedAt: now,
});
// 创建附带的 KRs
if (action.keyResults?.length) {
for (const krData of action.keyResults) {
// sourceKey 去重检查
if (krData.sourceKey) {
const existing = await db.select().from(keyResults)
.where(eq(keyResults.sourceKey, krData.sourceKey));
if (existing.some(kr => kr.objectiveId === objId)) continue;
}
const krId = uuid();
await db.insert(keyResults).values({
id: krId,
objectiveId: objId,
title: krData.title,
targetValue: 100,
currentValue: krData.currentValue || 0,
unit: '%',
weight: 1,
status: krData.currentValue >= 100 ? 'completed' : 'active',
startDate: krData.startDate || action.startDate,
endDate: krData.endDate || action.endDate,
source: 'ai_generated',
sourceKey: krData.sourceKey || null,
createdAt: now,
updatedAt: now,
});
await addAILog(krId, 'created', `AI 自动创建: ${action.reasoning || ''}`);
executedCount++;
}
}
await recalcObjectiveProgress(objId);
executedCount++;
console.info(`[AI-OKR] Created objective: ${action.title}`);
break;
}
case 'create_kr': {
if (!action.objectiveId || !action.title) break;
// 验证 objective 存在
const obj = await db.query.objectives.findFirst({
where: eq(objectives.id, action.objectiveId),
});
if (!obj) {
console.warn(`[AI-OKR] Objective ${action.objectiveId} not found, skipping create_kr`);
break;
}
// sourceKey 去重
if (action.sourceKey) {
const existing = await db.select().from(keyResults)
.where(eq(keyResults.sourceKey, action.sourceKey));
if (existing.some(kr => kr.objectiveId === action.objectiveId)) {
// 已存在同 sourceKey转为更新进度
const existingKR = existing.find(kr => kr.objectiveId === action.objectiveId)!;
if (action.currentValue && action.currentValue > (existingKR.currentValue || 0)) {
await db.update(keyResults)
.set({ currentValue: action.currentValue, updatedAt: now })
.where(eq(keyResults.id, existingKR.id));
await addAILog(existingKR.id, 'progress_update',
`AI 更新sourceKey 重复转更新): ${existingKR.currentValue}${action.currentValue}`);
await recalcObjectiveProgress(action.objectiveId);
}
executedCount++;
break;
}
}
const krId = uuid();
await db.insert(keyResults).values({
id: krId,
objectiveId: action.objectiveId,
title: action.title,
targetValue: 100,
currentValue: action.currentValue || 0,
unit: '%',
weight: 1,
status: (action.currentValue || 0) >= 100 ? 'completed' : 'active',
startDate: action.startDate || null,
endDate: action.endDate || null,
source: 'ai_generated',
sourceKey: action.sourceKey || null,
createdAt: now,
updatedAt: now,
});
await addAILog(krId, 'created', `AI 自动创建: ${action.reasoning || ''}`);
await recalcObjectiveProgress(action.objectiveId);
executedCount++;
console.info(`[AI-OKR] Created KR: ${action.title}`);
break;
}
case 'update_progress': {
if (!action.krId || action.newCurrentValue === undefined) break;
const kr = await db.query.keyResults.findFirst({
where: eq(keyResults.id, action.krId),
});
if (!kr) {
console.warn(`[AI-OKR] KR ${action.krId} not found, skipping update_progress`);
break;
}
// 只允许进度前进,不允许后退
if (action.newCurrentValue <= (kr.currentValue || 0)) break;
const clampedValue = Math.min(action.newCurrentValue, kr.targetValue);
await db.update(keyResults)
.set({ currentValue: clampedValue, updatedAt: now })
.where(eq(keyResults.id, action.krId));
await addAILog(action.krId, 'progress_update',
`AI 更新进度: ${kr.currentValue}${clampedValue}${action.reasoning || ''}`);
await recalcObjectiveProgress(kr.objectiveId);
executedCount++;
console.info(`[AI-OKR] Updated KR progress: ${kr.title}${clampedValue}`);
break;
}
case 'complete_kr': {
if (!action.krId) break;
const kr = await db.query.keyResults.findFirst({
where: eq(keyResults.id, action.krId),
});
if (!kr) {
console.warn(`[AI-OKR] KR ${action.krId} not found, skipping complete_kr`);
break;
}
if (kr.status === 'completed') break; // 已完成则跳过
await db.update(keyResults)
.set({ status: 'completed', currentValue: kr.targetValue, updatedAt: now })
.where(eq(keyResults.id, action.krId));
await addAILog(action.krId, 'completed',
`AI 判定完成: ${action.reasoning || ''}`);
await recalcObjectiveProgress(kr.objectiveId);
executedCount++;
console.info(`[AI-OKR] Completed KR: ${kr.title}`);
break;
}
}
} catch (err) {
const msg = err instanceof Error ? err.message : String(err);
console.error(`[AI-OKR] Error executing action ${action.type}: ${msg}`);
}
}
return executedCount;
}
/**
* KR AI
*/
async function addAILog(krId: string, action: string, detail: string) {
await db.insert(krLogs).values({
id: uuid(),
krId,
action,
detail,
operatorId: null,
operatorName: 'AI System',
createdAt: new Date(),
});
}
/**
* commits
*/
async function markCommitsAnalyzed(shas: string[], batchId: string) {
const now = new Date();
for (const sha of shas) {
try {
await db.insert(aiAnalyzedCommits).values({
id: uuid(),
commitSha: sha,
batchId,
createdAt: now,
});
} catch {
// 可能已经存在unique constraint忽略
}
}
}
// ── Main Entry ──
export interface AnalyzeResult {
totalCommits: number;
reposProcessed: number;
actionsExecuted: number;
summaries: { repo: string; summary: string }[];
}
/**
* commits OKR
* @param dryRun true AI
*/
export async function analyzeCommitsForOKR(dryRun = false): Promise<AnalyzeResult> {
const startTime = Date.now();
const batchId = uuid();
const groups = await gatherUnanalyzedCommits();
if (groups.length === 0) {
console.info('[AI-OKR] No unanalyzed commits found, skipping');
return { totalCommits: 0, reposProcessed: 0, actionsExecuted: 0, summaries: [] };
}
let totalCommits = 0;
let actionsExecuted = 0;
const summaries: { repo: string; summary: string }[] = [];
const MAX_COMMITS_PER_BATCH = 30;
for (const group of groups) {
try {
totalCommits += group.commits.length;
// 分批处理,每批最多 30 条 commits
const batches: typeof group.commits[] = [];
for (let i = 0; i < group.commits.length; i += MAX_COMMITS_PER_BATCH) {
batches.push(group.commits.slice(i, i + MAX_COMMITS_PER_BATCH));
}
for (let batchIdx = 0; batchIdx < batches.length; batchIdx++) {
const batchCommits = batches[batchIdx];
const batchGroup = { ...group, commits: batchCommits };
console.info(`[AI-OKR] Analyzing ${batchCommits.length} commits for repo: ${group.repoName} (batch ${batchIdx + 1}/${batches.length})`);
// 构建 prompt 并调用 AI
const userPrompt = await buildUserPrompt(batchGroup);
const rawResponse = await callLLM(SYSTEM_PROMPT, userPrompt);
const aiResponse = parseLLMJson<AIResponse>(rawResponse);
if (!aiResponse.actions || !Array.isArray(aiResponse.actions)) {
console.warn(`[AI-OKR] Invalid AI response for ${group.repoName}, skipping batch`);
await markCommitsAnalyzed(batchCommits.map(c => c.sha), batchId);
continue;
}
if (batchIdx === batches.length - 1) {
summaries.push({ repo: group.repoName, summary: aiResponse.summary || '' });
}
if (!dryRun) {
const count = await executeActions(aiResponse.actions, group.projectId);
actionsExecuted += count;
await markCommitsAnalyzed(batchCommits.map(c => c.sha), batchId);
} else {
console.info(`[AI-OKR] [DRY RUN] Would execute ${aiResponse.actions.length} actions for ${group.repoName}`);
actionsExecuted += aiResponse.actions.length;
}
}
} catch (err) {
const msg = err instanceof Error ? err.message : String(err);
console.error(`[AI-OKR] Failed to analyze repo ${group.repoName}: ${msg}`);
}
}
// 记录 sync log
const elapsed = Date.now() - startTime;
await db.insert(syncLogs).values({
id: uuid(),
source: 'ai_okr',
status: 'success',
message: `${dryRun ? '[DRY RUN] ' : ''}Analyzed ${totalCommits} commits from ${groups.length} repos, executed ${actionsExecuted} actions in ${elapsed}ms`,
recordsProcessed: actionsExecuted,
syncedAt: new Date(),
});
console.info(`[AI-OKR] Completed: ${totalCommits} commits, ${groups.length} repos, ${actionsExecuted} actions in ${elapsed}ms`);
return { totalCommits, reposProcessed: groups.length, actionsExecuted, summaries };
}

View File

@ -68,7 +68,7 @@ export async function getOKRByPeriod(period?: string) {
/**
* 2026-04-15 "2026-Q2"
*/
function dateToPeriod(dateStr: string): string {
export function dateToPeriod(dateStr: string): string {
const d = new Date(dateStr);
const year = d.getFullYear();
const q = Math.ceil((d.getMonth() + 1) / 3);
@ -256,14 +256,17 @@ async function addKRLog(krId: string, action: string, detail: string | null, ope
});
}
async function recalcObjectiveProgress(objectiveId: string) {
export async function recalcObjectiveProgress(objectiveId: string) {
const allKRs = await db.select().from(keyResults)
.where(eq(keyResults.objectiveId, objectiveId));
// 只算 active 和已完成的 KR暂停和取消的不计入
// 只算 active 和 completed 的 KR暂停和取消的不计入
const activeKRs = allKRs.filter(kr => kr.status !== 'paused' && kr.status !== 'cancelled');
const totalWeight = activeKRs.reduce((sum, k) => sum + (k.weight || 1), 0);
const weightedProgress = activeKRs.reduce((sum, k) => {
const progress = k.targetValue > 0 ? ((k.currentValue || 0) / k.targetValue) * 100 : 0;
// completed 状态按 100% 计入
const progress = k.status === 'completed'
? 100
: (k.targetValue > 0 ? ((k.currentValue || 0) / k.targetValue) * 100 : 0);
return sum + progress * (k.weight || 1);
}, 0);
const objProgress = totalWeight > 0 ? Math.round(weightedProgress / totalWeight) : 0;

View File

@ -1,13 +1,25 @@
import { Cron } from 'croner';
import { syncGitea } from './sync-gitea';
import { analyzeCommitsForOKR } from '../services/okr-ai-sync';
import { config } from '../config';
let giteaJob: Cron | null = null;
async function runSyncAndAnalyze() {
await syncGitea();
if (config.AI_ENABLED && config.AI_API_KEY) {
console.info('[SCHEDULER] AI OKR 分析开始...');
await analyzeCommitsForOKR().catch(e =>
console.error('[SCHEDULER] AI OKR 分析失败:', e)
);
}
}
export function startScheduler(): void {
// 每天凌晨 2 点 + 晚上 7 点各同步一次
giteaJob = new Cron('0 2,19 * * *', async () => {
console.info('[SCHEDULER] Gitea 定时同步开始...');
await syncGitea();
await runSyncAndAnalyze();
});
console.info('[SCHEDULER] Gitea 自动同步已启动(每天 02:00 + 19:00');
@ -15,7 +27,7 @@ export function startScheduler(): void {
// 启动后 10 秒执行一次首次同步
setTimeout(async () => {
console.info('[SCHEDULER] 执行首次同步...');
await syncGitea().catch(e => console.error('[SCHEDULER] 首次同步失败:', e));
await runSyncAndAnalyze().catch(e => console.error('[SCHEDULER] 首次同步失败:', e));
}, 10000);
}