P0.6.2 — Step 3 Execution Packet
Grounded in ../audits/p0-6-2-skill-crud-audit.md and ../contracts/p0-6-2-skill-crud-contract.md. Executable plan for Step 4.
Sigma pre-approved (per dispatch prompt, Wave D). T0 packet-gate waiver applies.
P1. Change list
P1.1 New files
| Path | Purpose | Expected LOC |
|---|---|---|
src/db/migrations/003_skills.sql |
ε skills table + greek_letter index | 15-20 |
src/domains/skills/repository.ts |
loader + CRUD + MCP tool registration | 280-350 |
src/__tests__/domains/skills/repository.test.ts |
Jest suite | 600-750 |
P1.2 Modified files
| Path | Change | Expected delta |
|---|---|---|
src/db/schema.sql |
Append ε skills ownership comment block | +10-15 lines (comments only, still no executable SQL) |
src/startup.ts |
Import loadSkillsFromDisk, registerSkillTools; add path import; add StartupOptions.skillsRoot; Phase 2 insertion |
+10-15 lines |
P1.3 Unmodified
src/server.ts — unchanged. (Per audit §4c, tool registration is deferred to Phase 2 in startup.ts, avoiding the bootstrap() 3-way merge risk flagged in the dispatch prompt.)
P1.4 No deletions, no moves
P2. Migration SQL — src/db/migrations/003_skills.sql
-- 003_skills — ε Skill Registry table (P0.6.2)
--
-- Owned by ε. Populated at startup by loadSkillsFromDisk() in
-- src/domains/skills/repository.ts. Source of truth is
-- .agents/skills/*/SKILL.md; this table is a searchable index.
--
-- PK is `name` (the skill's kebab-case identifier) per contract §C2 + audit §4e.
-- Synthetic IDs deferred to Phase 1 when rename-tracking becomes a requirement.
--
-- Columns map 1:1 to the P0.6.1 parser output (SkillFrontmatter + ParsedSkill):
-- name/description/version/entrypoint/greek_letter — frontmatter scalars
-- capabilities — JSON-encoded array of strings
-- body — raw markdown body
-- source_path — repo-relative SKILL.md path (POSIX)
-- frontmatter_json — full passthrough-preserving frontmatter JSON
-- loaded_at — ISO-8601 UTC at insert time
CREATE TABLE skills (
name TEXT PRIMARY KEY,
description TEXT NOT NULL,
version TEXT,
entrypoint TEXT,
capabilities TEXT NOT NULL DEFAULT '[]',
greek_letter TEXT,
body TEXT NOT NULL,
source_path TEXT NOT NULL,
frontmatter_json TEXT NOT NULL,
loaded_at TEXT NOT NULL
);
CREATE INDEX idx_skills_greek ON skills(greek_letter);
P3. Repository module — src/domains/skills/repository.ts skeleton
/**
* Colibri — Phase 0 ε Skill Registry repository + `skill_list` MCP tool.
*
* Runtime half of the ε Skill Registry surface. Scans .agents/skills/*\/SKILL.md
* at startup (via parseSkillFile from P0.6.1), upserts rows into the `skills`
* table, prunes rows whose directory no longer exists on disk, and exposes
* the single Phase 0 ε MCP tool `skill_list`.
*
* This module has NO side effects at import — the loader and tool registration
* are called imperatively by startup.ts Phase 2.
*
* Canonical references:
* - docs/guides/implementation/task-breakdown.md § P0.6.2
* - docs/spec/s17-mcp-surface.md §1 Category 3
* - docs/reference/mcp-tools-phase-0.md §Cat 3
* - docs/audits/p0-6-2-skill-crud-audit.md
* - docs/contracts/p0-6-2-skill-crud-contract.md
* - docs/packets/p0-6-2-skill-crud-packet.md
*/
import * as fs from 'node:fs';
import * as path from 'node:path';
import { z } from 'zod';
import type Database from 'better-sqlite3';
import { registerColibriTool } from '../../server.js';
import type { ColibriServerContext } from '../../server.js';
import {
SkillSchemaError,
parseSkillFile,
type ParsedSkill,
} from './schema.js';
/* -------------------------------------------------------------------------- */
/* Public types */
/* -------------------------------------------------------------------------- */
export interface SkillRow {
readonly name: string;
readonly description: string;
readonly version: string | null;
readonly entrypoint: string | null;
readonly capabilities: readonly string[];
readonly greek_letter: string | null;
readonly body: string;
readonly source_path: string;
readonly frontmatter_json: string;
readonly loaded_at: string;
}
export interface ListSkillsFilters {
readonly search?: string;
readonly capability?: string;
}
export interface LoadSkillsResult {
readonly loaded: number;
readonly skipped: number;
readonly pruned: number;
readonly total_on_disk: number;
}
export type Logger = (...args: unknown[]) => void;
/* -------------------------------------------------------------------------- */
/* Internal DB row type (raw shape coming out of better-sqlite3) */
/* -------------------------------------------------------------------------- */
interface RawSkillDbRow {
readonly name: string;
readonly description: string;
readonly version: string | null;
readonly entrypoint: string | null;
readonly capabilities: string; // JSON text
readonly greek_letter: string | null;
readonly body: string;
readonly source_path: string;
readonly frontmatter_json: string;
readonly loaded_at: string;
}
/* -------------------------------------------------------------------------- */
/* Helpers */
/* -------------------------------------------------------------------------- */
function decodeRow(raw: RawSkillDbRow): SkillRow {
let capabilities: readonly string[] = [];
try {
const parsed = JSON.parse(raw.capabilities) as unknown;
if (Array.isArray(parsed) && parsed.every((x) => typeof x === 'string')) {
capabilities = parsed as string[];
}
} catch {
// Malformed JSON → treat as empty. Should not happen if insert path is
// the only writer, but defense-in-depth for rows loaded from other sources.
capabilities = [];
}
return {
name: raw.name,
description: raw.description,
version: raw.version,
entrypoint: raw.entrypoint,
capabilities,
greek_letter: raw.greek_letter,
body: raw.body,
source_path: raw.source_path,
frontmatter_json: raw.frontmatter_json,
loaded_at: raw.loaded_at,
};
}
function toPosix(rel: string): string {
return rel.split(path.sep).join('/');
}
/* -------------------------------------------------------------------------- */
/* loadSkillsFromDisk */
/* -------------------------------------------------------------------------- */
export function loadSkillsFromDisk(
db: Database.Database,
skillsRoot: string,
logger: Logger,
): LoadSkillsResult {
const absRoot = path.resolve(skillsRoot);
if (!fs.existsSync(absRoot)) {
logger('[colibri] skills root missing:', absRoot);
return { loaded: 0, skipped: 0, pruned: 0, total_on_disk: 0 };
}
const repoRoot = process.cwd();
const entries = fs.readdirSync(absRoot, { withFileTypes: true });
interface Upsert {
readonly parsed: ParsedSkill;
readonly source_path: string; // repo-relative, POSIX
}
const upserts: Upsert[] = [];
let skipped = 0;
let totalOnDisk = 0;
for (const entry of entries) {
if (entry.name.startsWith('.')) {
continue;
}
if (!entry.isDirectory()) {
continue;
}
const absSkillMd = path.join(absRoot, entry.name, 'SKILL.md');
if (!fs.existsSync(absSkillMd)) {
continue;
}
totalOnDisk += 1;
try {
const parsed = parseSkillFile(absSkillMd);
const rel = path.relative(repoRoot, absSkillMd);
upserts.push({ parsed, source_path: toPosix(rel) });
} catch (err) {
const msg =
err instanceof SkillSchemaError
? err.message
: err instanceof Error
? `${err.name}: ${err.message}`
: String(err);
logger('[colibri] skill skipped:', `${absSkillMd}:`, msg);
skipped += 1;
}
}
const loadedAt = new Date().toISOString();
const keepNames = new Set(upserts.map((u) => u.parsed.frontmatter.name));
const upsertStmt = db.prepare(`
INSERT INTO skills (
name, description, version, entrypoint, capabilities, greek_letter,
body, source_path, frontmatter_json, loaded_at
) VALUES (
@name, @description, @version, @entrypoint, @capabilities, @greek_letter,
@body, @source_path, @frontmatter_json, @loaded_at
)
ON CONFLICT(name) DO UPDATE SET
description = excluded.description,
version = excluded.version,
entrypoint = excluded.entrypoint,
capabilities = excluded.capabilities,
greek_letter = excluded.greek_letter,
body = excluded.body,
source_path = excluded.source_path,
frontmatter_json = excluded.frontmatter_json,
loaded_at = excluded.loaded_at
`);
// Compute pruning set: rows in DB whose `name` is not in keepNames.
const existingNames = (
db.prepare('SELECT name FROM skills').all() as Array<{ name: string }>
).map((r) => r.name);
const toPrune = existingNames.filter((n) => !keepNames.has(n));
let loaded = 0;
let pruned = 0;
const tx = db.transaction(() => {
for (const { parsed, source_path } of upserts) {
const fm = parsed.frontmatter;
const capsRaw = fm['capabilities'];
const caps: string[] = Array.isArray(capsRaw)
? capsRaw.filter((x): x is string => typeof x === 'string')
: [];
upsertStmt.run({
name: fm.name,
description: fm.description,
version: typeof fm.version === 'string' ? fm.version : null,
entrypoint: typeof fm.entrypoint === 'string' ? fm.entrypoint : null,
capabilities: JSON.stringify(caps),
greek_letter: typeof fm['greekLetter'] === 'string' ? fm['greekLetter'] : null,
body: parsed.body,
source_path,
frontmatter_json: JSON.stringify(fm),
loaded_at: loadedAt,
});
loaded += 1;
}
if (toPrune.length > 0) {
const placeholders = toPrune.map(() => '?').join(',');
const delStmt = db.prepare(`DELETE FROM skills WHERE name IN (${placeholders})`);
delStmt.run(...toPrune);
pruned = toPrune.length;
}
});
tx();
logger(
`[colibri] skills loaded: ${String(loaded)}, skipped: ${String(skipped)}, pruned: ${String(pruned)}`,
);
return { loaded, skipped, pruned, total_on_disk: totalOnDisk };
}
/* -------------------------------------------------------------------------- */
/* getSkill */
/* -------------------------------------------------------------------------- */
export function getSkill(
db: Database.Database,
name: string,
): SkillRow | null {
const raw = db
.prepare('SELECT * FROM skills WHERE name = ?')
.get(name) as RawSkillDbRow | undefined;
return raw === undefined ? null : decodeRow(raw);
}
/* -------------------------------------------------------------------------- */
/* listSkills */
/* -------------------------------------------------------------------------- */
export function listSkills(
db: Database.Database,
filters: ListSkillsFilters = {},
): readonly SkillRow[] {
const search = typeof filters.search === 'string' && filters.search.length > 0
? `%${filters.search.toLowerCase()}%`
: null;
const capability =
typeof filters.capability === 'string' && filters.capability.length > 0
? `%"${filters.capability}"%`
: null;
let sql: string;
let params: unknown[];
if (search !== null && capability !== null) {
sql =
'SELECT * FROM skills WHERE (LOWER(name) LIKE ? OR LOWER(description) LIKE ?) AND capabilities LIKE ? ORDER BY name ASC';
params = [search, search, capability];
} else if (search !== null) {
sql =
'SELECT * FROM skills WHERE LOWER(name) LIKE ? OR LOWER(description) LIKE ? ORDER BY name ASC';
params = [search, search];
} else if (capability !== null) {
sql = 'SELECT * FROM skills WHERE capabilities LIKE ? ORDER BY name ASC';
params = [capability];
} else {
sql = 'SELECT * FROM skills ORDER BY name ASC';
params = [];
}
const raws = db.prepare(sql).all(...params) as RawSkillDbRow[];
return raws.map(decodeRow);
}
/* -------------------------------------------------------------------------- */
/* skill_list MCP tool */
/* -------------------------------------------------------------------------- */
const SkillListInputSchema = z.object({
search: z.string().optional(),
capability: z.string().optional(),
});
export function registerSkillTools(
ctx: ColibriServerContext,
db: Database.Database,
): void {
registerColibriTool(
ctx,
'skill_list',
{
title: 'skill_list',
description:
'List all skills loaded from .agents/skills/*/SKILL.md into the ε Skill Registry. Supports optional substring search (name or description) and exact capability filter.',
inputSchema: SkillListInputSchema,
},
({ search, capability }): {
skills: Array<{
name: string;
version: string | null;
description: string;
capabilities: readonly string[];
greek_letter: string | null;
path: string;
}>;
total_count: number;
} => {
const filters: ListSkillsFilters = {
...(search !== undefined ? { search } : {}),
...(capability !== undefined ? { capability } : {}),
};
const rows = listSkills(db, filters);
const projection = rows.map((r) => ({
name: r.name,
version: r.version,
description: r.description,
capabilities: r.capabilities,
greek_letter: r.greek_letter,
path: r.source_path,
}));
return { skills: projection, total_count: projection.length };
},
);
}
Notes on the skeleton
exactOptionalPropertyTypes: true→ never assignundefinedto an optional key. The...(search !== undefined ? { search } : {})pattern is the canonical workaround (used throughout existing Phase 0 code).noUncheckedIndexedAccess: true→fm['capabilities']returnsunknown; narrowed viaArray.isArrayandfilter((x): x is string => ...).- No
console.*calls — all logging goes through the injectedlogger. - The raw-DB-row interface (
RawSkillDbRow) exists so.all()results can be narrowed withoutany;decodeRowconverts to the publicSkillRowshape. SkillSchemaErrorcaught distinctly so the log line attributes the error cleanly; generic Errors fall back toname: message.
P4. src/startup.ts diff
P4.1 Imports (top of file)
Add:
import * as path from 'node:path';
and:
import {
loadSkillsFromDisk,
registerSkillTools,
} from './domains/skills/repository.js';
P4.2 StartupOptions extension
Add one field to the StartupOptions interface:
/** Override the skills root directory. Default `path.resolve(process.cwd(), '.agents/skills')`. */
readonly skillsRoot?: string;
(positioned next to the dbPath field for symmetry).
P4.3 Phase 2 body — before return { ctx, db, elapsedMs }
Current:
try {
const db = initDbFn(dbPath);
const elapsedMs = Math.floor(nowMs() - phase1StartMs);
logger(`[Startup] Complete in ${elapsedMs}ms`);
return { ctx, db, elapsedMs };
} catch (err) {
...
New:
try {
const db = initDbFn(dbPath);
// Phase 2b — ε Skill Registry: load SKILL.md + register skill_list.
const skillsRoot =
options.skillsRoot ?? path.resolve(process.cwd(), '.agents', 'skills');
loadSkillsFromDisk(db, skillsRoot, logger);
registerSkillTools(ctx, db);
const elapsedMs = Math.floor(nowMs() - phase1StartMs);
logger(`[Startup] Complete in ${elapsedMs}ms`);
return { ctx, db, elapsedMs };
} catch (err) {
...
Rationale: both calls inside the same try block so any throw routes to shutdown('phase-2-failed'). Order: DB first (so loadSkillsFromDisk has something to write to), then loader (so registerSkillTools’s handler closes over a populated table), then register tool.
P5. src/db/schema.sql diff
Append to the existing comment block:
--
-- -----------------------------------------------------------------------------
-- ε Skill Registry (P0.6.2, migration 003_skills.sql)
-- -----------------------------------------------------------------------------
-- Table: skills
-- Purpose: searchable index over the .agents/skills/*/SKILL.md corpus.
-- Source of truth: SKILL.md on disk. The table is rebuilt at every boot
-- via loadSkillsFromDisk() (src/domains/skills/repository.ts). Rows that
-- no longer match a disk file are pruned inside the same transaction.
--
-- PK is `name` (the skill's kebab-case identifier). No synthetic id; no
-- foreign keys reference this table in Phase 0.
--
-- Columns `capabilities` and `frontmatter_json` store JSON-encoded text
-- (SQLite has no array / object type). `capabilities` defaults to '[]'.
No executable SQL added — schema.sql remains a shipped documentation asset per P0.2.2 (§header comment).
P6. Test matrix — src/__tests__/domains/skills/repository.test.ts
File header + helper block:
/**
* Tests for src/domains/skills/repository.ts (P0.6.2 ε Skill CRUD + Discovery).
*
* Test posture:
* - In-memory SQLite (`:memory:`) per test via initDb then drop `user_version`
* so migrations re-run against the fresh handle. Simpler: use a temp-file
* DB (same pattern as db-init.test.ts) since the migration runner does not
* support in-memory by default, and temp-file pattern is already proven.
* - Each test builds a temp .agents-like fixture dir via fs.mkdtempSync
* containing N fake SKILL.md files. Cleanup in afterEach via fs.rmSync.
* - No process.env mutation.
*/
import { randomUUID } from 'node:crypto';
import * as fs from 'node:fs';
import * as os from 'node:os';
import * as path from 'node:path';
import { fileURLToPath } from 'node:url';
import type Database from 'better-sqlite3';
import { z } from 'zod';
import { closeDb, initDb } from '../../../db/index.js';
import {
getSkill,
listSkills,
loadSkillsFromDisk,
registerSkillTools,
} from '../../../domains/skills/repository.js';
import {
createServer,
type AuditSink,
type ColibriServerContext,
} from '../../../server.js';
const tempDirs: string[] = [];
function makeTempDb(): { db: Database.Database; path: string } {
const dir = path.join(os.tmpdir(), `colibri-p0-6-2-${randomUUID()}`);
tempDirs.push(dir);
const file = path.join(dir, 'test.db');
const db = initDb(file);
return { db, path: file };
}
function makeTempSkillsRoot(): string {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), `colibri-skills-${randomUUID()}-`));
tempDirs.push(dir);
return dir;
}
function writeSkill(
skillsRoot: string,
name: string,
frontmatter: string,
body: string,
): string {
const skillDir = path.join(skillsRoot, name);
fs.mkdirSync(skillDir, { recursive: true });
const file = path.join(skillDir, 'SKILL.md');
fs.writeFileSync(file, `---\n${frontmatter}---\n${body}`, 'utf8');
return file;
}
function captureLogger(): { logger: (...a: unknown[]) => void; lines: string[] } {
const lines: string[] = [];
return {
logger: (...a) => lines.push(a.map((x) => (x instanceof Error ? x.message : String(x))).join(' ')),
lines,
};
}
afterEach(() => {
closeDb();
for (const d of tempDirs) {
try {
fs.rmSync(d, { recursive: true, force: true });
} catch {
/* swallow Windows locks */
}
}
tempDirs.length = 0;
});
P6.1 describe blocks (test matrix ≥ 32 cases)
describe |
it (summary) |
Count |
|---|---|---|
loadSkillsFromDisk — happy path |
loads N fake skills; returns counters; idempotent rerun; no logger calls for skipped/pruned | 4 |
loadSkillsFromDisk — skip on parse error |
broken frontmatter → logger called with skill skipped, other skills load, counters correct |
3 |
loadSkillsFromDisk — missing root |
nonexistent skillsRoot → counters all zero, warn logged | 1 |
loadSkillsFromDisk — no SKILL.md in dir |
a directory without SKILL.md is silently skipped | 1 |
loadSkillsFromDisk — hidden dir |
a .git/-like dir is silently skipped |
1 |
loadSkillsFromDisk — pruning |
pre-seed row absent from fixture → pruned counter, row gone | 2 |
loadSkillsFromDisk — persistence details |
capabilities/version/greek_letter/entrypoint round-trip correctly; body preserved; source_path is repo-relative POSIX | 4 |
loadSkillsFromDisk — collision of frontmatter name |
two dirs both declaring name: conflict → last-writer wins (documented) |
1 |
getSkill — happy |
seed row → row returned | 1 |
getSkill — miss |
name not in table → null | 1 |
getSkill — empty name |
empty string → null (no throw) | 1 |
listSkills — no filter |
returns all, ordered by name ASC | 1 |
listSkills — search (name) |
substring match against name; case-insensitive | 2 |
listSkills — search (description) |
substring match against description | 1 |
listSkills — search (no match) |
returns [] | 1 |
listSkills — capability filter |
array containment: "read" matches skill with ["read"] but not ["readwrite"] |
3 |
listSkills — both filters |
AND semantics | 2 |
listSkills — empty-string filter |
empty filter treated as no-filter | 2 |
registerSkillTools |
registers exactly one tool, name skill_list, snake_case passes |
2 |
registerSkillTools — duplicate |
second call throws | 1 |
skill_list handler — no args → all rows |
end-to-end via wrapped handler, envelope shape matches | 1 |
skill_list handler — search filter |
narrows list | 1 |
skill_list handler — capability filter |
narrows list | 1 |
skill_list handler — invalid input (bad type) |
wrong-type search → INVALID_PARAMS envelope |
1 |
skill_list middleware conformance |
spy audit sink sees enter and exit events | 1 |
| Corpus load test (primary acceptance) | runs loadSkillsFromDisk against real .agents/skills/; loaded count matches fs.readdirSync count of dirs with SKILL.md |
2 |
Minimum total: ~40 cases (above the required 20). All tests hit both branches of every conditional in repository.ts.
P6.2 Corpus test shape
describe('corpus load — real .agents/skills', () => {
const __dirname = path.dirname(fileURLToPath(import.meta.url));
const REPO_ROOT = path.resolve(__dirname, '..', '..', '..', '..');
const SKILLS_ROOT = path.join(REPO_ROOT, '.agents', 'skills');
function countOnDisk(): number {
const entries = fs.readdirSync(SKILLS_ROOT, { withFileTypes: true });
return entries
.filter((e) => !e.name.startsWith('.') && e.isDirectory())
.filter((e) => fs.existsSync(path.join(SKILLS_ROOT, e.name, 'SKILL.md')))
.length;
}
it('loads every valid SKILL.md', () => {
const { db } = makeTempDb();
const cap = captureLogger();
const result = loadSkillsFromDisk(db, SKILLS_ROOT, cap.logger);
const onDisk = countOnDisk();
expect(result.total_on_disk).toBe(onDisk);
// All known-good skills parse; none of the 22 SKILL.md files in the
// shipped corpus should fail.
expect(result.skipped).toBe(0);
expect(result.loaded).toBe(onDisk);
expect(result.pruned).toBe(0);
});
it('indexes colibri-* and non-colibri skills', () => {
const { db } = makeTempDb();
loadSkillsFromDisk(db, SKILLS_ROOT, () => {});
const rows = listSkills(db);
const colibri = rows.filter((r) => r.name.startsWith('colibri-'));
const other = rows.filter((r) => !r.name.startsWith('colibri-'));
// At R75 baseline: 21 colibri-* + 1 repo-facing-polish = 22.
expect(colibri.length).toBeGreaterThanOrEqual(21);
expect(other.length).toBeGreaterThanOrEqual(1);
});
});
If a future PR adds or removes a skill, the counts auto-adjust — the only assertion is that loaded == onDisk and skipped == 0.
P6.3 Middleware spy pattern
function createSpyAuditSink(): {
sink: AuditSink;
enters: unknown[];
exits: unknown[];
} {
const enters: unknown[] = [];
const exits: unknown[] = [];
return {
sink: { enter: (e) => enters.push(e), exit: (e) => exits.push(e) },
enters,
exits,
};
}
function createLocalCtx(auditSink?: AuditSink): ColibriServerContext {
return createServer({
...(auditSink !== undefined ? { auditSink } : {}),
version: '0.0.0-test',
mode: 'FULL',
installGlobalHandlers: false,
});
}
The handler is invoked via ctx.server using the MCP SDK’s internal call path, same pattern as server.test.ts. No transport is connected — tests use the registered tool’s wrapped handler directly.
P7. Risks + mitigations
| Risk | Mitigation |
|---|---|
| Migration 003 prefix collision with P0.3.2/P0.7.2 parallel worktrees | Wave D lock assigns disjoint numbers; verified pre-Step 1. No other 003_*.sql exists. Tests confirm migration runs on a clean temp DB. |
bootstrap() 3-way merge conflict |
Avoided by design — P0.6.2 registers in startup.ts Phase 2, not in bootstrap() (audit §4c). |
startup.ts append-conflict with P0.2.4 or P0.7.2 |
Both insertions are append-style (right before return { ctx, db, elapsedMs }). Git 3-way merge will order them trivially. Each call-site is on its own line. |
| Parallel-worktree file leak (Wave C class) | Step 1 pre-clean verified clean tree. Step 4 will rerun git status before commit. |
repo-facing-polish frontmatter breakage |
Verified by read in Step 1: minimal valid frontmatter. Loads. |
ESLint @typescript-eslint/no-unused-vars |
All imports consumed; _ prefix rule available if needed. |
ESLint consistent-type-imports |
import type Database from 'better-sqlite3' (type only) + import type { ColibriServerContext } ... distinct from runtime imports. |
noUncheckedIndexedAccess on fm['capabilities'] |
Narrowed via Array.isArray + .filter. |
exactOptionalPropertyTypes on filters object |
Use conditional spread pattern. |
| Windows test cleanup flakiness on WAL | fs.rmSync(..., {force: true}) with swallow, matches db-init.test.ts. |
P8. Verification plan (what Step 5 runs)
From worktree root:
npm ci # (already ran during Step 1 audit prep)
npm run lint # eslint src — clean
npm test -- --coverage # full suite, focus coverage on new files
npm run build # tsc — no type errors
Expected:
npm testpasses the full pre-existing suite (modes, server, config, db, startup, task FSM, skill-schema, trail-schema, smoke, + new repository.test.ts). No regression.- Coverage on
src/domains/skills/repository.ts: ≥ 95% branch, ≥ 95% statement/function/line. npm run lint: zero errors, zero warnings.npm run build:dist/domains/skills/repository.jsemitted with.d.tssibling.
On any failure: stop, fix, re-run. Do not push.
P9. Commit plan detail
| # | Message | Files |
|---|---|---|
| 1 | audit(p0-6-2-skill-crud): inventory ε parser + DB + startup + corpus |
docs/audits/p0-6-2-skill-crud-audit.md (shipped c977cf5c) |
| 2 | contract(p0-6-2-skill-crud): behavioral contract |
docs/contracts/p0-6-2-skill-crud-contract.md (shipped 6611faf6) |
| 3 | packet(p0-6-2-skill-crud): execution plan |
docs/packets/p0-6-2-skill-crud-packet.md (this) |
| 4 | feat(p0-6-2-skill-crud): ε skill repository + loader + skill_list MCP tool |
src/db/migrations/003_skills.sql, src/db/schema.sql, src/domains/skills/repository.ts, src/tests/domains/skills/repository.test.ts, src/startup.ts |
| 5 | verify(p0-6-2-skill-crud): test evidence |
docs/verification/p0-6-2-skill-crud-verification.md |
Packet locked. Step 4 (Implement) proceeds.