Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
221 changes: 221 additions & 0 deletions .agents/skills/openfasttrace-spec-driven-development/SKILL.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,221 @@
---
name: oft-spec-driven-development
description: Support spec-driven development in projects that use OpenFastTrace, a system-requirements document in `doc/system_requirements.md`, an arc42-style design in `doc/design.md`, and per-issue task plans in `doc/changesets/`. Use when AI agent must turn a new GitHub issue or Jira ticket into a changeset plan, update traced requirements and design before or alongside code, and keep all implementation and verification work aligned with the project's quality requirements.
---

# OFT Spec-Driven Development

Work from specification to implementation. Treat the requirements, design, and quality documents as the source of truth for both planning and changes.

## Project Contract

Use these repository conventions unless the user explicitly says this project differs:

- System requirements live in `doc/system_requirements.md` and use OpenFastTrace notation.
- Design lives in `doc/design.md` and its referenced arc42 chapters under `doc/design/`.
- Per-issue implementation plans live in `doc/changesets/` using `<issue-number>-<short-kebab-title>.md`.
- Quality requirements are authoritative for code, tests, tracing, and build gates. See `doc/design/quality_requirements.md` or `doc/spec/design/quality_requirements.md`.

## OFT Reference

For OFT syntax, tracing behavior, selective tracing, and common error handling, read the upstream OpenFastTrace skill first:

`https://raw.githubusercontent.com/itsallcode/openfasttrace/refs/heads/main/skills/openfasttrace-skill/SKILL.md`

Do not restate OFT rules from memory when the upstream reference is available. Use it as the normative workflow for:

- specification item syntax and IDs
- `Needs`, `Covers`, forwarding, and coverage tags
- selective tracing and trace interpretation
- diagnosing broken trace links

## Operating Mode

When the user gives you a new issue link, derive the work plan from:

1. the issue or ticket itself
2. `doc/system_requirements.md`
3. `doc/design.md` and the relevant linked design chapters
4. the quality requirements document
5. the current code and tests

Do not jump straight to code. First determine whether the issue changes:

- user-visible behavior
- traced requirements or scenarios
- design decisions, building blocks, runtime behavior, or constraints
- verification expectations from the quality requirements

If the ticket conflicts with the existing specification or design, surface that conflict and repair the documents before or together with implementation.

## Required Workflow

### 1. Read Before Planning

Before drafting or editing a changeset, inspect:

- `doc/system_requirements.md`
- `doc/design.md`
- the specific design chapters relevant to the issue
- `doc/design/quality_requirements.md`
- `doc/changesets/README.md`
- existing changesets that touch the same feature area

Read code only after you understand the spec and design impact.

### 2. Create Or Update The Changeset

For a new issue, create `doc/changesets/<issue-number>-<short-kebab-title>.md`.

Use the established repository style:

- title with tracker reference and short summary
- `Goal`
- `Scope` with explicit in-scope and out-of-scope bullets
- `Design References`
- optional `Strategy` when the issue needs implementation direction
- `Task List`

Keep the task list ordered. Put specification and design work before production-code work unless the issue is strictly internal and does not affect traced artifacts.

### 3. Derive The Plan From The Spec

When building the task list, map the issue onto the requirement hierarchy from the quality requirements:

- `feat`
- `req`
- `scn`
- `dsn`
- `impl`
- `utest` or `itest`

Use that mapping to decide what must change:

- If behavior changes for users, update or add `req` and `scn` items in `doc/system_requirements.md`.
- If architecture or technical behavior changes, update or add `dsn` items and relevant arc42 design sections.
- If a lower layer adds no new information, prefer OFT forwarding instead of redundant new items.
- Ensure each runtime design item covers only one scenario unless forwarding is the better fit.

### 4. Treat Quality Requirements As Non-Negotiable

Every plan and implementation must satisfy `doc/design/quality_requirements.md`.

Always derive verification tasks from that document, including as applicable:

- clean trace results for the requirement and design artifact types in scope
- tests that follow the naming, assertion, and parameter-validation rules
- coverage expectations
- plugin verification and build checks
- dependency-policy constraints
- static-analysis and security gates

If an issue required violating a documented quality rule, stop and raise it explicitly instead of silently proceeding.

### 5. Implement In Spec-First Order

Preferred order:

1. requirements updates
2. design updates
3. changeset task-plan update
4. production code
5. tests
6. OFT trace and project verification

For small internal refactors, you may keep the docs unchanged only when the traced requirements and design remain fully accurate.

## Changeset Template

Use this template and adapt it to the issue:

```md
# GH-<number> <title>

## Goal

<What the issue achieves and why it matters.>

## Scope

In scope:

* <planned change>

Out of scope:

* <explicit non-goal>

## Design References

* [System Requirements](<path-to-specs>/system_requirements.md)
* [Quality Requirements](<path-to-specs>/design/quality_requirements.md)
* [<Relevant design chapter>](<path-to-specs>/design/<chapter>.md)

## Strategy

<Only include when useful. Describe the intended implementation direction.>

## Task List

- [ ] Create and checkout a new Git branch `<issue-type>/<issue-number>-<issue-title-lower-kebab-case>`

### Requirements And Design

- [ ] <system requirements update>
- [ ] Stop and ask user for a review of the system requirements
- [ ] <design update>
- [ ] Stop and ask user for a review of the design

### Implementation

- [ ] <production code task>

### Verification

- [ ] <tests derived from quality requirements>
- [ ] Keep the OpenFastTrace trace clean
- [ ] Keep required build and plugin verification tasks green

### Update user documentation

- [ ] Update the end user documentation in README.md

## Version and Changelog Update

- [ ] Raise the version to 0.3.0 (this is a feature release)
- [ ] Write the changelog entry for 0.3.0
```

Keep verification items concrete. Do not leave them as generic "run tests" placeholders when the quality requirements demand more specific checks.

## Issue Intake Rules

When the user provides a tracker link:

- extract the issue number for the changeset filename
- derive a short kebab-case title from the issue title
- summarize the requested outcome in repository terms
- identify impacted requirements, scenarios, design sections, code areas, and tests
- draft the changeset before substantial implementation

If the issue text is unavailable, ask the user for the issue content or a short summary. Do not invent requirements from the URL alone.

## Editing Rules

- Preserve existing OFT IDs unless the semantics changed.
- When semantics change, revise the item thoughtfully instead of making arbitrary ID churn.
- Keep requirements user-facing and design technical; do not duplicate the same text in both places.
- Prefer precise task items over broad epics in changesets.
- Keep comments in code minimal and let naming carry intent, consistent with the quality requirements.
- Avoid adding dependencies unless a documented design decision approves them.

## Verification Rules

Before closing work, verify at the level the issue changed:

- OFT artifacts stay syntactically valid and traceable.
- Requirement, scenario, design, implementation, and tests remain consistent.
- Automated tests cover the new behavior at the correct level.
- Repository quality gates required by `doc/design/quality_requirements.md` are either run or explicitly called out as not run.

If you cannot run a required verification step, state that clearly in the final report and leave the changeset task unchecked.
1 change: 1 addition & 0 deletions doc/changes/changes.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
# Changes

* [4.3.1](changes_4.3.1.md)
* [4.3.0](changes_4.3.0.md)
* [4.2.3](changes_4.2.3.md)
* [4.2.2](changes_4.2.2.md)
Expand Down
13 changes: 13 additions & 0 deletions doc/changes/changes_4.3.1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# OpenFastTrace 4.3.0, released 2026-05-11

Code name: Spec-driven Development Skill

## Summary

In this release we added the OpenFastTrace Spec-driven Development Skill. The skill guides AI agents through turning a GitHub issue into a changeset plan that can be reviewed before it gets executed by the user. The process uses a spec-driven development approach backed by an OFT spec.

We also moved the skills to `.agents/skills` for best cross-vendor compatibility.

## Documentation

* Added changeset skill
4 changes: 4 additions & 0 deletions doc/spec/design.md
Original file line number Diff line number Diff line change
Expand Up @@ -1074,6 +1074,10 @@ Covers:

Needs: impl

# Quality Requirements

See [Quality Requirements](design/quality_requirements.md).

# Design Decisions

## How do we Implement the Command Line Interpreter
Expand Down
94 changes: 94 additions & 0 deletions doc/spec/design/quality_requirements.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
# Quality Requirements

This chapter documents architecture-relevant quality requirements and technical quality goals.

User-facing acceptance scenarios are defined in [System Requirements](../system_requirements.md).

Terms such as `plugin`, `OpenFastTrace`, and `OFT` use the definitions from [System Requirements](../system_requirements.md).

## Requirement Quality

We have the following requirement hierachy in this project:

1. `feat`: Top level feature as you would find on a product sheet (limited number)
2. `req`: High level requirement
3. `scn`: Given-when-then scenario
4. `dsn`: Design requirement
`constr`: Technical constraint
5. `impl`: Implementation
`utest`: Unit test
`itest`: Integration test

Runtime design requirements `dsn` can only cover on scenario at a time `scn`. You can use OFT's forwarding notation if a technical requirement does not add new information to a user scenario.

## Code Quality

1. Production code and test code follow clean-code principles.
2. The implementation prefers speaking names over explanatory comments. Comments are only acceptable when intent cannot be expressed clearly in code.
3. Abbreviations are only used where they are guaranteed common (meaning everyone knows them) otherwise all symbol names are written out.
4. Methods avoid side effects where possible.
5. Methods stay short and keep cyclomatic complexity low. When behavior grows beyond a small, readable unit, the design is refactored into smaller collaborating types or methods.
6. When the cyclomatic complexity is too high, use extract method refactoring.
7. APIs (not implementation) are documented with JavaDoc: interfaces, abstract classes, methods, parameters, return values, side effects (if any), purpose
8. All objects that can be, are immutable.
9. The implementation prefers static object allocation over dynamic allocation where possible.
10. All method parameters are final. Output parameters are only allowed when they are used in external libraries.

### Test Code Guideline

1. Test method names follow the given-when-then naming principle.
2. Method parameters are final.
3. The project uses Hamcrest matchers.
4. If possible each test method has only a single assert.
5. If multiple asserts are necessary and the latter asserts are not a follow-up symptom of the previous ones, the asserts must be wrapped in `assertAll`.
6. When exceptions are asserted, then both type and message are validated.
7. Where possible similar tests are bundled into parameterized tests (e.g., when multiple variants of input are tested against the same implementation method).
8. Prefer parameterized tests over tests that exercise different scenarios of the same method under test with multiple asserts.
9. Parameter validation is tested with multiple valid and invalid inputs. Testing valid and invalid input is done in separate test methods.

## Dependency Policy

The plugin uses the minimum set of dependencies required for:

* building and running a JetBrains plugin
* integrating the OpenFastTrace library

Additional libraries are not allowed by default. Any new third-party dependency requires an explicit design decision and approval before it is added to the build.

The Gradle dependency verification metadata in `gradle/verification-metadata.xml` is committed to source control. Maintainers update and review this metadata whenever dependency declarations, Gradle plugin versions, the IntelliJ Platform version, or other build inputs change the resolved dependency artifacts. The standard update command is `./gradlew --write-verification-metadata sha256 help`.

Generated local IntelliJ Platform Ivy metadata for bundled plugin and module artifacts is trusted without checksum verification because the IntelliJ Platform Gradle Plugin can generate environment-specific Ivy descriptors for the local IntelliJ Platform artifact repository. This exception applies only to `ivy-*.xml` descriptors in the `bundledPlugin` and `bundledModule` groups. Resolved JAR artifacts remain checksum verified.

## Static Analysis And Security Gates

Static code analysis runs in SonarQube Cloud and acts as a build breaker. A failing quality gate blocks integration until the reported issues are resolved or an approved exception exists.

Dependency vulnerability monitoring runs through GitHub Dependabot instead of the Gradle build. Dependabot alerts provide the repository's vulnerability check for known vulnerable dependencies, while local and CI builds no longer run OSS Index as an immediate build-breaking gate.

OpenFastTrace tracing runs as a build breaker for the specification artifacts in scope. The trace stays clean for the requirement and design artifact types used by the project.

The Gradle build applies the latest OpenFastTrace Gradle plugin version approved for the project and executes `traceRequirements` locally and in CI through the standard `check` lifecycle.

## Testability And Coverage

Automated tests use JUnit 5 together with Hamcrest matchers.

IntelliJ Platform light fixture tests may keep a JUnit 4 compatibility dependency when JetBrains test base classes still require the legacy `TestCase` API.

Path coverage across the code base stays at or above 80%. Coverage below that threshold fails the build unless a documented exception is accepted in advance.

The architecture favors testable units with clear boundaries so the coverage target can be met without relying mainly on brittle UI-level tests.

Automated plugin testing follows the IntelliJ Platform test strategy and uses real platform components in a headless environment instead of extensive mocking. Most plugin behavior is verified as model-level functional tests against test data files and expected results.

Plugin tests prefer light platform tests whenever possible because they reuse the project setup and run faster. Tests for plugin logic without Java-specific PSI use fixtures or base classes equivalent to `BasePlatformTestCase`. Heavier project-scoped tests are reserved for behavior that requires a fresh project, multiple modules, or project-level services that cannot be covered with light tests.

Editor-facing features such as syntax highlighting, inspections, annotators, references, and navigation use the IntelliJ test fixtures and test data conventions. Highlighting tests store expected results in test files and verify them with the platform highlighting fixture support.

End-to-end IDE startup and workflow checks use dedicated integration tests with the IntelliJ Platform Starter and Driver infrastructure. These tests run from a separate integration test task against the built plugin distribution, verify that the plugin loads into the target IDE, and cover only a small number of critical workflows because the UI driver APIs remain slower and more brittle than model-level tests.

Continuous integration runs plugin-specific verification tasks in addition to ordinary automated tests. The build verifies plugin packaging and descriptor validity and runs IntelliJ Plugin Verifier checks against the supported IDE builds to detect binary compatibility problems early.

## Platform Compatibility

The required Java version follows the standard Java requirement of the targeted JetBrains platform version. The project does not introduce a language level or bytecode target that exceeds that platform requirement.
2 changes: 1 addition & 1 deletion parent/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
<description>Free requirement tracking suite</description>
<url>https://github.com/itsallcode/openfasttrace</url>
<properties>
<revision>4.3.0</revision>
<revision>4.3.1</revision>
<java.version>17</java.version>
<junit.version>6.1.0-M1</junit.version>
<junit.version>6.0.3</junit.version>
Expand Down
Loading
Loading