Skip to content

Commit bce1db0

Browse files
refactor(docs): code analysis engine
changes: - file: cli_parser.py area: analyzer modified: [create_parser] - file: readme_exporter.py area: docs modified: [READMEExporter, _generate_readme_content] stats: lines: "+295/-295 (net +0)" files: 11 complexity: "Stable complexity"
1 parent 904e6fe commit bce1db0

File tree

17 files changed

+316
-299
lines changed

17 files changed

+316
-299
lines changed

CHANGELOG.md

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,22 @@
11
## [Unreleased]
22

3+
## [0.5.86] - 2026-03-26
4+
5+
### Docs
6+
- Update README.md
7+
- Update project/README.md
8+
- Update project/context.md
9+
10+
### Other
11+
- Update code2llm/cli_parser.py
12+
- Update code2llm/exporters/readme_exporter.py
13+
- Update project/analysis.toon.yaml
14+
- Update project/calls.mmd
15+
- Update project/calls.png
16+
- Update project/flow.mmd
17+
- Update project/index.html
18+
- Update project/map.toon.yaml
19+
320
## [0.5.85] - 2026-03-26
421

522
### Docs

README.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,8 +26,8 @@ When you run `code2llm ./ -f all`, the following files are created:
2626
# Quick health check (TOON format only)
2727
code2llm ./ -f toon
2828

29-
# Generate all formats (what created these files)
30-
code2llm ./ -f all
29+
# Generate all formats + prompt.txt (single-project mode)
30+
code2llm ./ -f all -o ./project --no-chunk
3131

3232
# LLM-ready context only
3333
code2llm ./ -f context
@@ -142,8 +142,10 @@ grep -E "^ .*[0-9]{3,}$" project.toon.yaml | sort -t',' -k2 -n -r | head -10
142142

143143
### `prompt.txt` - Ready-to-Send LLM Prompt
144144
**Purpose**: Pre-formatted prompt listing all generated files for LLM conversation
145+
**Generation**: Written when `code2llm` runs with a source path and requests `-f all` (including `--no-chunk`) or `code2logic`
145146
**Contents**:
146147
- **Files section**: Lists all existing generated files with descriptions
148+
- **Source files section**: Highlights important source files such as `cli_exports/orchestrator.py`
147149
- **Missing section**: Shows which files weren't generated (if any)
148150
- **Task section**: Instructions for LLM analysis
149151
- **Requirements section**: Guidelines for suggested changes

VERSION

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
0.5.85
1+
0.5.86

code2llm/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
and entity resolution with multilingual support.
99
"""
1010

11-
__version__ = "0.5.85"
11+
__version__ = "0.5.86"
1212
__author__ = "STTS Project"
1313

1414
# Core analysis components (lightweight, always needed)

code2llm/cli_parser.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ def create_parser() -> argparse.ArgumentParser:
1414
epilog='''
1515
Examples:
1616
code2llm ./ # Default: TOON diagnostics + README
17-
code2llm ./ -f all -o ./docs # All core formats to ./docs/
17+
code2llm ./ -f all -o ./docs --no-chunk # All core formats + prompt.txt to ./docs/
1818
code2llm ./ -f toon,map,evolution # Consolidated diagnostics + structure + roadmap
1919
code2llm ./ -f context # LLM narrative (context.md)
2020
code2llm ./ --streaming --strategy deep -f all # Deep streaming analysis, all core outputs
@@ -43,7 +43,7 @@ def create_parser() -> argparse.ArgumentParser:
4343
flow — Data-flow analysis (flow.toon) — legacy, explicit opt-in
4444
code2logic — Generate project logic (legacy project.toon) via external code2logic
4545
project-yaml — Legacy project.yaml export (single source of truth) + generated views
46-
all — Generate core formats (analysis.toon, map.toon.yaml, evolution.toon.yaml, context, mermaid)
46+
all — Generate core formats (analysis.toon, map.toon.yaml, evolution.toon.yaml, context, mermaid) plus prompt.txt in single-project runs
4747
4848
Strategy Options (--strategy):
4949
quick — Fast overview, fewer files analyzed

code2llm/exporters/readme_exporter.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -233,6 +233,7 @@ def _generate_readme_content(self, project_path: str, output_dir: Path,
233233
234234
### `prompt.txt` - Ready-to-Send LLM Prompt
235235
**Purpose**: Pre-formatted prompt listing all generated files for LLM conversation
236+
**Generation**: Written when `code2llm` runs with a source path and requests `-f all` (including `--no-chunk`) or `code2logic`
236237
**Contents**:
237238
- **Files section**: Lists all existing generated files with descriptions
238239
- **Source files section**: Highlights important source files such as `cli_exports/orchestrator.py`

code2llm/nlp/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
with multilingual support and fuzzy matching.
55
"""
66

7-
__version__ = "0.5.85"
7+
__version__ = "0.5.86"
88

99
from .pipeline import NLPPipeline
1010
from .normalization import QueryNormalizer

project/README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -151,6 +151,7 @@ grep -E "^ .*[0-9]{3,}$" project.toon.yaml | sort -t',' -k2 -n -r | head -10
151151

152152
### `prompt.txt` - Ready-to-Send LLM Prompt
153153
**Purpose**: Pre-formatted prompt listing all generated files for LLM conversation
154+
**Generation**: Written when `code2llm` runs with a source path and requests `-f all` (including `--no-chunk`) or `code2logic`
154155
**Contents**:
155156
- **Files section**: Lists all existing generated files with descriptions
156157
- **Source files section**: Highlights important source files such as `cli_exports/orchestrator.py`

project/analysis.toon.yaml

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# code2llm | 113f 21433L | python:105,shell:2,java:1 | 2026-03-26
1+
# code2llm | 113f 21434L | python:105,shell:2,php:1 | 2026-03-26
22
# CC̄=4.6 | critical:10/932 | dups:0 | cycles:0
33

44
HEALTH[10]:
@@ -17,15 +17,15 @@ REFACTOR[1]:
1717
1. split 10 high-CC methods (CC>15)
1818

1919
PIPELINES[637]:
20-
[1] Src [read_readme]: read_readme
20+
[1] Src [run_benchmark]: run_benchmark → load_previous
2121
PURITY: 100% pure
22-
[2] Src [save_report]: save_report
22+
[2] Src [read_readme]: read_readme
2323
PURITY: 100% pure
2424
[3] Src [main]: main → load_file → is_toon_file
2525
PURITY: 100% pure
26-
[4] Src [run_benchmark]: run_benchmark → load_previous
26+
[4] Src [save_report]: save_report
2727
PURITY: 100% pure
28-
[5] Src [__init__]: __init__
28+
[5] Src [main]: main → create_html → get_shield_url
2929
PURITY: 100% pure
3030

3131
LAYERS:
@@ -36,7 +36,7 @@ LAYERS:
3636
│ !! pipeline_detector 506L 3C 18m CC=13 ←0
3737
│ !! html_dashboard 504L 1C 14m CC=7 ←0
3838
│ !! metrics 501L 1C 27m CC=12 ←0
39-
│ readme_exporter 494L 1C 7m CC=13 ←0
39+
│ readme_exporter 495L 1C 7m CC=13 ←0
4040
│ large_repo 488L 2C 20m CC=9 ←1
4141
│ mermaid 485L 0C 16m CC=13 ←1
4242
│ llm_flow 472L 1C 24m CC=14 ←0
@@ -148,9 +148,9 @@ LAYERS:
148148
149149
demo_langs/ CC̄=1.5 ←in:0 →out:0
150150
│ sample.rs 47L 1C 5m CC=2 ←0
151-
│ sample.java 47L 2C 7m CC=31
151+
│ sample.java 47L 2C 4m CC=20
152152
│ sample.go 46L 0C 4m CC=3 ←0
153-
│ sample.php 44L 1C 1m CC=10
153+
│ sample.php 44L 2C 4m CC=22
154154
│ sample 40L 1C 4m CC=3 ←0
155155
│ sample.ts 26L 1C 1m CC=1 ←0
156156

0 commit comments

Comments
 (0)