Skip to content
Closed
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 21 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,27 @@ var result = schema.validate(
// result.valid() => true
```

Compatibility: runs the official 2020‑12 JSON Schema Test Suite on `verify`; in strict mode it currently passes about 71% of applicable cases.
Compatibility: runs the official 2020‑12 JSON Schema Test Suite on `verify`; **measured compatibility is 63.3%** (1,153 of 1,822 tests pass) with comprehensive metrics reporting.

### JSON Schema Test Suite Metrics

The validator now provides defensible compatibility statistics:

```bash
# Run with console metrics (default)
mvn verify -pl json-java21-schema

# Export detailed JSON metrics
mvn verify -pl json-java21-schema -Djson.schema.metrics=json

# Export CSV metrics for analysis
mvn verify -pl json-java21-schema -Djson.schema.metrics=csv
```

**Current measured compatibility**:
- **Overall**: 63.3% (1,153 of 1,822 tests pass)
- **Test coverage**: 420 test groups, 1,657 validation attempts
- **Skip breakdown**: 70 unsupported schema groups, 2 test exceptions, 504 lenient mismatches

## Building

Expand Down
26 changes: 26 additions & 0 deletions json-java21-schema/AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,32 @@ The project uses `java.util.logging` with levels:
- **JSON Schema Test Suite**: Official tests from json-schema-org
- **Real-world schemas**: Complex nested validation scenarios
- **Performance tests**: Large schema compilation
- **Metrics reporting**: Comprehensive compatibility statistics with detailed skip categorization

### JSON Schema Test Suite Metrics

The integration test now provides defensible compatibility metrics:

```bash
# Run with console metrics (default)
mvnd verify -pl json-java21-schema

# Export detailed JSON metrics
mvnd verify -pl json-java21-schema -Djson.schema.metrics=json

# Export CSV metrics for analysis
mvnd verify -pl json-java21-schema -Djson.schema.metrics=csv
```

**Current measured compatibility** (as of implementation):
- **Overall**: 63.3% (1,153 of 1,822 tests pass)
- **Test coverage**: 420 test groups, 1,657 validation attempts
- **Skip breakdown**: 70 unsupported schema groups, 2 test exceptions, 504 lenient mismatches

The metrics distinguish between:
- **unsupportedSchemaGroup**: Whole groups skipped due to unsupported features (e.g., $ref, anchors)
- **testException**: Individual tests that threw exceptions during validation
- **lenientMismatch**: Expected≠actual results in lenient mode (counted as failures in strict mode)

#### Annotation Tests (`JsonSchemaAnnotationsTest.java`)
- **Annotation processing**: Compile-time schema generation
Expand Down
5 changes: 4 additions & 1 deletion json-java21-schema/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,10 @@ Compatibility and verify

- The module runs the official JSON Schema Test Suite during Maven verify.
- Default mode is lenient: unsupported groups/tests are skipped to avoid build breaks while still logging.
- Strict mode: enable with -Djson.schema.strict=true to enforce full assertions. In strict mode it currently passes about 71% of applicable cases.
- Strict mode: enable with -Djson.schema.strict=true to enforce full assertions.
- **Measured compatibility**: 63.3% (1,153 of 1,822 tests pass in lenient mode)
- **Test coverage**: 420 test groups, 1,657 validation attempts, 70 unsupported schema groups, 2 test exceptions
- Detailed metrics available via `-Djson.schema.metrics=json|csv`

How to run

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,14 @@
import jdk.sandbox.java.util.json.Json;
import org.junit.jupiter.api.DynamicTest;
import org.junit.jupiter.api.TestFactory;
import org.junit.jupiter.api.AfterAll;
import org.junit.jupiter.api.Assumptions;

import java.io.File;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.atomic.LongAdder;
import java.util.stream.Stream;
import java.util.stream.StreamSupport;

Expand All @@ -25,6 +28,8 @@ public class JsonSchemaCheckIT {
new File("target/json-schema-test-suite/tests/draft2020-12");
private static final ObjectMapper MAPPER = new ObjectMapper();
private static final boolean STRICT = Boolean.getBoolean("json.schema.strict");
private static final String METRICS_FMT = System.getProperty("json.schema.metrics", "").trim();
private static final SuiteMetrics METRICS = new SuiteMetrics();

@SuppressWarnings("resource")
@TestFactory
Expand All @@ -37,6 +42,19 @@ Stream<DynamicTest> runOfficialSuite() throws Exception {
private Stream<DynamicTest> testsFromFile(Path file) {
try {
JsonNode root = MAPPER.readTree(file.toFile());

// Count groups and tests discovered
int groupCount = root.size();
METRICS.groupsDiscovered.add(groupCount);
perFile(file).groups.add(groupCount);

int testCount = 0;
for (JsonNode group : root) {
testCount += group.get("tests").size();
}
METRICS.testsDiscovered.add(testCount);
perFile(file).tests.add(testCount);

return StreamSupport.stream(root.spliterator(), false)
.flatMap(group -> {
String groupDesc = group.get("description").asText();
Expand All @@ -55,29 +73,62 @@ private Stream<DynamicTest> testsFromFile(Path file) {
try {
actual = schema.validate(
Json.parse(test.get("data").toString())).valid();

// Count validation attempt
METRICS.validationsRun.increment();
perFile(file).run.increment();
} catch (Exception e) {
String reason = e.getMessage() == null ? e.getClass().getSimpleName() : e.getMessage();
System.err.println("[JsonSchemaCheckIT] Skipping test due to exception: "
+ groupDesc + " — " + reason + " (" + file.getFileName() + ")");

// Count exception skip
METRICS.skipTestException.increment();
perFile(file).skipException.increment();

if (STRICT) throw e;
Assumptions.assumeTrue(false, "Skipped: " + reason);
return; // not reached when strict
}

if (STRICT) {
assertEquals(expected, actual);
try {
assertEquals(expected, actual);
// Count pass in strict mode
METRICS.passed.increment();
perFile(file).pass.increment();
} catch (AssertionError e) {
// Count failure in strict mode
METRICS.failed.increment();
perFile(file).fail.increment();
throw e;
}
} else if (expected != actual) {
System.err.println("[JsonSchemaCheckIT] Mismatch (ignored): "
+ groupDesc + " — expected=" + expected + ", actual=" + actual
+ " (" + file.getFileName() + ")");

// Count lenient mismatch skip
METRICS.skipLenientMismatch.increment();
perFile(file).skipMismatch.increment();

Assumptions.assumeTrue(false, "Mismatch ignored");
} else {
// Count pass in lenient mode
METRICS.passed.increment();
perFile(file).pass.increment();
}
}));
} catch (Exception ex) {
// Unsupported schema for this group; emit a single skipped test for visibility
String reason = ex.getMessage() == null ? ex.getClass().getSimpleName() : ex.getMessage();
System.err.println("[JsonSchemaCheckIT] Skipping group due to unsupported schema: "
+ groupDesc + " — " + reason + " (" + file.getFileName() + ")");

// Count unsupported group skip
METRICS.skipUnsupportedGroup.increment();
perFile(file).skipUnsupported.increment();

return Stream.of(DynamicTest.dynamicTest(
groupDesc + " – SKIPPED: " + reason,
() -> { if (STRICT) throw ex; Assumptions.assumeTrue(false, "Unsupported schema: " + reason); }
Expand All @@ -88,4 +139,146 @@ private Stream<DynamicTest> testsFromFile(Path file) {
throw new RuntimeException("Failed to process " + file, ex);
}
}

private static SuiteMetrics.FileCounters perFile(Path file) {
return METRICS.perFile.computeIfAbsent(file.getFileName().toString(), k -> new SuiteMetrics.FileCounters());
}

@AfterAll
static void printAndPersistMetrics() throws Exception {
var strict = STRICT;
var totalRun = METRICS.validationsRun.sum();
var passed = METRICS.passed.sum();
var failed = METRICS.failed.sum();
var skippedU = METRICS.skipUnsupportedGroup.sum();
var skippedE = METRICS.skipTestException.sum();
var skippedM = METRICS.skipLenientMismatch.sum();

System.out.printf(
"JSON-SCHEMA SUITE (%s): groups=%d testsScanned=%d run=%d passed=%d failed=%d skipped={unsupported=%d, exception=%d, lenientMismatch=%d}%n",
strict ? "STRICT" : "LENIENT",
METRICS.groupsDiscovered.sum(),
METRICS.testsDiscovered.sum(),
totalRun, passed, failed, skippedU, skippedE, skippedM
);

if (!METRICS_FMT.isEmpty()) {
var outDir = java.nio.file.Path.of("target");
java.nio.file.Files.createDirectories(outDir);
var ts = java.time.OffsetDateTime.now().toString();
Copy link

Copilot AI Sep 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The timestamp format from OffsetDateTime.toString() may contain characters that are problematic for filenames or JSON values. Consider using a more controlled format like DateTimeFormatter.ISO_INSTANT or a custom format."

Copilot uses AI. Check for mistakes.
if ("json".equalsIgnoreCase(METRICS_FMT)) {
var json = buildJsonSummary(strict, ts);
java.nio.file.Files.writeString(outDir.resolve("json-schema-compat.json"), json);
} else if ("csv".equalsIgnoreCase(METRICS_FMT)) {
var csv = buildCsvSummary(strict, ts);
java.nio.file.Files.writeString(outDir.resolve("json-schema-compat.csv"), csv);
}
}
}

private static String buildJsonSummary(boolean strict, String timestamp) {
var totals = new StringBuilder();
totals.append("{\n");
totals.append(" \"mode\": \"").append(strict ? "STRICT" : "LENIENT").append("\",\n");
totals.append(" \"timestamp\": \"").append(timestamp).append("\",\n");
totals.append(" \"totals\": {\n");
totals.append(" \"groupsDiscovered\": ").append(METRICS.groupsDiscovered.sum()).append(",\n");
totals.append(" \"testsDiscovered\": ").append(METRICS.testsDiscovered.sum()).append(",\n");
totals.append(" \"validationsRun\": ").append(METRICS.validationsRun.sum()).append(",\n");
totals.append(" \"passed\": ").append(METRICS.passed.sum()).append(",\n");
totals.append(" \"failed\": ").append(METRICS.failed.sum()).append(",\n");
totals.append(" \"skipped\": {\n");
totals.append(" \"unsupportedSchemaGroup\": ").append(METRICS.skipUnsupportedGroup.sum()).append(",\n");
totals.append(" \"testException\": ").append(METRICS.skipTestException.sum()).append(",\n");
totals.append(" \"lenientMismatch\": ").append(METRICS.skipLenientMismatch.sum()).append("\n");
totals.append(" }\n");
totals.append(" },\n");
totals.append(" \"perFile\": [\n");

var files = new java.util.ArrayList<String>(METRICS.perFile.keySet());
java.util.Collections.sort(files);
var first = true;
for (String file : files) {
var counters = METRICS.perFile.get(file);
if (!first) totals.append(",\n");
first = false;
totals.append(" {\n");
totals.append(" \"file\": \"").append(file).append("\",\n");
totals.append(" \"groups\": ").append(counters.groups.sum()).append(",\n");
totals.append(" \"tests\": ").append(counters.tests.sum()).append(",\n");
totals.append(" \"run\": ").append(counters.run.sum()).append(",\n");
totals.append(" \"pass\": ").append(counters.pass.sum()).append(",\n");
totals.append(" \"fail\": ").append(counters.fail.sum()).append(",\n");
totals.append(" \"skipUnsupported\": ").append(counters.skipUnsupported.sum()).append(",\n");
totals.append(" \"skipException\": ").append(counters.skipException.sum()).append(",\n");
totals.append(" \"skipMismatch\": ").append(counters.skipMismatch.sum()).append("\n");
totals.append(" }");
}
totals.append("\n ]\n");
totals.append("}\n");
Copy link

Copilot AI Sep 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Manual JSON string building is error-prone and hard to maintain. Consider using Jackson ObjectMapper (already available as MAPPER) to serialize a proper data structure instead of manual string concatenation."

Copilot uses AI. Check for mistakes.
return totals.toString();
Comment on lines +180 to +220
Copy link

Copilot AI Sep 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Manual JSON string building is error-prone and hard to maintain. Consider using Jackson ObjectMapper (already available as MAPPER) to serialize a proper data structure instead of manual string concatenation."

Suggested change
var totals = new StringBuilder();
totals.append("{\n");
totals.append(" \"mode\": \"").append(strict ? "STRICT" : "LENIENT").append("\",\n");
totals.append(" \"timestamp\": \"").append(timestamp).append("\",\n");
totals.append(" \"totals\": {\n");
totals.append(" \"groupsDiscovered\": ").append(METRICS.groupsDiscovered.sum()).append(",\n");
totals.append(" \"testsDiscovered\": ").append(METRICS.testsDiscovered.sum()).append(",\n");
totals.append(" \"validationsRun\": ").append(METRICS.validationsRun.sum()).append(",\n");
totals.append(" \"passed\": ").append(METRICS.passed.sum()).append(",\n");
totals.append(" \"failed\": ").append(METRICS.failed.sum()).append(",\n");
totals.append(" \"skipped\": {\n");
totals.append(" \"unsupportedSchemaGroup\": ").append(METRICS.skipUnsupportedGroup.sum()).append(",\n");
totals.append(" \"testException\": ").append(METRICS.skipTestException.sum()).append(",\n");
totals.append(" \"lenientMismatch\": ").append(METRICS.skipLenientMismatch.sum()).append("\n");
totals.append(" }\n");
totals.append(" },\n");
totals.append(" \"perFile\": [\n");
var files = new java.util.ArrayList<String>(METRICS.perFile.keySet());
java.util.Collections.sort(files);
var first = true;
for (String file : files) {
var counters = METRICS.perFile.get(file);
if (!first) totals.append(",\n");
first = false;
totals.append(" {\n");
totals.append(" \"file\": \"").append(file).append("\",\n");
totals.append(" \"groups\": ").append(counters.groups.sum()).append(",\n");
totals.append(" \"tests\": ").append(counters.tests.sum()).append(",\n");
totals.append(" \"run\": ").append(counters.run.sum()).append(",\n");
totals.append(" \"pass\": ").append(counters.pass.sum()).append(",\n");
totals.append(" \"fail\": ").append(counters.fail.sum()).append(",\n");
totals.append(" \"skipUnsupported\": ").append(counters.skipUnsupported.sum()).append(",\n");
totals.append(" \"skipException\": ").append(counters.skipException.sum()).append(",\n");
totals.append(" \"skipMismatch\": ").append(counters.skipMismatch.sum()).append("\n");
totals.append(" }");
}
totals.append("\n ]\n");
totals.append("}\n");
return totals.toString();
// Build the "skipped" map
var skipped = new java.util.LinkedHashMap<String, Object>();
skipped.put("unsupportedSchemaGroup", METRICS.skipUnsupportedGroup.sum());
skipped.put("testException", METRICS.skipTestException.sum());
skipped.put("lenientMismatch", METRICS.skipLenientMismatch.sum());
// Build the "totals" map
var totalsMap = new java.util.LinkedHashMap<String, Object>();
totalsMap.put("groupsDiscovered", METRICS.groupsDiscovered.sum());
totalsMap.put("testsDiscovered", METRICS.testsDiscovered.sum());
totalsMap.put("validationsRun", METRICS.validationsRun.sum());
totalsMap.put("passed", METRICS.passed.sum());
totalsMap.put("failed", METRICS.failed.sum());
totalsMap.put("skipped", skipped);
// Build the "perFile" list
var files = new java.util.ArrayList<String>(METRICS.perFile.keySet());
java.util.Collections.sort(files);
var perFileList = new java.util.ArrayList<java.util.Map<String, Object>>();
for (String file : files) {
var counters = METRICS.perFile.get(file);
var fileMap = new java.util.LinkedHashMap<String, Object>();
fileMap.put("file", file);
fileMap.put("groups", counters.groups.sum());
fileMap.put("tests", counters.tests.sum());
fileMap.put("run", counters.run.sum());
fileMap.put("pass", counters.pass.sum());
fileMap.put("fail", counters.fail.sum());
fileMap.put("skipUnsupported", counters.skipUnsupported.sum());
fileMap.put("skipException", counters.skipException.sum());
fileMap.put("skipMismatch", counters.skipMismatch.sum());
perFileList.add(fileMap);
}
// Build the top-level map
var summary = new java.util.LinkedHashMap<String, Object>();
summary.put("mode", strict ? "STRICT" : "LENIENT");
summary.put("timestamp", timestamp);
summary.put("totals", totalsMap);
summary.put("perFile", perFileList);
try {
return MAPPER.writeValueAsString(summary);
} catch (Exception e) {
throw new RuntimeException("Failed to serialize JSON summary", e);
}

Copilot uses AI. Check for mistakes.
Comment on lines +180 to +220
Copy link

Copilot AI Sep 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Manual JSON string building is error-prone and hard to maintain. Consider using Jackson ObjectMapper (already available as MAPPER) to serialize a proper data structure instead of manual string concatenation."

Suggested change
var totals = new StringBuilder();
totals.append("{\n");
totals.append(" \"mode\": \"").append(strict ? "STRICT" : "LENIENT").append("\",\n");
totals.append(" \"timestamp\": \"").append(timestamp).append("\",\n");
totals.append(" \"totals\": {\n");
totals.append(" \"groupsDiscovered\": ").append(METRICS.groupsDiscovered.sum()).append(",\n");
totals.append(" \"testsDiscovered\": ").append(METRICS.testsDiscovered.sum()).append(",\n");
totals.append(" \"validationsRun\": ").append(METRICS.validationsRun.sum()).append(",\n");
totals.append(" \"passed\": ").append(METRICS.passed.sum()).append(",\n");
totals.append(" \"failed\": ").append(METRICS.failed.sum()).append(",\n");
totals.append(" \"skipped\": {\n");
totals.append(" \"unsupportedSchemaGroup\": ").append(METRICS.skipUnsupportedGroup.sum()).append(",\n");
totals.append(" \"testException\": ").append(METRICS.skipTestException.sum()).append(",\n");
totals.append(" \"lenientMismatch\": ").append(METRICS.skipLenientMismatch.sum()).append("\n");
totals.append(" }\n");
totals.append(" },\n");
totals.append(" \"perFile\": [\n");
var files = new java.util.ArrayList<String>(METRICS.perFile.keySet());
java.util.Collections.sort(files);
var first = true;
for (String file : files) {
var counters = METRICS.perFile.get(file);
if (!first) totals.append(",\n");
first = false;
totals.append(" {\n");
totals.append(" \"file\": \"").append(file).append("\",\n");
totals.append(" \"groups\": ").append(counters.groups.sum()).append(",\n");
totals.append(" \"tests\": ").append(counters.tests.sum()).append(",\n");
totals.append(" \"run\": ").append(counters.run.sum()).append(",\n");
totals.append(" \"pass\": ").append(counters.pass.sum()).append(",\n");
totals.append(" \"fail\": ").append(counters.fail.sum()).append(",\n");
totals.append(" \"skipUnsupported\": ").append(counters.skipUnsupported.sum()).append(",\n");
totals.append(" \"skipException\": ").append(counters.skipException.sum()).append(",\n");
totals.append(" \"skipMismatch\": ").append(counters.skipMismatch.sum()).append("\n");
totals.append(" }");
}
totals.append("\n ]\n");
totals.append("}\n");
return totals.toString();
try {
// Root object
java.util.Map<String, Object> root = new java.util.LinkedHashMap<>();
root.put("mode", strict ? "STRICT" : "LENIENT");
root.put("timestamp", timestamp);
// Totals object
java.util.Map<String, Object> totals = new java.util.LinkedHashMap<>();
totals.put("groupsDiscovered", METRICS.groupsDiscovered.sum());
totals.put("testsDiscovered", METRICS.testsDiscovered.sum());
totals.put("validationsRun", METRICS.validationsRun.sum());
totals.put("passed", METRICS.passed.sum());
totals.put("failed", METRICS.failed.sum());
java.util.Map<String, Object> skipped = new java.util.LinkedHashMap<>();
skipped.put("unsupportedSchemaGroup", METRICS.skipUnsupportedGroup.sum());
skipped.put("testException", METRICS.skipTestException.sum());
skipped.put("lenientMismatch", METRICS.skipLenientMismatch.sum());
totals.put("skipped", skipped);
root.put("totals", totals);
// Per-file array
var files = new java.util.ArrayList<String>(METRICS.perFile.keySet());
java.util.Collections.sort(files);
java.util.List<java.util.Map<String, Object>> perFileList = new java.util.ArrayList<>();
for (String file : files) {
var counters = METRICS.perFile.get(file);
java.util.Map<String, Object> fileMap = new java.util.LinkedHashMap<>();
fileMap.put("file", file);
fileMap.put("groups", counters.groups.sum());
fileMap.put("tests", counters.tests.sum());
fileMap.put("run", counters.run.sum());
fileMap.put("pass", counters.pass.sum());
fileMap.put("fail", counters.fail.sum());
fileMap.put("skipUnsupported", counters.skipUnsupported.sum());
fileMap.put("skipException", counters.skipException.sum());
fileMap.put("skipMismatch", counters.skipMismatch.sum());
perFileList.add(fileMap);
}
root.put("perFile", perFileList);
// Serialize to pretty JSON
return MAPPER.writerWithDefaultPrettyPrinter().writeValueAsString(root);
} catch (Exception e) {
throw new RuntimeException("Failed to build JSON summary", e);
}

Copilot uses AI. Check for mistakes.
}

private static String buildCsvSummary(boolean strict, String timestamp) {
var csv = new StringBuilder();
csv.append("mode,timestamp,groupsDiscovered,testsDiscovered,validationsRun,passed,failed,skipUnsupportedGroup,skipTestException,skipLenientMismatch\n");
csv.append(strict ? "STRICT" : "LENIENT").append(",");
csv.append(timestamp).append(",");
csv.append(METRICS.groupsDiscovered.sum()).append(",");
csv.append(METRICS.testsDiscovered.sum()).append(",");
csv.append(METRICS.validationsRun.sum()).append(",");
csv.append(METRICS.passed.sum()).append(",");
csv.append(METRICS.failed.sum()).append(",");
csv.append(METRICS.skipUnsupportedGroup.sum()).append(",");
csv.append(METRICS.skipTestException.sum()).append(",");
csv.append(METRICS.skipLenientMismatch.sum()).append("\n");

csv.append("\nperFile breakdown:\n");
csv.append("file,groups,tests,run,pass,fail,skipUnsupported,skipException,skipMismatch\n");

var files = new java.util.ArrayList<String>(METRICS.perFile.keySet());
java.util.Collections.sort(files);
for (String file : files) {
var counters = METRICS.perFile.get(file);
csv.append(file).append(",");
csv.append(counters.groups.sum()).append(",");
csv.append(counters.tests.sum()).append(",");
csv.append(counters.run.sum()).append(",");
csv.append(counters.pass.sum()).append(",");
csv.append(counters.fail.sum()).append(",");
csv.append(counters.skipUnsupported.sum()).append(",");
csv.append(counters.skipException.sum()).append(",");
csv.append(counters.skipMismatch.sum()).append("\n");
}
return csv.toString();
}
}

/**
* Thread-safe metrics container for the JSON Schema Test Suite run.
*/
final class SuiteMetrics {
final LongAdder groupsDiscovered = new LongAdder();
final LongAdder testsDiscovered = new LongAdder();

final LongAdder validationsRun = new LongAdder(); // attempted validations
final LongAdder passed = new LongAdder();
final LongAdder failed = new LongAdder();

final LongAdder skipUnsupportedGroup = new LongAdder();
final LongAdder skipTestException = new LongAdder(); // lenient only
final LongAdder skipLenientMismatch = new LongAdder(); // lenient only

final ConcurrentHashMap<String, FileCounters> perFile = new ConcurrentHashMap<>();

static final class FileCounters {
final LongAdder groups = new LongAdder();
final LongAdder tests = new LongAdder();
final LongAdder run = new LongAdder();
final LongAdder pass = new LongAdder();
final LongAdder fail = new LongAdder();
final LongAdder skipUnsupported = new LongAdder();
final LongAdder skipException = new LongAdder();
final LongAdder skipMismatch = new LongAdder();
}
}