Compare commits

...

97 commits
v2.1.0 ... main

Author SHA1 Message Date
Jozef Izso
ee446707ff
Merge pull request #692 from dorny/release/v2.3.0
Some checks failed
Check dist/ / check-dist (push) Has been cancelled
CI / Build & Test (push) Has been cancelled
2025-11-30 01:52:48 +01:00
Jozef Izso
fe45e95373
test-reporter release v2.3.0 2025-11-30 01:49:30 +01:00
Jozef Izso
e40a1da745
Merge pull request #682 from dorny/dependabot/npm_and_yarn/reports/mocha/multi-f14266366f 2025-11-30 01:01:42 +01:00
dependabot[bot]
3445860437
Bump js-yaml and mocha in /reports/mocha
Bumps [js-yaml](https://github.com/nodeca/js-yaml) to 4.1.1 and updates ancestor dependency [mocha](https://github.com/mochajs/mocha). These dependencies need to be updated together.


Updates `js-yaml` from 4.0.0 to 4.1.1
- [Changelog](https://github.com/nodeca/js-yaml/blob/master/CHANGELOG.md)
- [Commits](https://github.com/nodeca/js-yaml/compare/4.0.0...4.1.1)

Updates `mocha` from 8.3.0 to 11.7.5
- [Release notes](https://github.com/mochajs/mocha/releases)
- [Changelog](https://github.com/mochajs/mocha/blob/v11.7.5/CHANGELOG.md)
- [Commits](https://github.com/mochajs/mocha/compare/v8.3.0...v11.7.5)

---
updated-dependencies:
- dependency-name: js-yaml
  dependency-version: 4.1.1
  dependency-type: indirect
- dependency-name: mocha
  dependency-version: 11.7.5
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-29 23:45:48 +00:00
Jozef Izso
9ef5c136b2
Merge pull request #691 from dorny/fix/complete-documentation 2025-11-30 00:40:18 +01:00
Jozef Izso
83e20c1534
Merge pull request #685 from dorny/dependabot/npm_and_yarn/reports/jest/js-yaml-3.14.2 2025-11-30 00:37:29 +01:00
Jozef Izso
4331a3b620
Clarify the dotnet-nunit docs to require NUnit3TestAdapter for nunit logger 2025-11-23 15:26:03 +01:00
Jozef Izso
04232af26f
Complete documentation for all supported reporters
This commit addresses several documentation gaps to ensure all implemented
reporters are properly documented across action.yml and README.md.

Changes:
1. Updated action.yml description to include all supported languages:
   - Added: Go, Python (pytest, unittest), Ruby (RSpec), Swift

2. Added Ruby/RSpec to supported languages list in README.md

3. Added detailed documentation sections in README.md:
   - dotnet-nunit: Added section with NUnit3 XML format instructions
   - rspec-json: Added section with RSpec JSON formatter configuration

All reporters now have:
- Entry in action.yml description
- Entry in README supported languages list
- Entry in README usage documentation (reporter input)
- Detailed documentation section in README "Supported formats"
- Implementation in src/main.ts
- Tests in __tests__/

This ensures users can discover and use all available reporters without
confusion about what is supported.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-22 18:05:33 +01:00
Jozef Izso
cf146f4036
Merge pull request #690 from dorny/fix/add-golang-json-to-action-yml
Some checks failed
Check dist/ / check-dist (push) Has been cancelled
CI / Build & Test (push) Has been cancelled
2025-11-22 17:50:03 +01:00
Jozef Izso
33fc27cf09
Merge pull request #687 from dorny/dependabot/github_actions/actions/checkout-6 2025-11-22 17:49:02 +01:00
Jozef Izso
8fd5fc58ca
Add missing golang-json reporter to action.yml
The golang-json reporter has been fully implemented since earlier versions
but was missing from the action.yml documentation. This made it undiscoverable
for users looking for Go test support.

Changes:
- Added golang-json to the list of supported reporters in action.yml

This aligns the action.yml with:
- The actual implementation in src/main.ts (lines 264-265)
- The README.md documentation (line 145)
- The existing parser and tests

Fixes #689

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-22 17:47:11 +01:00
dependabot[bot]
fc80cb4400
Bump actions/checkout from 5 to 6
Bumps [actions/checkout](https://github.com/actions/checkout) from 5 to 6.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-21 23:07:16 +00:00
dependabot[bot]
79ea6a9d0e
Bump js-yaml from 3.14.0 to 3.14.2 in /reports/jest
Bumps [js-yaml](https://github.com/nodeca/js-yaml) from 3.14.0 to 3.14.2.
- [Changelog](https://github.com/nodeca/js-yaml/blob/master/CHANGELOG.md)
- [Commits](https://github.com/nodeca/js-yaml/compare/3.14.0...3.14.2)

---
updated-dependencies:
- dependency-name: js-yaml
  dependency-version: 3.14.2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-18 19:47:48 +00:00
Jozef Izso
aef3d726a6
Merge pull request #683 from micmarc/feature/python-pytest
Some checks failed
Check dist/ / check-dist (push) Has been cancelled
CI / Build & Test (push) Has been cancelled
2025-11-15 18:19:24 +01:00
Michael Marcus
c1a56edcfe Enhance pytest support
Add robust test schema for pytest report
Update README with sample pytest command
2025-11-15 11:55:41 -05:00
Jozef Izso
3b9dad208e
Merge pull request #681 from phactum-mnestler/main
Update sax.js to fix large XML file parsing #681
2025-11-15 11:24:15 +01:00
Jozef Izso
7c636a991c
Merge pull request #643 from micmarc/feature/python-support 2025-11-15 11:12:45 +01:00
Michael Nestler
cfce4bda71 Add saxjs to version overrides 2025-11-15 11:07:56 +01:00
Michael Marcus
fe87682515 Improve testing with robust schema for unittest report 2025-11-14 21:59:25 -05:00
Michael Marcus
9b8d3b002e Python support
Add python-xunit-parser.ts with associated case statement
Add python-xunit to reporter docs in action.yml
Add tests
Update README

Resolves #244
Resolves #633
2025-11-14 16:29:58 -05:00
Jozef Izso
e2f0ff6339
Merge pull request #645 from micmarc/fix/report-title-short-summary
Some checks are pending
Check dist/ / check-dist (push) Waiting to run
CI / Build & Test (push) Waiting to run
2025-11-14 20:00:35 +01:00
Jozef Izso
bc8c29617e
test-reporter release v2.2.0
Merge pull request #679 from dorny/release/v2.2.0
2025-11-14 18:46:03 +01:00
Michael Marcus
9aef9d168f Remove info log 2025-11-14 12:01:42 -05:00
Michael Marcus
6b64465c34 Rebuild index.js after rebase from main 2025-11-14 11:59:46 -05:00
Michael Marcus
6617053f9c Fix short summary formatting when a report title is present 2025-11-14 11:58:16 -05:00
Michael Nestler
43a747d94c Update sax.js to fix large XML file parsing 2025-11-14 16:06:35 +01:00
Jozef Izso
7b7927aa7d
test-reporter release v2.2.0 2025-11-13 21:35:26 +01:00
Jozef Izso
eeac280b8e
Merge pull request #676 from dorny/dependabot/npm_and_yarn/js-yaml-4.1.1
Some checks are pending
Check dist/ / check-dist (push) Waiting to run
CI / Build & Test (push) Waiting to run
2025-11-13 20:55:51 +01:00
dependabot[bot]
6939db53fb
Bump js-yaml from 4.1.0 to 4.1.1
Bumps [js-yaml](https://github.com/nodeca/js-yaml) from 4.1.0 to 4.1.1.
- [Changelog](https://github.com/nodeca/js-yaml/blob/master/CHANGELOG.md)
- [Commits](https://github.com/nodeca/js-yaml/compare/4.1.0...4.1.1)

---
updated-dependencies:
- dependency-name: js-yaml
  dependency-version: 4.1.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-12 23:17:28 +00:00
Jozef Izso
b3812e0f5b
Merge pull request #664 from pespinel/feature/collapsed-option
Some checks are pending
Check dist/ / check-dist (push) Waiting to run
CI / Build & Test (push) Waiting to run
2025-11-12 18:03:25 +01:00
Jozef Izso
cd299561e7
tests: refactor input collapsed=auto to individual tests
Generated-by: Claude Sonnet 4.5
2025-11-12 14:42:24 +01:00
Jozef Izso
c7935221e6
feat: add validation for the collapsed input parameter
Generated-by: Claude Sonnet 4.5
2025-11-12 14:42:24 +01:00
Jozef Izso
5fb0582760
chore: run linter to fix code style issues 2025-11-12 14:42:24 +01:00
pespinel
7148297f02
test: fix linter and create tests 2025-11-12 14:42:23 +01:00
pespinel
828632acd0
feat: add collapsed option to control report visibility 2025-11-12 14:42:23 +01:00
Jozef Izso
4a41472ca4
Merge pull request #672 from dorny/bugfix/671-allow_hyphens_in_badge_image_names
Some checks failed
Check dist/ / check-dist (push) Has been cancelled
CI / Build & Test (push) Has been cancelled
2025-11-10 17:14:53 +01:00
Jozef Izso
22dc7b52f4
Rebuild dist/ code 2025-11-05 22:54:32 +01:00
Jozef Izso
bed521d765
Fix badge encoding for values including the _ underscore character 2025-11-05 22:54:32 +01:00
Jozef Izso
6079ce3d17
Add unit tests for getBadge() function to ensure values are encoded for img.shields.io service 2025-11-05 22:54:32 +01:00
Jozef Izso
de77f76b7e
Fix badge image by correctly encoding the URI components 2025-11-05 21:18:09 +01:00
Jozef Izso
c883ae9738
Merge pull request #668 from dorny/feature/update_packages
Some checks failed
Check dist/ / check-dist (push) Has been cancelled
CI / Build & Test (push) Has been cancelled
2025-10-25 11:54:17 +02:00
Jozef Izso
35be98f7e7
Update npm packages to latest minor updates
Update report:
 @types/node  ^20.19.13  →  ^20.19.23
 @vercel/ncc    ^0.38.3  →    ^0.38.4
 jest           ^30.1.3  →    ^30.2.0
 ts-jest        ^29.4.1  →    ^29.4.5
 typescript      ^5.9.2  →     ^5.9.3
2025-10-25 11:30:47 +02:00
Jozef Izso
f372a8338e
Merge pull request #662 from dorny/dependabot/github_actions/actions/setup-node-6 2025-10-25 11:09:08 +02:00
Jozef Izso
948dd03d7b
Merge pull request #665 from dorny/dependabot/github_actions/actions/upload-artifact-5 2025-10-25 11:06:43 +02:00
dependabot[bot]
cf9db500ed
Bump actions/upload-artifact from 4 to 5
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 4 to 5.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](https://github.com/actions/upload-artifact/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-version: '5'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-24 23:05:54 +00:00
dependabot[bot]
ba33405987
Bump actions/setup-node from 5 to 6
Bumps [actions/setup-node](https://github.com/actions/setup-node) from 5 to 6.
- [Release notes](https://github.com/actions/setup-node/releases)
- [Commits](https://github.com/actions/setup-node/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/setup-node
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-14 23:06:32 +00:00
Jozef Izso
34d8269ede
Merge pull request #658 from dorny/feature/github_actions 2025-09-12 13:33:58 +02:00
Jozef Izso
fd1c798d8d
Upgrade actions/setup-node to v5
This action runs using NodeJS 24 and requires GitHub Runner v2.327.1 and newer
https://github.com/actions/setup-node/releases/tag/v5.0.0
2025-09-12 13:16:20 +02:00
Jozef Izso
2211cf1035
Upgrade actions/checkout to v5
This action runs using NodeJS 24 and requires GitHub Runner v2.327.1 and newer
https://github.com/actions/checkout/releases/tag/v5.0.0
2025-09-12 13:15:16 +02:00
Jozef Izso
be3721d54a
Merge pull request #657 from dorny/feature/update_packages 2025-09-12 13:09:48 +02:00
Jozef Izso
d171d89cd4
Update dependencies to latest minor releases 2025-09-12 13:07:41 +02:00
Jozef Izso
661decd3af
Upgrade jest to v30.1.3
https://github.com/jestjs/jest/releases/tag/v30.1.0
2025-09-12 13:03:19 +02:00
Jozef Izso
bd9e36bf0c
Upgrade @types/picomatch to match the picomatch v4 used as main dependency 2025-09-12 13:00:45 +02:00
Jozef Izso
9642942c97
Upgrade @types/jest to match the jest v30 used as main dependency 2025-09-12 13:00:09 +02:00
Jozef Izso
aa953f36f9
Merge pull request #646 from dorny/dependabot/npm_and_yarn/typescript-5.9.2 2025-08-05 12:03:02 +02:00
dependabot[bot]
f686ce916a
Bump typescript from 5.8.3 to 5.9.2
Bumps [typescript](https://github.com/microsoft/TypeScript) from 5.8.3 to 5.9.2.
- [Release notes](https://github.com/microsoft/TypeScript/releases)
- [Changelog](https://github.com/microsoft/TypeScript/blob/main/azure-pipelines.release-publish.yml)
- [Commits](https://github.com/microsoft/TypeScript/compare/v5.8.3...v5.9.2)

---
updated-dependencies:
- dependency-name: typescript
  dependency-version: 5.9.2
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-31 23:43:00 +00:00
Jozef Izso
b14337a039
Merge pull request #632 from dorny/feature/631_jest_v30 2025-07-14 16:09:03 +02:00
Jozef Izso
ec1e910416
Merge pull request #630 from dorny/chore/shadowed_variables 2025-07-14 16:08:51 +02:00
Jozef Izso
353a438514
Merge pull request #637 from dorny/bugfix/610-report-title-not-working 2025-07-14 16:08:33 +02:00
Jozef Izso
dc3a92680f
test-reporter release v2.1.1
Merge pull request #638 from dorny/release/v2.1.1
2025-07-09 16:35:29 +02:00
Jozef Izso
e8e27361af
test-reporter release v2.1.1 2025-07-09 16:15:51 +02:00
Jozef Izso
ec9d9d2459
Merge pull request #623 from 0xced/xunitv3-trx 2025-07-09 16:07:47 +02:00
Jozef Izso
be36461fba
Fix code formatting in the dotnet-trx.tests.ts file 2025-07-09 16:00:13 +02:00
Jozef Izso
9c4a54379f
Upgrade jest to v30
Update `jest` package to v30.0.4
2025-07-09 14:16:56 +02:00
Jozef Izso
07e5c648b5
Define the report-title attribute in action definition
Fixes missing attribute from PR #568
2025-07-09 14:04:01 +02:00
Jozef Izso
4d84da17a1
Upgrade jest to v30 2025-06-29 16:55:36 +02:00
Jozef Izso
1c33c4c823
Rebuild dist/ code 2025-06-29 16:26:12 +02:00
Jozef Izso
eea8b67eb1
Refactor variable names
Fixes error `no-shadow`: Disallow variable declarations from shadowing variables declared in the outer scope.
2025-06-29 16:22:07 +02:00
Jozef Izso
8dd7047bf0
Merge pull request #628 from dorny/chore/update_packages 2025-06-29 12:28:49 +02:00
Jozef Izso
71814ae0cd
Update development dependencies 2025-06-28 11:32:23 +02:00
Cédric Luthi
4128d36b92 Use "Unclassified" when no class name is available
Fixes #556
2025-06-22 20:33:16 +02:00
Cédric Luthi
d1504ea554 Add test on a trx report where the className attribute of TestMethod is missing
This reproduces issue #556
2025-06-22 16:18:52 +02:00
Jozef Izso
18430db883
Merge pull request #615 from dboriichuk/trx-stack-trace-summary
Add stack trace from trx to summary
2025-06-18 13:28:01 +02:00
dboriichuk
ae8bd195f8 Add stack tracke to summary 2025-06-18 14:09:49 +03:00
Jozef Izso
a1ac327414
Merge pull request #606 from dorny/bugfix/142-list_failed_tests_only 2025-06-13 13:07:08 +02:00
Jozef Izso
09bbc2665b
Merge pull request #605 from dorny/docs/markdown_linting 2025-06-13 13:06:38 +02:00
Jozef Izso
5456de96b0
Merge pull request #604 from dorny/feature/603_improve_code_quality 2025-06-13 13:06:16 +02:00
Jozef Izso
6adcc0c72a
Compile dist code 2025-06-08 16:44:51 +02:00
Siegfried Pammer
2312e637f3
List only failed tests
Fixes issue #142
2025-06-08 16:44:51 +02:00
Jozef Izso
3a1ec876a9
Improve alternative test of images showing test-reporter generated content 2025-06-08 14:18:39 +02:00
Jozef Izso
c4b9a11207
Generate alternative text for images showing test-reporter generated content
Text was generated by Copilot with GPT-4.1 model.
2025-06-08 14:05:26 +02:00
Jozef Izso
981f52cdc2
Configure permissive markdown linting rules 2025-06-08 14:00:55 +02:00
Jozef Izso
016f16f7b8
Do not lint markdown files in the __tests_ folder 2025-06-08 13:56:11 +02:00
Jozef Izso
6126f49c2c
Use types arguments in the downloadStream event handlers
Issues #603
2025-06-08 13:21:12 +02:00
Jozef Izso
be2b975095
Use typed WorkflowRunEvent when parsing workflow_run payload
Issue #603
2025-06-08 13:21:12 +02:00
Jozef Izso
a6b3e93884
Merge pull request #588 from OlesGalatsan/features/add-links-to-report 2025-06-08 13:13:59 +02:00
Jozef Izso
223c6cd55b
Compile dist code 2025-06-08 13:09:53 +02:00
Oles Galatsan
b522d19cac
Return links to summary report 2025-06-08 13:09:27 +02:00
Jozef Izso
d56352b96c
Merge pull request #589 from OlesGalatsan/features/add-step-summary-short-summary 2025-06-08 12:59:33 +02:00
Jozef Izso
f078ba5e08
Merge pull request #599 from dorny/chore/update_packages 2025-06-07 13:17:49 +02:00
Jozef Izso
389794c9ad
Update packages to latest minor releases 2025-06-07 12:11:17 +02:00
Oles Galatsan
9934a5fbd4 Merge branch 'features/add-step-summary-short-summary' of github.com:OlesGalatsan/test-reporter into features/add-step-summary-short-summary 2025-05-20 14:46:19 +03:00
Oles Galatsan
0f25185fa5 Rebuild 2025-05-20 14:46:05 +03:00
Oles Galatsan
fb07f1b2a5
Merge branch 'dorny:main' into features/add-step-summary-short-summary 2025-05-20 14:25:01 +03:00
Oles Galatsan
364887ed35 Add short summary for step summary 2025-05-20 12:11:51 +03:00
Jozef Izso
0b4ea9b681
Merge pull request #576 from Vampire/use-if-always 2025-05-19 21:55:05 +02:00
Björn Kautler
302102c9a4 Use if: ${{ !cancelled() }} 2025-05-19 20:02:04 +02:00
68 changed files with 7795 additions and 4286 deletions

View file

@ -21,10 +21,10 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Set Node.js
uses: actions/setup-node@v4
uses: actions/setup-node@v6
with:
node-version-file: '.nvmrc'
@ -46,7 +46,7 @@ jobs:
id: diff
# If index.js was different than expected, upload the expected version as an artifact
- uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v5
if: ${{ failure() && steps.diff.conclusion == 'failure' }}
with:
name: dist

View file

@ -13,8 +13,8 @@ jobs:
name: Build & Test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
- uses: actions/checkout@v6
- uses: actions/setup-node@v6
with:
node-version-file: '.nvmrc'
- run: npm ci
@ -24,8 +24,8 @@ jobs:
- run: npm test
- name: Upload test results
if: success() || failure()
uses: actions/upload-artifact@v4
if: ${{ !cancelled() }}
uses: actions/upload-artifact@v5
with:
name: test-results
path: __tests__/__results__/*.xml

View file

@ -8,14 +8,14 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- run: npm ci
- run: npm run build
- run: npm test
- name: Create test report
uses: ./
if: success() || failure()
if: ${{ !cancelled() }}
with:
name: JEST Tests
path: __tests__/__results__/*.xml

View file

@ -11,7 +11,7 @@ jobs:
name: Workflow test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: ./
with:
artifact: test-results

13
.markdownlint.json Normal file
View file

@ -0,0 +1,13 @@
{
"blanks-around-headings": false,
"blanks-around-lists": false,
"blanks-around-tables": false,
"blanks-around-fences": false,
"no-bare-urls": false,
"line-length": false,
"ul-style": false,
"no-inline-html": false,
"no-multiple-blanks": {
"maximum": 3
}
}

View file

@ -1,5 +1,30 @@
# Changelog
## 2.3.0
* Feature: Add Python support with `python-xunit` reporter (pytest) https://github.com/dorny/test-reporter/pull/643
* Feature: Add pytest traceback parsing and `directory-mapping` option https://github.com/dorny/test-reporter/pull/238
* Performance: Update sax.js to fix large XML file parsing https://github.com/dorny/test-reporter/pull/681
* Documentation: Complete documentation for all supported reporters https://github.com/dorny/test-reporter/pull/691
* Security: Bump js-yaml and mocha in /reports/mocha (fixes prototype pollution) https://github.com/dorny/test-reporter/pull/682
## 2.2.0
* Feature: Add collapsed option to control report summary visibility https://github.com/dorny/test-reporter/pull/664
* Fix badge encoding for values including underscore and hyphens https://github.com/dorny/test-reporter/pull/672
* Fix missing `report-title` attribute in action definition https://github.com/dorny/test-reporter/pull/637
* Refactor variable names to fix shadowing issues https://github.com/dorny/test-reporter/pull/630
## 2.1.1
* Fix error when a TestMethod element does not have a className attribute in a trx file https://github.com/dorny/test-reporter/pull/623
* Add stack trace from trx to summary https://github.com/dorny/test-reporter/pull/615
* List only failed tests https://github.com/dorny/test-reporter/pull/606
* Add type definitions to `github-utils.ts` https://github.com/dorny/test-reporter/pull/604
* Avoid split on undefined https://github.com/dorny/test-reporter/pull/258
* Return links to summary report https://github.com/dorny/test-reporter/pull/588
* Add step summary short summary https://github.com/dorny/test-reporter/pull/589
* Fix for empty TRX TestDefinitions https://github.com/dorny/test-reporter/pull/582
* Increase step summary limit to 1MiB https://github.com/dorny/test-reporter/pull/581
* Fix input description for list options https://github.com/dorny/test-reporter/pull/572
## 2.1.0
* Feature: Add summary title https://github.com/dorny/test-reporter/pull/568
* Feature: Add Golang test parser https://github.com/dorny/test-reporter/pull/571

View file

@ -9,7 +9,7 @@ This [Github Action](https://github.com/features/actions) displays test results
✔️ Provides final `conclusion` and counts of `passed`, `failed` and `skipped` tests as output parameters
**How it looks:**
|![](assets/fluent-validation-report.png)|![](assets/provider-error-summary.png)|![](assets/provider-error-details.png)|![](assets/mocha-groups.png)|
|![Summary showing test run with all tests passed, including details such as test file names, number of passed, failed, and skipped tests, and execution times. The interface is dark-themed and displays a green badge indicating 3527 passed and 4 skipped tests.](assets/fluent-validation-report.png)|![Summary showing test run with a failed unit test. The summary uses a dark background and highlights errors in red for quick identification.](assets/provider-error-summary.png)|![GitHub Actions annotation showing details of a failed unit test with a detailed error message, stack trace, and code annotation.](assets/provider-error-details.png)|![Test cases written in Mocha framework with a list of expectations for each test case. The table format and color-coded badges help users quickly assess test suite health.](assets/mocha-groups.png)|
|:--:|:--:|:--:|:--:|
**Supported languages / frameworks:**
@ -19,6 +19,8 @@ This [Github Action](https://github.com/features/actions) displays test results
- Go / [go test](https://pkg.go.dev/testing)
- Java / [JUnit](https://junit.org/)
- JavaScript / [JEST](https://jestjs.io/) / [Mocha](https://mochajs.org/)
- Python / [pytest](https://docs.pytest.org/en/stable/) / [unittest](https://docs.python.org/3/library/unittest.html)
- Ruby / [RSpec](https://rspec.info/)
- Swift / xUnit
For more information see [Supported formats](#supported-formats) section.
@ -50,7 +52,7 @@ jobs:
- name: Test Report
uses: dorny/test-reporter@v2
if: success() || failure() # run this step even if previous step failed
if: ${{ !cancelled() }} # run this step even if previous step failed
with:
name: JEST Tests # Name of the check run which will be created
path: reports/jest-*.xml # Path to test results
@ -79,7 +81,7 @@ jobs:
- run: npm ci # install packages
- run: npm test # run tests (configured to use jest-junit reporter)
- uses: actions/upload-artifact@v4 # upload test results
if: success() || failure() # run this step even if previous step failed
if: ${{ !cancelled() }} # run this step even if previous step failed
with:
name: test-results
path: jest-junit.xml
@ -145,7 +147,9 @@ jobs:
# java-junit
# jest-junit
# mocha-json
# python-xunit
# rspec-json
# swift-xunit
reporter: ''
# Allows you to generate only the summary.
@ -253,6 +257,20 @@ Supported testing frameworks:
For more information see [dotnet test](https://docs.microsoft.com/en-us/dotnet/core/tools/dotnet-test#examples)
</details>
<details>
<summary>dotnet-nunit</summary>
Test execution must be configured to generate [NUnit3](https://docs.nunit.org/articles/nunit/technical-notes/usage/Test-Result-XML-Format.html) XML test results.
Install the [NUnit3TestAdapter](https://www.nuget.org/packages/NUnit3TestAdapter) package (required; it registers the `nunit` logger for `dotnet test`), then run tests with:
`dotnet test --logger "nunit;LogFileName=test-results.xml"`
Supported testing frameworks:
- [NUnit](https://nunit.org/)
For more information see [dotnet test](https://docs.microsoft.com/en-us/dotnet/core/tools/dotnet-test#examples)
</details>
<details>
<summary>flutter-json</summary>
@ -349,6 +367,41 @@ Before version [v9.1.0](https://github.com/mochajs/mocha/releases/tag/v9.1.0), M
Please update Mocha to version [v9.1.0](https://github.com/mochajs/mocha/releases/tag/v9.1.0) or above if you encounter this issue.
</details>
<details>
<summary>python-xunit (Experimental)</summary>
Support for Python test results in xUnit format is experimental - should work but it was not extensively tested.
For **pytest** support, configure [JUnit XML output](https://docs.pytest.org/en/stable/how-to/output.html#creating-junitxml-format-files) and run with the `--junit-xml` option, which also lets you specify the output path for test results.
```shell
pytest --junit-xml=test-report.xml
```
For **unittest** support, use a test runner that outputs the JUnit report format, such as [unittest-xml-reporting](https://pypi.org/project/unittest-xml-reporting/).
</details>
<details>
<summary>rspec-json</summary>
[RSpec](https://rspec.info/) testing framework support requires the usage of JSON formatter.
You can configure RSpec to output JSON format by using the `--format json` option and redirecting to a file:
```shell
rspec --format json --out rspec-results.json
```
Or configure it in `.rspec` file:
```
--format json
--out rspec-results.json
```
For more information see:
- [RSpec documentation](https://rspec.info/)
- [RSpec Formatters](https://relishapp.com/rspec/rspec-core/docs/formatters)
</details>
<details>
<summary>swift-xunit (Experimental)</summary>

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-1%20passed%2C%204%20failed%2C%201%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/dart-json.json|1 ✅|4 ❌|1 ⚪|4s|
|[fixtures/dart-json.json](#user-content-r0)|1 ✅|4 ❌|1 ⚪|4s|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/dart-json.json</a>
**6** tests were completed in **4s** with **1** passed, **4** failed and **1** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-3%20passed%2C%205%20failed%2C%201%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/dotnet-nunit.xml|3 ✅|5 ❌|1 ⚪|230ms|
|[fixtures/dotnet-nunit.xml](#user-content-r0)|3 ✅|5 ❌|1 ⚪|230ms|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/dotnet-nunit.xml</a>
**9** tests were completed in **230ms** with **3** passed, **5** failed and **1** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -0,0 +1,34 @@
![Tests failed](https://img.shields.io/badge/tests-5%20passed%2C%205%20failed%2C%201%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|[fixtures/dotnet-trx.trx](#user-content-r0)|5 ✅|5 ❌|1 ⚪|1s|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/dotnet-trx.trx</a>
**11** tests were completed in **1s** with **5** passed, **5** failed and **1** skipped.
|Test suite|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|[DotnetTests.XUnitTests.CalculatorTests](#user-content-r0s0)|5 ✅|5 ❌|1 ⚪|118ms|
### ❌ <a id="user-content-r0s0" href="#user-content-r0s0">DotnetTests.XUnitTests.CalculatorTests</a>
```
❌ Exception_In_TargetTest
System.DivideByZeroException : Attempted to divide by zero.
at DotnetTests.Unit.Calculator.Div(Int32 a, Int32 b) in C:\Users\Michal\Workspace\dorny\test-reporter\reports\dotnet\DotnetTests.Unit\Calculator.cs:line 9
at DotnetTests.XUnitTests.CalculatorTests.Exception_In_TargetTest() in C:\Users\Michal\Workspace\dorny\test-reporter\reports\dotnet\DotnetTests.XUnitTests\CalculatorTests.cs:line 33
❌ Exception_In_Test
System.Exception : Test
at DotnetTests.XUnitTests.CalculatorTests.Exception_In_Test() in C:\Users\Michal\Workspace\dorny\test-reporter\reports\dotnet\DotnetTests.XUnitTests\CalculatorTests.cs:line 39
❌ Failing_Test
Assert.Equal() Failure
Expected: 3
Actual: 2
at DotnetTests.XUnitTests.CalculatorTests.Failing_Test() in C:\Users\Michal\Workspace\dorny\test-reporter\reports\dotnet\DotnetTests.XUnitTests\CalculatorTests.cs:line 27
❌ Is_Even_Number(i: 3)
Assert.True() Failure
Expected: True
Actual: False
at DotnetTests.XUnitTests.CalculatorTests.Is_Even_Number(Int32 i) in C:\Users\Michal\Workspace\dorny\test-reporter\reports\dotnet\DotnetTests.XUnitTests\CalculatorTests.cs:line 59
❌ Should be even number(i: 3)
Assert.True() Failure
Expected: True
Actual: False
at DotnetTests.XUnitTests.CalculatorTests.Theory_With_Custom_Name(Int32 i) in C:\Users\Michal\Workspace\dorny\test-reporter\reports\dotnet\DotnetTests.XUnitTests\CalculatorTests.cs:line 67
```

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-5%20passed%2C%205%20failed%2C%201%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/dotnet-trx.trx|5 ✅|5 ❌|1 ⚪|1s|
|[fixtures/dotnet-trx.trx](#user-content-r0)|5 ✅|5 ❌|1 ⚪|1s|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/dotnet-trx.trx</a>
**11** tests were completed in **1s** with **5** passed, **5** failed and **1** skipped.
|Test suite|Passed|Failed|Skipped|Time|
@ -12,23 +12,29 @@
✅ Custom Name
❌ Exception_In_TargetTest
System.DivideByZeroException : Attempted to divide by zero.
at DotnetTests.Unit.Calculator.Div(Int32 a, Int32 b) in C:\Users\Michal\Workspace\dorny\test-reporter\reports\dotnet\DotnetTests.Unit\Calculator.cs:line 9
at DotnetTests.XUnitTests.CalculatorTests.Exception_In_TargetTest() in C:\Users\Michal\Workspace\dorny\test-reporter\reports\dotnet\DotnetTests.XUnitTests\CalculatorTests.cs:line 33
❌ Exception_In_Test
System.Exception : Test
at DotnetTests.XUnitTests.CalculatorTests.Exception_In_Test() in C:\Users\Michal\Workspace\dorny\test-reporter\reports\dotnet\DotnetTests.XUnitTests\CalculatorTests.cs:line 39
❌ Failing_Test
Assert.Equal() Failure
Expected: 3
Actual: 2
at DotnetTests.XUnitTests.CalculatorTests.Failing_Test() in C:\Users\Michal\Workspace\dorny\test-reporter\reports\dotnet\DotnetTests.XUnitTests\CalculatorTests.cs:line 27
✅ Is_Even_Number(i: 2)
❌ Is_Even_Number(i: 3)
Assert.True() Failure
Expected: True
Actual: False
at DotnetTests.XUnitTests.CalculatorTests.Is_Even_Number(Int32 i) in C:\Users\Michal\Workspace\dorny\test-reporter\reports\dotnet\DotnetTests.XUnitTests\CalculatorTests.cs:line 59
✅ Passing_Test
✅ Should be even number(i: 2)
❌ Should be even number(i: 3)
Assert.True() Failure
Expected: True
Actual: False
at DotnetTests.XUnitTests.CalculatorTests.Theory_With_Custom_Name(Int32 i) in C:\Users\Michal\Workspace\dorny\test-reporter\reports\dotnet\DotnetTests.XUnitTests\CalculatorTests.cs:line 67
⚪ Skipped_Test
✅ Timeout_Test
```

View file

@ -0,0 +1,26 @@
![Tests failed](https://img.shields.io/badge/tests-1%20passed%2C%203%20failed-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|[fixtures/dotnet-xunitv3.trx](#user-content-r0)|1 ✅|3 ❌||267ms|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/dotnet-xunitv3.trx</a>
**4** tests were completed in **267ms** with **1** passed, **3** failed and **0** skipped.
|Test suite|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|[DotnetTests.XUnitV3Tests.FixtureTests](#user-content-r0s0)|1 ✅|1 ❌||18ms|
|[Unclassified](#user-content-r0s1)||2 ❌||0ms|
### ❌ <a id="user-content-r0s0" href="#user-content-r0s0">DotnetTests.XUnitV3Tests.FixtureTests</a>
```
❌ Failing_Test
Assert.Null() Failure: Value is not null
Expected: null
Actual: Fixture { }
at DotnetTests.XUnitV3Tests.FixtureTests.Failing_Test() in /_/reports/dotnet/DotnetTests.XUnitV3Tests/FixtureTests.cs:line 25
at System.RuntimeMethodHandle.InvokeMethod(Object target, Void** arguments, Signature sig, Boolean isConstructor)
at System.Reflection.MethodBaseInvoker.InvokeWithNoArgs(Object obj, BindingFlags invokeAttr)
✅ Passing_Test
```
### ❌ <a id="user-content-r0s1" href="#user-content-r0s1">Unclassified</a>
```
❌ [Test Class Cleanup Failure (DotnetTests.XUnitV3Tests.FixtureTests.Failing_Test)]
❌ [Test Class Cleanup Failure (DotnetTests.XUnitV3Tests.FixtureTests.Passing_Test)]
```

View file

@ -3,7 +3,7 @@
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/external/FluentValidation.Tests.trx|803 ✅||1 ⚪|4s|
|[fixtures/external/FluentValidation.Tests.trx](#user-content-r0)|803 ✅||1 ⚪|4s|
## ✅ <a id="user-content-r0" href="#user-content-r0">fixtures/external/FluentValidation.Tests.trx</a>
**804** tests were completed in **4s** with **803** passed, **0** failed and **1** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-5%20passed%2C%206%20failed%2C%201%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/golang-json.json|5 ✅|6 ❌|1 ⚪|6s|
|[fixtures/golang-json.json](#user-content-r0)|5 ✅|6 ❌|1 ⚪|6s|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/golang-json.json</a>
**12** tests were completed in **6s** with **5** passed, **6** failed and **1** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -3,7 +3,7 @@
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/jest-junit-eslint.xml|1 ✅|||0ms|
|[fixtures/jest-junit-eslint.xml](#user-content-r0)|1 ✅|||0ms|
## ✅ <a id="user-content-r0" href="#user-content-r0">fixtures/jest-junit-eslint.xml</a>
**1** tests were completed in **0ms** with **1** passed, **0** failed and **0** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-1%20passed%2C%204%20failed%2C%201%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/jest-junit.xml|1 ✅|4 ❌|1 ⚪|1s|
|[fixtures/jest-junit.xml](#user-content-r0)|1 ✅|4 ❌|1 ⚪|1s|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/jest-junit.xml</a>
**6** tests were completed in **1s** with **1** passed, **4** failed and **1** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -3,7 +3,7 @@
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/external/jest/jest-react-component-test-results.xml|1 ✅|||1000ms|
|[fixtures/external/jest/jest-react-component-test-results.xml](#user-content-r0)|1 ✅|||1000ms|
## ✅ <a id="user-content-r0" href="#user-content-r0">fixtures/external/jest/jest-react-component-test-results.xml</a>
**1** tests were completed in **1000ms** with **1** passed, **0** failed and **0** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-4207%20passed%2C%202%20failed%2C%2030%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/external/jest/jest-test-results.xml|4207 ✅|2 ❌|30 ⚪|166s|
|[fixtures/external/jest/jest-test-results.xml](#user-content-r0)|4207 ✅|2 ❌|30 ⚪|166s|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/external/jest/jest-test-results.xml</a>
**4239** tests were completed in **166s** with **4207** passed, **2** failed and **30** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-1%20failed-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/junit-with-message.xml||1 ❌||1ms|
|[fixtures/junit-with-message.xml](#user-content-r0)||1 ❌||1ms|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/junit-with-message.xml</a>
**1** tests were completed in **1ms** with **0** passed, **1** failed and **0** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-1%20passed%2C%204%20failed%2C%201%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/mocha-json.json|1 ✅|4 ❌|1 ⚪|12ms|
|[fixtures/mocha-json.json](#user-content-r0)|1 ✅|4 ❌|1 ⚪|12ms|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/mocha-json.json</a>
**6** tests were completed in **12ms** with **1** passed, **4** failed and **1** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -3,7 +3,7 @@
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/external/mocha/mocha-test-results.json|833 ✅||6 ⚪|6s|
|[fixtures/external/mocha/mocha-test-results.json](#user-content-r0)|833 ✅||6 ⚪|6s|
## ✅ <a id="user-content-r0" href="#user-content-r0">fixtures/external/mocha/mocha-test-results.json</a>
**839** tests were completed in **6s** with **833** passed, **0** failed and **6** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-268%20passed%2C%201%20failed-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/external/flutter/provider-test-results.json|268 ✅|1 ❌||0ms|
|[fixtures/external/flutter/provider-test-results.json](#user-content-r0)|268 ✅|1 ❌||0ms|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/external/flutter/provider-test-results.json</a>
**269** tests were completed in **0ms** with **268** passed, **1** failed and **0** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-1%20failed%2C%201%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/external/java/TEST-org.apache.pulsar.AddMissingPatchVersionTest.xml||1 ❌|1 ⚪|116ms|
|[fixtures/external/java/TEST-org.apache.pulsar.AddMissingPatchVersionTest.xml](#user-content-r0)||1 ❌|1 ⚪|116ms|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/external/java/TEST-org.apache.pulsar.AddMissingPatchVersionTest.xml</a>
**2** tests were completed in **116ms** with **0** passed, **1** failed and **1** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-793%20passed%2C%201%20failed%2C%2014%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/external/java/pulsar-test-report.xml|793 ✅|1 ❌|14 ⚪|2127s|
|[fixtures/external/java/pulsar-test-report.xml](#user-content-r0)|793 ✅|1 ❌|14 ⚪|2127s|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/external/java/pulsar-test-report.xml</a>
**808** tests were completed in **2127s** with **793** passed, **1** failed and **14** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -0,0 +1,26 @@
![Tests failed](https://img.shields.io/badge/tests-6%20passed%2C%202%20failed%2C%202%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|[fixtures/python-xunit-pytest.xml](#user-content-r0)|6 ✅|2 ❌|2 ⚪|19ms|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/python-xunit-pytest.xml</a>
**10** tests were completed in **19ms** with **6** passed, **2** failed and **2** skipped.
|Test suite|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|[pytest](#user-content-r0s0)|6 ✅|2 ❌|2 ⚪|19ms|
### ❌ <a id="user-content-r0s0" href="#user-content-r0s0">pytest</a>
```
tests.test_lib
✅ test_always_pass
✅ test_with_subtests
✅ test_parameterized[param1]
✅ test_parameterized[param2]
⚪ test_always_skip
❌ test_always_fail
assert False
⚪ test_expected_failure
❌ test_error
Exception: error
✅ test_with_record_property
custom_classname
✅ test_with_record_xml_attribute
```

View file

@ -0,0 +1,23 @@
![Tests failed](https://img.shields.io/badge/tests-4%20passed%2C%202%20failed%2C%202%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|[fixtures/python-xunit-unittest.xml](#user-content-r0)|4 ✅|2 ❌|2 ⚪|1ms|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/python-xunit-unittest.xml</a>
**8** tests were completed in **1ms** with **4** passed, **2** failed and **2** skipped.
|Test suite|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|[TestAcme-20251114214921](#user-content-r0s0)|4 ✅|2 ❌|2 ⚪|1ms|
### ❌ <a id="user-content-r0s0" href="#user-content-r0s0">TestAcme-20251114214921</a>
```
TestAcme
✅ test_always_pass
✅ test_parameterized_0_param1
✅ test_parameterized_1_param2
✅ test_with_subtests
❌ test_always_fail
AssertionError: failed
❌ test_error
Exception: error
⚪ test_always_skip
⚪ test_expected_failure
```

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-1%20passed%2C%201%20failed%2C%201%20skipped-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/rspec-json.json|1 ✅|1 ❌|1 ⚪|0ms|
|[fixtures/rspec-json.json](#user-content-r0)|1 ✅|1 ❌|1 ⚪|0ms|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/rspec-json.json</a>
**3** tests were completed in **0ms** with **1** passed, **1** failed and **1** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -3,7 +3,7 @@
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/external/SilentNotes.trx|67 ✅||12 ⚪|1s|
|[fixtures/external/SilentNotes.trx](#user-content-r0)|67 ✅||12 ⚪|1s|
## ✅ <a id="user-content-r0" href="#user-content-r0">fixtures/external/SilentNotes.trx</a>
**79** tests were completed in **1s** with **67** passed, **0** failed and **12** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -1,7 +1,7 @@
![Tests failed](https://img.shields.io/badge/tests-2%20passed%2C%201%20failed-critical)
|Report|Passed|Failed|Skipped|Time|
|:---|---:|---:|---:|---:|
|fixtures/swift-xunit.xml|2 ✅|1 ❌||220ms|
|[fixtures/swift-xunit.xml](#user-content-r0)|2 ✅|1 ❌||220ms|
## ❌ <a id="user-content-r0" href="#user-content-r0">fixtures/swift-xunit.xml</a>
**3** tests were completed in **220ms** with **2** passed, **1** failed and **0** skipped.
|Test suite|Passed|Failed|Skipped|Time|

View file

@ -1,4 +1,4 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
// Jest Snapshot v1, https://jestjs.io/docs/snapshot-testing
exports[`dart-json tests matches report snapshot 1`] = `
TestRunResult {

View file

@ -1,4 +1,4 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
// Jest Snapshot v1, https://jestjs.io/docs/snapshot-testing
exports[`dotnet-nunit tests report from ./reports/dotnet test results matches snapshot 1`] = `
TestRunResult {

View file

@ -1,6 +1,6 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
// Jest Snapshot v1, https://jestjs.io/docs/snapshot-testing
exports[`dotnet-trx tests matches report snapshot 1`] = `
exports[`dotnet-trx tests matches dotnet-trx report snapshot 1`] = `
TestRunResult {
"path": "fixtures/dotnet-trx.trx",
"suites": [
@ -21,7 +21,9 @@ TestRunResult {
at DotnetTests.Unit.Calculator.Div(Int32 a, Int32 b) in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.Unit\\Calculator.cs:line 9
at DotnetTests.XUnitTests.CalculatorTests.Exception_In_TargetTest() in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 33",
"line": 9,
"message": "System.DivideByZeroException : Attempted to divide by zero.",
"message": "System.DivideByZeroException : Attempted to divide by zero.
at DotnetTests.Unit.Calculator.Div(Int32 a, Int32 b) in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.Unit\\Calculator.cs:line 9
at DotnetTests.XUnitTests.CalculatorTests.Exception_In_TargetTest() in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 33",
"path": "DotnetTests.Unit/Calculator.cs",
},
"name": "Exception_In_TargetTest",
@ -33,7 +35,8 @@ TestRunResult {
"details": "System.Exception : Test
at DotnetTests.XUnitTests.CalculatorTests.Exception_In_Test() in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 39",
"line": 39,
"message": "System.Exception : Test",
"message": "System.Exception : Test
at DotnetTests.XUnitTests.CalculatorTests.Exception_In_Test() in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 39",
"path": "DotnetTests.XUnitTests/CalculatorTests.cs",
},
"name": "Exception_In_Test",
@ -49,7 +52,8 @@ Actual: 2
"line": 27,
"message": "Assert.Equal() Failure
Expected: 3
Actual: 2",
Actual: 2
at DotnetTests.XUnitTests.CalculatorTests.Failing_Test() in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 27",
"path": "DotnetTests.XUnitTests/CalculatorTests.cs",
},
"name": "Failing_Test",
@ -71,7 +75,8 @@ Actual: False
"line": 59,
"message": "Assert.True() Failure
Expected: True
Actual: False",
Actual: False
at DotnetTests.XUnitTests.CalculatorTests.Is_Even_Number(Int32 i) in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 59",
"path": "DotnetTests.XUnitTests/CalculatorTests.cs",
},
"name": "Is_Even_Number(i: 3)",
@ -99,7 +104,213 @@ Actual: False
"line": 67,
"message": "Assert.True() Failure
Expected: True
Actual: False",
Actual: False
at DotnetTests.XUnitTests.CalculatorTests.Theory_With_Custom_Name(Int32 i) in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 67",
"path": "DotnetTests.XUnitTests/CalculatorTests.cs",
},
"name": "Should be even number(i: 3)",
"result": "failed",
"time": 0.6537000000000001,
},
TestCaseResult {
"error": undefined,
"name": "Skipped_Test",
"result": "skipped",
"time": 1,
},
TestCaseResult {
"error": undefined,
"name": "Timeout_Test",
"result": "success",
"time": 108.42580000000001,
},
],
},
],
"name": "DotnetTests.XUnitTests.CalculatorTests",
"totalTime": undefined,
},
],
"totalTime": 1116,
}
`;
exports[`dotnet-trx tests matches dotnet-xunitv3 report snapshot 1`] = `
TestRunResult {
"path": "fixtures/dotnet-xunitv3.trx",
"suites": [
TestSuiteResult {
"groups": [
TestGroupResult {
"name": null,
"tests": [
TestCaseResult {
"error": {
"details": "Assert.Null() Failure: Value is not null
Expected: null
Actual: Fixture { }
at DotnetTests.XUnitV3Tests.FixtureTests.Failing_Test() in /_/reports/dotnet/DotnetTests.XUnitV3Tests/FixtureTests.cs:line 25
at System.RuntimeMethodHandle.InvokeMethod(Object target, Void** arguments, Signature sig, Boolean isConstructor)
at System.Reflection.MethodBaseInvoker.InvokeWithNoArgs(Object obj, BindingFlags invokeAttr)",
"line": 25,
"message": "Assert.Null() Failure: Value is not null
Expected: null
Actual: Fixture { }
at DotnetTests.XUnitV3Tests.FixtureTests.Failing_Test() in /_/reports/dotnet/DotnetTests.XUnitV3Tests/FixtureTests.cs:line 25
at System.RuntimeMethodHandle.InvokeMethod(Object target, Void** arguments, Signature sig, Boolean isConstructor)
at System.Reflection.MethodBaseInvoker.InvokeWithNoArgs(Object obj, BindingFlags invokeAttr)",
"path": "DotnetTests.XUnitV3Tests/FixtureTests.cs",
},
"name": "Failing_Test",
"result": "failed",
"time": 17.0545,
},
TestCaseResult {
"error": undefined,
"name": "Passing_Test",
"result": "success",
"time": 0.8786,
},
],
},
],
"name": "DotnetTests.XUnitV3Tests.FixtureTests",
"totalTime": undefined,
},
TestSuiteResult {
"groups": [
TestGroupResult {
"name": null,
"tests": [
TestCaseResult {
"error": undefined,
"name": "[Test Class Cleanup Failure (DotnetTests.XUnitV3Tests.FixtureTests.Failing_Test)]",
"result": "failed",
"time": 0,
},
TestCaseResult {
"error": undefined,
"name": "[Test Class Cleanup Failure (DotnetTests.XUnitV3Tests.FixtureTests.Passing_Test)]",
"result": "failed",
"time": 0,
},
],
},
],
"name": "Unclassified",
"totalTime": undefined,
},
],
"totalTime": 267,
}
`;
exports[`dotnet-trx tests matches report snapshot (only failed tests) 1`] = `
TestRunResult {
"path": "fixtures/dotnet-trx.trx",
"suites": [
TestSuiteResult {
"groups": [
TestGroupResult {
"name": null,
"tests": [
TestCaseResult {
"error": undefined,
"name": "Custom Name",
"result": "success",
"time": 0.1371,
},
TestCaseResult {
"error": {
"details": "System.DivideByZeroException : Attempted to divide by zero.
at DotnetTests.Unit.Calculator.Div(Int32 a, Int32 b) in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.Unit\\Calculator.cs:line 9
at DotnetTests.XUnitTests.CalculatorTests.Exception_In_TargetTest() in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 33",
"line": 9,
"message": "System.DivideByZeroException : Attempted to divide by zero.
at DotnetTests.Unit.Calculator.Div(Int32 a, Int32 b) in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.Unit\\Calculator.cs:line 9
at DotnetTests.XUnitTests.CalculatorTests.Exception_In_TargetTest() in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 33",
"path": "DotnetTests.Unit/Calculator.cs",
},
"name": "Exception_In_TargetTest",
"result": "failed",
"time": 0.8377,
},
TestCaseResult {
"error": {
"details": "System.Exception : Test
at DotnetTests.XUnitTests.CalculatorTests.Exception_In_Test() in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 39",
"line": 39,
"message": "System.Exception : Test
at DotnetTests.XUnitTests.CalculatorTests.Exception_In_Test() in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 39",
"path": "DotnetTests.XUnitTests/CalculatorTests.cs",
},
"name": "Exception_In_Test",
"result": "failed",
"time": 2.5175,
},
TestCaseResult {
"error": {
"details": "Assert.Equal() Failure
Expected: 3
Actual: 2
at DotnetTests.XUnitTests.CalculatorTests.Failing_Test() in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 27",
"line": 27,
"message": "Assert.Equal() Failure
Expected: 3
Actual: 2
at DotnetTests.XUnitTests.CalculatorTests.Failing_Test() in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 27",
"path": "DotnetTests.XUnitTests/CalculatorTests.cs",
},
"name": "Failing_Test",
"result": "failed",
"time": 3.8697,
},
TestCaseResult {
"error": undefined,
"name": "Is_Even_Number(i: 2)",
"result": "success",
"time": 0.0078,
},
TestCaseResult {
"error": {
"details": "Assert.True() Failure
Expected: True
Actual: False
at DotnetTests.XUnitTests.CalculatorTests.Is_Even_Number(Int32 i) in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 59",
"line": 59,
"message": "Assert.True() Failure
Expected: True
Actual: False
at DotnetTests.XUnitTests.CalculatorTests.Is_Even_Number(Int32 i) in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 59",
"path": "DotnetTests.XUnitTests/CalculatorTests.cs",
},
"name": "Is_Even_Number(i: 3)",
"result": "failed",
"time": 0.41409999999999997,
},
TestCaseResult {
"error": undefined,
"name": "Passing_Test",
"result": "success",
"time": 0.1365,
},
TestCaseResult {
"error": undefined,
"name": "Should be even number(i: 2)",
"result": "success",
"time": 0.0097,
},
TestCaseResult {
"error": {
"details": "Assert.True() Failure
Expected: True
Actual: False
at DotnetTests.XUnitTests.CalculatorTests.Theory_With_Custom_Name(Int32 i) in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 67",
"line": 67,
"message": "Assert.True() Failure
Expected: True
Actual: False
at DotnetTests.XUnitTests.CalculatorTests.Theory_With_Custom_Name(Int32 i) in C:\\Users\\Michal\\Workspace\\dorny\\test-reporter\\reports\\dotnet\\DotnetTests.XUnitTests\\CalculatorTests.cs:line 67",
"path": "DotnetTests.XUnitTests/CalculatorTests.cs",
},
"name": "Should be even number(i: 3)",

View file

@ -1,4 +1,4 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
// Jest Snapshot v1, https://jestjs.io/docs/snapshot-testing
exports[`golang-json tests report from ./reports/dotnet test results matches snapshot 1`] = `
TestRunResult {

View file

@ -1,4 +1,4 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
// Jest Snapshot v1, https://jestjs.io/docs/snapshot-testing
exports[`java-junit tests report from apache/pulsar single suite test results matches snapshot 1`] = `
TestRunResult {

View file

@ -1,4 +1,4 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
// Jest Snapshot v1, https://jestjs.io/docs/snapshot-testing
exports[`jest-junit tests parsing ESLint report without timing information works - PR #134 1`] = `
TestRunResult {

View file

@ -1,4 +1,4 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
// Jest Snapshot v1, https://jestjs.io/docs/snapshot-testing
exports[`mocha-json tests report from ./reports/mocha-json test results matches snapshot 1`] = `
TestRunResult {

View file

@ -0,0 +1,192 @@
// Jest Snapshot v1, https://jestjs.io/docs/snapshot-testing
exports[`python-xunit pytest report report from python test results matches snapshot 1`] = `
TestRunResult {
"path": "fixtures/python-xunit-pytest.xml",
"suites": [
TestSuiteResult {
"groups": [
TestGroupResult {
"name": "tests.test_lib",
"tests": [
TestCaseResult {
"error": undefined,
"name": "test_always_pass",
"result": "success",
"time": 2,
},
TestCaseResult {
"error": undefined,
"name": "test_with_subtests",
"result": "success",
"time": 5,
},
TestCaseResult {
"error": undefined,
"name": "test_parameterized[param1]",
"result": "success",
"time": 0,
},
TestCaseResult {
"error": undefined,
"name": "test_parameterized[param2]",
"result": "success",
"time": 0,
},
TestCaseResult {
"error": undefined,
"name": "test_always_skip",
"result": "skipped",
"time": 0,
},
TestCaseResult {
"error": {
"details": "def test_always_fail():
> assert False
E assert False
tests/test_lib.py:25: AssertionError
",
"line": undefined,
"message": "assert False",
"path": undefined,
},
"name": "test_always_fail",
"result": "failed",
"time": 0,
},
TestCaseResult {
"error": undefined,
"name": "test_expected_failure",
"result": "skipped",
"time": 0,
},
TestCaseResult {
"error": {
"details": "def test_error():
> raise Exception("error")
E Exception: error
tests/test_lib.py:32: Exception
",
"line": undefined,
"message": "Exception: error",
"path": undefined,
},
"name": "test_error",
"result": "failed",
"time": 0,
},
TestCaseResult {
"error": undefined,
"name": "test_with_record_property",
"result": "success",
"time": 0,
},
],
},
TestGroupResult {
"name": "custom_classname",
"tests": [
TestCaseResult {
"error": undefined,
"name": "test_with_record_xml_attribute",
"result": "success",
"time": 0,
},
],
},
],
"name": "pytest",
"totalTime": 19,
},
],
"totalTime": undefined,
}
`;
exports[`python-xunit unittest report report from python test results matches snapshot 1`] = `
TestRunResult {
"path": "fixtures/python-xunit-unittest.xml",
"suites": [
TestSuiteResult {
"groups": [
TestGroupResult {
"name": "TestAcme",
"tests": [
TestCaseResult {
"error": undefined,
"name": "test_always_pass",
"result": "success",
"time": 0,
},
TestCaseResult {
"error": undefined,
"name": "test_parameterized_0_param1",
"result": "success",
"time": 1,
},
TestCaseResult {
"error": undefined,
"name": "test_parameterized_1_param2",
"result": "success",
"time": 0,
},
TestCaseResult {
"error": undefined,
"name": "test_with_subtests",
"result": "success",
"time": 0,
},
TestCaseResult {
"error": {
"details": "Traceback (most recent call last):
File "/Users/foo/Projects/python-test/tests/test_lib.py", line 24, in test_always_fail
self.fail("failed")
AssertionError: failed
",
"line": undefined,
"message": "AssertionError: failed",
"path": undefined,
},
"name": "test_always_fail",
"result": "failed",
"time": 0,
},
TestCaseResult {
"error": {
"details": "Traceback (most recent call last):
File "/Users/foo/Projects/python-test/tests/test_lib.py", line 31, in test_error
raise Exception("error")
Exception: error
",
"line": undefined,
"message": "Exception: error",
"path": undefined,
},
"name": "test_error",
"result": "failed",
"time": 0,
},
TestCaseResult {
"error": undefined,
"name": "test_always_skip",
"result": "skipped",
"time": 0,
},
TestCaseResult {
"error": undefined,
"name": "test_expected_failure",
"result": "skipped",
"time": 0,
},
],
},
],
"name": "TestAcme-20251114214921",
"totalTime": 1,
},
],
"totalTime": 1,
}
`;

View file

@ -1,4 +1,4 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
// Jest Snapshot v1, https://jestjs.io/docs/snapshot-testing
exports[`rspec-json tests report from ./reports/rspec-json test results matches snapshot 1`] = `
TestRunResult {

View file

@ -1,4 +1,4 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
// Jest Snapshot v1, https://jestjs.io/docs/snapshot-testing
exports[`swift-xunit tests report from swift test results matches snapshot 1`] = `
TestRunResult {

View file

@ -3,7 +3,7 @@ import * as path from 'path'
import {DotnetTrxParser} from '../src/parsers/dotnet-trx/dotnet-trx-parser'
import {ParseOptions} from '../src/test-parser'
import {DEFAULT_OPTIONS, getReport} from '../src/report/get-report'
import {DEFAULT_OPTIONS, getReport, ReportOptions} from '../src/report/get-report'
import {normalizeFilePath} from '../src/utils/path-utils'
describe('dotnet-trx tests', () => {
@ -39,9 +39,34 @@ describe('dotnet-trx tests', () => {
expect(result.result).toBe('success')
})
it('matches report snapshot', async () => {
it.each([['dotnet-trx'], ['dotnet-xunitv3']])('matches %s report snapshot', async reportName => {
const fixturePath = path.join(__dirname, 'fixtures', `${reportName}.trx`)
const outputPath = path.join(__dirname, '__outputs__', `${reportName}.md`)
const filePath = normalizeFilePath(path.relative(__dirname, fixturePath))
const fileContent = fs.readFileSync(fixturePath, {encoding: 'utf8'})
const opts: ParseOptions = {
parseErrors: true,
trackedFiles: [
'DotnetTests.Unit/Calculator.cs',
'DotnetTests.XUnitTests/CalculatorTests.cs',
'DotnetTests.XUnitV3Tests/FixtureTests.cs'
]
//workDir: 'C:/Users/Michal/Workspace/dorny/test-check/reports/dotnet/'
}
const parser = new DotnetTrxParser(opts)
const result = await parser.parse(filePath, fileContent)
expect(result).toMatchSnapshot()
const report = getReport([result])
fs.mkdirSync(path.dirname(outputPath), {recursive: true})
fs.writeFileSync(outputPath, report)
})
it('matches report snapshot (only failed tests)', async () => {
const fixturePath = path.join(__dirname, 'fixtures', 'dotnet-trx.trx')
const outputPath = path.join(__dirname, '__outputs__', 'dotnet-trx.md')
const outputPath = path.join(__dirname, '__outputs__', 'dotnet-trx-only-failed.md')
const filePath = normalizeFilePath(path.relative(__dirname, fixturePath))
const fileContent = fs.readFileSync(fixturePath, {encoding: 'utf8'})
@ -55,7 +80,12 @@ describe('dotnet-trx tests', () => {
const result = await parser.parse(filePath, fileContent)
expect(result).toMatchSnapshot()
const report = getReport([result])
const reportOptions: ReportOptions = {
...DEFAULT_OPTIONS,
listSuites: 'all',
listTests: 'failed'
}
const report = getReport([result], reportOptions)
fs.mkdirSync(path.dirname(outputPath), {recursive: true})
fs.writeFileSync(outputPath, report)
})

View file

@ -0,0 +1,60 @@
<?xml version="1.0" encoding="utf-8"?>
<TestRun id="54e29175-539e-48a3-a634-3a1855a0ed38" name="@Asterix 2025-06-22 14:17:12.022" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
<Times creation="2025-06-22T14:17:11.756535Z" queuing="2025-06-22T14:17:11.756535Z" start="2025-06-22T14:17:11.756535Z" finish="2025-06-22T14:17:12.023063Z" />
<TestSettings name="default" id="932e6c6f-3e5b-4392-ad65-e04c1ef476b5">
<Deployment runDeploymentRoot="_Asterix_2025-06-22_14_17_12.022" />
</TestSettings>
<Results>
<UnitTestResult executionId="37242a1f-ca3e-44b3-8142-71e510480975" testId="f846a1e6-0b68-2ac6-9a66-f417926e3238" testName="DotnetTests.XUnitV3Tests.FixtureTests.Failing_Test" computerName="Asterix" duration="00:00:00.0170545" startTime="2025-06-22T14:17:11.9339840+00:00" endTime="2025-06-22T14:17:11.9750850+00:00" testType="13CDC9D9-DDB5-4fa4-A97D-D965CCFC6D4B" outcome="Failed" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" relativeResultsDirectory="37242a1f-ca3e-44b3-8142-71e510480975">
<Output>
<ErrorInfo>
<Message>Assert.Null() Failure: Value is not null
Expected: null
Actual: Fixture { }</Message>
<StackTrace> at DotnetTests.XUnitV3Tests.FixtureTests.Failing_Test() in /_/reports/dotnet/DotnetTests.XUnitV3Tests/FixtureTests.cs:line 25
at System.RuntimeMethodHandle.InvokeMethod(Object target, Void** arguments, Signature sig, Boolean isConstructor)
at System.Reflection.MethodBaseInvoker.InvokeWithNoArgs(Object obj, BindingFlags invokeAttr)</StackTrace>
</ErrorInfo>
</Output>
</UnitTestResult>
<UnitTestResult executionId="592aaafb-4dc0-49dc-b3c7-bcd81218d58a" testId="3ee930dd-8a75-92a0-0d90-373833166db1" testName="DotnetTests.XUnitV3Tests.FixtureTests.Passing_Test" computerName="Asterix" duration="00:00:00.0008786" startTime="2025-06-22T14:17:11.9819890+00:00" endTime="2025-06-22T14:17:11.9833560+00:00" testType="13CDC9D9-DDB5-4fa4-A97D-D965CCFC6D4B" outcome="Passed" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" relativeResultsDirectory="592aaafb-4dc0-49dc-b3c7-bcd81218d58a" />
<UnitTestResult executionId="19c42d36-f4d7-4046-bcc6-dd9b85c9ca2b" testId="372fb60f-1f5b-a52e-032e-41a7556021e8" testName="[Test Class Cleanup Failure (DotnetTests.XUnitV3Tests.FixtureTests.Passing_Test)]" computerName="Asterix" duration="00:00:00" startTime="2025-06-22T14:17:12.0320280+00:00" endTime="2025-06-22T14:17:12.0320290+00:00" testType="13CDC9D9-DDB5-4fa4-A97D-D965CCFC6D4B" outcome="Failed" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" relativeResultsDirectory="19c42d36-f4d7-4046-bcc6-dd9b85c9ca2b" />
<UnitTestResult executionId="b7f40170-1e2c-45ce-b5e4-5bf49fd4c360" testId="a69083a1-56b4-3da3-2d7c-66fda374fd8e" testName="[Test Class Cleanup Failure (DotnetTests.XUnitV3Tests.FixtureTests.Failing_Test)]" computerName="Asterix" duration="00:00:00" startTime="2025-06-22T14:17:12.0320420+00:00" endTime="2025-06-22T14:17:12.0320430+00:00" testType="13CDC9D9-DDB5-4fa4-A97D-D965CCFC6D4B" outcome="Failed" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" relativeResultsDirectory="b7f40170-1e2c-45ce-b5e4-5bf49fd4c360" />
</Results>
<TestDefinitions>
<UnitTest name="DotnetTests.XUnitV3Tests.FixtureTests.Failing_Test" storage="~/test-reporter/reports/dotnet/dotnettests.xunitv3tests/bin/debug/net8.0/dotnettests.xunitv3tests.dll" id="f846a1e6-0b68-2ac6-9a66-f417926e3238">
<Execution id="37242a1f-ca3e-44b3-8142-71e510480975" />
<TestMethod codeBase="~/test-reporter/reports/dotnet/DotnetTests.XUnitV3Tests/bin/Debug/net8.0/DotnetTests.XUnitV3Tests.dll" adapterTypeName="executor://30ea7c6e-dd24-4152-a360-1387158cd41d/2.0.3" className="DotnetTests.XUnitV3Tests.FixtureTests" name="DotnetTests.XUnitV3Tests.FixtureTests.Failing_Test" />
</UnitTest>
<UnitTest name="DotnetTests.XUnitV3Tests.FixtureTests.Passing_Test" storage="~/test-reporter/reports/dotnet/dotnettests.xunitv3tests/bin/debug/net8.0/dotnettests.xunitv3tests.dll" id="3ee930dd-8a75-92a0-0d90-373833166db1">
<Execution id="592aaafb-4dc0-49dc-b3c7-bcd81218d58a" />
<TestMethod codeBase="~/test-reporter/reports/dotnet/DotnetTests.XUnitV3Tests/bin/Debug/net8.0/DotnetTests.XUnitV3Tests.dll" adapterTypeName="executor://30ea7c6e-dd24-4152-a360-1387158cd41d/2.0.3" className="DotnetTests.XUnitV3Tests.FixtureTests" name="DotnetTests.XUnitV3Tests.FixtureTests.Passing_Test" />
</UnitTest>
<UnitTest name="[Test Class Cleanup Failure (DotnetTests.XUnitV3Tests.FixtureTests.Passing_Test)]" storage="~/test-reporter/reports/dotnet/dotnettests.xunitv3tests/bin/debug/net8.0/dotnettests.xunitv3tests.dll" id="372fb60f-1f5b-a52e-032e-41a7556021e8">
<Execution id="19c42d36-f4d7-4046-bcc6-dd9b85c9ca2b" />
<TestMethod codeBase="~/test-reporter/reports/dotnet/DotnetTests.XUnitV3Tests/bin/Debug/net8.0/DotnetTests.XUnitV3Tests.dll" adapterTypeName="executor://30ea7c6e-dd24-4152-a360-1387158cd41d/2.0.3" name="[Test Class Cleanup Failure (DotnetTests.XUnitV3Tests.FixtureTests.Passing_Test)]" />
</UnitTest>
<UnitTest name="[Test Class Cleanup Failure (DotnetTests.XUnitV3Tests.FixtureTests.Failing_Test)]" storage="~/test-reporter/reports/dotnet/dotnettests.xunitv3tests/bin/debug/net8.0/dotnettests.xunitv3tests.dll" id="a69083a1-56b4-3da3-2d7c-66fda374fd8e">
<Execution id="b7f40170-1e2c-45ce-b5e4-5bf49fd4c360" />
<TestMethod codeBase="~/test-reporter/reports/dotnet/DotnetTests.XUnitV3Tests/bin/Debug/net8.0/DotnetTests.XUnitV3Tests.dll" adapterTypeName="executor://30ea7c6e-dd24-4152-a360-1387158cd41d/2.0.3" name="[Test Class Cleanup Failure (DotnetTests.XUnitV3Tests.FixtureTests.Failing_Test)]" />
</UnitTest>
</TestDefinitions>
<TestEntries>
<TestEntry testId="f846a1e6-0b68-2ac6-9a66-f417926e3238" executionId="37242a1f-ca3e-44b3-8142-71e510480975" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" />
<TestEntry testId="3ee930dd-8a75-92a0-0d90-373833166db1" executionId="592aaafb-4dc0-49dc-b3c7-bcd81218d58a" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" />
<TestEntry testId="372fb60f-1f5b-a52e-032e-41a7556021e8" executionId="19c42d36-f4d7-4046-bcc6-dd9b85c9ca2b" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" />
<TestEntry testId="a69083a1-56b4-3da3-2d7c-66fda374fd8e" executionId="b7f40170-1e2c-45ce-b5e4-5bf49fd4c360" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" />
</TestEntries>
<TestLists>
<TestList name="Results Not in a List" id="8C84FA94-04C1-424b-9868-57A2D4851A1D" />
<TestList name="All Loaded Results" id="19431567-8539-422a-85d7-44ee4e166bda" />
</TestLists>
<ResultSummary outcome="Failed">
<Counters total="4" executed="4" passed="1" failed="3" error="0" timeout="0" aborted="0" inconclusive="0" passedButRunAborted="0" notRunnable="0" notExecuted="0" disconnected="0" warning="0" completed="0" inProgress="0" pending="0" />
<RunInfos>
<RunInfo computerName="Asterix" outcome="Error" timestamp="2025-06-22T14:17:12.033401">
<Text>Exit code indicates failure: '2'. Please refer to https://aka.ms/testingplatform/exitcodes for more information.</Text>
</RunInfo>
</RunInfos>
</ResultSummary>
</TestRun>

View file

@ -0,0 +1,42 @@
<?xml version="1.0" encoding="utf-8"?>
<testsuites name="pytest tests">
<testsuite name="pytest" errors="0" failures="2" skipped="2" tests="15" time="0.019"
timestamp="2025-11-15T11:51:49.548396-05:00" hostname="Mac.hsd1.va.comcast.net">
<properties>
<property name="custom_prop" value="custom_val"/>
</properties>
<testcase classname="tests.test_lib" name="test_always_pass" time="0.002"/>
<testcase classname="tests.test_lib" name="test_with_subtests" time="0.005"/>
<testcase classname="tests.test_lib" name="test_parameterized[param1]" time="0.000"/>
<testcase classname="tests.test_lib" name="test_parameterized[param2]" time="0.000"/>
<testcase classname="tests.test_lib" name="test_always_skip" time="0.000">
<skipped type="pytest.skip" message="skipped">/Users/mike/Projects/python-test/tests/test_lib.py:20: skipped
</skipped>
</testcase>
<testcase classname="tests.test_lib" name="test_always_fail" time="0.000">
<failure message="assert False">def test_always_fail():
&gt; assert False
E assert False
tests/test_lib.py:25: AssertionError
</failure>
</testcase>
<testcase classname="tests.test_lib" name="test_expected_failure" time="0.000">
<skipped type="pytest.xfail" message=""/>
</testcase>
<testcase classname="tests.test_lib" name="test_error" time="0.000">
<failure message="Exception: error">def test_error():
&gt; raise Exception("error")
E Exception: error
tests/test_lib.py:32: Exception
</failure>
</testcase>
<testcase classname="tests.test_lib" name="test_with_record_property" time="0.000">
<properties>
<property name="example_key" value="1"/>
</properties>
</testcase>
<testcase classname="custom_classname" name="test_with_record_xml_attribute" time="0.000"/>
</testsuite>
</testsuites>

View file

@ -0,0 +1,27 @@
<?xml version="1.0" encoding="UTF-8"?>
<testsuite name="TestAcme-20251114214921" tests="8" file=".py" time="0.001" timestamp="2025-11-14T21:49:22" failures="1" errors="1" skipped="2">
<testcase classname="TestAcme" name="test_always_pass" time="0.000" timestamp="2025-11-14T21:49:22" file="tests/test_lib.py" line="8"/>
<testcase classname="TestAcme" name="test_parameterized_0_param1" time="0.001" timestamp="2025-11-14T21:49:22" file="tests/test_lib.py" line="618"/>
<testcase classname="TestAcme" name="test_parameterized_1_param2" time="0.000" timestamp="2025-11-14T21:49:22" file="tests/test_lib.py" line="618"/>
<testcase classname="TestAcme" name="test_with_subtests" time="0.000" timestamp="2025-11-14T21:49:22" file="tests/test_lib.py" line="11"/>
<testcase classname="TestAcme" name="test_always_fail" time="0.000" timestamp="2025-11-14T21:49:22" file="tests/test_lib.py" line="23">
<failure type="AssertionError" message="failed"><![CDATA[Traceback (most recent call last):
File "/Users/foo/Projects/python-test/tests/test_lib.py", line 24, in test_always_fail
self.fail("failed")
AssertionError: failed
]]></failure>
</testcase>
<testcase classname="TestAcme" name="test_error" time="0.000" timestamp="2025-11-14T21:49:22" file="tests/test_lib.py" line="30">
<error type="Exception" message="error"><![CDATA[Traceback (most recent call last):
File "/Users/foo/Projects/python-test/tests/test_lib.py", line 31, in test_error
raise Exception("error")
Exception: error
]]></error>
</testcase>
<testcase classname="TestAcme" name="test_always_skip" time="0.000" timestamp="2025-11-14T21:49:22" file="tests/test_lib.py" line="20">
<skipped type="skip" message="skipped"/>
</testcase>
<testcase classname="TestAcme" name="test_expected_failure" time="0.000" timestamp="2025-11-14T21:49:22" file="tests/test_lib.py" line="26">
<skipped type="XFAIL" message="expected failure: (&lt;class 'AssertionError'&gt;, AssertionError('expected failure'), &lt;traceback object at 0x100c125c0&gt;)"/>
</testcase>
</testsuite>

View file

@ -207,4 +207,143 @@ describe('jest-junit tests', () => {
// Report should have the title as the first line
expect(report).toMatch(/^# My Custom Title\n/)
})
it('report can be collapsed when configured', async () => {
const fixturePath = path.join(__dirname, 'fixtures', 'jest-junit.xml')
const filePath = normalizeFilePath(path.relative(__dirname, fixturePath))
const fileContent = fs.readFileSync(fixturePath, {encoding: 'utf8'})
const opts: ParseOptions = {
parseErrors: true,
trackedFiles: []
}
const parser = new JestJunitParser(opts)
const result = await parser.parse(filePath, fileContent)
const report = getReport([result], {
...DEFAULT_OPTIONS,
collapsed: 'always'
})
// Report should include collapsible details
expect(report).toContain('<details><summary>Expand for details</summary>')
expect(report).toContain('</details>')
})
it('report is not collapsed when configured to never', async () => {
const fixturePath = path.join(__dirname, 'fixtures', 'jest-junit.xml')
const filePath = normalizeFilePath(path.relative(__dirname, fixturePath))
const fileContent = fs.readFileSync(fixturePath, {encoding: 'utf8'})
const opts: ParseOptions = {
parseErrors: true,
trackedFiles: []
}
const parser = new JestJunitParser(opts)
const result = await parser.parse(filePath, fileContent)
const report = getReport([result], {
...DEFAULT_OPTIONS,
collapsed: 'never'
})
// Report should not include collapsible details
expect(report).not.toContain('<details><summary>Expand for details</summary>')
expect(report).not.toContain('</details>')
})
it('report auto-collapses when all tests pass', async () => {
// Test with a fixture that has all passing tests (no failures)
const fixturePath = path.join(__dirname, 'fixtures', 'jest-junit-eslint.xml')
const filePath = normalizeFilePath(path.relative(__dirname, fixturePath))
const fileContent = fs.readFileSync(fixturePath, {encoding: 'utf8'})
const opts: ParseOptions = {
parseErrors: true,
trackedFiles: []
}
const parser = new JestJunitParser(opts)
const result = await parser.parse(filePath, fileContent)
// Verify this fixture has no failures
expect(result.failed).toBe(0)
const report = getReport([result], {
...DEFAULT_OPTIONS,
collapsed: 'auto'
})
// Should collapse when all tests pass
expect(report).toContain('<details><summary>Expand for details</summary>')
expect(report).toContain('</details>')
})
it('report does not auto-collapse when tests fail', async () => {
// Test with a fixture that has failing tests
const fixturePath = path.join(__dirname, 'fixtures', 'jest-junit.xml')
const filePath = normalizeFilePath(path.relative(__dirname, fixturePath))
const fileContent = fs.readFileSync(fixturePath, {encoding: 'utf8'})
const opts: ParseOptions = {
parseErrors: true,
trackedFiles: []
}
const parser = new JestJunitParser(opts)
const result = await parser.parse(filePath, fileContent)
// Verify this fixture has failures
expect(result.failed).toBeGreaterThan(0)
const report = getReport([result], {
...DEFAULT_OPTIONS,
collapsed: 'auto'
})
// Should not collapse when there are failures
expect(report).not.toContain('<details><summary>Expand for details</summary>')
expect(report).not.toContain('</details>')
})
it('report includes the short summary', async () => {
const fixturePath = path.join(__dirname, 'fixtures', 'jest-junit.xml')
const filePath = normalizeFilePath(path.relative(__dirname, fixturePath))
const fileContent = fs.readFileSync(fixturePath, {encoding: 'utf8'})
const opts: ParseOptions = {
parseErrors: true,
trackedFiles: []
}
const parser = new JestJunitParser(opts)
const result = await parser.parse(filePath, fileContent)
const shortSummary = '1 passed, 4 failed and 1 skipped'
const report = getReport([result], DEFAULT_OPTIONS, shortSummary)
// Report should have the title as the first line
expect(report).toMatch(/^## 1 passed, 4 failed and 1 skipped\n/)
})
it('report includes a custom report title and short summary', async () => {
const fixturePath = path.join(__dirname, 'fixtures', 'jest-junit.xml')
const filePath = normalizeFilePath(path.relative(__dirname, fixturePath))
const fileContent = fs.readFileSync(fixturePath, {encoding: 'utf8'})
const opts: ParseOptions = {
parseErrors: true,
trackedFiles: []
}
const parser = new JestJunitParser(opts)
const result = await parser.parse(filePath, fileContent)
const shortSummary = '1 passed, 4 failed and 1 skipped'
const report = getReport(
[result],
{
...DEFAULT_OPTIONS,
reportTitle: 'My Custom Title'
},
shortSummary
)
// Report should have the title as the first line
expect(report).toMatch(/^# My Custom Title\n## 1 passed, 4 failed and 1 skipped\n/)
})
})

View file

@ -0,0 +1,93 @@
import * as fs from 'fs'
import * as path from 'path'
import {PythonXunitParser} from '../src/parsers/python-xunit/python-xunit-parser'
import {ParseOptions} from '../src/test-parser'
import {DEFAULT_OPTIONS, getReport} from '../src/report/get-report'
import {normalizeFilePath} from '../src/utils/path-utils'
const defaultOpts: ParseOptions = {
parseErrors: true,
trackedFiles: []
}
describe('python-xunit unittest report', () => {
const fixturePath = path.join(__dirname, 'fixtures', 'python-xunit-unittest.xml')
const filePath = normalizeFilePath(path.relative(__dirname, fixturePath))
const fileContent = fs.readFileSync(fixturePath, {encoding: 'utf8'})
const outputPath = path.join(__dirname, '__outputs__', 'python-xunit-unittest.md')
it('report from python test results matches snapshot', async () => {
const trackedFiles = ['tests/test_lib.py']
const opts: ParseOptions = {
...defaultOpts,
trackedFiles
}
const parser = new PythonXunitParser(opts)
const result = await parser.parse(filePath, fileContent)
expect(result).toMatchSnapshot()
const report = getReport([result])
fs.mkdirSync(path.dirname(outputPath), {recursive: true})
fs.writeFileSync(outputPath, report)
})
it('report does not include a title by default', async () => {
const parser = new PythonXunitParser(defaultOpts)
const result = await parser.parse(filePath, fileContent)
const report = getReport([result])
// Report should have the badge as the first line
expect(report).toMatch(/^!\[Tests failed]/)
})
it.each([
['empty string', ''],
['space', ' '],
['tab', '\t'],
['newline', '\n']
])('report does not include a title when configured value is %s', async (_, reportTitle) => {
const parser = new PythonXunitParser(defaultOpts)
const result = await parser.parse(filePath, fileContent)
const report = getReport([result], {
...DEFAULT_OPTIONS,
reportTitle
})
// Report should have the badge as the first line
expect(report).toMatch(/^!\[Tests failed]/)
})
it('report includes a custom report title', async () => {
const parser = new PythonXunitParser(defaultOpts)
const result = await parser.parse(filePath, fileContent)
const report = getReport([result], {
...DEFAULT_OPTIONS,
reportTitle: 'My Custom Title'
})
// Report should have the title as the first line
expect(report).toMatch(/^# My Custom Title\n/)
})
})
describe('python-xunit pytest report', () => {
const fixturePath = path.join(__dirname, 'fixtures', 'python-xunit-pytest.xml')
const filePath = normalizeFilePath(path.relative(__dirname, fixturePath))
const fileContent = fs.readFileSync(fixturePath, {encoding: 'utf8'})
const outputPath = path.join(__dirname, '__outputs__', 'python-xunit-pytest.md')
it('report from python test results matches snapshot', async () => {
const trackedFiles = ['tests/test_lib.py']
const opts: ParseOptions = {
...defaultOpts,
trackedFiles
}
const parser = new PythonXunitParser(opts)
const result = await parser.parse(filePath, fileContent)
expect(result).toMatchSnapshot()
const report = getReport([result])
fs.mkdirSync(path.dirname(outputPath), {recursive: true})
fs.writeFileSync(outputPath, report)
})
})

View file

@ -0,0 +1,120 @@
import {getBadge, DEFAULT_OPTIONS, ReportOptions} from '../../src/report/get-report'
describe('getBadge', () => {
describe('URI encoding with special characters', () => {
it('generates correct URI with simple badge title', () => {
const options: ReportOptions = {
...DEFAULT_OPTIONS,
badgeTitle: 'tests'
}
const badge = getBadge(5, 0, 1, options)
expect(badge).toBe('![Tests passed successfully](https://img.shields.io/badge/tests-5%20passed%2C%201%20skipped-success)')
})
it('handles badge title with single hyphen', () => {
const options: ReportOptions = {
...DEFAULT_OPTIONS,
badgeTitle: 'unit-tests'
}
const badge = getBadge(3, 0, 0, options)
// The hyphen in the badge title should be encoded as --
expect(badge).toBe('![Tests passed successfully](https://img.shields.io/badge/unit--tests-3%20passed-success)')
})
it('handles badge title with multiple hyphens', () => {
const options: ReportOptions = {
...DEFAULT_OPTIONS,
badgeTitle: 'integration-api-tests'
}
const badge = getBadge(10, 0, 0, options)
// All hyphens in the title should be encoded as --
expect(badge).toBe('![Tests passed successfully](https://img.shields.io/badge/integration--api--tests-10%20passed-success)')
})
it('handles badge title with multiple underscores', () => {
const options: ReportOptions = {
...DEFAULT_OPTIONS,
badgeTitle: 'my_integration_test'
}
const badge = getBadge(10, 0, 0, options)
// All underscores in the title should be encoded as __
expect(badge).toBe('![Tests passed successfully](https://img.shields.io/badge/my__integration__test-10%20passed-success)')
})
it('handles badge title with version format containing hyphen', () => {
const options: ReportOptions = {
...DEFAULT_OPTIONS,
badgeTitle: 'MariaDb 12.0-ubi database tests'
}
const badge = getBadge(1, 0, 0, options)
// The hyphen in "12.0-ubi" should be encoded as --
expect(badge).toBe('![Tests passed successfully](https://img.shields.io/badge/MariaDb%2012.0--ubi%20database%20tests-1%20passed-success)')
})
it('handles badge title with dots and hyphens', () => {
const options: ReportOptions = {
...DEFAULT_OPTIONS,
badgeTitle: 'v1.2.3-beta-test'
}
const badge = getBadge(4, 1, 0, options)
expect(badge).toBe('![Tests failed](https://img.shields.io/badge/v1.2.3--beta--test-4%20passed%2C%201%20failed-critical)')
})
it('preserves structural hyphens between label and message', () => {
const options: ReportOptions = {
...DEFAULT_OPTIONS,
badgeTitle: 'test-suite'
}
const badge = getBadge(2, 3, 1, options)
// The URI should have literal hyphens separating title-message-color
expect(badge).toBe('![Tests failed](https://img.shields.io/badge/test--suite-2%20passed%2C%203%20failed%2C%201%20skipped-critical)')
})
})
describe('generates test outcome as color name for imgshields', () => {
it('uses success color when all tests pass', () => {
const options: ReportOptions = {...DEFAULT_OPTIONS}
const badge = getBadge(5, 0, 0, options)
expect(badge).toContain('-success)')
})
it('uses critical color when tests fail', () => {
const options: ReportOptions = {...DEFAULT_OPTIONS}
const badge = getBadge(5, 2, 0, options)
expect(badge).toContain('-critical)')
})
it('uses yellow color when no tests found', () => {
const options: ReportOptions = {...DEFAULT_OPTIONS}
const badge = getBadge(0, 0, 0, options)
expect(badge).toContain('-yellow)')
})
})
describe('badge message composition', () => {
it('includes only passed count when no failures or skips', () => {
const options: ReportOptions = {...DEFAULT_OPTIONS}
const badge = getBadge(5, 0, 0, options)
expect(badge).toBe('![Tests passed successfully](https://img.shields.io/badge/tests-5%20passed-success)')
})
it('includes passed and failed counts', () => {
const options: ReportOptions = {...DEFAULT_OPTIONS}
const badge = getBadge(5, 2, 0, options)
expect(badge).toBe('![Tests failed](https://img.shields.io/badge/tests-5%20passed%2C%202%20failed-critical)')
})
it('includes passed, failed and skipped counts', () => {
const options: ReportOptions = {...DEFAULT_OPTIONS}
const badge = getBadge(5, 2, 1, options)
expect(badge).toBe('![Tests failed](https://img.shields.io/badge/tests-5%20passed%2C%202%20failed%2C%201%20skipped-critical)')
})
it('uses "none" message when no tests', () => {
const options: ReportOptions = {...DEFAULT_OPTIONS}
const badge = getBadge(0, 0, 0, options)
expect(badge).toBe('![Tests passed successfully](https://img.shields.io/badge/tests-none-yellow)')
})
})
})

View file

@ -32,6 +32,6 @@ describe('parseNetDuration', () => {
})
it('throws when string has invalid format', () => {
expect(() => parseNetDuration('12:34:56 not a duration')).toThrowError(/^Invalid format/)
expect(() => parseNetDuration('12:34:56 not a duration')).toThrow(/^Invalid format/)
})
})

View file

@ -1,6 +1,5 @@
name: Test Reporter
description: |
Shows test results in GitHub UI: .NET (xUnit, NUnit, MSTest), Dart, Flutter, Java (JUnit), JavaScript (JEST, Mocha)
description: Displays test results from popular testing frameworks directly in GitHub
author: Michal Dorner <dorner.michal@gmail.com>
inputs:
artifact:
@ -29,9 +28,11 @@ inputs:
- dotnet-nunit
- dotnet-trx
- flutter-json
- golang-json
- java-junit
- jest-junit
- mocha-json
- python-xunit
- rspec-json
- swift-xunit
required: true
@ -68,6 +69,10 @@ inputs:
working-directory:
description: Relative path under $GITHUB_WORKSPACE where the repository was checked out
required: false
report-title:
description: Title for the test report summary
required: false
default: ''
only-summary:
description: |
Allows you to generate only the summary.
@ -85,6 +90,14 @@ inputs:
description: Customize badge title
required: false
default: 'tests'
collapsed:
description: |
Controls whether test report details are collapsed or expanded. Supported options:
- auto: Collapse only if all tests pass (default behavior)
- always: Always collapse the report details
- never: Always expand the report details
required: false
default: 'auto'
token:
description: GitHub Access Token
required: false

976
dist/index.js generated vendored

File diff suppressed because it is too large Load diff

80
dist/licenses.txt generated vendored
View file

@ -1350,48 +1350,62 @@ CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
sax
ISC
The ISC License
BlueOak-1.0.0
# Blue Oak Model License
Copyright (c) Isaac Z. Schlueter and Contributors
Version 1.0.0
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
## Purpose
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
This license gives everyone as much permission to work with
this software as possible, while protecting contributors
from liability.
====
## Acceptance
`String.fromCodePoint` by Mathias Bynens used according to terms of MIT
License, as follows:
In order to receive this license, you must agree to its
rules. The rules of this license are both obligations
under that agreement and conditions to your license.
You must not do anything with this software that triggers
a rule that you cannot or will not follow.
Copyright Mathias Bynens <https://mathiasbynens.be/>
## Copyright
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
"Software"), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
Each contributor licenses you to do everything with this
software that would otherwise infringe that contributor's
copyright in it.
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
## Notices
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
You must ensure that everyone who gets a copy of
any part of this software from you, with or without
changes, also gets the text of this license or a link to
<https://blueoakcouncil.org/license/1.0.0>.
## Excuse
If anyone notifies you in writing that you have not
complied with [Notices](#notices), you can keep your
license by taking all practical steps to comply within 30
days after the notice. If you do not do so, your license
ends immediately.
## Patent
Each contributor licenses you to do everything with this
software that would otherwise infringe any patent claims
they can license or become able to license.
## Reliability
No contributor can revoke this license.
## No Liability
***As far as the law allows, this software comes as is,
without any warranty or condition, and no contributor
will be liable to anyone for any damages related to this
software or this license, under any kind of legal claim.***
to-regex-range

3259
package-lock.json generated

File diff suppressed because it is too large Load diff

View file

@ -1,6 +1,6 @@
{
"name": "test-reporter",
"version": "2.1.0",
"version": "2.3.0",
"private": true,
"description": "Presents test results from popular testing frameworks as Github check run",
"main": "lib/main.js",
@ -16,6 +16,7 @@
"all": "npm run build && npm run format && npm run lint && npm run package && npm test",
"dart-fixture": "cd \"reports/dart\" && dart test --file-reporter=\"json:../../__tests__/fixtures/dart-json.json\"",
"dotnet-fixture": "dotnet test reports/dotnet/DotnetTests.XUnitTests --logger \"trx;LogFileName=../../../../__tests__/fixtures/dotnet-trx.trx\"",
"dotnet-xunitv3-fixture": "dotnet run --project reports/dotnet/DotnetTests.XUnitV3Tests/DotnetTests.XUnitV3Tests.csproj --report-trx --report-trx-filename dotnet-xunitv3.trx --results-directory __tests__/fixtures/",
"dotnet-nunit-fixture": "nunit.exe reports/dotnet/DotnetTests.NUnitV3Tests/bin/Debug/netcoreapp3.1/DotnetTests.NUnitV3Tests.dll --result=__tests__/fixtures/dotnet-nunit.xml",
"dotnet-nunit-legacy-fixture": "nunit-console.exe reports/dotnet-nunit-legacy/NUnitLegacy.sln --result=__tests__/fixtures/dotnet-nunit-legacy.xml",
"golang-json-fixture": "go test -v -json -timeout 5s ./reports/go | tee __tests__/fixtures/golang-json.json",
@ -41,33 +42,35 @@
"adm-zip": "^0.5.16",
"fast-glob": "^3.3.3",
"got": "^11.8.6",
"picomatch": "^4.0.2",
"picomatch": "^4.0.3",
"xml2js": "^0.6.2"
},
"devDependencies": {
"@octokit/webhooks-types": "^7.6.1",
"@types/adm-zip": "^0.5.7",
"@types/jest": "^29.5.14",
"@types/node": "^20.17.47",
"@types/picomatch": "^2.3.4",
"@types/jest": "^30.0.0",
"@types/node": "^20.19.23",
"@types/picomatch": "^4.0.2",
"@types/xml2js": "^0.4.14",
"@typescript-eslint/eslint-plugin": "^7.18.0",
"@typescript-eslint/parser": "^7.18.0",
"@vercel/ncc": "^0.38.3",
"@vercel/ncc": "^0.38.4",
"eol-converter-cli": "^1.1.0",
"eslint": "^8.57.1",
"eslint-import-resolver-typescript": "^3.10.1",
"eslint-plugin-github": "^4.10.2",
"eslint-plugin-import": "^2.31.0",
"eslint-plugin-jest": "^28.11.0",
"eslint-plugin-prettier": "^5.4.0",
"jest": "^29.7.0",
"jest-circus": "^29.7.0",
"eslint-plugin-import": "^2.32.0",
"eslint-plugin-jest": "^28.14.0",
"eslint-plugin-prettier": "^5.5.4",
"jest": "^30.2.0",
"jest-junit": "^16.0.0",
"js-yaml": "^4.1.0",
"prettier": "^3.5.3",
"ts-jest": "^29.3.4",
"typescript": "^5.8.3"
"js-yaml": "^4.1.1",
"prettier": "^3.6.2",
"ts-jest": "^29.4.5",
"typescript": "^5.9.3"
},
"overrides": {
"sax": "^1.4.3"
},
"jest-junit": {
"suiteName": "jest tests",
@ -81,5 +84,10 @@
},
"engines": {
"node": ">=20"
},
"markdownlint-cli2": {
"ignores": [
"__tests__/**/*"
]
}
}

View file

@ -40,7 +40,7 @@ namespace DotnetTests.XUnitTests
}
[Test]
[Timeout(1)]
[CancelAfter(1)]
public void Timeout_Test()
{
Thread.Sleep(100);
@ -58,7 +58,7 @@ namespace DotnetTests.XUnitTests
[TestCase(3)]
public void Is_Even_Number(int i)
{
Assert.True(i % 2 == 0);
Assert.That(i % 2 == 0);
}
}
}

View file

@ -1,14 +1,13 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<IsPackable>false</IsPackable>
<TargetFramework>net8.0</TargetFramework>
<DeterministicSourcePaths>true</DeterministicSourcePaths>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="NUnit" Version="3.13.2" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="16.5.0" />
<PackageReference Include="NUnit" Version="4.3.2" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.14.1" />
</ItemGroup>
<ItemGroup>

View file

@ -2,6 +2,7 @@
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<DeterministicSourcePaths>true</DeterministicSourcePaths>
</PropertyGroup>
</Project>

View file

@ -1,16 +1,14 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<IsPackable>false</IsPackable>
<TargetFramework>net8.0</TargetFramework>
<DeterministicSourcePaths>true</DeterministicSourcePaths>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="16.5.0" />
<PackageReference Include="xunit" Version="2.4.0" />
<PackageReference Include="xunit.runner.visualstudio" Version="2.4.0" />
<PackageReference Include="coverlet.collector" Version="1.2.0" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.14.1" />
<PackageReference Include="xunit" Version="2.9.3" />
<PackageReference Include="xunit.runner.visualstudio" Version="3.1.1" />
</ItemGroup>
<ItemGroup>

View file

@ -0,0 +1,15 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<OutputType>exe</OutputType>
<DeterministicSourcePaths>true</DeterministicSourcePaths>
<UseMicrosoftTestingPlatformRunner>true</UseMicrosoftTestingPlatformRunner>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Testing.Extensions.TrxReport" Version="1.7.3" />
<PackageReference Include="xunit.v3" Version="2.0.3" />
</ItemGroup>
</Project>

View file

@ -0,0 +1,27 @@
using System;
using Xunit;
namespace DotnetTests.XUnitV3Tests;
public sealed class Fixture : IDisposable
{
public void Dispose()
{
throw new InvalidOperationException("Failure during fixture disposal");
}
}
public class FixtureTests(Fixture fixture) : IClassFixture<Fixture>
{
[Fact]
public void Passing_Test()
{
Assert.NotNull(fixture);
}
[Fact]
public void Failing_Test()
{
Assert.Null(fixture);
}
}

View file

@ -11,6 +11,8 @@ Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "DotnetTests.XUnitTests", "D
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "DotnetTests.NUnitV3Tests", "DotnetTests.NUnitV3Tests\DotnetTests.NUnitV3Tests.csproj", "{81023ED7-56CB-47E9-86C5-9125A0873C55}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "DotnetTests.XUnitV3Tests", "DotnetTests.XUnitV3Tests\DotnetTests.XUnitV3Tests.csproj", "{D35E65DC-62EF-4612-9FF3-66F5600BFB74}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU
@ -29,6 +31,10 @@ Global
{81023ED7-56CB-47E9-86C5-9125A0873C55}.Debug|Any CPU.Build.0 = Debug|Any CPU
{81023ED7-56CB-47E9-86C5-9125A0873C55}.Release|Any CPU.ActiveCfg = Release|Any CPU
{81023ED7-56CB-47E9-86C5-9125A0873C55}.Release|Any CPU.Build.0 = Release|Any CPU
{D35E65DC-62EF-4612-9FF3-66F5600BFB74}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{D35E65DC-62EF-4612-9FF3-66F5600BFB74}.Debug|Any CPU.Build.0 = Debug|Any CPU
{D35E65DC-62EF-4612-9FF3-66F5600BFB74}.Release|Any CPU.ActiveCfg = Release|Any CPU
{D35E65DC-62EF-4612-9FF3-66F5600BFB74}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
@ -36,6 +42,7 @@ Global
GlobalSection(NestedProjects) = preSolution
{F8607EDB-D25D-47AA-8132-38ACA242E845} = {BCAC3B31-ADB1-4221-9D5B-182EE868648C}
{81023ED7-56CB-47E9-86C5-9125A0873C55} = {BCAC3B31-ADB1-4221-9D5B-182EE868648C}
{D35E65DC-62EF-4612-9FF3-66F5600BFB74} = {BCAC3B31-ADB1-4221-9D5B-182EE868648C}
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
SolutionGuid = {6ED5543C-74AA-4B21-8050-943550F3F66E}

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -9,6 +9,6 @@
"author": "Michal Dorner <dorner.michal@gmail.com>",
"license": "MIT",
"devDependencies": {
"mocha": "^8.3.0"
"mocha": "^11.7.5"
}
}

View file

@ -17,9 +17,9 @@ import {GolangJsonParser} from './parsers/golang-json/golang-json-parser'
import {JavaJunitParser} from './parsers/java-junit/java-junit-parser'
import {JestJunitParser} from './parsers/jest-junit/jest-junit-parser'
import {MochaJsonParser} from './parsers/mocha-json/mocha-json-parser'
import {PythonXunitParser} from './parsers/python-xunit/python-xunit-parser'
import {RspecJsonParser} from './parsers/rspec-json/rspec-json-parser'
import {SwiftXunitParser} from './parsers/swift-xunit/swift-xunit-parser'
import {normalizeDirPath, normalizeFilePath} from './utils/path-utils'
import {getCheckRunContext} from './utils/github-utils'
@ -49,6 +49,7 @@ class TestReporter {
readonly useActionsSummary = core.getInput('use-actions-summary', {required: false}) === 'true'
readonly badgeTitle = core.getInput('badge-title', {required: false})
readonly reportTitle = core.getInput('report-title', {required: false})
readonly collapsed = core.getInput('collapsed', {required: false}) as 'auto' | 'always' | 'never'
readonly token = core.getInput('token', {required: true})
readonly octokit: InstanceType<typeof GitHub>
readonly context = getCheckRunContext()
@ -66,6 +67,11 @@ class TestReporter {
return
}
if (this.collapsed !== 'auto' && this.collapsed !== 'always' && this.collapsed !== 'never') {
core.setFailed(`Input parameter 'collapsed' has invalid value`)
return
}
if (isNaN(this.maxAnnotations) || this.maxAnnotations < 0 || this.maxAnnotations > 50) {
core.setFailed(`Input parameter 'max-annotations' has invalid value`)
return
@ -166,19 +172,29 @@ class TestReporter {
}
}
const {listSuites, listTests, onlySummary, useActionsSummary, badgeTitle, reportTitle} = this
const {listSuites, listTests, onlySummary, useActionsSummary, badgeTitle, reportTitle, collapsed} = this
const passed = results.reduce((sum, tr) => sum + tr.passed, 0)
const failed = results.reduce((sum, tr) => sum + tr.failed, 0)
const skipped = results.reduce((sum, tr) => sum + tr.skipped, 0)
const shortSummary = `${passed} passed, ${failed} failed and ${skipped} skipped `
let baseUrl = ''
if (this.useActionsSummary) {
const summary = getReport(results, {
listSuites,
listTests,
baseUrl,
onlySummary,
useActionsSummary,
badgeTitle,
reportTitle
})
const summary = getReport(
results,
{
listSuites,
listTests,
baseUrl,
onlySummary,
useActionsSummary,
badgeTitle,
reportTitle,
collapsed
},
shortSummary
)
core.info('Summary content:')
core.info(summary)
@ -205,7 +221,8 @@ class TestReporter {
onlySummary,
useActionsSummary,
badgeTitle,
reportTitle
reportTitle,
collapsed
})
core.info('Creating annotations')
@ -214,11 +231,6 @@ class TestReporter {
const isFailed = this.failOnError && results.some(tr => tr.result === 'failed')
const conclusion = isFailed ? 'failure' : 'success'
const passed = results.reduce((sum, tr) => sum + tr.passed, 0)
const failed = results.reduce((sum, tr) => sum + tr.failed, 0)
const skipped = results.reduce((sum, tr) => sum + tr.skipped, 0)
const shortSummary = `${passed} passed, ${failed} failed and ${skipped} skipped `
core.info(`Updating check run conclusion (${conclusion}) and output`)
const resp = await this.octokit.rest.checks.update({
check_run_id: createResp.data.id,
@ -259,6 +271,8 @@ class TestReporter {
return new JestJunitParser(options)
case 'mocha-json':
return new MochaJsonParser(options)
case 'python-xunit':
return new PythonXunitParser(options)
case 'rspec-json':
return new RspecJsonParser(options)
case 'swift-xunit':

View file

@ -77,13 +77,13 @@ export class DotnetNunitParser implements TestParser {
.join('.')
const groupName = suitesWithoutTheories[suitesWithoutTheories.length - 1].$.name
let existingSuite = result.find(existingSuite => existingSuite.name === suiteName)
let existingSuite = result.find(suite => suite.name === suiteName)
if (existingSuite === undefined) {
existingSuite = new TestSuiteResult(suiteName, [])
result.push(existingSuite)
}
let existingGroup = existingSuite.groups.find(existingGroup => existingGroup.name === groupName)
let existingGroup = existingSuite.groups.find(group => group.name === groupName)
if (existingGroup === undefined) {
existingGroup = new TestGroupResult(groupName, [])
existingSuite.groups.push(existingGroup)

View file

@ -81,7 +81,7 @@ export class DotnetTrxParser implements TestParser {
const testClasses: {[name: string]: TestClass} = {}
for (const r of unitTestsResults) {
const className = r.test.TestMethod[0].$.className
const className = r.test.TestMethod[0].$.className ?? "Unclassified"
let tc = testClasses[className]
if (tc === undefined) {
tc = new TestClass(className)
@ -146,8 +146,8 @@ export class DotnetTrxParser implements TestParser {
return undefined
}
const message = test.error.Message[0]
const stackTrace = test.error.StackTrace[0]
const message = `${test.error.Message[0]}\n${stackTrace}`
let path
let line
@ -161,7 +161,7 @@ export class DotnetTrxParser implements TestParser {
path,
line,
message,
details: `${message}\n${stackTrace}`
details: `${message}`
}
}

View file

@ -0,0 +1,8 @@
import {ParseOptions} from '../../test-parser'
import {JavaJunitParser} from '../java-junit/java-junit-parser'
export class PythonXunitParser extends JavaJunitParser {
constructor(readonly options: ParseOptions) {
super(options)
}
}

View file

@ -16,6 +16,7 @@ export interface ReportOptions {
useActionsSummary: boolean
badgeTitle: string
reportTitle: string
collapsed: 'auto' | 'always' | 'never'
}
export const DEFAULT_OPTIONS: ReportOptions = {
@ -25,16 +26,19 @@ export const DEFAULT_OPTIONS: ReportOptions = {
onlySummary: false,
useActionsSummary: true,
badgeTitle: 'tests',
reportTitle: ''
reportTitle: '',
collapsed: 'auto'
}
export function getReport(results: TestRunResult[], options: ReportOptions = DEFAULT_OPTIONS): string {
core.info('Generating check run summary')
export function getReport(
results: TestRunResult[],
options: ReportOptions = DEFAULT_OPTIONS,
shortSummary = ''
): string {
applySort(results)
const opts = {...options}
let lines = renderReport(results, opts)
let lines = renderReport(results, opts, shortSummary)
let report = lines.join('\n')
if (getByteLength(report) <= getMaxReportLength(options)) {
@ -44,7 +48,7 @@ export function getReport(results: TestRunResult[], options: ReportOptions = DEF
if (opts.listTests === 'all') {
core.info("Test report summary is too big - setting 'listTests' to 'failed'")
opts.listTests = 'failed'
lines = renderReport(results, opts)
lines = renderReport(results, opts, shortSummary)
report = lines.join('\n')
if (getByteLength(report) <= getMaxReportLength(options)) {
return report
@ -101,7 +105,7 @@ function getByteLength(text: string): number {
return Buffer.byteLength(text, 'utf8')
}
function renderReport(results: TestRunResult[], options: ReportOptions): string[] {
function renderReport(results: TestRunResult[], options: ReportOptions, shortSummary: string): string[] {
const sections: string[] = []
const reportTitle: string = options.reportTitle.trim()
@ -109,6 +113,10 @@ function renderReport(results: TestRunResult[], options: ReportOptions): string[
sections.push(`# ${reportTitle}`)
}
if (shortSummary) {
sections.push(`## ${shortSummary}`)
}
const badge = getReportBadge(results, options)
sections.push(badge)
@ -125,7 +133,7 @@ function getReportBadge(results: TestRunResult[], options: ReportOptions): strin
return getBadge(passed, failed, skipped, options)
}
function getBadge(passed: number, failed: number, skipped: number, options: ReportOptions): string {
export function getBadge(passed: number, failed: number, skipped: number, options: ReportOptions): string {
const text = []
if (passed > 0) {
text.push(`${passed} passed`)
@ -145,28 +153,37 @@ function getBadge(passed: number, failed: number, skipped: number, options: Repo
color = 'yellow'
}
const hint = failed > 0 ? 'Tests failed' : 'Tests passed successfully'
const uri = encodeURIComponent(`${options.badgeTitle}-${message}-${color}`)
return `![${hint}](https://img.shields.io/badge/${uri})`
const encodedBadgeTitle = encodeImgShieldsURIComponent(options.badgeTitle)
const encodedMessage = encodeImgShieldsURIComponent(message)
const encodedColor = encodeImgShieldsURIComponent(color)
return `![${hint}](https://img.shields.io/badge/${encodedBadgeTitle}-${encodedMessage}-${encodedColor})`
}
function getTestRunsReport(testRuns: TestRunResult[], options: ReportOptions): string[] {
const sections: string[] = []
const totalFailed = testRuns.reduce((sum, tr) => sum + tr.failed, 0)
if (totalFailed === 0) {
// Determine if report should be collapsed based on collapsed option
const shouldCollapse = options.collapsed === 'always' || (options.collapsed === 'auto' && totalFailed === 0)
if (shouldCollapse) {
sections.push(`<details><summary>Expand for details</summary>`)
sections.push(` `)
}
if (testRuns.length > 0 || options.onlySummary) {
const tableData = testRuns
.filter(tr => tr.passed > 0 || tr.failed > 0 || tr.skipped > 0)
.map(tr => {
.map((tr, originalIndex) => ({tr, originalIndex}))
.filter(({tr}) => tr.passed > 0 || tr.failed > 0 || tr.skipped > 0)
.map(({tr, originalIndex}) => {
const time = formatTime(tr.time)
const name = tr.path
const addr = options.baseUrl + makeRunSlug(originalIndex, options).link
const nameLink = link(name, addr)
const passed = tr.passed > 0 ? `${tr.passed} ${Icon.success}` : ''
const failed = tr.failed > 0 ? `${tr.failed} ${Icon.fail}` : ''
const skipped = tr.skipped > 0 ? `${tr.skipped} ${Icon.skip}` : ''
return [name, passed, failed, skipped, time]
return [nameLink, passed, failed, skipped, time]
})
const resultsTable = table(
@ -182,7 +199,7 @@ function getTestRunsReport(testRuns: TestRunResult[], options: ReportOptions): s
sections.push(...suitesReports)
}
if (totalFailed === 0) {
if (shouldCollapse) {
sections.push(`</details>`)
}
return sections
@ -260,6 +277,9 @@ function getTestsReport(ts: TestSuiteResult, runIndex: number, suiteIndex: numbe
}
const space = grp.name ? ' ' : ''
for (const tc of grp.tests) {
if (options.listTests === 'failed' && tc.result !== 'failed') {
continue
}
const result = getResultIcon(tc.result)
sections.push(`${space}${result} ${tc.name}`)
if (tc.error) {
@ -299,3 +319,7 @@ function getResultIcon(result: TestExecutionResult): string {
return ''
}
}
function encodeImgShieldsURIComponent(component: string): string {
return encodeURIComponent(component).replace(/-/g, '--').replace(/_/g, '__')
}

View file

@ -2,16 +2,17 @@ import {createWriteStream} from 'fs'
import * as core from '@actions/core'
import * as github from '@actions/github'
import {GitHub} from '@actions/github/lib/utils'
import type {PullRequest} from '@octokit/webhooks-types'
import type {PullRequest, WorkflowRunEvent} from '@octokit/webhooks-types'
import {IncomingMessage} from 'http'
import * as stream from 'stream'
import {promisify} from 'util'
import got from 'got'
import got, {Progress} from 'got'
const asyncStream = promisify(stream.pipeline)
export function getCheckRunContext(): {sha: string; runId: number} {
if (github.context.eventName === 'workflow_run') {
core.info('Action was triggered by workflow_run: using SHA and RUN_ID from triggering workflow')
const event = github.context.payload
const event = github.context.payload as WorkflowRunEvent
if (!event.workflow_run) {
throw new Error("Event of type 'workflow_run' is missing 'workflow_run' field")
}
@ -54,11 +55,11 @@ export async function downloadArtifact(
const downloadStream = got.stream(req.url, {headers})
const fileWriterStream = createWriteStream(fileName)
downloadStream.on('redirect', response => {
downloadStream.on('redirect', (response: IncomingMessage) => {
core.info(`Downloading ${response.headers.location}`)
})
downloadStream.on('downloadProgress', ({transferred}) => {
core.info(`Progress: ${transferred} B`)
downloadStream.on('downloadProgress', (progress: Progress) => {
core.info(`Progress: ${progress.transferred} B`)
})
await asyncStream(downloadStream, fileWriterStream)