Compare commits

...

218 Commits

Author SHA1 Message Date
Koen Vlaswinkel
07bb2b932c Update changelog and version for v3.28.20 2025-07-21 11:24:11 +02:00
Koen Vlaswinkel
f6c7f63bda Ignore pre-release parts when comparing GHES versions 2025-07-21 11:19:24 +02:00
Koen Vlaswinkel
6370c01206 Move comment to JSDoc 2025-07-21 11:19:07 +02:00
Koen Vlaswinkel
dd627a9af6 Fix parsing of GHES pre-release versions 2025-07-21 11:19:02 +02:00
Koen Vlaswinkel
bcdb4ecb96 Unconditionally disable combining SARIF files for GHES 3.18 2025-07-21 11:18:49 +02:00
Koen Vlaswinkel
c0809df981 Remove support for combining SARIF runs with non-unique categories 2025-07-21 11:18:34 +02:00
Chris Smowton
ff0a06e83c Merge pull request #2896 from github/update-v3.28.18-b86edfc27
Merge main into releases/v3
2025-05-16 11:14:47 +01:00
github-actions[bot]
a41e0844be Update changelog for v3.28.18 2025-05-16 09:36:50 +00:00
Chris Smowton
b86edfc27a Merge pull request #2893 from github/update-bundle/codeql-bundle-v2.21.3
Update default bundle to 2.21.3
2025-05-15 12:40:00 +01:00
Henry Mercer
e93b90025f Merge branch 'main' into update-bundle/codeql-bundle-v2.21.3 2025-05-14 19:57:41 +01:00
Henry Mercer
510dfa3460 Merge pull request #2894 from github/henrymercer/skip-validating-codeql-sarif
Skip validating SARIF produced by CodeQL
2025-05-14 19:55:03 +01:00
Henry Mercer
492d783245 Merge branch 'main' into henrymercer/skip-validating-codeql-sarif 2025-05-14 19:16:54 +01:00
Henry Mercer
83bdf3b7f9 Merge pull request #2859 from github/update-supported-enterprise-server-versions
Update supported GitHub Enterprise Server versions
2025-05-14 19:15:31 +01:00
Andrew Eisenberg
cffc916774 Merge pull request #2891 from austinpray-mixpanel/patch-1
Allow configuring CODEQL_THREADS with an env var
2025-05-14 14:00:23 -04:00
Henry Mercer
4420887272 Add deprecation warning for CodeQL 2.16.5 and earlier 2025-05-14 17:13:10 +01:00
Henry Mercer
4e178c5841 Update supported versions table in README 2025-05-14 17:12:44 +01:00
Henry Mercer
05446e4bbf Merge branch 'main' into update-supported-enterprise-server-versions 2025-05-14 16:58:40 +01:00
Austin Pray
bb9fc01aa6 Update CHANGELOG.md 2025-05-14 10:44:35 -05:00
Austin Pray
3dce55ac70 rebuild 2025-05-14 15:41:39 +00:00
github-actions[bot]
bacf5fe7c2 Rebuild 2025-05-14 14:23:08 +00:00
Henry Mercer
15f19ac220 Improve docstring
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-05-14 15:21:38 +01:00
Henry Mercer
f7ab654551 Add changelog note 2025-05-14 15:12:22 +01:00
Henry Mercer
2f70a988e7 Skip validating SARIF produced by CodeQL 2025-05-14 15:11:16 +01:00
Henry Mercer
f681ad69a7 Add utility function to get testing environment 2025-05-14 14:10:19 +01:00
github-actions[bot]
15447f393e Add changelog note 2025-05-13 22:28:13 +00:00
github-actions[bot]
ded79fc5fd Update default bundle to codeql-bundle-v2.21.3 2025-05-13 22:28:10 +00:00
Austin Pray
77ae18dc82 Revert "threads defaults to CODEQL_THREADS env var"
This reverts commit df7d681f04.
2025-05-13 22:19:47 +00:00
Austin Pray
df7d681f04 threads defaults to CODEQL_THREADS env var 2025-05-13 20:13:00 +00:00
Nick Fyson
15bce5bb14 Merge pull request #2892 from github/dependabot/npm_and_yarn/npm-9a9ecb9151
build(deps): bump the npm group across 1 directory with 4 updates
2025-05-13 11:35:20 +01:00
Nick Fyson
c64095f75e Merge pull request #2889 from github/dependabot/github_actions/actions-b37916a4ef
build(deps): bump the actions group with 2 updates
2025-05-13 11:16:24 +01:00
nickfyson
07dbe6f6f7 update generated workflows 2025-05-13 11:02:59 +01:00
github-actions[bot]
3d97729508 Update checked-in dependencies 2025-05-12 18:01:08 +00:00
dependabot[bot]
d5e9ae3f8b build(deps): bump the npm group across 1 directory with 4 updates
Bumps the npm group with 4 updates in the / directory: [semver](https://github.com/npm/node-semver), [@eslint/js](https://github.com/eslint/eslint/tree/HEAD/packages/js), [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin) and [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser).


Updates `semver` from 7.7.1 to 7.7.2
- [Release notes](https://github.com/npm/node-semver/releases)
- [Changelog](https://github.com/npm/node-semver/blob/main/CHANGELOG.md)
- [Commits](https://github.com/npm/node-semver/compare/v7.7.1...v7.7.2)

Updates `@eslint/js` from 9.25.1 to 9.26.0
- [Release notes](https://github.com/eslint/eslint/releases)
- [Changelog](https://github.com/eslint/eslint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslint/commits/v9.26.0/packages/js)

Updates `@typescript-eslint/eslint-plugin` from 8.31.1 to 8.32.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.32.1/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.31.1 to 8.32.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.32.1/packages/parser)

---
updated-dependencies:
- dependency-name: semver
  dependency-version: 7.7.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@eslint/js"
  dependency-version: 9.26.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-version: 8.32.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-version: 8.32.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-12 18:00:33 +00:00
Austin Pray
c41b278fa8 Allow configuring CODEQL_THREADS with an env var
ref https://github.com/github/codeql-action/issues/2890
2025-05-05 21:28:43 -05:00
dependabot[bot]
7657741c79 build(deps): bump the actions group with 2 updates
Bumps the actions group with 2 updates: [ruby/setup-ruby](https://github.com/ruby/setup-ruby) and [actions/create-github-app-token](https://github.com/actions/create-github-app-token).


Updates `ruby/setup-ruby` from 1.230.0 to 1.237.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](e5ac7b085f...eaecf785f6)

Updates `actions/create-github-app-token` from 2.0.2 to 2.0.6
- [Release notes](https://github.com/actions/create-github-app-token/releases)
- [Commits](https://github.com/actions/create-github-app-token/compare/v2.0.2...v2.0.6)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-version: 1.237.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
- dependency-name: actions/create-github-app-token
  dependency-version: 2.0.6
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-05 18:27:10 +00:00
Nick Rolfe
5eb3ed6614 Merge pull request #2887 from github/mergeback/v3.28.17-to-main-60168efe
Mergeback v3.28.17 refs/heads/releases/v3 into main
2025-05-02 11:26:39 +01:00
github-actions[bot]
213a8a5a44 Update checked-in dependencies 2025-05-02 09:30:05 +00:00
github-actions[bot]
c46165d67e Update changelog and version after v3.28.17 2025-05-02 09:27:21 +00:00
Nick Rolfe
60168efe1c Merge pull request #2886 from github/update-v3.28.17-97a2bfd2a
Merge main into releases/v3
2025-05-02 10:26:47 +01:00
github-actions[bot]
0d5a3115da Update changelog for v3.28.17 2025-05-02 09:10:30 +00:00
Nick Rolfe
97a2bfd2a3 Merge pull request #2872 from github/update-bundle/codeql-bundle-v2.21.2
Update default bundle to 2.21.2
2025-05-01 13:31:16 +01:00
Nick Rolfe
9aba20e4c9 Merge branch 'main' into update-bundle/codeql-bundle-v2.21.2 2025-05-01 13:16:31 +01:00
Henry Mercer
81a9508deb Merge pull request #2876 from github/henrymercer/fix-diff-informed-multiple-analyze
Do not fail diff informed analyses when analyze is run twice in the same job
2025-05-01 13:07:58 +01:00
Henry Mercer
1569f4c145 Disable diff-informed queries in code scanning config tests 2025-05-01 12:14:34 +01:00
Henry Mercer
62fbeb66b3 Merge branch 'main' into henrymercer/fix-diff-informed-multiple-analyze 2025-05-01 12:05:02 +01:00
Henry Mercer
f122d1dc9e Address test failures from computing temporary directory too early
These relied on the RUNNER_TEMP environment variable that does not necessarily exist when running locally.
2025-05-01 12:01:22 +01:00
Henry Mercer
083772aae4 Do not fail diff informed analyses when analyze is run twice in the same job 2025-05-01 12:00:46 +01:00
Nick Rolfe
5db14d0471 Merge branch 'main' into update-bundle/codeql-bundle-v2.21.2 2025-05-01 10:28:59 +01:00
Andrew Eisenberg
40e16edda1 Merge pull request #2874 from github/aeisenberg/add-actions-telemetry
Add actions-specific telemetry fields
2025-04-30 08:02:38 -07:00
Andrew Eisenberg
3ca9a88941 Add actions-specific telemetry fields 2025-04-29 16:14:46 -07:00
Henry Mercer
ed51cb5abd Merge pull request #2873 from github/dependabot/npm_and_yarn/npm-a5e2fd638a
build(deps-dev): bump the npm group with 2 updates
2025-04-29 11:36:38 +01:00
Andrew Eisenberg
8ccb6b16a6 Merge pull request #2861 from github/dependabot/github_actions/actions-0553007f0f
build(deps): bump ruby/setup-ruby from 1.229.0 to 1.230.0 in the actions group
2025-04-29 03:21:43 -07:00
github-actions[bot]
1817a33c8b Update checked-in dependencies 2025-04-28 18:49:27 +00:00
dependabot[bot]
6893d12604 build(deps-dev): bump the npm group with 2 updates
Bumps the npm group with 2 updates: [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin) and [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser).


Updates `@typescript-eslint/eslint-plugin` from 8.31.0 to 8.31.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.31.1/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.31.0 to 8.31.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.31.1/packages/parser)

---
updated-dependencies:
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-version: 8.31.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-version: 8.31.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-04-28 18:48:49 +00:00
Henry Mercer
83605b3ce2 Merge pull request #2864 from github/dependabot/npm_and_yarn/npm-cac24ffe08
build(deps): bump the npm group across 1 directory with 7 updates
2025-04-28 18:34:13 +01:00
github-actions[bot]
6a3cfab0e9 Add changelog note 2025-04-28 15:20:43 +00:00
github-actions[bot]
4b7eecf8a7 Update default bundle to codeql-bundle-v2.21.2 2025-04-28 15:20:40 +00:00
Michael B. Gale
018ac1a585 Merge pull request #2834 from github/mbg/private-registry/goproxy
Go: Support `GOPROXY` via the Dependabot proxy
2025-04-28 11:11:41 +01:00
Michael B. Gale
6ad5d99ccc Add goproxy_server to LANGUAGE_TO_REGISTRY_TYPE 2025-04-25 16:56:36 +01:00
Michael B. Gale
f843d94177 Merge pull request #2869 from github/mbg/proxy/use-2.21.1-artifacts
Use proxy artifacts for `v2.21.1`
2025-04-25 16:50:50 +01:00
Michael B. Gale
2264a4ecc1 Merge branch 'main' into mbg/proxy/use-2.21.1-artifacts 2025-04-25 14:25:57 +01:00
Michael B. Gale
d3b65fcaf0 Merge pull request #2870 from github/mbg/ci/retire-ubuntu-20.04
Remove ubuntu-20.04 and add ubuntu-24.04
2025-04-25 14:25:40 +01:00
Michael B. Gale
eea52ddc4e Remove ubuntu-20.04 and add ubuntu-24.04 2025-04-25 13:03:25 +01:00
Michael B. Gale
6ef9b921b1 Use proxy artifacts for v2.21.1 2025-04-24 18:20:31 +01:00
Ian Lynagh
4ffa2364a0 Merge pull request #2867 from github/mergeback/v3.28.16-to-main-28deaeda
Mergeback v3.28.16 refs/heads/releases/v3 into main
2025-04-23 13:34:31 +01:00
github-actions[bot]
7e00290d34 Update checked-in dependencies 2025-04-23 12:17:11 +00:00
github-actions[bot]
259434501f Update changelog and version after v3.28.16 2025-04-23 12:10:49 +00:00
Ian Lynagh
28deaeda66 Merge pull request #2865 from github/update-v3.28.16-2a8cbadc0
Merge main into releases/v3
2025-04-23 13:10:18 +01:00
github-actions[bot]
03c5d71c11 Update changelog for v3.28.16 2025-04-23 10:40:48 +00:00
Ian Lynagh
2a8cbadc02 Merge pull request #2863 from github/update-bundle/codeql-bundle-v2.21.1
Update default bundle to 2.21.1
2025-04-22 12:30:12 +01:00
github-actions[bot]
95d52b7807 Update checked-in dependencies 2025-04-21 18:01:41 +00:00
dependabot[bot]
c9f0d30a86 build(deps): bump the npm group across 1 directory with 7 updates
Bumps the npm group with 7 updates in the / directory:

| Package | From | To |
| --- | --- | --- |
| [@octokit/types](https://github.com/octokit/types.ts) | `13.10.0` | `14.0.0` |
| [long](https://github.com/dcodeIO/long.js) | `5.3.1` | `5.3.2` |
| [octokit](https://github.com/octokit/octokit.js) | `4.1.2` | `4.1.3` |
| [@eslint/js](https://github.com/eslint/eslint/tree/HEAD/packages/js) | `9.24.0` | `9.25.1` |
| [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin) | `8.29.0` | `8.31.0` |
| [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser) | `8.29.0` | `8.31.0` |
| [nock](https://github.com/nock/nock) | `14.0.3` | `14.0.4` |



Updates `@octokit/types` from 13.10.0 to 14.0.0
- [Release notes](https://github.com/octokit/types.ts/releases)
- [Commits](https://github.com/octokit/types.ts/compare/v13.10.0...v14.0.0)

Updates `long` from 5.3.1 to 5.3.2
- [Release notes](https://github.com/dcodeIO/long.js/releases)
- [Commits](https://github.com/dcodeIO/long.js/compare/v5.3.1...v5.3.2)

Updates `octokit` from 4.1.2 to 4.1.3
- [Release notes](https://github.com/octokit/octokit.js/releases)
- [Commits](https://github.com/octokit/octokit.js/compare/v4.1.2...v4.1.3)

Updates `@eslint/js` from 9.24.0 to 9.25.1
- [Release notes](https://github.com/eslint/eslint/releases)
- [Changelog](https://github.com/eslint/eslint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslint/commits/v9.25.1/packages/js)

Updates `@typescript-eslint/eslint-plugin` from 8.29.0 to 8.31.0
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.31.0/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.29.0 to 8.31.0
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.31.0/packages/parser)

Updates `nock` from 14.0.3 to 14.0.4
- [Release notes](https://github.com/nock/nock/releases)
- [Changelog](https://github.com/nock/nock/blob/main/CHANGELOG.md)
- [Commits](https://github.com/nock/nock/compare/v14.0.3...v14.0.4)

---
updated-dependencies:
- dependency-name: "@octokit/types"
  dependency-version: 14.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: npm
- dependency-name: long
  dependency-version: 5.3.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: octokit
  dependency-version: 4.1.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@eslint/js"
  dependency-version: 9.25.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-version: 8.31.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-version: 8.31.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: nock
  dependency-version: 14.0.4
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-04-21 18:01:03 +00:00
github-actions[bot]
f76eaf51a6 Add changelog note 2025-04-16 16:54:18 +00:00
github-actions[bot]
e63b3f5166 Update default bundle to codeql-bundle-v2.21.1 2025-04-16 16:54:11 +00:00
Andrew Eisenberg
c0cffae534 Update checks file 2025-04-14 14:00:02 -07:00
dependabot[bot]
7eaba0dbc6 build(deps): bump ruby/setup-ruby in the actions group
Bumps the actions group with 1 update: [ruby/setup-ruby](https://github.com/ruby/setup-ruby).


Updates `ruby/setup-ruby` from 1.229.0 to 1.230.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](354a1ad156...e5ac7b085f)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-version: 1.230.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-04-14 17:57:59 +00:00
github-actions[bot]
d1c7d49753 Update supported GitHub Enterprise Server versions 2025-04-11 00:16:14 +00:00
Andrew Eisenberg
4c3e536282 Merge pull request #2853 from github/dependabot/npm_and_yarn/npm-7d84c66b66
build(deps-dev): bump the npm group with 3 updates
2025-04-10 16:31:21 -07:00
Nick Fyson
56dd02f26d Merge pull request #2852 from github/dependabot/github_actions/actions-4575878e06
build(deps): bump actions/create-github-app-token from 1.12.0 to 2.0.2 in the actions group
2025-04-09 17:18:03 +01:00
Nick Fyson
192406dd84 Merge branch 'main' into dependabot/github_actions/actions-4575878e06 2025-04-09 16:59:59 +01:00
Nick Fyson
c7dbb2084e Merge pull request #2857 from github/nickfyson/address-vulns
move use of input variables into env vars
2025-04-09 16:05:04 +01:00
nickfyson
9a45cd8c50 move use of input variables into env vars 2025-04-09 14:13:35 +01:00
Andrew Eisenberg
d26c46acea Merge pull request #2855 from github/mergeback/v3.28.15-to-main-45775bd8
Mergeback v3.28.15 refs/heads/releases/v3 into main
2025-04-07 14:48:19 -07:00
github-actions[bot]
51c83e1588 Update checked-in dependencies 2025-04-07 21:34:58 +00:00
github-actions[bot]
8774e3f945 Update changelog and version after v3.28.15 2025-04-07 21:32:19 +00:00
Andrew Eisenberg
45775bd823 Merge pull request #2854 from github/update-v3.28.15-a35ae8c38
Merge main into releases/v3
2025-04-07 14:31:50 -07:00
Andrew Eisenberg
dd78aab407 Update CHANGELOG.md with bug fix details 2025-04-07 14:15:05 -07:00
github-actions[bot]
e40af59174 Update changelog for v3.28.15 2025-04-07 21:05:03 +00:00
Chuan-kai Lin
a35ae8c380 Merge pull request #2843 from github/cklin/diff-informed-compat
Set checkPresence in diff-range data extension
2025-04-07 13:29:16 -07:00
github-actions[bot]
5bddbeb2bf Update checked-in dependencies 2025-04-07 17:59:50 +00:00
dependabot[bot]
c7102cdca1 build(deps-dev): bump the npm group with 3 updates
Bumps the npm group with 3 updates: [@eslint/js](https://github.com/eslint/eslint/tree/HEAD/packages/js), [nock](https://github.com/nock/nock) and [typescript](https://github.com/microsoft/TypeScript).


Updates `@eslint/js` from 9.23.0 to 9.24.0
- [Release notes](https://github.com/eslint/eslint/releases)
- [Changelog](https://github.com/eslint/eslint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslint/commits/v9.24.0/packages/js)

Updates `nock` from 14.0.2 to 14.0.3
- [Release notes](https://github.com/nock/nock/releases)
- [Changelog](https://github.com/nock/nock/blob/main/CHANGELOG.md)
- [Commits](https://github.com/nock/nock/compare/v14.0.2...v14.0.3)

Updates `typescript` from 5.8.2 to 5.8.3
- [Release notes](https://github.com/microsoft/TypeScript/releases)
- [Changelog](https://github.com/microsoft/TypeScript/blob/main/azure-pipelines.release-publish.yml)
- [Commits](https://github.com/microsoft/TypeScript/commits)

---
updated-dependencies:
- dependency-name: "@eslint/js"
  dependency-version: 9.24.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: nock
  dependency-version: 14.0.3
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: typescript
  dependency-version: 5.8.3
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-04-07 17:59:17 +00:00
dependabot[bot]
a1ca4846bc build(deps): bump actions/create-github-app-token in the actions group
Bumps the actions group with 1 update: [actions/create-github-app-token](https://github.com/actions/create-github-app-token).


Updates `actions/create-github-app-token` from 1.12.0 to 2.0.2
- [Release notes](https://github.com/actions/create-github-app-token/releases)
- [Commits](https://github.com/actions/create-github-app-token/compare/v1.12.0...v2.0.2)

---
updated-dependencies:
- dependency-name: actions/create-github-app-token
  dependency-version: 2.0.2
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-04-07 17:53:11 +00:00
Andrew Eisenberg
bb59df6c17 Merge pull request #2842 from github/henrymercer/zip64
Raise the file limit for debug artifacts by producing zip64 files where necessary
2025-04-07 10:50:46 -07:00
Arthur Baars
4b508f5964 Merge pull request #2845 from github/mergeback/v3.28.14-to-main-fc7e4a0f
Mergeback v3.28.14 refs/heads/releases/v3 into main
2025-04-07 13:04:29 +02:00
github-actions[bot]
ca00afb5f1 Update checked-in dependencies 2025-04-07 09:33:21 +00:00
github-actions[bot]
2969c78ce0 Update changelog and version after v3.28.14 2025-04-07 09:27:28 +00:00
Arthur Baars
fc7e4a0fa0 Merge pull request #2844 from github/update-v3.28.14-362ef4ce2
Merge main into releases/v3
2025-04-07 11:26:56 +02:00
github-actions[bot]
be0175c800 Update changelog for v3.28.14 2025-04-07 09:09:01 +00:00
Andrew Eisenberg
a8be43c24e Don't throw error for ENOENT 2025-04-04 13:42:00 -07:00
Chuan-kai Lin
94102d99b0 Set checkPresence in diff-range data extension
This commit updates the diff-range data extension to use the new
checkPresence field being introduced in CodeQL CLI 2.21.0, so that
diff-informed analysis no longer fails when a query pack does not have
the restrictAlertsTo extensible predicate.
2025-04-04 08:41:50 -07:00
github-actions[bot]
fd8685f16e Update checked-in dependencies 2025-04-04 13:46:53 +00:00
Henry Mercer
56feaac968 Raise file limit in debug artifacts by using zip64 2025-04-04 14:40:53 +01:00
Arthur Baars
362ef4ce20 Merge pull request #2838 from github/update-bundle/codeql-bundle-v2.21.0
Update default bundle to 2.21.0
2025-04-03 15:40:24 +02:00
Arthur Baars
2b85c00718 Merge branch 'main' into update-bundle/codeql-bundle-v2.21.0 2025-04-03 15:28:09 +02:00
Angela P Wen
41aa437638 Merge pull request #2841 from github/angelapwen/log-init-post-telemetry
Add logs around status report telemetry in `init-post` step
2025-04-03 14:51:03 +02:00
Angela P Wen
92864f48b0 Add logs around status report telemetry in init-post step 2025-04-03 14:37:27 +02:00
Fotis Koutoulakis
e13fe0dd2d Merge pull request #2833 from github/NlightNFotis/reclassify_upload_sarif_issues
feat: further error re-classification
2025-04-02 20:09:36 +01:00
Fotis Koutoulakis
06703ce3e5 Merge branch 'main' into NlightNFotis/reclassify_upload_sarif_issues 2025-04-02 19:06:45 +01:00
Fotis Koutoulakis (@NlightNFotis)
676a422916 review-comments: nest validateSariFileSchema into try-catch block to better discriminate error thrown 2025-04-02 19:06:31 +01:00
Fotis Koutoulakis (@NlightNFotis)
498c7f37e8 review-comments: unwrap error in upload-sarif-action and re-classify as ConfigurationError if in known error category 2025-04-02 15:20:03 +01:00
Fotis Koutoulakis (@NlightNFotis)
efd29bef22 refactor: revert getActionsStatus taking an extra argument 2025-04-02 15:13:00 +01:00
Angela P Wen
dab8a02091 Merge pull request #2836 from github/dependabot/github_actions/actions-02c935407f
build(deps): bump the actions group with 2 updates
2025-04-02 14:57:29 +02:00
Angela P Wen
10771737a9 Merge pull request #2840 from github/dependabot/npm_and_yarn/npm-05c8aca45e
build(deps-dev): bump the npm group across 1 directory with 4 updates
2025-04-02 14:56:55 +02:00
Angela P Wen
17379bcd20 Manually update PR check workflow 2025-04-02 14:43:55 +02:00
github-actions[bot]
dbb232a3d8 Update checked-in dependencies 2025-04-02 12:43:14 +00:00
dependabot[bot]
4b72bef651 build(deps-dev): bump the npm group across 1 directory with 4 updates
Bumps the npm group with 4 updates in the / directory: [@types/semver](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/semver), [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin), [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser) and [nock](https://github.com/nock/nock).


Updates `@types/semver` from 7.5.8 to 7.7.0
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/semver)

Updates `@typescript-eslint/eslint-plugin` from 8.28.0 to 8.29.0
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.29.0/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.28.0 to 8.29.0
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.29.0/packages/parser)

Updates `nock` from 14.0.1 to 14.0.2
- [Release notes](https://github.com/nock/nock/releases)
- [Changelog](https://github.com/nock/nock/blob/main/CHANGELOG.md)
- [Commits](https://github.com/nock/nock/compare/v14.0.1...v14.0.2)

---
updated-dependencies:
- dependency-name: "@types/semver"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: nock
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-04-02 12:42:37 +00:00
Fotis Koutoulakis (@NlightNFotis)
b53826d56d review-comments: remove syntax-error handling for SARIF from upload-lib 2025-04-01 15:10:16 +01:00
Fotis Koutoulakis (@NlightNFotis)
55ee663d5f review-comments: refactor getActionsStatus to accept an extra parameter designating if the analysis is third-party 2025-04-01 14:58:59 +01:00
github-actions[bot]
a27e401674 Add changelog note 2025-04-01 13:51:07 +00:00
github-actions[bot]
a69f5113b7 Update default bundle to codeql-bundle-v2.21.0 2025-04-01 13:51:03 +00:00
dependabot[bot]
b6f76bd566 build(deps): bump the actions group with 2 updates
Bumps the actions group with 2 updates: [ruby/setup-ruby](https://github.com/ruby/setup-ruby) and [actions/create-github-app-token](https://github.com/actions/create-github-app-token).


Updates `ruby/setup-ruby` from 1.227.0 to 1.229.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](1a615958ad...354a1ad156)

Updates `actions/create-github-app-token` from 1.11.7 to 1.12.0
- [Release notes](https://github.com/actions/create-github-app-token/releases)
- [Commits](https://github.com/actions/create-github-app-token/compare/v1.11.7...v1.12.0)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
- dependency-name: actions/create-github-app-token
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-31 17:16:14 +00:00
Fotis Koutoulakis (@NlightNFotis)
01f1a1f2c9 Merge branch 'main' into NlightNFotis/reclassify_upload_sarif_issues 2025-03-31 16:29:02 +01:00
Chuan-kai Lin
efffb483ec Merge pull request #2831 from github/cklin/diff-informed-query-filtering
Respect `exclude-from-incremental` query tag for diff-informed analysis
2025-03-31 08:00:50 -07:00
Fotis Koutoulakis (@NlightNFotis)
f21cf0bbd7 feat: reclassify InvalidSarifUploadError as a user-error when final status report is produced 2025-03-31 12:22:18 +01:00
Fotis Koutoulakis (@NlightNFotis)
72a2b1295e feat: classify some observed SARIF errors as InvalidSarifUploadError 2025-03-31 12:17:23 +01:00
Fotis Koutoulakis (@NlightNFotis)
a022653e2d feat: classify more HTTP errors as configuration errors in api-client 2025-03-31 11:54:16 +01:00
Fotis Koutoulakis (@NlightNFotis)
3c42562190 fix: update comment for test to state correct expected outcome 2025-03-31 11:51:11 +01:00
Chuan-kai Lin
e4ca874973 build: refresh js files 2025-03-28 12:30:40 -07:00
Chuan-kai Lin
e7f67e2e61 Redefine shouldPerformDiffInformedAnalysis()
This commit renames the original shouldPerformDiffInformedAnalysis(),
which returns `PullRequestBranches | undefined`, to
getDiffInformedAnalysisBranches(). It also adds a new
shouldPerformDiffInformedAnalysis() function that returns boolean.

Separating these two functions makes it clear what the intended uses and
return values should be for each.
2025-03-28 12:29:28 -07:00
Fotis Koutoulakis
9f45e7498b Merge pull request #2832 from github/NlightNFotis/fix_config_error_classification
fix: change regex matching for API error to not contain regex boundaries
2025-03-28 15:18:02 +00:00
Fotis Koutoulakis (@NlightNFotis)
73c938dbc0 fix: fix issue where wrapApiConfigurationError would fail to regex match a string due to boundary constraints on the regex 2025-03-28 14:38:06 +00:00
Fotis Koutoulakis (@NlightNFotis)
2be6da694a test: add tests for the wrapApiConfigurationError function 2025-03-28 14:37:10 +00:00
Fotis Koutoulakis (@NlightNFotis)
76f9ed9cd9 test: add tests to validate getActionsStatus' behaviour 2025-03-28 14:37:10 +00:00
Chuan-kai Lin
71ab101d38 Set default query filter for diff-informed analysis 2025-03-27 14:06:40 -07:00
Chuan-kai Lin
da967b1ade AugmentationProperties: add defaultQueryFilters
This commit adds a defaultQueryFilters field to AugmentationProperties
and incorporates its value into the augmented Code Scanning config.
However, in this commit defaultQueryFilters is always empty, so there is
not yet any actual behavior change.
2025-03-27 13:44:47 -07:00
Chuan-kai Lin
3c4533916b Call shouldPerformDiffInformedAnalysis() outside setupDiffInformedQueryRun() 2025-03-27 10:27:24 -07:00
Chuan-kai Lin
1994ea768e Move shouldPerformDiffInformedAnalysis() 2025-03-27 10:27:24 -07:00
Chuan-kai Lin
534bc63d5e Rename diff-filtering-utils.ts to diff-informed-analysis-utils.ts 2025-03-27 10:27:23 -07:00
Chuan-kai Lin
3fbee52426 Extract shouldPerformDiffInformedAnalysis() 2025-03-27 10:27:23 -07:00
Chuan-kai Lin
9bd18b486f Merge pull request #2830 from github/cklin/code-scanning-repo
getFileDiffsWithBasehead(): use CODE_SCANNING_REPOSITORY if present
2025-03-27 10:25:27 -07:00
Chuan-kai Lin
0afd488dc1 build: refresh js files 2025-03-27 08:50:55 -07:00
Chuan-kai Lin
c1fc897eb2 getFileDiffsWithBasehead(): use CODE_SCANNING_REPOSITORY if present 2025-03-27 08:50:31 -07:00
Chuan-kai Lin
f88459c0a3 Use getRepositoryNwo() 2025-03-26 10:18:40 -07:00
Chuan-kai Lin
b22f3341fe Add getRepositoryNwo() helper functions 2025-03-26 08:11:16 -07:00
Henry Mercer
486ab5a292 Merge pull request #2827 from github/dependabot/npm_and_yarn/npm-6956921c2d
build(deps): bump the npm group with 8 updates
2025-03-24 21:40:41 +00:00
github-actions[bot]
5275714183 Update checked-in dependencies 2025-03-24 21:18:42 +00:00
dependabot[bot]
08e5c8d618 build(deps): bump the npm group with 8 updates
Bumps the npm group with 8 updates:

| Package | From | To |
| --- | --- | --- |
| [@actions/cache](https://github.com/actions/toolkit/tree/HEAD/packages/cache) | `4.0.2` | `4.0.3` |
| [@octokit/types](https://github.com/octokit/types.ts) | `13.8.0` | `13.10.0` |
| [@eslint/eslintrc](https://github.com/eslint/eslintrc) | `3.3.0` | `3.3.1` |
| [@eslint/js](https://github.com/eslint/eslint/tree/HEAD/packages/js) | `9.22.0` | `9.23.0` |
| [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin) | `8.26.1` | `8.28.0` |
| [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser) | `8.26.1` | `8.28.0` |
| [eslint-import-resolver-typescript](https://github.com/import-js/eslint-import-resolver-typescript) | `3.8.3` | `3.8.7` |
| [sinon](https://github.com/sinonjs/sinon) | `19.0.2` | `20.0.0` |


Updates `@actions/cache` from 4.0.2 to 4.0.3
- [Changelog](https://github.com/actions/toolkit/blob/main/packages/cache/RELEASES.md)
- [Commits](https://github.com/actions/toolkit/commits/HEAD/packages/cache)

Updates `@octokit/types` from 13.8.0 to 13.10.0
- [Release notes](https://github.com/octokit/types.ts/releases)
- [Commits](https://github.com/octokit/types.ts/compare/v13.8.0...v13.10.0)

Updates `@eslint/eslintrc` from 3.3.0 to 3.3.1
- [Release notes](https://github.com/eslint/eslintrc/releases)
- [Changelog](https://github.com/eslint/eslintrc/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslintrc/compare/v3.3.0...v3.3.1)

Updates `@eslint/js` from 9.22.0 to 9.23.0
- [Release notes](https://github.com/eslint/eslint/releases)
- [Changelog](https://github.com/eslint/eslint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslint/commits/v9.23.0/packages/js)

Updates `@typescript-eslint/eslint-plugin` from 8.26.1 to 8.28.0
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.28.0/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.26.1 to 8.28.0
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.28.0/packages/parser)

Updates `eslint-import-resolver-typescript` from 3.8.3 to 3.8.7
- [Release notes](https://github.com/import-js/eslint-import-resolver-typescript/releases)
- [Changelog](https://github.com/import-js/eslint-import-resolver-typescript/blob/master/CHANGELOG.md)
- [Commits](https://github.com/import-js/eslint-import-resolver-typescript/compare/v3.8.3...v3.8.7)

Updates `sinon` from 19.0.2 to 20.0.0
- [Release notes](https://github.com/sinonjs/sinon/releases)
- [Changelog](https://github.com/sinonjs/sinon/blob/main/docs/changelog.md)
- [Commits](https://github.com/sinonjs/sinon/compare/v19.0.2...v20.0.0)

---
updated-dependencies:
- dependency-name: "@actions/cache"
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@octokit/types"
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@eslint/eslintrc"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@eslint/js"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: eslint-import-resolver-typescript
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: sinon
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-24 21:18:05 +00:00
Andrew Eisenberg
be853de3b7 Merge pull request #2822 from github/dependabot/github_actions/actions-cbe19e082f
build(deps): bump the actions group with 2 updates
2025-03-24 12:03:54 -07:00
Andrew Eisenberg
502426aa6b Also update checks/rubocop-multi-language.yml 2025-03-24 11:50:24 -07:00
github-actions[bot]
4cdde5c397 Rebuild 2025-03-24 18:43:49 +00:00
dependabot[bot]
6ceaf4460c build(deps): bump the actions group with 2 updates
Bumps the actions group with 2 updates: [ruby/setup-ruby](https://github.com/ruby/setup-ruby) and [actions/create-github-app-token](https://github.com/actions/create-github-app-token).


Updates `ruby/setup-ruby` from 1.226.0 to 1.227.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](922ebc4c52...1a615958ad)

Updates `actions/create-github-app-token` from 1.11.6 to 1.11.7
- [Release notes](https://github.com/actions/create-github-app-token/releases)
- [Commits](https://github.com/actions/create-github-app-token/compare/v1.11.6...v1.11.7)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
- dependency-name: actions/create-github-app-token
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-24 18:06:50 +00:00
Chuan-kai Lin
f15aac3db1 Merge pull request #2820 from github/mergeback/v3.28.13-to-main-1b549b92
Mergeback v3.28.13 refs/heads/releases/v3 into main
2025-03-24 07:41:49 -07:00
github-actions[bot]
e149e39832 Update checked-in dependencies 2025-03-24 13:48:13 +00:00
github-actions[bot]
f313d62247 Update changelog and version after v3.28.13 2025-03-24 13:43:41 +00:00
Chuan-kai Lin
1b549b9259 Merge pull request #2819 from github/update-v3.28.13-e0ea14102
Merge main into releases/v3
2025-03-24 06:42:41 -07:00
github-actions[bot]
82630c85f3 Update changelog for v3.28.13 2025-03-24 13:18:07 +00:00
Chuan-kai Lin
e0ea141027 Merge pull request #2818 from github/cklin/empty-pr-diff-range
Diff-informed analysis: fix empty PR handling
2025-03-21 16:04:38 -07:00
Chuan-kai Lin
b361a91508 Diff-informed analysis: fix empty PR handling 2025-03-21 14:18:25 -07:00
Chuan-kai Lin
bd1d9ab4ed Merge pull request #2816 from github/cklin/overlay-file-list
Overlay databases: use --overlay-changes
2025-03-21 12:30:26 -07:00
Chuan-kai Lin
b98ae6ca52 Add overlay-database-utils tests 2025-03-21 11:31:28 -07:00
Chuan-kai Lin
9825184a0a Add getFileOidsUnderPath() tests 2025-03-21 10:53:21 -07:00
Chuan-kai Lin
ac67cffe5c Merge pull request #2817 from github/cklin/default-setup-diff-informed
Support diff-informed queries under Default Setup
2025-03-21 09:47:20 -07:00
Chuan-kai Lin
9c674ba4f5 build: refresh js files 2025-03-21 09:25:30 -07:00
Chuan-kai Lin
d109dd5d33 Detect PR branches for Default Setup 2025-03-21 09:25:08 -07:00
Chuan-kai Lin
3e5446c3d2 Introduce PullRequestBranches 2025-03-21 09:24:16 -07:00
Chuan-kai Lin
6adda79888 Move PR branch detection into setupDiffInformedQueryRun() 2025-03-20 09:51:17 -07:00
Chuan-kai Lin
6be6984cc1 Overlay databases: use --overlay-changes
This commit changes overlay database creation to use the
--overlay-changes flag. It also implements Git-based file change
detection to generate the list of files to extract for the overlay
database.
2025-03-19 11:38:45 -07:00
Andrew Eisenberg
c50c157cc3 Merge pull request #2813 from github/NlightNFotis/enhance_justfile
build: sync some utility just instructions I had locally
2025-03-19 10:57:36 -07:00
Fotis Koutoulakis
c74c378e29 Update justfile
Co-authored-by: Andrew Eisenberg <aeisenberg@github.com>
2025-03-19 17:11:02 +00:00
Fotis Koutoulakis
d271bde0ec Update justfile
Co-authored-by: Andrew Eisenberg <aeisenberg@github.com>
2025-03-19 17:10:52 +00:00
Chris Smowton
df9f80e0f0 Merge pull request #2815 from github/mergeback/v3.28.12-to-main-5f8171a6
Mergeback v3.28.12 refs/heads/releases/v3 into main
2025-03-19 13:42:24 +00:00
github-actions[bot]
46371933a7 Update checked-in dependencies 2025-03-19 12:43:51 +00:00
github-actions[bot]
ee6a063cbd Update changelog and version after v3.28.12 2025-03-19 12:41:18 +00:00
Chris Smowton
5f8171a638 Merge pull request #2814 from github/update-v3.28.12-6349095d1
Merge main into releases/v3
2025-03-19 12:40:51 +00:00
github-actions[bot]
bb59f7707d Update changelog for v3.28.12 2025-03-19 12:17:24 +00:00
Fotis Koutoulakis (@NlightNFotis)
8b0dccd066 build: sync some utility just instructions I had locally 2025-03-19 11:56:11 +00:00
Chris Smowton
6349095d19 Merge pull request #2810 from github/update-bundle/codeql-bundle-v2.20.7
Update default bundle to 2.20.7
2025-03-18 12:35:37 +00:00
github-actions[bot]
d7d03fda12 Add changelog note 2025-03-18 12:21:54 +00:00
github-actions[bot]
4e3a5342c5 Update default bundle to codeql-bundle-v2.20.7 2025-03-18 12:21:54 +00:00
Michael B. Gale
55f023701c Merge pull request #2802 from github/mbg/dependency-caching/java-buildless
Set and cache dependency directory for Java `build-mode: none`
2025-03-18 10:28:36 +00:00
Angela P Wen
6a151cd774 Merge pull request #2811 from github/dependabot/github_actions/actions-c2c311daa1
build(deps): bump ruby/setup-ruby from 1.222.0 to 1.226.0 in the actions group
2025-03-17 12:15:27 -07:00
Angela P Wen
7866bcdb1b Manually bump workflow to match autogenerated file 2025-03-17 12:00:05 -07:00
dependabot[bot]
611289e0b0 build(deps): bump ruby/setup-ruby in the actions group
Bumps the actions group with 1 update: [ruby/setup-ruby](https://github.com/ruby/setup-ruby).


Updates `ruby/setup-ruby` from 1.222.0 to 1.226.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](277ba2a127...922ebc4c52)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-17 18:11:32 +00:00
Michael B. Gale
4c409a5b66 Remove temporary dependency directory in analyze post action 2025-03-17 11:34:09 +00:00
Andrew Eisenberg
70df9def86 Merge pull request #2808 from github/aeisenberg/fix-dependabot
Fix dependabot errors
2025-03-14 13:49:58 -07:00
Andrew Eisenberg
5f98c40063 Fix dependabot errors
I explicitly had to downgrade "@octokit/plugin-retry" to "^6.0.0". Other
dependencies were upgraded.
2025-03-14 13:13:56 -07:00
Chuan-kai Lin
f338ec87a3 Merge pull request #2806 from github/cklin/delete-unused-git-utils
git-utils: deleted unused functions
2025-03-13 11:51:05 -07:00
Chuan-kai Lin
c31f6c89e8 git-utils: deleted unused functions 2025-03-13 10:45:14 -07:00
Michael B. Gale
251c7fdf5d Update changelog 2025-03-13 11:50:11 +00:00
Michael B. Gale
afa3ed33bb Add more documentation 2025-03-13 11:45:27 +00:00
Michael B. Gale
f8367fb063 Set and cache dependency directory for Java build-mode: none 2025-03-13 11:39:39 +00:00
Andrew Eisenberg
dc49dcabdb Merge pull request #2800 from github/aeisenberg/remove-minimatch
Minimally remove micromatch
2025-03-11 16:01:07 -07:00
Andrew Eisenberg
7254660adc Merge pull request #2804 from github/dependabot/github_actions/actions-96d25c356e
build(deps): bump ruby/setup-ruby from 1.221.0 to 1.222.0 in the actions group
2025-03-11 08:53:45 -07:00
Chuan-kai Lin
13f2f96cdd Merge pull request #2801 from github/cklin/overlay-databases
Basic support for overlay databases
2025-03-11 08:33:33 -07:00
Chuan-kai Lin
0efe12d12c build: refresh js files 2025-03-10 13:31:46 -07:00
Chuan-kai Lin
ff5f0b9efd Support overlay database creation
This commit adds support for creating overlay-base and overlay
databases, controlled via the CODEQL_OVERLAY_DATABASE_MODE environment
variable.
2025-03-10 13:25:46 -07:00
Chuan-kai Lin
270886f805 Pass overlay mode into databaseInitCluster()
This commit adds a OverlayDatabaseMode parameter to
databaseInitCluster(). The parameter controls the "codeql database init"
flags concerning overlay database creation.

There is no behavior change in this commit because we always pass
OverlayDatabaseMode.None to databaseInitCluster(). That will change in
the next commit.
2025-03-10 13:22:24 -07:00
Andrew Eisenberg
d3762699d1 Update pr-check 2025-03-10 11:22:58 -07:00
Henry Mercer
b46b37a8a3 Merge pull request #2803 from github/dependabot/npm_and_yarn/npm-129f0c3752
build(deps-dev): bump the npm group with 3 updates
2025-03-10 18:01:08 +00:00
dependabot[bot]
aecf01557d build(deps): bump ruby/setup-ruby in the actions group
Bumps the actions group with 1 update: [ruby/setup-ruby](https://github.com/ruby/setup-ruby).


Updates `ruby/setup-ruby` from 1.221.0 to 1.222.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](32110d4e31...277ba2a127)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-10 17:57:35 +00:00
github-actions[bot]
053e2184a0 Update checked-in dependencies 2025-03-10 17:42:57 +00:00
dependabot[bot]
248ab9b811 build(deps-dev): bump the npm group with 3 updates
Bumps the npm group with 3 updates: [@eslint/js](https://github.com/eslint/eslint/tree/HEAD/packages/js), [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin) and [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser).


Updates `@eslint/js` from 9.21.0 to 9.22.0
- [Release notes](https://github.com/eslint/eslint/releases)
- [Changelog](https://github.com/eslint/eslint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslint/commits/v9.22.0/packages/js)

Updates `@typescript-eslint/eslint-plugin` from 8.26.0 to 8.26.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.26.1/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.26.0 to 8.26.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.26.1/packages/parser)

---
updated-dependencies:
- dependency-name: "@eslint/js"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-10 17:42:05 +00:00
Chuan-kai Lin
d76f393713 Do not set --expect-discarded-cache on "cleanup-level: overlay"
When a user specifies "cleanup-level: overlay", it suggests that the
user wishes to preserve the evaluation cache for future use. So in this
case we should not set --expect-discarded-cache when running queries.
2025-03-10 10:32:13 -07:00
Andrew Eisenberg
88676f2b14 Minimally remove micromatch 2025-03-07 10:07:08 -08:00
Chuan-kai Lin
b2e6519679 Merge pull request #2799 from github/mergeback/v3.28.11-to-main-6bb031af
Mergeback v3.28.11 refs/heads/releases/v3 into main
2025-03-07 08:34:57 -08:00
github-actions[bot]
ff91c9db25 Update checked-in dependencies 2025-03-07 16:12:00 +00:00
github-actions[bot]
d1b3f740d8 Update changelog and version after v3.28.11 2025-03-07 16:09:54 +00:00
Chuan-kai Lin
6bb031afdd Merge pull request #2798 from github/update-v3.28.11-56b25d5d5
Merge main into releases/v3
2025-03-07 08:09:23 -08:00
github-actions[bot]
6bca7dd940 Update changelog for v3.28.11 2025-03-07 14:28:04 +00:00
Chuan-kai Lin
56b25d5d52 Merge pull request #2793 from github/update-bundle/codeql-bundle-v2.20.6
Update default bundle to 2.20.6
2025-03-06 07:12:12 -08:00
Chuan-kai Lin
256aa16582 Merge branch 'main' into update-bundle/codeql-bundle-v2.20.6 2025-03-06 06:59:38 -08:00
Nick Fyson
911d845ab6 Merge pull request #2796 from github/nickfyson/adjust-rate-error-string
adjust string for handling rate limit error
2025-03-06 10:45:00 +00:00
nickfyson
7b7ed63503 adjust string for handling rate limit error 2025-03-06 10:33:25 +00:00
Henry Mercer
608ccd6cd9 Merge pull request #2794 from github/update-supported-enterprise-server-versions
Update supported GitHub Enterprise Server versions
2025-03-05 14:41:52 +00:00
github-actions[bot]
35d04d3627 Update supported GitHub Enterprise Server versions 2025-03-05 00:15:30 +00:00
Chuan-kai Lin
ec3b22164b Update supported GitHub Enterprise Server versions 2025-03-03 13:06:35 -08:00
github-actions[bot]
8dc01f6342 Add changelog note 2025-03-03 20:54:07 +00:00
github-actions[bot]
b378daf0bc Update default bundle to codeql-bundle-v2.20.6 2025-03-03 20:54:03 +00:00
3873 changed files with 858691 additions and 559140 deletions

View File

@@ -29,24 +29,27 @@ runs:
- id: get-url
name: Determine URL
shell: bash
env:
VERSION: ${{ inputs.version }}
USE_ALL_PLATFORM_BUNDLE: ${{ inputs.use-all-platform-bundle }}
run: |
set -e # Fail this Action if `gh release list` fails.
if [[ ${{ inputs.version }} == "linked" ]]; then
if [[ "$VERSION" == "linked" ]]; then
echo "tools-url=linked" >> "$GITHUB_OUTPUT"
exit 0
elif [[ ${{ inputs.version }} == "default" ]]; then
elif [[ "$VERSION" == "default" ]]; then
echo "tools-url=" >> "$GITHUB_OUTPUT"
exit 0
fi
if [[ ${{ inputs.version }} == "nightly-latest" && "$RUNNER_OS" != "Windows" ]]; then
if [[ "$VERSION" == "nightly-latest" && "$RUNNER_OS" != "Windows" ]]; then
extension="tar.zst"
else
extension="tar.gz"
fi
if [[ ${{ inputs.use-all-platform-bundle }} == "true" ]]; then
if [[ "$USE_ALL_PLATFORM_BUNDLE" == "true" ]]; then
artifact_name="codeql-bundle.$extension"
elif [[ "$RUNNER_OS" == "Linux" ]]; then
artifact_name="codeql-bundle-linux64.$extension"
@@ -59,14 +62,14 @@ runs:
exit 1
fi
if [[ ${{ inputs.version }} == "nightly-latest" ]]; then
if [[ "$VERSION" == "nightly-latest" ]]; then
tag=`gh release list --repo dsp-testing/codeql-cli-nightlies -L 1 | cut -f 3`
echo "tools-url=https://github.com/dsp-testing/codeql-cli-nightlies/releases/download/$tag/$artifact_name" >> $GITHUB_OUTPUT
elif [[ ${{ inputs.version }} == *"nightly"* ]]; then
version=`echo ${{ inputs.version }} | sed -e 's/^.*\-//'`
elif [[ "$VERSION" == *"nightly"* ]]; then
version=`echo "$VERSION" | sed -e 's/^.*\-//'`
echo "tools-url=https://github.com/dsp-testing/codeql-cli-nightlies/releases/download/codeql-bundle-$version/$artifact_name" >> $GITHUB_OUTPUT
elif [[ ${{ inputs.version }} == *"stable"* ]]; then
version=`echo ${{ inputs.version }} | sed -e 's/^.*\-//'`
elif [[ "$VERSION" == *"stable"* ]]; then
version=`echo "$VERSION" | sed -e 's/^.*\-//'`
echo "tools-url=https://github.com/github/codeql-action/releases/download/codeql-bundle-$version/$artifact_name" >> $GITHUB_OUTPUT
else
echo "::error::Unrecognized version specified!"

View File

@@ -18,8 +18,11 @@ runs:
using: "composite"
steps:
- id: branches
env:
MAJOR_VERSION: ${{ inputs.major_version }}
LATEST_TAG: ${{ inputs.latest_tag }}
run: |
python ${{ github.action_path }}/release-branches.py \
--major-version ${{ inputs.major_version }} \
--latest-tag ${{ inputs.latest_tag }}
--major-version "$MAJOR_VERSION" \
--latest-tag "$LATEST_TAG"
shell: bash

View File

@@ -46,7 +46,7 @@ jobs:
use-all-platform-bundle: 'false'
setup-kotlin: 'true'
- name: Set up Ruby
uses: ruby/setup-ruby@32110d4e311bd8996b2a82bf2a43b714ccc91777 # v1.221.0
uses: ruby/setup-ruby@e5ac7b085f6e63d49c8973eb0c6e04d876b881f1 # v1.230.0
with:
ruby-version: 2.6
- name: Install Code Scanning integration

View File

@@ -75,7 +75,7 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [ubuntu-20.04,ubuntu-22.04,windows-2019,windows-2022,macos-13,macos-14]
os: [ubuntu-22.04,ubuntu-24.04,windows-2019,windows-2022,macos-13,macos-14]
tools: ${{ fromJson(needs.check-codeql-versions.outputs.versions) }}
runs-on: ${{ matrix.os }}

View File

@@ -3,6 +3,9 @@
name: Code-Scanning config CLI tests
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# Diff informed queries add an additional query filter which is not yet
# taken into account by these tests.
CODEQL_ACTION_DIFF_INFORMED_QUERIES: false
on:
push:

View File

@@ -168,7 +168,7 @@ jobs:
--draft
- name: Generate token
uses: actions/create-github-app-token@v1.11.6
uses: actions/create-github-app-token@v2.0.6
id: app-token
with:
app-id: ${{ vars.AUTOMATION_APP_ID }}

View File

@@ -124,7 +124,7 @@ jobs:
pull-requests: write # needed to create pull request
steps:
- name: Generate token
uses: actions/create-github-app-token@v1.11.6
uses: actions/create-github-app-token@v2.0.6
id: app-token
with:
app-id: ${{ vars.AUTOMATION_APP_ID }}

View File

@@ -2,10 +2,45 @@
See the [releases page](https://github.com/github/codeql-action/releases) for the relevant changes to the CodeQL CLI and language packs.
## [UNRELEASED]
## 3.28.20 - 21 July 2025
- Remove support for combining SARIF files from a single upload for GHES 3.18, see [the changelog post](https://github.blog/changelog/2024-05-06-code-scanning-will-stop-combining-runs-from-a-single-upload/). [#2959](https://github.com/github/codeql-action/pull/2959)
## 3.28.18 - 16 May 2025
- Update default CodeQL bundle version to 2.21.3. [#2893](https://github.com/github/codeql-action/pull/2893)
- Skip validating SARIF produced by CodeQL for improved performance. [#2894](https://github.com/github/codeql-action/pull/2894)
- The number of threads and amount of RAM used by CodeQL can now be set via the `CODEQL_THREADS` and `CODEQL_RAM` runner environment variables. If set, these environment variables override the `threads` and `ram` inputs respectively. [#2891](https://github.com/github/codeql-action/pull/2891)
## 3.28.17 - 02 May 2025
- Update default CodeQL bundle version to 2.21.2. [#2872](https://github.com/github/codeql-action/pull/2872)
## 3.28.16 - 23 Apr 2025
- Update default CodeQL bundle version to 2.21.1. [#2863](https://github.com/github/codeql-action/pull/2863)
## 3.28.15 - 07 Apr 2025
- Fix bug where the action would fail if it tried to produce a debug artifact with more than 65535 files. [#2842](https://github.com/github/codeql-action/pull/2842)
## 3.28.14 - 07 Apr 2025
- Update default CodeQL bundle version to 2.21.0. [#2838](https://github.com/github/codeql-action/pull/2838)
## 3.28.13 - 24 Mar 2025
No user facing changes.
## 3.28.12 - 19 Mar 2025
- Dependency caching should now cache more dependencies for Java `build-mode: none` extractions. This should speed up workflows and avoid inconsistent alerts in some cases.
- Update default CodeQL bundle version to 2.20.7. [#2810](https://github.com/github/codeql-action/pull/2810)
## 3.28.11 - 07 Mar 2025
- Update default CodeQL bundle version to 2.20.6. [#2793](https://github.com/github/codeql-action/pull/2793)
## 3.28.10 - 21 Feb 2025
- Update default CodeQL bundle version to 2.20.5. [#2772](https://github.com/github/codeql-action/pull/2772)

View File

@@ -70,10 +70,11 @@ We typically release new minor versions of the CodeQL Action and Bundle when a n
| Minimum CodeQL Action | Minimum CodeQL Bundle Version | GitHub Environment | Notes |
|-----------------------|-------------------------------|--------------------|-------|
| `v3.26.6` | `2.18.4` | Enterprise Server 3.15 | |
| `v3.25.11` | `2.17.6` | Enterprise Server 3.14 | |
| `v3.24.11` | `2.16.6` | Enterprise Server 3.13 | |
| `v3.22.12` | `2.15.5` | Enterprise Server 3.12 | |
| `v3.28.12` | `2.20.7` | Enterprise Server 3.17 | |
| `v3.28.6` | `2.20.3` | Enterprise Server 3.16 | |
| `v3.28.6` | `2.20.3` | Enterprise Server 3.15 | |
| `v3.28.6` | `2.20.3` | Enterprise Server 3.14 | |
| `v3.28.6` | `2.20.3` | Enterprise Server 3.13 | |
See the full list of GHES release and deprecation dates at [GitHub Enterprise Server releases](https://docs.github.com/en/enterprise-server/admin/all-releases#releases-of-github-enterprise-server).

View File

@@ -3,7 +3,7 @@ all: lint sync
# Lint source typescript
lint:
npm run lint -- --fix
npm run lint-fix
# Sync generated files (javascript and PR checks)
sync: build update-pr-checks
@@ -15,3 +15,16 @@ update-pr-checks:
# Transpile typescript code into javascript
build:
npm run build
# Build then run all the tests
test: build
npm run test
# Run the tests for a single file
test_file filename: build
npx ava --verbose {{filename}}
[doc("Refresh the .js build artefacts in the lib directory")]
[confirm]
refresh-lib:
rm -rf lib && npm run build

View File

@@ -38,12 +38,14 @@ Object.defineProperty(exports, "__esModule", { value: true });
* It will run after the all steps in this job, in reverse order in relation to
* other `post:` hooks.
*/
const fs = __importStar(require("fs"));
const core = __importStar(require("@actions/core"));
const actionsUtil = __importStar(require("./actions-util"));
const api_client_1 = require("./api-client");
const codeql_1 = require("./codeql");
const config_utils_1 = require("./config-utils");
const debugArtifacts = __importStar(require("./debug-artifacts"));
const dependency_caching_1 = require("./dependency-caching");
const environment_1 = require("./environment");
const logging_1 = require("./logging");
const util_1 = require("./util");
@@ -63,6 +65,18 @@ async function runWrapper() {
await debugArtifacts.uploadCombinedSarifArtifacts(logger, config.gitHubVersion.type, version.version);
}
}
// If we analysed Java in build-mode: none, we may have downloaded dependencies
// to the temp directory. Clean these up so they don't persist unnecessarily
// long on self-hosted runners.
const javaTempDependencyDir = (0, dependency_caching_1.getJavaTempDependencyDir)();
if (fs.existsSync(javaTempDependencyDir)) {
try {
fs.rmSync(javaTempDependencyDir, { recursive: true });
}
catch (error) {
logger.info(`Failed to remove temporary Java dependencies directory: ${(0, util_1.getErrorMessage)(error)}`);
}
}
}
catch (error) {
core.setFailed(`analyze post-action step failed: ${(0, util_1.getErrorMessage)(error)}`);

View File

@@ -1 +1 @@
{"version":3,"file":"analyze-action-post.js","sourceRoot":"","sources":["../src/analyze-action-post.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA;;;;GAIG;AACH,oDAAsC;AAEtC,4DAA8C;AAC9C,6CAAgD;AAChD,qCAAqC;AACrC,iDAA2C;AAC3C,kEAAoD;AACpD,+CAAuC;AACvC,uCAA6C;AAC7C,iCAAoE;AAEpE,KAAK,UAAU,UAAU;IACvB,IAAI,CAAC;QACH,WAAW,CAAC,aAAa,EAAE,CAAC;QAC5B,MAAM,MAAM,GAAG,IAAA,0BAAgB,GAAE,CAAC;QAClC,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;QAC/C,IAAA,gCAAyB,EAAC,aAAa,EAAE,MAAM,CAAC,CAAC;QAEjD,kFAAkF;QAClF,wFAAwF;QACxF,IAAI,OAAO,CAAC,GAAG,CAAC,oBAAM,CAAC,mBAAmB,CAAC,KAAK,MAAM,EAAE,CAAC;YACvD,MAAM,MAAM,GAAG,MAAM,IAAA,wBAAS,EAC5B,WAAW,CAAC,qBAAqB,EAAE,EACnC,MAAM,CACP,CAAC;YACF,IAAI,MAAM,KAAK,SAAS,EAAE,CAAC;gBACzB,MAAM,MAAM,GAAG,MAAM,IAAA,kBAAS,EAAC,MAAM,CAAC,SAAS,CAAC,CAAC;gBACjD,MAAM,OAAO,GAAG,MAAM,MAAM,CAAC,UAAU,EAAE,CAAC;gBAC1C,MAAM,cAAc,CAAC,4BAA4B,CAC/C,MAAM,EACN,MAAM,CAAC,aAAa,CAAC,IAAI,EACzB,OAAO,CAAC,OAAO,CAChB,CAAC;YACJ,CAAC;QACH,CAAC;IACH,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,IAAI,CAAC,SAAS,CACZ,oCAAoC,IAAA,sBAAe,EAAC,KAAK,CAAC,EAAE,CAC7D,CAAC;IACJ,CAAC;AACH,CAAC;AAED,KAAK,UAAU,EAAE,CAAC"}
{"version":3,"file":"analyze-action-post.js","sourceRoot":"","sources":["../src/analyze-action-post.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA;;;;GAIG;AACH,uCAAyB;AAEzB,oDAAsC;AAEtC,4DAA8C;AAC9C,6CAAgD;AAChD,qCAAqC;AACrC,iDAA2C;AAC3C,kEAAoD;AACpD,6DAAgE;AAChE,+CAAuC;AACvC,uCAA6C;AAC7C,iCAAoE;AAEpE,KAAK,UAAU,UAAU;IACvB,IAAI,CAAC;QACH,WAAW,CAAC,aAAa,EAAE,CAAC;QAC5B,MAAM,MAAM,GAAG,IAAA,0BAAgB,GAAE,CAAC;QAClC,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;QAC/C,IAAA,gCAAyB,EAAC,aAAa,EAAE,MAAM,CAAC,CAAC;QAEjD,kFAAkF;QAClF,wFAAwF;QACxF,IAAI,OAAO,CAAC,GAAG,CAAC,oBAAM,CAAC,mBAAmB,CAAC,KAAK,MAAM,EAAE,CAAC;YACvD,MAAM,MAAM,GAAG,MAAM,IAAA,wBAAS,EAC5B,WAAW,CAAC,qBAAqB,EAAE,EACnC,MAAM,CACP,CAAC;YACF,IAAI,MAAM,KAAK,SAAS,EAAE,CAAC;gBACzB,MAAM,MAAM,GAAG,MAAM,IAAA,kBAAS,EAAC,MAAM,CAAC,SAAS,CAAC,CAAC;gBACjD,MAAM,OAAO,GAAG,MAAM,MAAM,CAAC,UAAU,EAAE,CAAC;gBAC1C,MAAM,cAAc,CAAC,4BAA4B,CAC/C,MAAM,EACN,MAAM,CAAC,aAAa,CAAC,IAAI,EACzB,OAAO,CAAC,OAAO,CAChB,CAAC;YACJ,CAAC;QACH,CAAC;QAED,+EAA+E;QAC/E,4EAA4E;QAC5E,+BAA+B;QAC/B,MAAM,qBAAqB,GAAG,IAAA,6CAAwB,GAAE,CAAC;QACzD,IAAI,EAAE,CAAC,UAAU,CAAC,qBAAqB,CAAC,EAAE,CAAC;YACzC,IAAI,CAAC;gBACH,EAAE,CAAC,MAAM,CAAC,qBAAqB,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;YACxD,CAAC;YAAC,OAAO,KAAK,EAAE,CAAC;gBACf,MAAM,CAAC,IAAI,CACT,2DAA2D,IAAA,sBAAe,EAAC,KAAK,CAAC,EAAE,CACpF,CAAC;YACJ,CAAC;QACH,CAAC;IACH,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,IAAI,CAAC,SAAS,CACZ,oCAAoC,IAAA,sBAAe,EAAC,KAAK,CAAC,EAAE,CAC7D,CAAC;IACJ,CAAC;AACH,CAAC;AAED,KAAK,UAAU,EAAE,CAAC"}

20
lib/analyze-action.js generated
View File

@@ -41,7 +41,6 @@ const fs = __importStar(require("fs"));
const path_1 = __importDefault(require("path"));
const perf_hooks_1 = require("perf_hooks");
const core = __importStar(require("@actions/core"));
const github = __importStar(require("@actions/github"));
const actionsUtil = __importStar(require("./actions-util"));
const analyze_1 = require("./analyze");
const api_client_1 = require("./api-client");
@@ -51,6 +50,7 @@ const codeql_1 = require("./codeql");
const config_utils_1 = require("./config-utils");
const database_upload_1 = require("./database-upload");
const dependency_caching_1 = require("./dependency-caching");
const diff_informed_analysis_utils_1 = require("./diff-informed-analysis-utils");
const environment_1 = require("./environment");
const feature_flags_1 = require("./feature-flags");
const languages_1 = require("./languages");
@@ -189,22 +189,24 @@ async function run() {
const outputDir = actionsUtil.getRequiredInput("output");
core.exportVariable(environment_1.EnvVar.SARIF_RESULTS_OUTPUT_DIR, outputDir);
const threads = util.getThreadsFlag(actionsUtil.getOptionalInput("threads") || process.env["CODEQL_THREADS"], logger);
const repositoryNwo = (0, repository_1.parseRepositoryNwo)(util.getRequiredEnvParam("GITHUB_REPOSITORY"));
const repositoryNwo = (0, repository_1.getRepositoryNwo)();
const gitHubVersion = await (0, api_client_1.getGitHubVersion)();
util.checkActionVersion(actionsUtil.getActionVersion(), gitHubVersion);
const features = new feature_flags_1.Features(gitHubVersion, repositoryNwo, actionsUtil.getTemporaryDirectory(), logger);
const memory = util.getMemoryFlag(actionsUtil.getOptionalInput("ram") || process.env["CODEQL_RAM"], logger);
const pull_request = github.context.payload.pull_request;
const diffRangePackDir = pull_request &&
(await (0, analyze_1.setupDiffInformedQueryRun)(pull_request.base.ref, pull_request.head.label, codeql, logger, features));
const branches = await (0, diff_informed_analysis_utils_1.getDiffInformedAnalysisBranches)(codeql, features, logger);
const diffRangePackDir = branches
? await (0, analyze_1.setupDiffInformedQueryRun)(branches, logger)
: undefined;
await (0, analyze_1.warnIfGoInstalledAfterInit)(config, logger);
await runAutobuildIfLegacyGoWorkflow(config, logger);
dbCreationTimings = await (0, analyze_1.runFinalize)(outputDir, threads, memory, codeql, config, logger);
const cleanupLevel = actionsUtil.getOptionalInput("cleanup-level") || "brutal";
if (actionsUtil.getRequiredInput("skip-queries") !== "true") {
runStats = await (0, analyze_1.runQueries)(outputDir, memory, util.getAddSnippetsFlag(actionsUtil.getRequiredInput("add-snippets")), threads, diffRangePackDir, actionsUtil.getOptionalInput("category"), config, logger, features);
runStats = await (0, analyze_1.runQueries)(outputDir, memory, util.getAddSnippetsFlag(actionsUtil.getRequiredInput("add-snippets")), threads, cleanupLevel, diffRangePackDir, actionsUtil.getOptionalInput("category"), config, logger, features);
}
if (actionsUtil.getOptionalInput("cleanup-level") !== "none") {
await (0, analyze_1.runCleanup)(config, actionsUtil.getOptionalInput("cleanup-level") || "brutal", logger);
if (cleanupLevel !== "none") {
await (0, analyze_1.runCleanup)(config, cleanupLevel, logger);
}
const dbLocations = {};
for (const language of config.languages) {
@@ -238,7 +240,7 @@ async function run() {
}
else if (uploadResult !== undefined &&
actionsUtil.getRequiredInput("wait-for-processing") === "true") {
await uploadLib.waitForProcessing((0, repository_1.parseRepositoryNwo)(util.getRequiredEnvParam("GITHUB_REPOSITORY")), uploadResult.sarifID, (0, logging_1.getActionsLogger)());
await uploadLib.waitForProcessing((0, repository_1.getRepositoryNwo)(), uploadResult.sarifID, (0, logging_1.getActionsLogger)());
}
// If we did not throw an error yet here, but we expect one, throw it.
if (actionsUtil.getOptionalInput("expect-error") === "true") {

File diff suppressed because one or more lines are too long

89
lib/analyze.js generated
View File

@@ -54,15 +54,16 @@ const actionsUtil = __importStar(require("./actions-util"));
const api_client_1 = require("./api-client");
const autobuild_1 = require("./autobuild");
const codeql_1 = require("./codeql");
const dependency_caching_1 = require("./dependency-caching");
const diagnostics_1 = require("./diagnostics");
const diff_filtering_utils_1 = require("./diff-filtering-utils");
const diff_informed_analysis_utils_1 = require("./diff-informed-analysis-utils");
const environment_1 = require("./environment");
const feature_flags_1 = require("./feature-flags");
const languages_1 = require("./languages");
const logging_1 = require("./logging");
const repository_1 = require("./repository");
const tools_features_1 = require("./tools-features");
const tracer_config_1 = require("./tracer-config");
const upload_lib_1 = require("./upload-lib");
const util = __importStar(require("./util"));
const util_1 = require("./util");
class CodeQLAnalysisError extends Error {
@@ -102,6 +103,14 @@ async function runExtraction(codeql, config, logger) {
config.buildMode === util_1.BuildMode.Autobuild) {
await (0, autobuild_1.setupCppAutobuild)(codeql, logger);
}
// The Java `build-mode: none` extractor places dependencies (.jar files) in the
// database scratch directory by default. For dependency caching purposes, we want
// a stable path that caches can be restored into and that we can cache at the
// end of the workflow (i.e. that does not get removed when the scratch directory is).
if (language === languages_1.Language.java && config.buildMode === util_1.BuildMode.None) {
process.env["CODEQL_EXTRACTOR_JAVA_OPTION_BUILDLESS_DEPENDENCY_DIR"] =
(0, dependency_caching_1.getJavaTempDependencyDir)();
}
await codeql.extractUsingBuildMode(config, language);
}
else {
@@ -152,21 +161,13 @@ async function finalizeDatabaseCreation(codeql, config, threadsFlag, memoryFlag,
/**
* Set up the diff-informed analysis feature.
*
* @param baseRef The base branch name, used for calculating the diff range.
* @param headLabel The label that uniquely identifies the head branch across
* repositories, used for calculating the diff range.
* @param codeql
* @param logger
* @param features
* @returns Absolute path to the directory containing the extension pack for
* the diff range information, or `undefined` if the feature is disabled.
*/
async function setupDiffInformedQueryRun(baseRef, headLabel, codeql, logger, features) {
if (!(await features.getValue(feature_flags_1.Feature.DiffInformedQueries, codeql))) {
return undefined;
}
async function setupDiffInformedQueryRun(branches, logger) {
return await (0, logging_1.withGroupAsync)("Generating diff range extension pack", async () => {
const diffRanges = await getPullRequestEditedDiffRanges(baseRef, headLabel, logger);
logger.info(`Calculating diff ranges for ${branches.base}...${branches.head}`);
const diffRanges = await getPullRequestEditedDiffRanges(branches, logger);
const packDir = writeDiffRangeDataExtensionPack(logger, diffRanges);
if (packDir === undefined) {
logger.warning("Cannot create diff range extension pack for diff-informed queries; " +
@@ -181,17 +182,15 @@ async function setupDiffInformedQueryRun(baseRef, headLabel, codeql, logger, fea
/**
* Return the file line ranges that were added or modified in the pull request.
*
* @param baseRef The base branch name, used for calculating the diff range.
* @param headLabel The label that uniquely identifies the head branch across
* repositories, used for calculating the diff range.
* @param branches The base and head branches of the pull request.
* @param logger
* @returns An array of tuples, where each tuple contains the absolute path of a
* file, the start line and the end line (both 1-based and inclusive) of an
* added or modified range in that file. Returns `undefined` if the action was
* not triggered by a pull request or if there was an error.
*/
async function getPullRequestEditedDiffRanges(baseRef, headLabel, logger) {
const fileDiffs = await getFileDiffsWithBasehead(baseRef, headLabel, logger);
async function getPullRequestEditedDiffRanges(branches, logger) {
const fileDiffs = await getFileDiffsWithBasehead(branches, logger);
if (fileDiffs === undefined) {
return undefined;
}
@@ -214,15 +213,15 @@ async function getPullRequestEditedDiffRanges(baseRef, headLabel, logger) {
}
return results;
}
async function getFileDiffsWithBasehead(baseRef, headLabel, logger) {
const ownerRepo = util.getRequiredEnvParam("GITHUB_REPOSITORY").split("/");
const owner = ownerRepo[0];
const repo = ownerRepo[1];
const basehead = `${baseRef}...${headLabel}`;
async function getFileDiffsWithBasehead(branches, logger) {
// Check CODE_SCANNING_REPOSITORY first. If it is empty or not set, fall back
// to GITHUB_REPOSITORY.
const repositoryNwo = (0, repository_1.getRepositoryNwoFromEnv)("CODE_SCANNING_REPOSITORY", "GITHUB_REPOSITORY");
const basehead = `${branches.base}...${branches.head}`;
try {
const response = await (0, api_client_1.getApiClient)().rest.repos.compareCommitsWithBasehead({
owner,
repo,
owner: repositoryNwo.owner,
repo: repositoryNwo.repo,
basehead,
per_page: 1,
});
@@ -334,8 +333,21 @@ function writeDiffRangeDataExtensionPack(logger, ranges) {
if (ranges === undefined) {
return undefined;
}
if (ranges.length === 0) {
// An empty diff range means that there are no added or modified lines in
// the pull request. But the `restrictAlertsTo` extensible predicate
// interprets an empty data extension differently, as an indication that
// all alerts should be included. So we need to specifically set the diff
// range to a non-empty list that cannot match any alert location.
ranges = [{ path: "", startLine: 0, endLine: 0 }];
}
const diffRangeDir = path.join(actionsUtil.getTemporaryDirectory(), "pr-diff-range");
fs.mkdirSync(diffRangeDir);
// We expect the Actions temporary directory to already exist, so are mainly
// using `recursive: true` to avoid errors if the directory already exists,
// for example if the analyze Action is run multiple times in the same job.
// This is not really something that is supported, but we make use of it in
// tests.
fs.mkdirSync(diffRangeDir, { recursive: true });
fs.writeFileSync(path.join(diffRangeDir, "qlpack.yml"), `
name: codeql-action/pr-diff-range
version: 0.0.0
@@ -350,6 +362,7 @@ extensions:
- addsTo:
pack: codeql/util
extensible: restrictAlertsTo
checkPresence: false
data:
`;
let data = ranges
@@ -371,24 +384,25 @@ extensions:
logger.debug(`Wrote pr-diff-range extension pack to ${extensionFilePath}:\n${extensionContents}`);
// Write the diff ranges to a JSON file, for action-side alert filtering by the
// upload-lib module.
(0, diff_filtering_utils_1.writeDiffRangesJsonFile)(logger, ranges);
(0, diff_informed_analysis_utils_1.writeDiffRangesJsonFile)(logger, ranges);
return diffRangeDir;
}
// Runs queries and creates sarif files in the given folder
async function runQueries(sarifFolder, memoryFlag, addSnippetsFlag, threadsFlag, diffRangePackDir, automationDetailsId, config, logger, features) {
async function runQueries(sarifFolder, memoryFlag, addSnippetsFlag, threadsFlag, cleanupLevel, diffRangePackDir, automationDetailsId, config, logger, features) {
const statusReport = {};
const queryFlags = [memoryFlag, threadsFlag];
if (cleanupLevel !== "overlay") {
queryFlags.push("--expect-discarded-cache");
}
statusReport.analysis_is_diff_informed = diffRangePackDir !== undefined;
const dataExtensionFlags = diffRangePackDir
? [
`--additional-packs=${diffRangePackDir}`,
"--extension-packs=codeql-action/pr-diff-range",
]
: [];
if (diffRangePackDir) {
queryFlags.push(`--additional-packs=${diffRangePackDir}`);
queryFlags.push("--extension-packs=codeql-action/pr-diff-range");
}
const sarifRunPropertyFlag = diffRangePackDir
? "--sarif-run-property=incrementalMode=diff-informed"
: undefined;
const codeql = await (0, codeql_1.getCodeQL)(config.codeQLCmd);
const queryFlags = [memoryFlag, threadsFlag, ...dataExtensionFlags];
for (const language of config.languages) {
try {
const sarifFile = path.join(sarifFolder, `${language}.sarif`);
@@ -414,7 +428,7 @@ async function runQueries(sarifFolder, memoryFlag, addSnippetsFlag, threadsFlag,
logger.endGroup();
logger.info(analysisSummary);
if (await features.getValue(feature_flags_1.Feature.QaTelemetryEnabled)) {
const perQueryAlertCounts = getPerQueryAlertCounts(sarifFile, logger);
const perQueryAlertCounts = getPerQueryAlertCounts(sarifFile);
const perQueryAlertCountEventReport = {
event: "codeql database interpret-results",
started_at: startTimeInterpretResults.toISOString(),
@@ -442,8 +456,7 @@ async function runQueries(sarifFolder, memoryFlag, addSnippetsFlag, threadsFlag,
return await codeql.databaseInterpretResults(databasePath, queries, sarifFile, addSnippetsFlag, threadsFlag, enableDebugLogging ? "-vv" : "-v", sarifRunPropertyFlag, automationDetailsId, config, features);
}
/** Get an object with all queries and their counts parsed from a SARIF file path. */
function getPerQueryAlertCounts(sarifPath, log) {
(0, upload_lib_1.validateSarifFileSchema)(sarifPath, log);
function getPerQueryAlertCounts(sarifPath) {
const sarifObject = JSON.parse(fs.readFileSync(sarifPath, "utf8"));
// We do not need to compute fingerprints because we are not sending data based off of locations.
// Generate the query: alert count object

File diff suppressed because one or more lines are too long

2
lib/analyze.test.js generated
View File

@@ -114,7 +114,7 @@ const util = __importStar(require("./util"));
fs.mkdirSync(util.getCodeQLDatabasePath(config, language), {
recursive: true,
});
const statusReport = await (0, analyze_1.runQueries)(tmpDir, memoryFlag, addSnippetsFlag, threadsFlag, undefined, undefined, config, (0, logging_1.getRunnerLogger)(true), (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.QaTelemetryEnabled]));
const statusReport = await (0, analyze_1.runQueries)(tmpDir, memoryFlag, addSnippetsFlag, threadsFlag, "brutal", undefined, undefined, config, (0, logging_1.getRunnerLogger)(true), (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.QaTelemetryEnabled]));
t.deepEqual(Object.keys(statusReport).sort(), [
"analysis_is_diff_informed",
`analyze_builtin_queries_${language}_duration_ms`,

File diff suppressed because one or more lines are too long

21
lib/api-client.js generated
View File

@@ -122,14 +122,12 @@ async function getGitHubVersion() {
* Get the path of the currently executing workflow relative to the repository root.
*/
async function getWorkflowRelativePath() {
const repo_nwo = (0, util_1.getRequiredEnvParam)("GITHUB_REPOSITORY").split("/");
const owner = repo_nwo[0];
const repo = repo_nwo[1];
const repo_nwo = (0, repository_1.getRepositoryNwo)();
const run_id = Number((0, util_1.getRequiredEnvParam)("GITHUB_RUN_ID"));
const apiClient = getApiClient();
const runsResponse = await apiClient.request("GET /repos/:owner/:repo/actions/runs/:run_id?exclude_pull_requests=true", {
owner,
repo,
owner: repo_nwo.owner,
repo: repo_nwo.repo,
run_id,
});
const workflowUrl = runsResponse.data.workflow_url;
@@ -187,7 +185,7 @@ function computeAutomationID(analysis_key, environment) {
}
/** List all Actions cache entries matching the provided key and ref. */
async function listActionsCaches(key, ref) {
const repositoryNwo = (0, repository_1.parseRepositoryNwo)((0, util_1.getRequiredEnvParam)("GITHUB_REPOSITORY"));
const repositoryNwo = (0, repository_1.getRepositoryNwo)();
return await getApiClient().paginate("GET /repos/{owner}/{repo}/actions/caches", {
owner: repositoryNwo.owner,
repo: repositoryNwo.repo,
@@ -197,7 +195,7 @@ async function listActionsCaches(key, ref) {
}
/** Delete an Actions cache item by its ID. */
async function deleteActionsCache(id) {
const repositoryNwo = (0, repository_1.parseRepositoryNwo)((0, util_1.getRequiredEnvParam)("GITHUB_REPOSITORY"));
const repositoryNwo = (0, repository_1.getRepositoryNwo)();
await getApiClient().rest.actions.deleteActionsCacheById({
owner: repositoryNwo.owner,
repo: repositoryNwo.repo,
@@ -206,11 +204,16 @@ async function deleteActionsCache(id) {
}
function wrapApiConfigurationError(e) {
if ((0, util_1.isHTTPError)(e)) {
if (e.message.includes("API rate limit exceeded for site ID installation") ||
if (e.message.includes("API rate limit exceeded for installation") ||
e.message.includes("commit not found") ||
/^ref .* not found in this repository$/.test(e.message)) {
e.message.includes("Resource not accessible by integration") ||
/ref .* not found in this repository/.test(e.message)) {
return new util_1.ConfigurationError(e.message);
}
else if (e.message.includes("Bad credentials") ||
e.message.includes("Not Found")) {
return new util_1.ConfigurationError("Please check that your token is valid and has the required permissions: contents: read, security-events: write");
}
}
return e;
}

File diff suppressed because one or more lines are too long

36
lib/api-client.test.js generated
View File

@@ -120,4 +120,40 @@ function mockGetMetaVersionHeader(versionHeader) {
});
t.deepEqual({ type: util.GitHubVariant.GHE_DOTCOM }, gheDotcom);
});
(0, ava_1.default)("wrapApiConfigurationError correctly wraps specific configuration errors", (t) => {
// We don't reclassify arbitrary errors
const arbitraryError = new Error("arbitrary error");
let res = api.wrapApiConfigurationError(arbitraryError);
t.is(res, arbitraryError);
// Same goes for arbitrary errors
const configError = new util.ConfigurationError("arbitrary error");
res = api.wrapApiConfigurationError(configError);
t.is(res, configError);
// If an HTTP error doesn't contain a specific error message, we don't
// wrap is an an API error.
const httpError = new util.HTTPError("arbitrary HTTP error", 456);
res = api.wrapApiConfigurationError(httpError);
t.is(res, httpError);
// For other HTTP errors, we wrap them as Configuration errors if they contain
// specific error messages.
const httpNotFoundError = new util.HTTPError("commit not found", 404);
res = api.wrapApiConfigurationError(httpNotFoundError);
t.deepEqual(res, new util.ConfigurationError("commit not found"));
const refNotFoundError = new util.HTTPError("ref 'refs/heads/jitsi' not found in this repository - https://docs.github.com/rest", 404);
res = api.wrapApiConfigurationError(refNotFoundError);
t.deepEqual(res, new util.ConfigurationError("ref 'refs/heads/jitsi' not found in this repository - https://docs.github.com/rest"));
const apiRateLimitError = new util.HTTPError("API rate limit exceeded for installation", 403);
res = api.wrapApiConfigurationError(apiRateLimitError);
t.deepEqual(res, new util.ConfigurationError("API rate limit exceeded for installation"));
const tokenSuggestionMessage = "Please check that your token is valid and has the required permissions: contents: read, security-events: write";
const badCredentialsError = new util.HTTPError("Bad credentials", 401);
res = api.wrapApiConfigurationError(badCredentialsError);
t.deepEqual(res, new util.ConfigurationError(tokenSuggestionMessage));
const notFoundError = new util.HTTPError("Not Found", 404);
res = api.wrapApiConfigurationError(notFoundError);
t.deepEqual(res, new util.ConfigurationError(tokenSuggestionMessage));
const resourceNotAccessibleError = new util.HTTPError("Resource not accessible by integration", 403);
res = api.wrapApiConfigurationError(resourceNotAccessibleError);
t.deepEqual(res, new util.ConfigurationError("Resource not accessible by integration"));
});
//# sourceMappingURL=api-client.test.js.map

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
{ "maximumVersion": "3.16", "minimumVersion": "3.12" }
{ "maximumVersion": "3.17", "minimumVersion": "3.13" }

2
lib/autobuild.js generated
View File

@@ -123,7 +123,7 @@ async function setupCppAutobuild(codeql, logger) {
const envVar = feature_flags_1.featureConfig[feature_flags_1.Feature.CppDependencyInstallation].envVar;
const featureName = "C++ automatic installation of dependencies";
const gitHubVersion = await (0, api_client_1.getGitHubVersion)();
const repositoryNwo = (0, repository_1.parseRepositoryNwo)((0, util_1.getRequiredEnvParam)("GITHUB_REPOSITORY"));
const repositoryNwo = (0, repository_1.getRepositoryNwo)();
const features = new feature_flags_1.Features(gitHubVersion, repositoryNwo, (0, actions_util_1.getTemporaryDirectory)(), logger);
if (await features.getValue(feature_flags_1.Feature.CppDependencyInstallation, codeql)) {
// disable autoinstall on self-hosted runners unless explicitly requested

View File

@@ -1 +1 @@
{"version":3,"file":"autobuild.js","sourceRoot":"","sources":["../src/autobuild.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAeA,kEAkGC;AAED,8CAqCC;AAED,oCAsBC;AAhLD,oDAAsC;AAEtC,iDAA6E;AAC7E,6CAAgD;AAChD,qCAA6C;AAE7C,uCAAmC;AACnC,+CAAuC;AACvC,mDAAmE;AACnE,2CAAyD;AAEzD,6CAAkD;AAClD,qDAAgD;AAChD,iCAAwD;AAEjD,KAAK,UAAU,2BAA2B,CAC/C,MAAc,EACd,MAA0B,EAC1B,MAAc;IAEd,IACE,CAAC,MAAM,CAAC,SAAS,KAAK,gBAAS,CAAC,IAAI;QAClC,CAAC,MAAM,MAAM,CAAC,eAAe,CAAC,6BAAY,CAAC,wBAAwB,CAAC,CAAC,CAAC;QACxE,MAAM,CAAC,SAAS,KAAK,gBAAS,CAAC,MAAM,EACrC,CAAC;QACD,MAAM,CAAC,IAAI,CACT,qBAAqB,MAAM,CAAC,SAAS,2BAA2B;YAC9D,OAAO,gBAAM,CAAC,kBAAkB,wBAAwB,CAC3D,CAAC;QACF,OAAO,SAAS,CAAC;IACnB,CAAC;IAED,0CAA0C;IAC1C,mFAAmF;IACnF,oFAAoF;IACpF,4EAA4E;IAC5E,MAAM,kBAAkB,GAAG,MAAM,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,EAAE,CACvD,IAAA,4BAAgB,EAAC,CAAC,CAAC,CACpB,CAAC;IAEF,IAAI,CAAC,kBAAkB,EAAE,CAAC;QACxB,MAAM,CAAC,IAAI,CACT,iEAAiE,CAClE,CAAC;QACF,OAAO,SAAS,CAAC;IACnB,CAAC;IAED;;;;;;;;;;;;;;;;;;;;;;;;;;OA0BG;IACH,MAAM,2BAA2B,GAAG,kBAAkB,CAAC,MAAM,CAC3D,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC,KAAK,oBAAQ,CAAC,EAAE,CACzB,CAAC;IAEF,MAAM,SAAS,GAAe,EAAE,CAAC;IACjC,yEAAyE;IACzE,UAAU;IACV,IAAI,2BAA2B,CAAC,CAAC,CAAC,KAAK,SAAS,EAAE,CAAC;QACjD,SAAS,CAAC,IAAI,CAAC,2BAA2B,CAAC,CAAC,CAAC,CAAC,CAAC;IACjD,CAAC;IACD,uEAAuE;IACvE,wCAAwC;IACxC,IAAI,kBAAkB,CAAC,MAAM,KAAK,2BAA2B,CAAC,MAAM,EAAE,CAAC;QACrE,SAAS,CAAC,IAAI,CAAC,oBAAQ,CAAC,EAAE,CAAC,CAAC;IAC9B,CAAC;IAED,MAAM,CAAC,KAAK,CAAC,kBAAkB,SAAS,CAAC,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC;IAE3D,2EAA2E;IAC3E,4EAA4E;IAC5E,2CAA2C;IAC3C,uEAAuE;IACvE,2EAA2E;IAC3E,uEAAuE;IACvE,yCAAyC;IACzC,IAAI,2BAA2B,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;QAC3C,MAAM,CAAC,OAAO,CACZ,oCAAoC,SAAS,CAAC,IAAI,CAChD,OAAO,CACR,8BAA8B,2BAA2B;aACvD,KAAK,CAAC,CAAC,CAAC;aACR,IAAI,CACH,OAAO,CACR,kFAAkF;YACnF,OAAO,gBAAM,CAAC,4BAA4B,wBAAwB,CACrE,CAAC;IACJ,CAAC;IAED,OAAO,SAAS,CAAC;AACnB,CAAC;AAEM,KAAK,UAAU,iBAAiB,CAAC,MAAc,EAAE,MAAc;IACpE,MAAM,MAAM,GAAG,6BAAa,CAAC,uBAAO,CAAC,yBAAyB,CAAC,CAAC,MAAM,CAAC;IACvE,MAAM,WAAW,GAAG,4CAA4C,CAAC;IACjE,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;IAC/C,MAAM,aAAa,GAAG,IAAA,+BAAkB,EACtC,IAAA,0BAAmB,EAAC,mBAAmB,CAAC,CACzC,CAAC;IACF,MAAM,QAAQ,GAAG,IAAI,wBAAQ,CAC3B,aAAa,EACb,aAAa,EACb,IAAA,oCAAqB,GAAE,EACvB,MAAM,CACP,CAAC;IACF,IAAI,MAAM,QAAQ,CAAC,QAAQ,CAAC,uBAAO,CAAC,yBAAyB,EAAE,MAAM,CAAC,EAAE,CAAC;QACvE,yEAAyE;QACzE,IACE,OAAO,CAAC,GAAG,CAAC,oBAAoB,CAAC,KAAK,aAAa;YACnD,OAAO,CAAC,GAAG,CAAC,MAAM,CAAC,KAAK,MAAM,EAC9B,CAAC;YACD,MAAM,CAAC,IAAI,CACT,aAAa,WAAW,sCACtB,IAAA,mCAAoB,GAAE,KAAK,SAAS;gBAClC,CAAC,CAAC,8BAA8B,MAAM,yDAAyD,gBAAM,CAAC,oBAAoB,wBAAwB;gBAClJ,CAAC,CAAC,EACN,EAAE,CACH,CAAC;YACF,IAAI,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;QACvC,CAAC;aAAM,CAAC;YACN,MAAM,CAAC,IAAI,CACT,YAAY,WAAW,yCAAyC,MAAM,yCAAyC,gBAAM,CAAC,oBAAoB,wBAAwB,CACnK,CAAC;YACF,IAAI,CAAC,cAAc,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACtC,CAAC;IACH,CAAC;SAAM,CAAC;QACN,MAAM,CAAC,IAAI,CAAC,aAAa,WAAW,GAAG,CAAC,CAAC;QACzC,IAAI,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;IACvC,CAAC;AACH,CAAC;AAEM,KAAK,UAAU,YAAY,CAChC,MAA0B,EAC1B,QAAkB,EAClB,MAAc;IAEd,MAAM,CAAC,UAAU,CAAC,qCAAqC,QAAQ,OAAO,CAAC,CAAC;IACxE,MAAM,MAAM,GAAG,MAAM,IAAA,kBAAS,EAAC,MAAM,CAAC,SAAS,CAAC,CAAC;IACjD,IAAI,QAAQ,KAAK,oBAAQ,CAAC,GAAG,EAAE,CAAC;QAC9B,MAAM,iBAAiB,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IAC1C,CAAC;IACD,IACE,MAAM,CAAC,SAAS;QAChB,CAAC,MAAM,MAAM,CAAC,eAAe,CAAC,6BAAY,CAAC,wBAAwB,CAAC,CAAC,EACrE,CAAC;QACD,MAAM,MAAM,CAAC,qBAAqB,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;IACvD,CAAC;SAAM,CAAC;QACN,MAAM,MAAM,CAAC,YAAY,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;IAC9C,CAAC;IACD,IAAI,QAAQ,KAAK,oBAAQ,CAAC,EAAE,EAAE,CAAC;QAC7B,IAAI,CAAC,cAAc,CAAC,oBAAM,CAAC,oBAAoB,EAAE,MAAM,CAAC,CAAC;IAC3D,CAAC;IACD,MAAM,CAAC,QAAQ,EAAE,CAAC;AACpB,CAAC"}
{"version":3,"file":"autobuild.js","sourceRoot":"","sources":["../src/autobuild.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAeA,kEAkGC;AAED,8CAmCC;AAED,oCAsBC;AA9KD,oDAAsC;AAEtC,iDAA6E;AAC7E,6CAAgD;AAChD,qCAA6C;AAE7C,uCAAmC;AACnC,+CAAuC;AACvC,mDAAmE;AACnE,2CAAyD;AAEzD,6CAAgD;AAChD,qDAAgD;AAChD,iCAAmC;AAE5B,KAAK,UAAU,2BAA2B,CAC/C,MAAc,EACd,MAA0B,EAC1B,MAAc;IAEd,IACE,CAAC,MAAM,CAAC,SAAS,KAAK,gBAAS,CAAC,IAAI;QAClC,CAAC,MAAM,MAAM,CAAC,eAAe,CAAC,6BAAY,CAAC,wBAAwB,CAAC,CAAC,CAAC;QACxE,MAAM,CAAC,SAAS,KAAK,gBAAS,CAAC,MAAM,EACrC,CAAC;QACD,MAAM,CAAC,IAAI,CACT,qBAAqB,MAAM,CAAC,SAAS,2BAA2B;YAC9D,OAAO,gBAAM,CAAC,kBAAkB,wBAAwB,CAC3D,CAAC;QACF,OAAO,SAAS,CAAC;IACnB,CAAC;IAED,0CAA0C;IAC1C,mFAAmF;IACnF,oFAAoF;IACpF,4EAA4E;IAC5E,MAAM,kBAAkB,GAAG,MAAM,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,EAAE,CACvD,IAAA,4BAAgB,EAAC,CAAC,CAAC,CACpB,CAAC;IAEF,IAAI,CAAC,kBAAkB,EAAE,CAAC;QACxB,MAAM,CAAC,IAAI,CACT,iEAAiE,CAClE,CAAC;QACF,OAAO,SAAS,CAAC;IACnB,CAAC;IAED;;;;;;;;;;;;;;;;;;;;;;;;;;OA0BG;IACH,MAAM,2BAA2B,GAAG,kBAAkB,CAAC,MAAM,CAC3D,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC,KAAK,oBAAQ,CAAC,EAAE,CACzB,CAAC;IAEF,MAAM,SAAS,GAAe,EAAE,CAAC;IACjC,yEAAyE;IACzE,UAAU;IACV,IAAI,2BAA2B,CAAC,CAAC,CAAC,KAAK,SAAS,EAAE,CAAC;QACjD,SAAS,CAAC,IAAI,CAAC,2BAA2B,CAAC,CAAC,CAAC,CAAC,CAAC;IACjD,CAAC;IACD,uEAAuE;IACvE,wCAAwC;IACxC,IAAI,kBAAkB,CAAC,MAAM,KAAK,2BAA2B,CAAC,MAAM,EAAE,CAAC;QACrE,SAAS,CAAC,IAAI,CAAC,oBAAQ,CAAC,EAAE,CAAC,CAAC;IAC9B,CAAC;IAED,MAAM,CAAC,KAAK,CAAC,kBAAkB,SAAS,CAAC,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC;IAE3D,2EAA2E;IAC3E,4EAA4E;IAC5E,2CAA2C;IAC3C,uEAAuE;IACvE,2EAA2E;IAC3E,uEAAuE;IACvE,yCAAyC;IACzC,IAAI,2BAA2B,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;QAC3C,MAAM,CAAC,OAAO,CACZ,oCAAoC,SAAS,CAAC,IAAI,CAChD,OAAO,CACR,8BAA8B,2BAA2B;aACvD,KAAK,CAAC,CAAC,CAAC;aACR,IAAI,CACH,OAAO,CACR,kFAAkF;YACnF,OAAO,gBAAM,CAAC,4BAA4B,wBAAwB,CACrE,CAAC;IACJ,CAAC;IAED,OAAO,SAAS,CAAC;AACnB,CAAC;AAEM,KAAK,UAAU,iBAAiB,CAAC,MAAc,EAAE,MAAc;IACpE,MAAM,MAAM,GAAG,6BAAa,CAAC,uBAAO,CAAC,yBAAyB,CAAC,CAAC,MAAM,CAAC;IACvE,MAAM,WAAW,GAAG,4CAA4C,CAAC;IACjE,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;IAC/C,MAAM,aAAa,GAAG,IAAA,6BAAgB,GAAE,CAAC;IACzC,MAAM,QAAQ,GAAG,IAAI,wBAAQ,CAC3B,aAAa,EACb,aAAa,EACb,IAAA,oCAAqB,GAAE,EACvB,MAAM,CACP,CAAC;IACF,IAAI,MAAM,QAAQ,CAAC,QAAQ,CAAC,uBAAO,CAAC,yBAAyB,EAAE,MAAM,CAAC,EAAE,CAAC;QACvE,yEAAyE;QACzE,IACE,OAAO,CAAC,GAAG,CAAC,oBAAoB,CAAC,KAAK,aAAa;YACnD,OAAO,CAAC,GAAG,CAAC,MAAM,CAAC,KAAK,MAAM,EAC9B,CAAC;YACD,MAAM,CAAC,IAAI,CACT,aAAa,WAAW,sCACtB,IAAA,mCAAoB,GAAE,KAAK,SAAS;gBAClC,CAAC,CAAC,8BAA8B,MAAM,yDAAyD,gBAAM,CAAC,oBAAoB,wBAAwB;gBAClJ,CAAC,CAAC,EACN,EAAE,CACH,CAAC;YACF,IAAI,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;QACvC,CAAC;aAAM,CAAC;YACN,MAAM,CAAC,IAAI,CACT,YAAY,WAAW,yCAAyC,MAAM,yCAAyC,gBAAM,CAAC,oBAAoB,wBAAwB,CACnK,CAAC;YACF,IAAI,CAAC,cAAc,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACtC,CAAC;IACH,CAAC;SAAM,CAAC;QACN,MAAM,CAAC,IAAI,CAAC,aAAa,WAAW,GAAG,CAAC,CAAC;QACzC,IAAI,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;IACvC,CAAC;AACH,CAAC;AAEM,KAAK,UAAU,YAAY,CAChC,MAA0B,EAC1B,QAAkB,EAClB,MAAc;IAEd,MAAM,CAAC,UAAU,CAAC,qCAAqC,QAAQ,OAAO,CAAC,CAAC;IACxE,MAAM,MAAM,GAAG,MAAM,IAAA,kBAAS,EAAC,MAAM,CAAC,SAAS,CAAC,CAAC;IACjD,IAAI,QAAQ,KAAK,oBAAQ,CAAC,GAAG,EAAE,CAAC;QAC9B,MAAM,iBAAiB,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IAC1C,CAAC;IACD,IACE,MAAM,CAAC,SAAS;QAChB,CAAC,MAAM,MAAM,CAAC,eAAe,CAAC,6BAAY,CAAC,wBAAwB,CAAC,CAAC,EACrE,CAAC;QACD,MAAM,MAAM,CAAC,qBAAqB,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;IACvD,CAAC;SAAM,CAAC;QACN,MAAM,MAAM,CAAC,YAAY,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;IAC9C,CAAC;IACD,IAAI,QAAQ,KAAK,oBAAQ,CAAC,EAAE,EAAE,CAAC;QAC7B,IAAI,CAAC,cAAc,CAAC,oBAAM,CAAC,oBAAoB,EAAE,MAAM,CAAC,CAAC;IAC3D,CAAC;IACD,MAAM,CAAC,QAAQ,EAAE,CAAC;AACpB,CAAC"}

31
lib/codeql.js generated
View File

@@ -55,6 +55,7 @@ const environment_1 = require("./environment");
const feature_flags_1 = require("./feature-flags");
const git_utils_1 = require("./git-utils");
const languages_1 = require("./languages");
const overlay_database_utils_1 = require("./overlay-database-utils");
const setupCodeql = __importStar(require("./setup-codeql"));
const tools_features_1 = require("./tools-features");
const tracer_config_1 = require("./tracer-config");
@@ -77,15 +78,15 @@ const CODEQL_MINIMUM_VERSION = "2.15.5";
/**
* This version will shortly become the oldest version of CodeQL that the Action will run with.
*/
const CODEQL_NEXT_MINIMUM_VERSION = "2.15.5";
const CODEQL_NEXT_MINIMUM_VERSION = "2.16.6";
/**
* This is the version of GHES that was most recently deprecated.
*/
const GHES_VERSION_MOST_RECENTLY_DEPRECATED = "3.11";
const GHES_VERSION_MOST_RECENTLY_DEPRECATED = "3.12";
/**
* This is the deprecation date for the version of GHES that was most recently deprecated.
*/
const GHES_MOST_RECENT_DEPRECATION_DATE = "2024-12-19";
const GHES_MOST_RECENT_DEPRECATION_DATE = "2025-04-03";
/** The CLI verbosity level to use for extraction in debug mode. */
const EXTRACTION_DEBUG_MODE_VERBOSITY = "progress++";
/*
@@ -254,7 +255,7 @@ async function getCodeQLForCmd(cmd, checkVersion) {
async supportsFeature(feature) {
return (0, tools_features_1.isSupportedToolsFeature)(await this.getVersion(), feature);
},
async databaseInitCluster(config, sourceRoot, processName, qlconfigFile, logger) {
async databaseInitCluster(config, sourceRoot, processName, qlconfigFile, overlayDatabaseMode, logger) {
const extraArgs = config.languages.map((language) => `--language=${language}`);
if (await (0, tracer_config_1.shouldEnableIndirectTracing)(codeql, config)) {
extraArgs.push("--begin-tracing");
@@ -290,10 +291,19 @@ async function getCodeQLForCmd(cmd, checkVersion) {
const overwriteFlag = (0, tools_features_1.isSupportedToolsFeature)(await this.getVersion(), tools_features_1.ToolsFeature.ForceOverwrite)
? "--force-overwrite"
: "--overwrite";
if (overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.Overlay) {
const overlayChangesFile = await (0, overlay_database_utils_1.writeOverlayChangesFile)(config, sourceRoot, logger);
extraArgs.push(`--overlay-changes=${overlayChangesFile}`);
}
else if (overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.OverlayBase) {
extraArgs.push("--overlay-base");
}
await runCli(cmd, [
"database",
"init",
overwriteFlag,
...(overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.Overlay
? []
: [overwriteFlag]),
"--db-cluster",
config.dbLocation,
`--source-root=${sourceRoot}`,
@@ -305,6 +315,9 @@ async function getCodeQLForCmd(cmd, checkVersion) {
ignoringOptions: ["--overwrite"],
}),
], { stdin: externalRepositoryToken });
if (overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.OverlayBase) {
await (0, overlay_database_utils_1.writeBaseDatabaseOidsFile)(config, sourceRoot);
}
},
async runAutobuild(config, language) {
applyAutobuildAzurePipelinesTimeoutFix();
@@ -458,7 +471,6 @@ async function getCodeQLForCmd(cmd, checkVersion) {
"run-queries",
...flags,
databasePath,
"--expect-discarded-cache",
"--intra-layer-parallelism",
"--min-disk-free=1024", // Try to leave at least 1GB free
"-v",
@@ -797,6 +809,13 @@ async function generateCodeScanningConfig(config, logger) {
if (Array.isArray(augmentedConfig.packs) && !augmentedConfig.packs.length) {
delete augmentedConfig.packs;
}
augmentedConfig["query-filters"] = [
...(config.augmentationProperties.defaultQueryFilters || []),
...(augmentedConfig["query-filters"] || []),
];
if (augmentedConfig["query-filters"]?.length === 0) {
delete augmentedConfig["query-filters"];
}
logger.info(`Writing augmented user configuration file to ${codeScanningConfigFile}`);
logger.startGroup("Augmented user configuration file contents");
logger.info(yaml.dump(augmentedConfig));

File diff suppressed because one or more lines are too long

11
lib/codeql.test.js generated
View File

@@ -53,6 +53,7 @@ const defaults = __importStar(require("./defaults.json"));
const doc_url_1 = require("./doc-url");
const languages_1 = require("./languages");
const logging_1 = require("./logging");
const overlay_database_utils_1 = require("./overlay-database-utils");
const setup_codeql_1 = require("./setup-codeql");
const testing_utils_1 = require("./testing-utils");
const tools_features_1 = require("./tools-features");
@@ -335,7 +336,7 @@ const injectedConfigMacro = ava_1.default.macro({
tempDir,
augmentationProperties,
};
await codeqlObject.databaseInitCluster(thisStubConfig, "", undefined, undefined, (0, logging_1.getRunnerLogger)(true));
await codeqlObject.databaseInitCluster(thisStubConfig, "", undefined, undefined, overlay_database_utils_1.OverlayDatabaseMode.None, (0, logging_1.getRunnerLogger)(true));
const args = runnerConstructorStub.firstCall.args[1];
// should have used an config file
const configArg = args.find((arg) => arg.startsWith("--codescanning-config="));
@@ -471,7 +472,7 @@ const injectedConfigMacro = ava_1.default.macro({
const runnerConstructorStub = stubToolRunnerConstructor();
const codeqlObject = await codeql.getCodeQLForTesting();
sinon.stub(codeqlObject, "getVersion").resolves((0, testing_utils_1.makeVersionInfo)("2.17.6"));
await codeqlObject.databaseInitCluster({ ...stubConfig, tempDir }, "", undefined, "/path/to/qlconfig.yml", (0, logging_1.getRunnerLogger)(true));
await codeqlObject.databaseInitCluster({ ...stubConfig, tempDir }, "", undefined, "/path/to/qlconfig.yml", overlay_database_utils_1.OverlayDatabaseMode.None, (0, logging_1.getRunnerLogger)(true));
const args = runnerConstructorStub.firstCall.args[1];
// should have used a config file
const hasCodeScanningConfigArg = args.some((arg) => arg.startsWith("--codescanning-config="));
@@ -487,7 +488,7 @@ const injectedConfigMacro = ava_1.default.macro({
const codeqlObject = await codeql.getCodeQLForTesting();
sinon.stub(codeqlObject, "getVersion").resolves((0, testing_utils_1.makeVersionInfo)("2.17.6"));
await codeqlObject.databaseInitCluster({ ...stubConfig, tempDir }, "", undefined, undefined, // undefined qlconfigFile
(0, logging_1.getRunnerLogger)(true));
overlay_database_utils_1.OverlayDatabaseMode.None, (0, logging_1.getRunnerLogger)(true));
const args = runnerConstructorStub.firstCall.args[1];
const hasQlconfigArg = args.some((arg) => arg.startsWith("--qlconfig-file="));
t.false(hasQlconfigArg, "should NOT have injected a qlconfig");
@@ -612,7 +613,7 @@ for (const { codeqlVersion, flagPassed, githubVersion, negativeFlagPassed, } of
sinon.stub(io, "which").resolves("");
await t.throwsAsync(async () => await codeqlObject.databaseRunQueries(stubConfig.dbLocation, []), {
instanceOf: cli_errors_1.CliError,
message: `Encountered a fatal error while running "codeql-for-testing database run-queries --expect-discarded-cache --intra-layer-parallelism --min-disk-free=1024 -v". Exit code was 1 and error was: Oops! A fatal internal error occurred. Details:
message: `Encountered a fatal error while running "codeql-for-testing database run-queries --intra-layer-parallelism --min-disk-free=1024 -v". Exit code was 1 and error was: Oops! A fatal internal error occurred. Details:
com.semmle.util.exception.CatastrophicError: An error occurred while evaluating ControlFlowGraph::ControlFlow::Root.isRootOf/1#dispred#f610e6ed/2@86282cc8
Severe disk cache trouble (corruption or out of space) at /home/runner/work/_temp/codeql_databases/go/db-go/default/cache/pages/28/33.pack: Failed to write item to disk. See the logs for more details.`,
});
@@ -638,7 +639,7 @@ for (const { codeqlVersion, flagPassed, githubVersion, negativeFlagPassed, } of
sinon.stub(io, "which").resolves("");
process.env["CODEQL_ACTION_EXTRA_OPTIONS"] =
'{ "database": { "init": ["--overwrite"] } }';
await codeqlObject.databaseInitCluster(stubConfig, "sourceRoot", undefined, undefined, (0, logging_1.getRunnerLogger)(false));
await codeqlObject.databaseInitCluster(stubConfig, "sourceRoot", undefined, undefined, overlay_database_utils_1.OverlayDatabaseMode.None, (0, logging_1.getRunnerLogger)(false));
t.true(runnerConstructorStub.calledOnce);
const args = runnerConstructorStub.firstCall.args[1];
t.is(args.filter((option) => option === "--overwrite").length, 1, "--overwrite should only be passed once");

File diff suppressed because one or more lines are too long

16
lib/config-utils.js generated
View File

@@ -64,6 +64,7 @@ const yaml = __importStar(require("js-yaml"));
const semver = __importStar(require("semver"));
const api = __importStar(require("./api-client"));
const caching_utils_1 = require("./caching-utils");
const diff_informed_analysis_utils_1 = require("./diff-informed-analysis-utils");
const feature_flags_1 = require("./feature-flags");
const languages_1 = require("./languages");
const trap_caching_1 = require("./trap-caching");
@@ -79,6 +80,7 @@ exports.defaultAugmentationProperties = {
packsInputCombines: false,
packsInput: undefined,
queriesInput: undefined,
defaultQueryFilters: [],
};
function getPacksStrInvalid(packStr, configFile) {
return configFile
@@ -227,7 +229,7 @@ async function getRawLanguages(languagesInput, repository, logger) {
async function getDefaultConfig({ languagesInput, queriesInput, packsInput, buildModeInput, dbLocation, trapCachingEnabled, dependencyCachingEnabled, debugMode, debugArtifactName, debugDatabaseName, repository, tempDir, codeql, githubVersion, features, logger, }) {
const languages = await getLanguages(codeql, languagesInput, repository, logger);
const buildMode = await parseBuildModeInput(buildModeInput, languages, features, logger);
const augmentationProperties = calculateAugmentation(packsInput, queriesInput, languages);
const augmentationProperties = await calculateAugmentation(codeql, features, packsInput, queriesInput, languages, logger);
const { trapCaches, trapCacheDownloadTime } = await downloadCacheWithTime(trapCachingEnabled, codeql, languages, logger);
return {
languages,
@@ -277,7 +279,7 @@ async function loadConfig({ languagesInput, queriesInput, packsInput, buildModeI
}
const languages = await getLanguages(codeql, languagesInput, repository, logger);
const buildMode = await parseBuildModeInput(buildModeInput, languages, features, logger);
const augmentationProperties = calculateAugmentation(packsInput, queriesInput, languages);
const augmentationProperties = await calculateAugmentation(codeql, features, packsInput, queriesInput, languages, logger);
const { trapCaches, trapCacheDownloadTime } = await downloadCacheWithTime(trapCachingEnabled, codeql, languages, logger);
return {
languages,
@@ -303,11 +305,14 @@ async function loadConfig({ languagesInput, queriesInput, packsInput, buildModeI
* and the CLI does not know about these inputs so we need to inject them into
* the config file sent to the CLI.
*
* @param codeql The CodeQL object.
* @param features The feature enablement object.
* @param rawPacksInput The packs input from the action configuration.
* @param rawQueriesInput The queries input from the action configuration.
* @param languages The languages that the config file is for. If the packs input
* is non-empty, then there must be exactly one language. Otherwise, an
* error is thrown.
* @param logger The logger to use for logging.
*
* @returns The properties that need to be augmented in the config file.
*
@@ -315,16 +320,21 @@ async function loadConfig({ languagesInput, queriesInput, packsInput, buildModeI
* not have exactly one language.
*/
// exported for testing.
function calculateAugmentation(rawPacksInput, rawQueriesInput, languages) {
async function calculateAugmentation(codeql, features, rawPacksInput, rawQueriesInput, languages, logger) {
const packsInputCombines = shouldCombine(rawPacksInput);
const packsInput = parsePacksFromInput(rawPacksInput, languages, packsInputCombines);
const queriesInputCombines = shouldCombine(rawQueriesInput);
const queriesInput = parseQueriesFromInput(rawQueriesInput, queriesInputCombines);
const defaultQueryFilters = [];
if (await (0, diff_informed_analysis_utils_1.shouldPerformDiffInformedAnalysis)(codeql, features, logger)) {
defaultQueryFilters.push({ exclude: { tags: "exclude-from-incremental" } });
}
return {
packsInputCombines,
packsInput: packsInput?.[languages[0]],
queriesInput,
queriesInputCombines,
defaultQueryFilters,
};
}
function parseQueriesFromInput(rawQueriesInput, queriesInputCombines) {

File diff suppressed because one or more lines are too long

View File

@@ -624,7 +624,7 @@ const packSpecPrettyPrintingMacro = ava_1.default.macro({
const mockLogger = (0, logging_1.getRunnerLogger)(true);
const calculateAugmentationMacro = ava_1.default.macro({
exec: async (t, _title, rawPacksInput, rawQueriesInput, languages, expectedAugmentationProperties) => {
const actualAugmentationProperties = configUtils.calculateAugmentation(rawPacksInput, rawQueriesInput, languages);
const actualAugmentationProperties = await configUtils.calculateAugmentation((0, codeql_1.getCachedCodeQL)(), (0, testing_utils_1.createFeatures)([]), rawPacksInput, rawQueriesInput, languages, mockLogger);
t.deepEqual(actualAugmentationProperties, expectedAugmentationProperties);
},
title: (_, title) => `Calculate Augmentation: ${title}`,
@@ -634,34 +634,39 @@ const calculateAugmentationMacro = ava_1.default.macro({
queriesInput: undefined,
packsInputCombines: false,
packsInput: undefined,
defaultQueryFilters: [],
});
(0, ava_1.default)(calculateAugmentationMacro, "With queries", undefined, " a, b , c, d", [languages_1.Language.javascript], {
queriesInputCombines: false,
queriesInput: [{ uses: "a" }, { uses: "b" }, { uses: "c" }, { uses: "d" }],
packsInputCombines: false,
packsInput: undefined,
defaultQueryFilters: [],
});
(0, ava_1.default)(calculateAugmentationMacro, "With queries combining", undefined, " + a, b , c, d ", [languages_1.Language.javascript], {
queriesInputCombines: true,
queriesInput: [{ uses: "a" }, { uses: "b" }, { uses: "c" }, { uses: "d" }],
packsInputCombines: false,
packsInput: undefined,
defaultQueryFilters: [],
});
(0, ava_1.default)(calculateAugmentationMacro, "With packs", " codeql/a , codeql/b , codeql/c , codeql/d ", undefined, [languages_1.Language.javascript], {
queriesInputCombines: false,
queriesInput: undefined,
packsInputCombines: false,
packsInput: ["codeql/a", "codeql/b", "codeql/c", "codeql/d"],
defaultQueryFilters: [],
});
(0, ava_1.default)(calculateAugmentationMacro, "With packs combining", " + codeql/a, codeql/b, codeql/c, codeql/d", undefined, [languages_1.Language.javascript], {
queriesInputCombines: false,
queriesInput: undefined,
packsInputCombines: true,
packsInput: ["codeql/a", "codeql/b", "codeql/c", "codeql/d"],
defaultQueryFilters: [],
});
const calculateAugmentationErrorMacro = ava_1.default.macro({
exec: async (t, _title, rawPacksInput, rawQueriesInput, languages, expectedError) => {
t.throws(() => configUtils.calculateAugmentation(rawPacksInput, rawQueriesInput, languages), { message: expectedError });
await t.throwsAsync(() => configUtils.calculateAugmentation((0, codeql_1.getCachedCodeQL)(), (0, testing_utils_1.createFeatures)([]), rawPacksInput, rawQueriesInput, languages, mockLogger), { message: expectedError });
},
title: (_, title) => `Calculate Augmentation Error: ${title}`,
});

File diff suppressed because one or more lines are too long

19
lib/debug-artifacts.js generated
View File

@@ -46,7 +46,7 @@ const path = __importStar(require("path"));
const artifact = __importStar(require("@actions/artifact"));
const artifactLegacy = __importStar(require("@actions/artifact-legacy"));
const core = __importStar(require("@actions/core"));
const adm_zip_1 = __importDefault(require("adm-zip"));
const archiver_1 = __importDefault(require("archiver"));
const del_1 = __importDefault(require("del"));
const actions_util_1 = require("./actions-util");
const analyze_1 = require("./analyze");
@@ -250,9 +250,20 @@ async function createPartialDatabaseBundle(config, language) {
if (fs.existsSync(databaseBundlePath)) {
await (0, del_1.default)(databaseBundlePath, { force: true });
}
const zip = new adm_zip_1.default();
zip.addLocalFolder(databasePath);
zip.writeZip(databaseBundlePath);
const output = fs.createWriteStream(databaseBundlePath);
const zip = (0, archiver_1.default)("zip");
zip.on("error", (err) => {
throw err;
});
zip.on("warning", (err) => {
// Ignore ENOENT warnings. There's nothing anyone can do about it.
if (err.code !== "ENOENT") {
throw err;
}
});
zip.pipe(output);
zip.directory(databasePath, false);
await zip.finalize();
return databaseBundlePath;
}
/**

File diff suppressed because one or more lines are too long

View File

@@ -1,6 +1,6 @@
{
"bundleVersion": "codeql-bundle-v2.20.5",
"cliVersion": "2.20.5",
"priorBundleVersion": "codeql-bundle-v2.20.4",
"priorCliVersion": "2.20.4"
"bundleVersion": "codeql-bundle-v2.21.3",
"cliVersion": "2.21.3",
"priorBundleVersion": "codeql-bundle-v2.21.2",
"priorCliVersion": "2.21.2"
}

View File

@@ -33,54 +33,68 @@ var __importStar = (this && this.__importStar) || (function () {
};
})();
Object.defineProperty(exports, "__esModule", { value: true });
exports.getJavaTempDependencyDir = getJavaTempDependencyDir;
exports.downloadDependencyCaches = downloadDependencyCaches;
exports.uploadDependencyCaches = uploadDependencyCaches;
const os = __importStar(require("os"));
const path_1 = require("path");
const actionsCache = __importStar(require("@actions/cache"));
const glob = __importStar(require("@actions/glob"));
const actions_util_1 = require("./actions-util");
const caching_utils_1 = require("./caching-utils");
const environment_1 = require("./environment");
const util_1 = require("./util");
const CODEQL_DEPENDENCY_CACHE_PREFIX = "codeql-dependencies";
const CODEQL_DEPENDENCY_CACHE_VERSION = 1;
/**
* Returns a path to a directory intended to be used to store .jar files
* for the Java `build-mode: none` extractor.
* @returns The path to the directory that should be used by the `build-mode: none` extractor.
*/
function getJavaTempDependencyDir() {
return (0, path_1.join)((0, actions_util_1.getTemporaryDirectory)(), "codeql_java", "repository");
}
/**
* Default caching configurations per language.
*/
const CODEQL_DEFAULT_CACHE_CONFIG = {
java: {
paths: [
// Maven
(0, path_1.join)(os.homedir(), ".m2", "repository"),
// Gradle
(0, path_1.join)(os.homedir(), ".gradle", "caches"),
],
hash: [
// Maven
"**/pom.xml",
// Gradle
"**/*.gradle*",
"**/gradle-wrapper.properties",
"buildSrc/**/Versions.kt",
"buildSrc/**/Dependencies.kt",
"gradle/*.versions.toml",
"**/versions.properties",
],
},
csharp: {
paths: [(0, path_1.join)(os.homedir(), ".nuget", "packages")],
hash: [
// NuGet
"**/packages.lock.json",
// Paket
"**/paket.lock",
],
},
go: {
paths: [(0, path_1.join)(os.homedir(), "go", "pkg", "mod")],
hash: ["**/go.sum"],
},
};
function getDefaultCacheConfig() {
return {
java: {
paths: [
// Maven
(0, path_1.join)(os.homedir(), ".m2", "repository"),
// Gradle
(0, path_1.join)(os.homedir(), ".gradle", "caches"),
// CodeQL Java build-mode: none
getJavaTempDependencyDir(),
],
hash: [
// Maven
"**/pom.xml",
// Gradle
"**/*.gradle*",
"**/gradle-wrapper.properties",
"buildSrc/**/Versions.kt",
"buildSrc/**/Dependencies.kt",
"gradle/*.versions.toml",
"**/versions.properties",
],
},
csharp: {
paths: [(0, path_1.join)(os.homedir(), ".nuget", "packages")],
hash: [
// NuGet
"**/packages.lock.json",
// Paket
"**/paket.lock",
],
},
go: {
paths: [(0, path_1.join)(os.homedir(), "go", "pkg", "mod")],
hash: ["**/go.sum"],
},
};
}
async function makeGlobber(patterns) {
return glob.create(patterns.join("\n"));
}
@@ -94,7 +108,7 @@ async function makeGlobber(patterns) {
async function downloadDependencyCaches(languages, logger) {
const restoredCaches = [];
for (const language of languages) {
const cacheConfig = CODEQL_DEFAULT_CACHE_CONFIG[language];
const cacheConfig = getDefaultCacheConfig()[language];
if (cacheConfig === undefined) {
logger.info(`Skipping download of dependency cache for ${language} as we have no caching configuration for it.`);
continue;
@@ -128,7 +142,7 @@ async function downloadDependencyCaches(languages, logger) {
*/
async function uploadDependencyCaches(config, logger) {
for (const language of config.languages) {
const cacheConfig = CODEQL_DEFAULT_CACHE_CONFIG[language];
const cacheConfig = getDefaultCacheConfig()[language];
if (cacheConfig === undefined) {
logger.info(`Skipping upload of dependency cache for ${language} as we have no caching configuration for it.`);
continue;

View File

@@ -1 +1 @@
{"version":3,"file":"dependency-caching.js","sourceRoot":"","sources":["../src/dependency-caching.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AA+EA,4DAmDC;AAQD,wDAiEC;AA3MD,uCAAyB;AACzB,+BAA4B;AAE5B,6DAA+C;AAC/C,oDAAsC;AAEtC,mDAAoD;AAEpD,+CAAuC;AAGvC,iCAA6C;AAgB7C,MAAM,8BAA8B,GAAG,qBAAqB,CAAC;AAC7D,MAAM,+BAA+B,GAAG,CAAC,CAAC;AAE1C;;GAEG;AACH,MAAM,2BAA2B,GAAwC;IACvE,IAAI,EAAE;QACJ,KAAK,EAAE;YACL,QAAQ;YACR,IAAA,WAAI,EAAC,EAAE,CAAC,OAAO,EAAE,EAAE,KAAK,EAAE,YAAY,CAAC;YACvC,SAAS;YACT,IAAA,WAAI,EAAC,EAAE,CAAC,OAAO,EAAE,EAAE,SAAS,EAAE,QAAQ,CAAC;SACxC;QACD,IAAI,EAAE;YACJ,QAAQ;YACR,YAAY;YACZ,SAAS;YACT,cAAc;YACd,8BAA8B;YAC9B,yBAAyB;YACzB,6BAA6B;YAC7B,wBAAwB;YACxB,wBAAwB;SACzB;KACF;IACD,MAAM,EAAE;QACN,KAAK,EAAE,CAAC,IAAA,WAAI,EAAC,EAAE,CAAC,OAAO,EAAE,EAAE,QAAQ,EAAE,UAAU,CAAC,CAAC;QACjD,IAAI,EAAE;YACJ,QAAQ;YACR,uBAAuB;YACvB,QAAQ;YACR,eAAe;SAChB;KACF;IACD,EAAE,EAAE;QACF,KAAK,EAAE,CAAC,IAAA,WAAI,EAAC,EAAE,CAAC,OAAO,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE,KAAK,CAAC,CAAC;QAC/C,IAAI,EAAE,CAAC,WAAW,CAAC;KACpB;CACF,CAAC;AAEF,KAAK,UAAU,WAAW,CAAC,QAAkB;IAC3C,OAAO,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC;AAC1C,CAAC;AAED;;;;;;GAMG;AACI,KAAK,UAAU,wBAAwB,CAC5C,SAAqB,EACrB,MAAc;IAEd,MAAM,cAAc,GAAe,EAAE,CAAC;IAEtC,KAAK,MAAM,QAAQ,IAAI,SAAS,EAAE,CAAC;QACjC,MAAM,WAAW,GAAG,2BAA2B,CAAC,QAAQ,CAAC,CAAC;QAE1D,IAAI,WAAW,KAAK,SAAS,EAAE,CAAC;YAC9B,MAAM,CAAC,IAAI,CACT,6CAA6C,QAAQ,8CAA8C,CACpG,CAAC;YACF,SAAS;QACX,CAAC;QAED,gGAAgG;QAChG,wBAAwB;QACxB,MAAM,OAAO,GAAG,MAAM,WAAW,CAAC,WAAW,CAAC,IAAI,CAAC,CAAC;QAEpD,IAAI,CAAC,MAAM,OAAO,CAAC,IAAI,EAAE,CAAC,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;YACxC,MAAM,CAAC,IAAI,CACT,6CAA6C,QAAQ,mDAAmD,CACzG,CAAC;YACF,SAAS;QACX,CAAC;QAED,MAAM,UAAU,GAAG,MAAM,QAAQ,CAAC,QAAQ,EAAE,WAAW,CAAC,CAAC;QACzD,MAAM,WAAW,GAAa,CAAC,MAAM,WAAW,CAAC,QAAQ,CAAC,CAAC,CAAC;QAE5D,MAAM,CAAC,IAAI,CACT,yBAAyB,QAAQ,aAAa,UAAU,qBAAqB,WAAW,CAAC,IAAI,CAC3F,IAAI,CACL,EAAE,CACJ,CAAC;QAEF,MAAM,MAAM,GAAG,MAAM,YAAY,CAAC,YAAY,CAC5C,WAAW,CAAC,KAAK,EACjB,UAAU,EACV,WAAW,CACZ,CAAC;QAEF,IAAI,MAAM,KAAK,SAAS,EAAE,CAAC;YACzB,MAAM,CAAC,IAAI,CAAC,oBAAoB,MAAM,QAAQ,QAAQ,GAAG,CAAC,CAAC;YAC3D,cAAc,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;QAChC,CAAC;aAAM,CAAC;YACN,MAAM,CAAC,IAAI,CAAC,+BAA+B,QAAQ,GAAG,CAAC,CAAC;QAC1D,CAAC;IACH,CAAC;IAED,OAAO,cAAc,CAAC;AACxB,CAAC;AAED;;;;;GAKG;AACI,KAAK,UAAU,sBAAsB,CAAC,MAAc,EAAE,MAAc;IACzE,KAAK,MAAM,QAAQ,IAAI,MAAM,CAAC,SAAS,EAAE,CAAC;QACxC,MAAM,WAAW,GAAG,2BAA2B,CAAC,QAAQ,CAAC,CAAC;QAE1D,IAAI,WAAW,KAAK,SAAS,EAAE,CAAC;YAC9B,MAAM,CAAC,IAAI,CACT,2CAA2C,QAAQ,8CAA8C,CAClG,CAAC;YACF,SAAS;QACX,CAAC;QAED,gGAAgG;QAChG,wBAAwB;QACxB,MAAM,OAAO,GAAG,MAAM,WAAW,CAAC,WAAW,CAAC,IAAI,CAAC,CAAC;QAEpD,IAAI,CAAC,MAAM,OAAO,CAAC,IAAI,EAAE,CAAC,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;YACxC,MAAM,CAAC,IAAI,CACT,2CAA2C,QAAQ,mDAAmD,CACvG,CAAC;YACF,SAAS;QACX,CAAC;QAED,yGAAyG;QACzG,uGAAuG;QACvG,uCAAuC;QACvC,uGAAuG;QACvG,uGAAuG;QACvG,sCAAsC;QACtC,uGAAuG;QACvG,sGAAsG;QACtG,sGAAsG;QACtG,4CAA4C;QAC5C,MAAM,IAAI,GAAG,MAAM,IAAA,iCAAiB,EAAC,WAAW,CAAC,KAAK,EAAE,MAAM,EAAE,IAAI,CAAC,CAAC;QAEtE,iCAAiC;QACjC,IAAI,IAAI,KAAK,CAAC,EAAE,CAAC;YACf,MAAM,CAAC,IAAI,CACT,2CAA2C,QAAQ,qBAAqB,CACzE,CAAC;YACF,SAAS;QACX,CAAC;QAED,MAAM,GAAG,GAAG,MAAM,QAAQ,CAAC,QAAQ,EAAE,WAAW,CAAC,CAAC;QAElD,MAAM,CAAC,IAAI,CACT,2BAA2B,IAAI,QAAQ,QAAQ,aAAa,GAAG,KAAK,CACrE,CAAC;QAEF,IAAI,CAAC;YACH,MAAM,YAAY,CAAC,SAAS,CAAC,WAAW,CAAC,KAAK,EAAE,GAAG,CAAC,CAAC;QACvD,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,yFAAyF;YACzF,uFAAuF;YACvF,gCAAgC;YAChC,IAAI,KAAK,YAAY,YAAY,CAAC,iBAAiB,EAAE,CAAC;gBACpD,MAAM,CAAC,IAAI,CACT,2BAA2B,QAAQ,aAAa,GAAG,qBAAqB,CACzE,CAAC;gBACF,MAAM,CAAC,KAAK,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;YAC9B,CAAC;iBAAM,CAAC;gBACN,kCAAkC;gBAClC,MAAM,KAAK,CAAC;YACd,CAAC;QACH,CAAC;IACH,CAAC;AACH,CAAC;AAED;;;;;;GAMG;AACH,KAAK,UAAU,QAAQ,CACrB,QAAkB,EAClB,WAAwB;IAExB,MAAM,IAAI,GAAG,MAAM,IAAI,CAAC,SAAS,CAAC,WAAW,CAAC,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC;IAC/D,OAAO,GAAG,MAAM,WAAW,CAAC,QAAQ,CAAC,GAAG,IAAI,EAAE,CAAC;AACjD,CAAC;AAED;;;;;;GAMG;AACH,KAAK,UAAU,WAAW,CAAC,QAAkB;IAC3C,MAAM,QAAQ,GAAG,IAAA,0BAAmB,EAAC,WAAW,CAAC,CAAC;IAClD,MAAM,YAAY,GAAG,OAAO,CAAC,GAAG,CAAC,oBAAM,CAAC,yBAAyB,CAAC,CAAC;IACnE,IAAI,MAAM,GAAG,8BAA8B,CAAC;IAE5C,IAAI,YAAY,KAAK,SAAS,IAAI,YAAY,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;QAC1D,MAAM,GAAG,GAAG,MAAM,IAAI,YAAY,EAAE,CAAC;IACvC,CAAC;IAED,OAAO,GAAG,MAAM,IAAI,+BAA+B,IAAI,QAAQ,IAAI,QAAQ,GAAG,CAAC;AACjF,CAAC"}
{"version":3,"file":"dependency-caching.js","sourceRoot":"","sources":["../src/dependency-caching.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAoCA,4DAEC;AAuDD,4DAmDC;AAQD,wDAiEC;AAzND,uCAAyB;AACzB,+BAA4B;AAE5B,6DAA+C;AAC/C,oDAAsC;AAEtC,iDAAuD;AACvD,mDAAoD;AAEpD,+CAAuC;AAGvC,iCAA6C;AAgB7C,MAAM,8BAA8B,GAAG,qBAAqB,CAAC;AAC7D,MAAM,+BAA+B,GAAG,CAAC,CAAC;AAE1C;;;;GAIG;AACH,SAAgB,wBAAwB;IACtC,OAAO,IAAA,WAAI,EAAC,IAAA,oCAAqB,GAAE,EAAE,aAAa,EAAE,YAAY,CAAC,CAAC;AACpE,CAAC;AAED;;GAEG;AACH,SAAS,qBAAqB;IAC5B,OAAO;QACL,IAAI,EAAE;YACJ,KAAK,EAAE;gBACL,QAAQ;gBACR,IAAA,WAAI,EAAC,EAAE,CAAC,OAAO,EAAE,EAAE,KAAK,EAAE,YAAY,CAAC;gBACvC,SAAS;gBACT,IAAA,WAAI,EAAC,EAAE,CAAC,OAAO,EAAE,EAAE,SAAS,EAAE,QAAQ,CAAC;gBACvC,+BAA+B;gBAC/B,wBAAwB,EAAE;aAC3B;YACD,IAAI,EAAE;gBACJ,QAAQ;gBACR,YAAY;gBACZ,SAAS;gBACT,cAAc;gBACd,8BAA8B;gBAC9B,yBAAyB;gBACzB,6BAA6B;gBAC7B,wBAAwB;gBACxB,wBAAwB;aACzB;SACF;QACD,MAAM,EAAE;YACN,KAAK,EAAE,CAAC,IAAA,WAAI,EAAC,EAAE,CAAC,OAAO,EAAE,EAAE,QAAQ,EAAE,UAAU,CAAC,CAAC;YACjD,IAAI,EAAE;gBACJ,QAAQ;gBACR,uBAAuB;gBACvB,QAAQ;gBACR,eAAe;aAChB;SACF;QACD,EAAE,EAAE;YACF,KAAK,EAAE,CAAC,IAAA,WAAI,EAAC,EAAE,CAAC,OAAO,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE,KAAK,CAAC,CAAC;YAC/C,IAAI,EAAE,CAAC,WAAW,CAAC;SACpB;KACF,CAAC;AACJ,CAAC;AAED,KAAK,UAAU,WAAW,CAAC,QAAkB;IAC3C,OAAO,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC;AAC1C,CAAC;AAED;;;;;;GAMG;AACI,KAAK,UAAU,wBAAwB,CAC5C,SAAqB,EACrB,MAAc;IAEd,MAAM,cAAc,GAAe,EAAE,CAAC;IAEtC,KAAK,MAAM,QAAQ,IAAI,SAAS,EAAE,CAAC;QACjC,MAAM,WAAW,GAAG,qBAAqB,EAAE,CAAC,QAAQ,CAAC,CAAC;QAEtD,IAAI,WAAW,KAAK,SAAS,EAAE,CAAC;YAC9B,MAAM,CAAC,IAAI,CACT,6CAA6C,QAAQ,8CAA8C,CACpG,CAAC;YACF,SAAS;QACX,CAAC;QAED,gGAAgG;QAChG,wBAAwB;QACxB,MAAM,OAAO,GAAG,MAAM,WAAW,CAAC,WAAW,CAAC,IAAI,CAAC,CAAC;QAEpD,IAAI,CAAC,MAAM,OAAO,CAAC,IAAI,EAAE,CAAC,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;YACxC,MAAM,CAAC,IAAI,CACT,6CAA6C,QAAQ,mDAAmD,CACzG,CAAC;YACF,SAAS;QACX,CAAC;QAED,MAAM,UAAU,GAAG,MAAM,QAAQ,CAAC,QAAQ,EAAE,WAAW,CAAC,CAAC;QACzD,MAAM,WAAW,GAAa,CAAC,MAAM,WAAW,CAAC,QAAQ,CAAC,CAAC,CAAC;QAE5D,MAAM,CAAC,IAAI,CACT,yBAAyB,QAAQ,aAAa,UAAU,qBAAqB,WAAW,CAAC,IAAI,CAC3F,IAAI,CACL,EAAE,CACJ,CAAC;QAEF,MAAM,MAAM,GAAG,MAAM,YAAY,CAAC,YAAY,CAC5C,WAAW,CAAC,KAAK,EACjB,UAAU,EACV,WAAW,CACZ,CAAC;QAEF,IAAI,MAAM,KAAK,SAAS,EAAE,CAAC;YACzB,MAAM,CAAC,IAAI,CAAC,oBAAoB,MAAM,QAAQ,QAAQ,GAAG,CAAC,CAAC;YAC3D,cAAc,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;QAChC,CAAC;aAAM,CAAC;YACN,MAAM,CAAC,IAAI,CAAC,+BAA+B,QAAQ,GAAG,CAAC,CAAC;QAC1D,CAAC;IACH,CAAC;IAED,OAAO,cAAc,CAAC;AACxB,CAAC;AAED;;;;;GAKG;AACI,KAAK,UAAU,sBAAsB,CAAC,MAAc,EAAE,MAAc;IACzE,KAAK,MAAM,QAAQ,IAAI,MAAM,CAAC,SAAS,EAAE,CAAC;QACxC,MAAM,WAAW,GAAG,qBAAqB,EAAE,CAAC,QAAQ,CAAC,CAAC;QAEtD,IAAI,WAAW,KAAK,SAAS,EAAE,CAAC;YAC9B,MAAM,CAAC,IAAI,CACT,2CAA2C,QAAQ,8CAA8C,CAClG,CAAC;YACF,SAAS;QACX,CAAC;QAED,gGAAgG;QAChG,wBAAwB;QACxB,MAAM,OAAO,GAAG,MAAM,WAAW,CAAC,WAAW,CAAC,IAAI,CAAC,CAAC;QAEpD,IAAI,CAAC,MAAM,OAAO,CAAC,IAAI,EAAE,CAAC,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;YACxC,MAAM,CAAC,IAAI,CACT,2CAA2C,QAAQ,mDAAmD,CACvG,CAAC;YACF,SAAS;QACX,CAAC;QAED,yGAAyG;QACzG,uGAAuG;QACvG,uCAAuC;QACvC,uGAAuG;QACvG,uGAAuG;QACvG,sCAAsC;QACtC,uGAAuG;QACvG,sGAAsG;QACtG,sGAAsG;QACtG,4CAA4C;QAC5C,MAAM,IAAI,GAAG,MAAM,IAAA,iCAAiB,EAAC,WAAW,CAAC,KAAK,EAAE,MAAM,EAAE,IAAI,CAAC,CAAC;QAEtE,iCAAiC;QACjC,IAAI,IAAI,KAAK,CAAC,EAAE,CAAC;YACf,MAAM,CAAC,IAAI,CACT,2CAA2C,QAAQ,qBAAqB,CACzE,CAAC;YACF,SAAS;QACX,CAAC;QAED,MAAM,GAAG,GAAG,MAAM,QAAQ,CAAC,QAAQ,EAAE,WAAW,CAAC,CAAC;QAElD,MAAM,CAAC,IAAI,CACT,2BAA2B,IAAI,QAAQ,QAAQ,aAAa,GAAG,KAAK,CACrE,CAAC;QAEF,IAAI,CAAC;YACH,MAAM,YAAY,CAAC,SAAS,CAAC,WAAW,CAAC,KAAK,EAAE,GAAG,CAAC,CAAC;QACvD,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,yFAAyF;YACzF,uFAAuF;YACvF,gCAAgC;YAChC,IAAI,KAAK,YAAY,YAAY,CAAC,iBAAiB,EAAE,CAAC;gBACpD,MAAM,CAAC,IAAI,CACT,2BAA2B,QAAQ,aAAa,GAAG,qBAAqB,CACzE,CAAC;gBACF,MAAM,CAAC,KAAK,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;YAC9B,CAAC;iBAAM,CAAC;gBACN,kCAAkC;gBAClC,MAAM,KAAK,CAAC;YACd,CAAC;QACH,CAAC;IACH,CAAC;AACH,CAAC;AAED;;;;;;GAMG;AACH,KAAK,UAAU,QAAQ,CACrB,QAAkB,EAClB,WAAwB;IAExB,MAAM,IAAI,GAAG,MAAM,IAAI,CAAC,SAAS,CAAC,WAAW,CAAC,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC;IAC/D,OAAO,GAAG,MAAM,WAAW,CAAC,QAAQ,CAAC,GAAG,IAAI,EAAE,CAAC;AACjD,CAAC;AAED;;;;;;GAMG;AACH,KAAK,UAAU,WAAW,CAAC,QAAkB;IAC3C,MAAM,QAAQ,GAAG,IAAA,0BAAmB,EAAC,WAAW,CAAC,CAAC;IAClD,MAAM,YAAY,GAAG,OAAO,CAAC,GAAG,CAAC,oBAAM,CAAC,yBAAyB,CAAC,CAAC;IACnE,IAAI,MAAM,GAAG,8BAA8B,CAAC;IAE5C,IAAI,YAAY,KAAK,SAAS,IAAI,YAAY,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;QAC1D,MAAM,GAAG,GAAG,MAAM,IAAI,YAAY,EAAE,CAAC;IACvC,CAAC;IAED,OAAO,GAAG,MAAM,IAAI,+BAA+B,IAAI,QAAQ,IAAI,QAAQ,GAAG,CAAC;AACjF,CAAC"}

View File

@@ -1,60 +0,0 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || (function () {
var ownKeys = function(o) {
ownKeys = Object.getOwnPropertyNames || function (o) {
var ar = [];
for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
return ar;
};
return ownKeys(o);
};
return function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
__setModuleDefault(result, mod);
return result;
};
})();
Object.defineProperty(exports, "__esModule", { value: true });
exports.writeDiffRangesJsonFile = writeDiffRangesJsonFile;
exports.readDiffRangesJsonFile = readDiffRangesJsonFile;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const actionsUtil = __importStar(require("./actions-util"));
function getDiffRangesJsonFilePath() {
return path.join(actionsUtil.getTemporaryDirectory(), "pr-diff-range.json");
}
function writeDiffRangesJsonFile(logger, ranges) {
const jsonContents = JSON.stringify(ranges, null, 2);
const jsonFilePath = getDiffRangesJsonFilePath();
fs.writeFileSync(jsonFilePath, jsonContents);
logger.debug(`Wrote pr-diff-range JSON file to ${jsonFilePath}:\n${jsonContents}`);
}
function readDiffRangesJsonFile(logger) {
const jsonFilePath = getDiffRangesJsonFilePath();
if (!fs.existsSync(jsonFilePath)) {
logger.debug(`Diff ranges JSON file does not exist at ${jsonFilePath}`);
return undefined;
}
const jsonContents = fs.readFileSync(jsonFilePath, "utf8");
logger.debug(`Read pr-diff-range JSON file from ${jsonFilePath}:\n${jsonContents}`);
return JSON.parse(jsonContents);
}
//# sourceMappingURL=diff-filtering-utils.js.map

View File

@@ -1 +0,0 @@
{"version":3,"file":"diff-filtering-utils.js","sourceRoot":"","sources":["../src/diff-filtering-utils.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAgBA,0DAUC;AAED,wDAaC;AAzCD,uCAAyB;AACzB,2CAA6B;AAE7B,4DAA8C;AAS9C,SAAS,yBAAyB;IAChC,OAAO,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,qBAAqB,EAAE,EAAE,oBAAoB,CAAC,CAAC;AAC9E,CAAC;AAED,SAAgB,uBAAuB,CACrC,MAAc,EACd,MAAwB;IAExB,MAAM,YAAY,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,CAAC;IACrD,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,EAAE,CAAC,aAAa,CAAC,YAAY,EAAE,YAAY,CAAC,CAAC;IAC7C,MAAM,CAAC,KAAK,CACV,oCAAoC,YAAY,MAAM,YAAY,EAAE,CACrE,CAAC;AACJ,CAAC;AAED,SAAgB,sBAAsB,CACpC,MAAc;IAEd,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,IAAI,CAAC,EAAE,CAAC,UAAU,CAAC,YAAY,CAAC,EAAE,CAAC;QACjC,MAAM,CAAC,KAAK,CAAC,2CAA2C,YAAY,EAAE,CAAC,CAAC;QACxE,OAAO,SAAS,CAAC;IACnB,CAAC;IACD,MAAM,YAAY,GAAG,EAAE,CAAC,YAAY,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC;IAC3D,MAAM,CAAC,KAAK,CACV,qCAAqC,YAAY,MAAM,YAAY,EAAE,CACtE,CAAC;IACF,OAAO,IAAI,CAAC,KAAK,CAAC,YAAY,CAAqB,CAAC;AACtD,CAAC"}

114
lib/diff-informed-analysis-utils.js generated Normal file
View File

@@ -0,0 +1,114 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || (function () {
var ownKeys = function(o) {
ownKeys = Object.getOwnPropertyNames || function (o) {
var ar = [];
for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
return ar;
};
return ownKeys(o);
};
return function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
__setModuleDefault(result, mod);
return result;
};
})();
Object.defineProperty(exports, "__esModule", { value: true });
exports.shouldPerformDiffInformedAnalysis = shouldPerformDiffInformedAnalysis;
exports.getDiffInformedAnalysisBranches = getDiffInformedAnalysisBranches;
exports.writeDiffRangesJsonFile = writeDiffRangesJsonFile;
exports.readDiffRangesJsonFile = readDiffRangesJsonFile;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const github = __importStar(require("@actions/github"));
const actionsUtil = __importStar(require("./actions-util"));
const feature_flags_1 = require("./feature-flags");
function getPullRequestBranches() {
const pullRequest = github.context.payload.pull_request;
if (pullRequest) {
return {
base: pullRequest.base.ref,
// We use the head label instead of the head ref here, because the head
// ref lacks owner information and by itself does not uniquely identify
// the head branch (which may be in a forked repository).
head: pullRequest.head.label,
};
}
// PR analysis under Default Setup does not have the pull_request context,
// but it should set CODE_SCANNING_REF and CODE_SCANNING_BASE_BRANCH.
const codeScanningRef = process.env.CODE_SCANNING_REF;
const codeScanningBaseBranch = process.env.CODE_SCANNING_BASE_BRANCH;
if (codeScanningRef && codeScanningBaseBranch) {
return {
base: codeScanningBaseBranch,
// PR analysis under Default Setup analyzes the PR head commit instead of
// the merge commit, so we can use the provided ref directly.
head: codeScanningRef,
};
}
return undefined;
}
/**
* Check if the action should perform diff-informed analysis.
*/
async function shouldPerformDiffInformedAnalysis(codeql, features, logger) {
return ((await getDiffInformedAnalysisBranches(codeql, features, logger)) !==
undefined);
}
/**
* Get the branches to use for diff-informed analysis.
*
* @returns If the action should perform diff-informed analysis, return
* the base and head branches that should be used to compute the diff ranges.
* Otherwise return `undefined`.
*/
async function getDiffInformedAnalysisBranches(codeql, features, logger) {
if (!(await features.getValue(feature_flags_1.Feature.DiffInformedQueries, codeql))) {
return undefined;
}
const branches = getPullRequestBranches();
if (!branches) {
logger.info("Not performing diff-informed analysis " +
"because we are not analyzing a pull request.");
}
return branches;
}
function getDiffRangesJsonFilePath() {
return path.join(actionsUtil.getTemporaryDirectory(), "pr-diff-range.json");
}
function writeDiffRangesJsonFile(logger, ranges) {
const jsonContents = JSON.stringify(ranges, null, 2);
const jsonFilePath = getDiffRangesJsonFilePath();
fs.writeFileSync(jsonFilePath, jsonContents);
logger.debug(`Wrote pr-diff-range JSON file to ${jsonFilePath}:\n${jsonContents}`);
}
function readDiffRangesJsonFile(logger) {
const jsonFilePath = getDiffRangesJsonFilePath();
if (!fs.existsSync(jsonFilePath)) {
logger.debug(`Diff ranges JSON file does not exist at ${jsonFilePath}`);
return undefined;
}
const jsonContents = fs.readFileSync(jsonFilePath, "utf8");
logger.debug(`Read pr-diff-range JSON file from ${jsonFilePath}:\n${jsonContents}`);
return JSON.parse(jsonContents);
}
//# sourceMappingURL=diff-informed-analysis-utils.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"diff-informed-analysis-utils.js","sourceRoot":"","sources":["../src/diff-informed-analysis-utils.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AA6CA,8EASC;AASD,0EAiBC;AAYD,0DAUC;AAED,wDAaC;AArHD,uCAAyB;AACzB,2CAA6B;AAE7B,wDAA0C;AAE1C,4DAA8C;AAE9C,mDAA6D;AAQ7D,SAAS,sBAAsB;IAC7B,MAAM,WAAW,GAAG,MAAM,CAAC,OAAO,CAAC,OAAO,CAAC,YAAY,CAAC;IACxD,IAAI,WAAW,EAAE,CAAC;QAChB,OAAO;YACL,IAAI,EAAE,WAAW,CAAC,IAAI,CAAC,GAAG;YAC1B,uEAAuE;YACvE,uEAAuE;YACvE,yDAAyD;YACzD,IAAI,EAAE,WAAW,CAAC,IAAI,CAAC,KAAK;SAC7B,CAAC;IACJ,CAAC;IAED,0EAA0E;IAC1E,qEAAqE;IACrE,MAAM,eAAe,GAAG,OAAO,CAAC,GAAG,CAAC,iBAAiB,CAAC;IACtD,MAAM,sBAAsB,GAAG,OAAO,CAAC,GAAG,CAAC,yBAAyB,CAAC;IACrE,IAAI,eAAe,IAAI,sBAAsB,EAAE,CAAC;QAC9C,OAAO;YACL,IAAI,EAAE,sBAAsB;YAC5B,yEAAyE;YACzE,6DAA6D;YAC7D,IAAI,EAAE,eAAe;SACtB,CAAC;IACJ,CAAC;IACD,OAAO,SAAS,CAAC;AACnB,CAAC;AAED;;GAEG;AACI,KAAK,UAAU,iCAAiC,CACrD,MAAc,EACd,QAA2B,EAC3B,MAAc;IAEd,OAAO,CACL,CAAC,MAAM,+BAA+B,CAAC,MAAM,EAAE,QAAQ,EAAE,MAAM,CAAC,CAAC;QACjE,SAAS,CACV,CAAC;AACJ,CAAC;AAED;;;;;;GAMG;AACI,KAAK,UAAU,+BAA+B,CACnD,MAAc,EACd,QAA2B,EAC3B,MAAc;IAEd,IAAI,CAAC,CAAC,MAAM,QAAQ,CAAC,QAAQ,CAAC,uBAAO,CAAC,mBAAmB,EAAE,MAAM,CAAC,CAAC,EAAE,CAAC;QACpE,OAAO,SAAS,CAAC;IACnB,CAAC;IAED,MAAM,QAAQ,GAAG,sBAAsB,EAAE,CAAC;IAC1C,IAAI,CAAC,QAAQ,EAAE,CAAC;QACd,MAAM,CAAC,IAAI,CACT,wCAAwC;YACtC,8CAA8C,CACjD,CAAC;IACJ,CAAC;IACD,OAAO,QAAQ,CAAC;AAClB,CAAC;AAQD,SAAS,yBAAyB;IAChC,OAAO,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,qBAAqB,EAAE,EAAE,oBAAoB,CAAC,CAAC;AAC9E,CAAC;AAED,SAAgB,uBAAuB,CACrC,MAAc,EACd,MAAwB;IAExB,MAAM,YAAY,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,CAAC;IACrD,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,EAAE,CAAC,aAAa,CAAC,YAAY,EAAE,YAAY,CAAC,CAAC;IAC7C,MAAM,CAAC,KAAK,CACV,oCAAoC,YAAY,MAAM,YAAY,EAAE,CACrE,CAAC;AACJ,CAAC;AAED,SAAgB,sBAAsB,CACpC,MAAc;IAEd,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,IAAI,CAAC,EAAE,CAAC,UAAU,CAAC,YAAY,CAAC,EAAE,CAAC;QACjC,MAAM,CAAC,KAAK,CAAC,2CAA2C,YAAY,EAAE,CAAC,CAAC;QACxE,OAAO,SAAS,CAAC;IACnB,CAAC;IACD,MAAM,YAAY,GAAG,EAAE,CAAC,YAAY,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC;IAC3D,MAAM,CAAC,KAAK,CACV,qCAAqC,YAAY,MAAM,YAAY,EAAE,CACtE,CAAC;IACF,OAAO,IAAI,CAAC,KAAK,CAAC,YAAY,CAAqB,CAAC;AACtD,CAAC"}

7
lib/feature-flags.js generated
View File

@@ -61,6 +61,7 @@ var Feature;
Feature["CppBuildModeNone"] = "cpp_build_mode_none";
Feature["CppDependencyInstallation"] = "cpp_dependency_installation_enabled";
Feature["DiffInformedQueries"] = "diff_informed_queries";
Feature["DisableCombineSarifFiles"] = "disable_combine_sarif_files";
Feature["DisableCsharpBuildless"] = "disable_csharp_buildless";
Feature["DisableJavaBuildlessEnabled"] = "disable_java_buildless_enabled";
Feature["DisableKotlinAnalysisEnabled"] = "disable_kotlin_analysis_enabled";
@@ -96,8 +97,12 @@ exports.featureConfig = {
[Feature.DiffInformedQueries]: {
defaultValue: false,
envVar: "CODEQL_ACTION_DIFF_INFORMED_QUERIES",
minimumVersion: "2.21.0",
},
[Feature.DisableCombineSarifFiles]: {
defaultValue: false,
envVar: "CODEQL_ACTION_DISABLE_COMBINE_SARIF_FILES",
minimumVersion: undefined,
toolsFeature: tools_features_1.ToolsFeature.DatabaseInterpretResultsSupportsSarifRunProperty,
},
[Feature.DisableCsharpBuildless]: {
defaultValue: false,

File diff suppressed because one or more lines are too long

126
lib/git-utils.js generated
View File

@@ -33,7 +33,7 @@ var __importStar = (this && this.__importStar) || (function () {
};
})();
Object.defineProperty(exports, "__esModule", { value: true });
exports.decodeGitFilePath = exports.getGitDiffHunkHeaders = exports.getAllGitMergeBases = exports.gitRepack = exports.gitFetch = exports.deepenGitHistory = exports.determineBaseBranchHeadCommitOid = exports.getCommitOid = void 0;
exports.getFileOidsUnderPath = exports.getGitRoot = exports.decodeGitFilePath = exports.gitRepack = exports.gitFetch = exports.deepenGitHistory = exports.determineBaseBranchHeadCommitOid = exports.getCommitOid = exports.runGitCommand = void 0;
exports.getRef = getRef;
exports.isAnalyzingDefaultBranch = isAnalyzingDefaultBranch;
const core = __importStar(require("@actions/core"));
@@ -41,7 +41,7 @@ const toolrunner = __importStar(require("@actions/exec/lib/toolrunner"));
const io = __importStar(require("@actions/io"));
const actions_util_1 = require("./actions-util");
const util_1 = require("./util");
async function runGitCommand(checkoutPath, args, customErrorMessage) {
const runGitCommand = async function (workingDirectory, args, customErrorMessage) {
let stdout = "";
let stderr = "";
core.debug(`Running git command: git ${args.join(" ")}`);
@@ -56,7 +56,7 @@ async function runGitCommand(checkoutPath, args, customErrorMessage) {
stderr += data.toString();
},
},
cwd: checkoutPath,
cwd: workingDirectory,
}).exec();
return stdout;
}
@@ -69,7 +69,8 @@ async function runGitCommand(checkoutPath, args, customErrorMessage) {
core.info(`git call failed. ${customErrorMessage} Error: ${reason}`);
throw error;
}
}
};
exports.runGitCommand = runGitCommand;
/**
* Gets the SHA of the commit that is currently checked out.
*/
@@ -82,7 +83,7 @@ const getCommitOid = async function (checkoutPath, ref = "HEAD") {
// Even if this does go wrong, it's not a huge problem for the alerts to
// reported on the merge commit.
try {
const stdout = await runGitCommand(checkoutPath, ["rev-parse", ref], "Continuing with commit SHA from user input or environment.");
const stdout = await (0, exports.runGitCommand)(checkoutPath, ["rev-parse", ref], "Continuing with commit SHA from user input or environment.");
return stdout.trim();
}
catch {
@@ -106,7 +107,7 @@ const determineBaseBranchHeadCommitOid = async function (checkoutPathOverride) {
let commitOid = "";
let baseOid = "";
let headOid = "";
const stdout = await runGitCommand(checkoutPath, ["show", "-s", "--format=raw", mergeSha], "Will calculate the base branch SHA on the server.");
const stdout = await (0, exports.runGitCommand)(checkoutPath, ["show", "-s", "--format=raw", mergeSha], "Will calculate the base branch SHA on the server.");
for (const data of stdout.split("\n")) {
if (data.startsWith("commit ") && commitOid === "") {
commitOid = data.substring(7);
@@ -141,7 +142,7 @@ exports.determineBaseBranchHeadCommitOid = determineBaseBranchHeadCommitOid;
*/
const deepenGitHistory = async function () {
try {
await runGitCommand((0, actions_util_1.getOptionalInput)("checkout_path"), [
await (0, exports.runGitCommand)((0, actions_util_1.getOptionalInput)("checkout_path"), [
"fetch",
"origin",
"HEAD",
@@ -163,7 +164,7 @@ exports.deepenGitHistory = deepenGitHistory;
*/
const gitFetch = async function (branch, extraFlags) {
try {
await runGitCommand((0, actions_util_1.getOptionalInput)("checkout_path"), ["fetch", "--no-tags", ...extraFlags, "origin", `${branch}:${branch}`], `Cannot fetch ${branch}.`);
await (0, exports.runGitCommand)((0, actions_util_1.getOptionalInput)("checkout_path"), ["fetch", "--no-tags", ...extraFlags, "origin", `${branch}:${branch}`], `Cannot fetch ${branch}.`);
}
catch {
// Errors are already logged by runGitCommand()
@@ -178,68 +179,13 @@ exports.gitFetch = gitFetch;
*/
const gitRepack = async function (flags) {
try {
await runGitCommand((0, actions_util_1.getOptionalInput)("checkout_path"), ["repack", ...flags], "Cannot repack the repository.");
await (0, exports.runGitCommand)((0, actions_util_1.getOptionalInput)("checkout_path"), ["repack", ...flags], "Cannot repack the repository.");
}
catch {
// Errors are already logged by runGitCommand()
}
};
exports.gitRepack = gitRepack;
/**
* Compute the all merge bases between the given refs. Returns an empty array
* if no merge base is found, or if there is an error.
*
* This function uses the `checkout_path` to determine the repository path and
* works only when called from `analyze` or `upload-sarif`.
*/
const getAllGitMergeBases = async function (refs) {
try {
const stdout = await runGitCommand((0, actions_util_1.getOptionalInput)("checkout_path"), ["merge-base", "--all", ...refs], `Cannot get merge base of ${refs}.`);
return stdout.trim().split("\n");
}
catch {
return [];
}
};
exports.getAllGitMergeBases = getAllGitMergeBases;
/**
* Compute the diff hunk headers between the two given refs.
*
* This function uses the `checkout_path` to determine the repository path and
* works only when called from `analyze` or `upload-sarif`.
*
* @returns an array of diff hunk headers (one element per line), or undefined
* if the action was not triggered by a pull request, or if the diff could not
* be determined.
*/
const getGitDiffHunkHeaders = async function (fromRef, toRef) {
let stdout = "";
try {
stdout = await runGitCommand((0, actions_util_1.getOptionalInput)("checkout_path"), [
"-c",
"core.quotePath=false",
"diff",
"--no-renames",
"--irreversible-delete",
"-U0",
fromRef,
toRef,
], `Cannot get diff from ${fromRef} to ${toRef}.`);
}
catch {
return undefined;
}
const headers = [];
for (const line of stdout.split("\n")) {
if (line.startsWith("--- ") ||
line.startsWith("+++ ") ||
line.startsWith("@@ ")) {
headers.push(line);
}
}
return headers;
};
exports.getGitDiffHunkHeaders = getGitDiffHunkHeaders;
/**
* Decode, if necessary, a file path produced by Git. See
* https://git-scm.com/docs/git-config#Documentation/git-config.txt-corequotePath
@@ -285,6 +231,58 @@ const decodeGitFilePath = function (filePath) {
return filePath;
};
exports.decodeGitFilePath = decodeGitFilePath;
/**
* Get the root of the Git repository.
*
* @param sourceRoot The source root of the code being analyzed.
* @returns The root of the Git repository.
*/
const getGitRoot = async function (sourceRoot) {
try {
const stdout = await (0, exports.runGitCommand)(sourceRoot, ["rev-parse", "--show-toplevel"], `Cannot find Git repository root from the source root ${sourceRoot}.`);
return stdout.trim();
}
catch {
// Errors are already logged by runGitCommand()
return undefined;
}
};
exports.getGitRoot = getGitRoot;
/**
* Returns the Git OIDs of all tracked files (in the index and in the working
* tree) that are under the given base path, including files in active
* submodules. Untracked files and files not under the given base path are
* ignored.
*
* @param basePath A path into the Git repository.
* @returns a map from file paths (relative to `basePath`) to Git OIDs.
* @throws {Error} if "git ls-tree" produces unexpected output.
*/
const getFileOidsUnderPath = async function (basePath) {
// Without the --full-name flag, the path is relative to the current working
// directory of the git command, which is basePath.
const stdout = await (0, exports.runGitCommand)(basePath, ["ls-files", "--recurse-submodules", "--format=%(objectname)_%(path)"], "Cannot list Git OIDs of tracked files.");
const fileOidMap = {};
// With --format=%(objectname)_%(path), the output is a list of lines like:
// 30d998ded095371488be3a729eb61d86ed721a18_lib/git-utils.js
// d89514599a9a99f22b4085766d40af7b99974827_lib/git-utils.js.map
const regex = /^([0-9a-f]{40})_(.+)$/;
for (const line of stdout.split("\n")) {
if (line) {
const match = line.match(regex);
if (match) {
const oid = match[1];
const path = (0, exports.decodeGitFilePath)(match[2]);
fileOidMap[path] = oid;
}
else {
throw new Error(`Unexpected "git ls-files" output: ${line}`);
}
}
}
return fileOidMap;
};
exports.getFileOidsUnderPath = getFileOidsUnderPath;
function getRefFromEnv() {
// To workaround a limitation of Actions dynamic workflows not setting
// the GITHUB_REF in some cases, we accept also the ref within the

File diff suppressed because one or more lines are too long

71
lib/git-utils.test.js generated
View File

@@ -265,4 +265,75 @@ const util_1 = require("./util");
t.deepEqual(gitUtils.decodeGitFilePath('"foo\\vbar"'), "foo\vbar");
t.deepEqual(gitUtils.decodeGitFilePath('"\\a\\b\\f\\n\\r\\t\\v"'), "\x07\b\f\n\r\t\v");
});
(0, ava_1.default)("getFileOidsUnderPath returns correct file mapping", async (t) => {
const runGitCommandStub = sinon
.stub(gitUtils, "runGitCommand")
.resolves("30d998ded095371488be3a729eb61d86ed721a18_lib/git-utils.js\n" +
"d89514599a9a99f22b4085766d40af7b99974827_lib/git-utils.js.map\n" +
"a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96_src/git-utils.ts");
try {
const result = await gitUtils.getFileOidsUnderPath("/fake/path");
t.deepEqual(result, {
"lib/git-utils.js": "30d998ded095371488be3a729eb61d86ed721a18",
"lib/git-utils.js.map": "d89514599a9a99f22b4085766d40af7b99974827",
"src/git-utils.ts": "a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96",
});
t.deepEqual(runGitCommandStub.firstCall.args, [
"/fake/path",
["ls-files", "--recurse-submodules", "--format=%(objectname)_%(path)"],
"Cannot list Git OIDs of tracked files.",
]);
}
finally {
runGitCommandStub.restore();
}
});
(0, ava_1.default)("getFileOidsUnderPath handles quoted paths", async (t) => {
const runGitCommandStub = sinon
.stub(gitUtils, "runGitCommand")
.resolves("30d998ded095371488be3a729eb61d86ed721a18_lib/normal-file.js\n" +
'd89514599a9a99f22b4085766d40af7b99974827_"lib/file with spaces.js"\n' +
'a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96_"lib/file\\twith\\ttabs.js"');
try {
const result = await gitUtils.getFileOidsUnderPath("/fake/path");
t.deepEqual(result, {
"lib/normal-file.js": "30d998ded095371488be3a729eb61d86ed721a18",
"lib/file with spaces.js": "d89514599a9a99f22b4085766d40af7b99974827",
"lib/file\twith\ttabs.js": "a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96",
});
}
finally {
runGitCommandStub.restore();
}
});
(0, ava_1.default)("getFileOidsUnderPath handles empty output", async (t) => {
const runGitCommandStub = sinon
.stub(gitUtils, "runGitCommand")
.resolves("");
try {
const result = await gitUtils.getFileOidsUnderPath("/fake/path");
t.deepEqual(result, {});
}
finally {
runGitCommandStub.restore();
}
});
(0, ava_1.default)("getFileOidsUnderPath throws on unexpected output format", async (t) => {
const runGitCommandStub = sinon
.stub(gitUtils, "runGitCommand")
.resolves("30d998ded095371488be3a729eb61d86ed721a18_lib/git-utils.js\n" +
"invalid-line-format\n" +
"a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96_src/git-utils.ts");
try {
await t.throwsAsync(async () => {
await gitUtils.getFileOidsUnderPath("/fake/path");
}, {
instanceOf: Error,
message: 'Unexpected "git ls-files" output: invalid-line-format',
});
}
finally {
runGitCommandStub.restore();
}
});
//# sourceMappingURL=git-utils.test.js.map

File diff suppressed because one or more lines are too long

View File

@@ -173,7 +173,7 @@ async function removeUploadedSarif(uploadFailedSarifResult, logger) {
logger.info(`In test mode, therefore deleting the failed analysis to avoid impacting tool status for the Action repository. SARIF ID to delete: ${sarifID}.`);
const client = (0, api_client_1.getApiClient)();
try {
const repositoryNwo = (0, repository_1.parseRepositoryNwo)((0, util_1.getRequiredEnvParam)("GITHUB_REPOSITORY"));
const repositoryNwo = (0, repository_1.getRepositoryNwo)();
// Wait to make sure the analysis is ready for download before requesting it.
await (0, util_1.delay)(5000);
// Get the analysis associated with the uploaded sarif

File diff suppressed because one or more lines are too long

View File

@@ -59,7 +59,7 @@ async function runWrapper() {
(0, actions_util_1.restoreInputs)();
const gitHubVersion = await (0, api_client_1.getGitHubVersion)();
(0, util_1.checkGitHubVersionInRange)(gitHubVersion, logger);
const repositoryNwo = (0, repository_1.parseRepositoryNwo)((0, util_1.getRequiredEnvParam)("GITHUB_REPOSITORY"));
const repositoryNwo = (0, repository_1.getRepositoryNwo)();
const features = new feature_flags_1.Features(gitHubVersion, repositoryNwo, (0, actions_util_1.getTemporaryDirectory)(), logger);
config = await (0, config_utils_1.getConfig)((0, actions_util_1.getTemporaryDirectory)(), logger);
if (config === undefined) {
@@ -87,7 +87,9 @@ async function runWrapper() {
...uploadFailedSarifResult,
job_status: initActionPostHelper.getFinalJobStatus(),
};
logger.info("Sending status report for init-post step.");
await (0, status_report_1.sendStatusReport)(statusReport);
logger.info("Status report sent for init-post step.");
}
}
void runWrapper();

View File

@@ -1 +1 @@
{"version":3,"file":"init-action-post.js","sourceRoot":"","sources":["../src/init-action-post.ts"],"names":[],"mappings":";AAAA;;;;GAIG;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAEH,oDAAsC;AAEtC,iDAIwB;AACxB,6CAAgD;AAChD,iDAAmD;AACnD,kEAAoD;AACpD,mDAA2C;AAC3C,gFAAkE;AAClE,uCAA6C;AAC7C,6CAAkD;AAClD,mDAOyB;AACzB,iCAKgB;AAOhB,KAAK,UAAU,UAAU;IACvB,MAAM,MAAM,GAAG,IAAA,0BAAgB,GAAE,CAAC;IAClC,MAAM,SAAS,GAAG,IAAI,IAAI,EAAE,CAAC;IAC7B,IAAI,MAA0B,CAAC;IAC/B,IAAI,uBAES,CAAC;IACd,IAAI,CAAC;QACH,qCAAqC;QACrC,IAAA,4BAAa,GAAE,CAAC;QAEhB,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;QAC/C,IAAA,gCAAyB,EAAC,aAAa,EAAE,MAAM,CAAC,CAAC;QAEjD,MAAM,aAAa,GAAG,IAAA,+BAAkB,EACtC,IAAA,0BAAmB,EAAC,mBAAmB,CAAC,CACzC,CAAC;QACF,MAAM,QAAQ,GAAG,IAAI,wBAAQ,CAC3B,aAAa,EACb,aAAa,EACb,IAAA,oCAAqB,GAAE,EACvB,MAAM,CACP,CAAC;QAEF,MAAM,GAAG,MAAM,IAAA,wBAAS,EAAC,IAAA,oCAAqB,GAAE,EAAE,MAAM,CAAC,CAAC;QAC1D,IAAI,MAAM,KAAK,SAAS,EAAE,CAAC;YACzB,MAAM,CAAC,OAAO,CACZ,iGAAiG,CAClG,CAAC;QACJ,CAAC;aAAM,CAAC;YACN,uBAAuB,GAAG,MAAM,oBAAoB,CAAC,GAAG,CACtD,cAAc,CAAC,mCAAmC,EAClD,6BAAc,EACd,MAAM,EACN,aAAa,EACb,QAAQ,EACR,MAAM,CACP,CAAC;QACJ,CAAC;IACH,CAAC;IAAC,OAAO,cAAc,EAAE,CAAC;QACxB,MAAM,KAAK,GAAG,IAAA,gBAAS,EAAC,cAAc,CAAC,CAAC;QACxC,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAE9B,MAAM,gBAAgB,GAAG,MAAM,IAAA,sCAAsB,EACnD,0BAAU,CAAC,QAAQ,EACnB,IAAA,gCAAgB,EAAC,KAAK,CAAC,EACvB,SAAS,EACT,MAAM,EACN,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,EACN,KAAK,CAAC,OAAO,EACb,KAAK,CAAC,KAAK,CACZ,CAAC;QACF,IAAI,gBAAgB,KAAK,SAAS,EAAE,CAAC;YACnC,MAAM,IAAA,gCAAgB,EAAC,gBAAgB,CAAC,CAAC;QAC3C,CAAC;QACD,OAAO;IACT,CAAC;IACD,MAAM,SAAS,GAAG,oBAAoB,CAAC,iBAAiB,EAAE,CAAC;IAC3D,MAAM,CAAC,IAAI,CAAC,yBAAyB,IAAA,uCAAuB,EAAC,SAAS,CAAC,GAAG,CAAC,CAAC;IAE5E,MAAM,gBAAgB,GAAG,MAAM,IAAA,sCAAsB,EACnD,0BAAU,CAAC,QAAQ,EACnB,SAAS,EACT,SAAS,EACT,MAAM,EACN,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,CACP,CAAC;IACF,IAAI,gBAAgB,KAAK,SAAS,EAAE,CAAC;QACnC,MAAM,YAAY,GAAyB;YACzC,GAAG,gBAAgB;YACnB,GAAG,uBAAuB;YAC1B,UAAU,EAAE,oBAAoB,CAAC,iBAAiB,EAAE;SACrD,CAAC;QACF,MAAM,IAAA,gCAAgB,EAAC,YAAY,CAAC,CAAC;IACvC,CAAC;AACH,CAAC;AAED,KAAK,UAAU,EAAE,CAAC"}
{"version":3,"file":"init-action-post.js","sourceRoot":"","sources":["../src/init-action-post.ts"],"names":[],"mappings":";AAAA;;;;GAIG;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAEH,oDAAsC;AAEtC,iDAIwB;AACxB,6CAAgD;AAChD,iDAAmD;AACnD,kEAAoD;AACpD,mDAA2C;AAC3C,gFAAkE;AAClE,uCAA6C;AAC7C,6CAAgD;AAChD,mDAOyB;AACzB,iCAA8E;AAO9E,KAAK,UAAU,UAAU;IACvB,MAAM,MAAM,GAAG,IAAA,0BAAgB,GAAE,CAAC;IAClC,MAAM,SAAS,GAAG,IAAI,IAAI,EAAE,CAAC;IAC7B,IAAI,MAA0B,CAAC;IAC/B,IAAI,uBAES,CAAC;IACd,IAAI,CAAC;QACH,qCAAqC;QACrC,IAAA,4BAAa,GAAE,CAAC;QAEhB,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;QAC/C,IAAA,gCAAyB,EAAC,aAAa,EAAE,MAAM,CAAC,CAAC;QAEjD,MAAM,aAAa,GAAG,IAAA,6BAAgB,GAAE,CAAC;QACzC,MAAM,QAAQ,GAAG,IAAI,wBAAQ,CAC3B,aAAa,EACb,aAAa,EACb,IAAA,oCAAqB,GAAE,EACvB,MAAM,CACP,CAAC;QAEF,MAAM,GAAG,MAAM,IAAA,wBAAS,EAAC,IAAA,oCAAqB,GAAE,EAAE,MAAM,CAAC,CAAC;QAC1D,IAAI,MAAM,KAAK,SAAS,EAAE,CAAC;YACzB,MAAM,CAAC,OAAO,CACZ,iGAAiG,CAClG,CAAC;QACJ,CAAC;aAAM,CAAC;YACN,uBAAuB,GAAG,MAAM,oBAAoB,CAAC,GAAG,CACtD,cAAc,CAAC,mCAAmC,EAClD,6BAAc,EACd,MAAM,EACN,aAAa,EACb,QAAQ,EACR,MAAM,CACP,CAAC;QACJ,CAAC;IACH,CAAC;IAAC,OAAO,cAAc,EAAE,CAAC;QACxB,MAAM,KAAK,GAAG,IAAA,gBAAS,EAAC,cAAc,CAAC,CAAC;QACxC,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAE9B,MAAM,gBAAgB,GAAG,MAAM,IAAA,sCAAsB,EACnD,0BAAU,CAAC,QAAQ,EACnB,IAAA,gCAAgB,EAAC,KAAK,CAAC,EACvB,SAAS,EACT,MAAM,EACN,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,EACN,KAAK,CAAC,OAAO,EACb,KAAK,CAAC,KAAK,CACZ,CAAC;QACF,IAAI,gBAAgB,KAAK,SAAS,EAAE,CAAC;YACnC,MAAM,IAAA,gCAAgB,EAAC,gBAAgB,CAAC,CAAC;QAC3C,CAAC;QACD,OAAO;IACT,CAAC;IACD,MAAM,SAAS,GAAG,oBAAoB,CAAC,iBAAiB,EAAE,CAAC;IAC3D,MAAM,CAAC,IAAI,CAAC,yBAAyB,IAAA,uCAAuB,EAAC,SAAS,CAAC,GAAG,CAAC,CAAC;IAE5E,MAAM,gBAAgB,GAAG,MAAM,IAAA,sCAAsB,EACnD,0BAAU,CAAC,QAAQ,EACnB,SAAS,EACT,SAAS,EACT,MAAM,EACN,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,CACP,CAAC;IACF,IAAI,gBAAgB,KAAK,SAAS,EAAE,CAAC;QACnC,MAAM,YAAY,GAAyB;YACzC,GAAG,gBAAgB;YACnB,GAAG,uBAAuB;YAC1B,UAAU,EAAE,oBAAoB,CAAC,iBAAiB,EAAE;SACrD,CAAC;QACF,MAAM,CAAC,IAAI,CAAC,2CAA2C,CAAC,CAAC;QACzD,MAAM,IAAA,gCAAgB,EAAC,YAAY,CAAC,CAAC;QACrC,MAAM,CAAC,IAAI,CAAC,wCAAwC,CAAC,CAAC;IACxD,CAAC;AACH,CAAC;AAED,KAAK,UAAU,EAAE,CAAC"}

16
lib/init-action.js generated
View File

@@ -50,6 +50,7 @@ const feature_flags_1 = require("./feature-flags");
const init_1 = require("./init");
const languages_1 = require("./languages");
const logging_1 = require("./logging");
const overlay_database_utils_1 = require("./overlay-database-utils");
const repository_1 = require("./repository");
const setup_codeql_1 = require("./setup-codeql");
const status_report_1 = require("./status-report");
@@ -159,7 +160,7 @@ async function run() {
const gitHubVersion = await (0, api_client_1.getGitHubVersion)();
(0, util_1.checkGitHubVersionInRange)(gitHubVersion, logger);
(0, util_1.checkActionVersion)((0, actions_util_1.getActionVersion)(), gitHubVersion);
const repositoryNwo = (0, repository_1.parseRepositoryNwo)((0, util_1.getRequiredEnvParam)("GITHUB_REPOSITORY"));
const repositoryNwo = (0, repository_1.getRepositoryNwo)();
const features = new feature_flags_1.Features(gitHubVersion, repositoryNwo, (0, actions_util_1.getTemporaryDirectory)(), logger);
const jobRunUuid = (0, uuid_1.v4)();
logger.info(`Job run UUID is ${jobRunUuid}.`);
@@ -228,7 +229,12 @@ async function run() {
return;
}
try {
(0, init_1.cleanupDatabaseClusterDirectory)(config, logger);
const sourceRoot = path.resolve((0, util_1.getRequiredEnvParam)("GITHUB_WORKSPACE"), (0, actions_util_1.getOptionalInput)("source-root") || "");
const overlayDatabaseMode = await (0, init_1.getOverlayDatabaseMode)((await codeql.getVersion()).version, config, sourceRoot, logger);
logger.info(`Using overlay database mode: ${overlayDatabaseMode}`);
if (overlayDatabaseMode !== overlay_database_utils_1.OverlayDatabaseMode.Overlay) {
(0, init_1.cleanupDatabaseClusterDirectory)(config, logger);
}
if (zstdAvailability) {
await recordZstdAvailability(config, zstdAvailability);
}
@@ -313,7 +319,8 @@ async function run() {
// for details.
core.exportVariable("CODEQL_RAM", process.env["CODEQL_RAM"] ||
(0, util_1.getMemoryFlagValue)((0, actions_util_1.getOptionalInput)("ram"), logger).toString());
core.exportVariable("CODEQL_THREADS", (0, util_1.getThreadsFlagValue)((0, actions_util_1.getOptionalInput)("threads"), logger).toString());
core.exportVariable("CODEQL_THREADS", process.env["CODEQL_THREADS"] ||
(0, util_1.getThreadsFlagValue)((0, actions_util_1.getOptionalInput)("threads"), logger).toString());
// Disable Kotlin extractor if feature flag set
if (await features.getValue(feature_flags_1.Feature.DisableKotlinAnalysisEnabled)) {
core.exportVariable("CODEQL_EXTRACTOR_JAVA_AGENT_DISABLE_KOTLIN", "true");
@@ -408,8 +415,7 @@ async function run() {
core.exportVariable("CODEQL_EXTRACTOR_PYTHON_EXTRACT_STDLIB", "true");
}
}
const sourceRoot = path.resolve((0, util_1.getRequiredEnvParam)("GITHUB_WORKSPACE"), (0, actions_util_1.getOptionalInput)("source-root") || "");
const tracerConfig = await (0, init_1.runInit)(codeql, config, sourceRoot, "Runner.Worker.exe", (0, actions_util_1.getOptionalInput)("registries"), apiDetails, logger);
const tracerConfig = await (0, init_1.runInit)(codeql, config, sourceRoot, "Runner.Worker.exe", (0, actions_util_1.getOptionalInput)("registries"), apiDetails, overlayDatabaseMode, logger);
if (tracerConfig !== undefined) {
for (const [key, value] of Object.entries(tracerConfig.env)) {
core.exportVariable(key, value);

File diff suppressed because one or more lines are too long

34
lib/init.js generated
View File

@@ -35,6 +35,7 @@ var __importStar = (this && this.__importStar) || (function () {
Object.defineProperty(exports, "__esModule", { value: true });
exports.initCodeQL = initCodeQL;
exports.initConfig = initConfig;
exports.getOverlayDatabaseMode = getOverlayDatabaseMode;
exports.runInit = runInit;
exports.printPathFiltersWarning = printPathFiltersWarning;
exports.checkInstallPython311 = checkInstallPython311;
@@ -43,10 +44,13 @@ const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const toolrunner = __importStar(require("@actions/exec/lib/toolrunner"));
const io = __importStar(require("@actions/io"));
const semver = __importStar(require("semver"));
const actions_util_1 = require("./actions-util");
const codeql_1 = require("./codeql");
const configUtils = __importStar(require("./config-utils"));
const git_utils_1 = require("./git-utils");
const languages_1 = require("./languages");
const overlay_database_utils_1 = require("./overlay-database-utils");
const tools_features_1 = require("./tools-features");
const tracer_config_1 = require("./tracer-config");
const util = __importStar(require("./util"));
@@ -73,7 +77,33 @@ async function initConfig(inputs, codeql) {
logger.endGroup();
return config;
}
async function runInit(codeql, config, sourceRoot, processName, registriesInput, apiDetails, logger) {
async function getOverlayDatabaseMode(codeqlVersion, config, sourceRoot, logger) {
const overlayDatabaseMode = process.env.CODEQL_OVERLAY_DATABASE_MODE;
if (overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.Overlay ||
overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.OverlayBase) {
if (config.buildMode !== util.BuildMode.None) {
logger.warning(`Cannot build an ${overlayDatabaseMode} database because ` +
`build-mode is set to "${config.buildMode}" instead of "none". ` +
"Falling back to creating a normal full database instead.");
return overlay_database_utils_1.OverlayDatabaseMode.None;
}
if (semver.lt(codeqlVersion, overlay_database_utils_1.CODEQL_OVERLAY_MINIMUM_VERSION)) {
logger.warning(`Cannot build an ${overlayDatabaseMode} database because ` +
`the CodeQL CLI is older than ${overlay_database_utils_1.CODEQL_OVERLAY_MINIMUM_VERSION}. ` +
"Falling back to creating a normal full database instead.");
return overlay_database_utils_1.OverlayDatabaseMode.None;
}
if ((await (0, git_utils_1.getGitRoot)(sourceRoot)) === undefined) {
logger.warning(`Cannot build an ${overlayDatabaseMode} database because ` +
`the source root "${sourceRoot}" is not inside a git repository. ` +
"Falling back to creating a normal full database instead.");
return overlay_database_utils_1.OverlayDatabaseMode.None;
}
return overlayDatabaseMode;
}
return overlay_database_utils_1.OverlayDatabaseMode.None;
}
async function runInit(codeql, config, sourceRoot, processName, registriesInput, apiDetails, overlayDatabaseMode, logger) {
fs.mkdirSync(config.dbLocation, { recursive: true });
const { registriesAuthTokens, qlconfigFile } = await configUtils.generateRegistries(registriesInput, config.tempDir, logger);
await configUtils.wrapEnvironment({
@@ -81,7 +111,7 @@ async function runInit(codeql, config, sourceRoot, processName, registriesInput,
CODEQL_REGISTRIES_AUTH: registriesAuthTokens,
},
// Init a database cluster
async () => await codeql.databaseInitCluster(config, sourceRoot, processName, qlconfigFile, logger));
async () => await codeql.databaseInitCluster(config, sourceRoot, processName, qlconfigFile, overlayDatabaseMode, logger));
return await (0, tracer_config_1.getCombinedTracerConfig)(codeql, config);
}
function printPathFiltersWarning(config, logger) {

View File

@@ -1 +1 @@
{"version":3,"file":"init.js","sourceRoot":"","sources":["../src/init.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAoBA,gCAyCC;AAED,gCAgBC;AAED,0BAkCC;AAED,0DAeC;AAMD,sDAkBC;AAED,0EAkDC;AAhND,uCAAyB;AACzB,2CAA6B;AAE7B,yEAA2D;AAC3D,gDAAkC;AAElC,iDAAsE;AAEtE,qCAA+C;AAC/C,4DAA8C;AAE9C,2CAA0D;AAK1D,qDAAgD;AAChD,mDAAwE;AACxE,6CAA+B;AAExB,KAAK,UAAU,UAAU,CAC9B,UAA8B,EAC9B,UAA4B,EAC5B,OAAe,EACf,OAA2B,EAC3B,iBAA2C,EAC3C,QAA2B,EAC3B,MAAc;IAQd,MAAM,CAAC,UAAU,CAAC,oBAAoB,CAAC,CAAC;IACxC,MAAM,EACJ,MAAM,EACN,yBAAyB,EACzB,WAAW,EACX,YAAY,EACZ,gBAAgB,GACjB,GAAG,MAAM,IAAA,oBAAW,EACnB,UAAU,EACV,UAAU,EACV,OAAO,EACP,OAAO,EACP,iBAAiB,EACjB,MAAM,EACN,QAAQ,EACR,IAAI,CACL,CAAC;IACF,MAAM,MAAM,CAAC,YAAY,EAAE,CAAC;IAC5B,MAAM,CAAC,QAAQ,EAAE,CAAC;IAClB,OAAO;QACL,MAAM;QACN,yBAAyB;QACzB,WAAW;QACX,YAAY;QACZ,gBAAgB;KACjB,CAAC;AACJ,CAAC;AAEM,KAAK,UAAU,UAAU,CAC9B,MAAoC,EACpC,MAAc;IAEd,MAAM,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC;IAC7B,MAAM,CAAC,UAAU,CAAC,6BAA6B,CAAC,CAAC;IACjD,MAAM,MAAM,GAAG,MAAM,WAAW,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC;IACpD,IACE,CAAC,CAAC,MAAM,MAAM,CAAC,eAAe,CAC5B,6BAAY,CAAC,kCAAkC,CAChD,CAAC,EACF,CAAC;QACD,uBAAuB,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IAC1C,CAAC;IACD,MAAM,CAAC,QAAQ,EAAE,CAAC;IAClB,OAAO,MAAM,CAAC;AAChB,CAAC;AAEM,KAAK,UAAU,OAAO,CAC3B,MAAc,EACd,MAA0B,EAC1B,UAAkB,EAClB,WAA+B,EAC/B,eAAmC,EACnC,UAAoC,EACpC,MAAc;IAEd,EAAE,CAAC,SAAS,CAAC,MAAM,CAAC,UAAU,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;IAErD,MAAM,EAAE,oBAAoB,EAAE,YAAY,EAAE,GAC1C,MAAM,WAAW,CAAC,kBAAkB,CAClC,eAAe,EACf,MAAM,CAAC,OAAO,EACd,MAAM,CACP,CAAC;IACJ,MAAM,WAAW,CAAC,eAAe,CAC/B;QACE,YAAY,EAAE,UAAU,CAAC,IAAI;QAC7B,sBAAsB,EAAE,oBAAoB;KAC7C;IAED,0BAA0B;IAC1B,KAAK,IAAI,EAAE,CACT,MAAM,MAAM,CAAC,mBAAmB,CAC9B,MAAM,EACN,UAAU,EACV,WAAW,EACX,YAAY,EACZ,MAAM,CACP,CACJ,CAAC;IACF,OAAO,MAAM,IAAA,uCAAuB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;AACvD,CAAC;AAED,SAAgB,uBAAuB,CACrC,MAA0B,EAC1B,MAAc;IAEd,qEAAqE;IACrE,sEAAsE;IACtE,IACE,CAAC,MAAM,CAAC,iBAAiB,CAAC,KAAK,EAAE,MAAM;QACrC,MAAM,CAAC,iBAAiB,CAAC,cAAc,CAAC,EAAE,MAAM,CAAC;QACnD,CAAC,MAAM,CAAC,SAAS,CAAC,KAAK,CAAC,6BAAiB,CAAC,EAC1C,CAAC;QACD,MAAM,CAAC,OAAO,CACZ,mGAAmG,CACpG,CAAC;IACJ,CAAC;AACH,CAAC;AAED;;;GAGG;AACI,KAAK,UAAU,qBAAqB,CACzC,SAAqB,EACrB,MAAc;IAEd,IACE,SAAS,CAAC,QAAQ,CAAC,oBAAQ,CAAC,MAAM,CAAC;QACnC,OAAO,CAAC,QAAQ,KAAK,OAAO;QAC5B,CAAC,CAAC,MAAM,MAAM,CAAC,UAAU,EAAE,CAAC,CAAC,QAAQ,EAAE,iBAAiB,EACxD,CAAC;QACD,MAAM,MAAM,GAAG,IAAI,CAAC,OAAO,CACzB,SAAS,EACT,iBAAiB,EACjB,oBAAoB,CACrB,CAAC;QACF,MAAM,IAAI,UAAU,CAAC,UAAU,CAAC,MAAM,EAAE,CAAC,KAAK,CAAC,YAAY,EAAE,IAAI,CAAC,EAAE;YAClE,MAAM;SACP,CAAC,CAAC,IAAI,EAAE,CAAC;IACZ,CAAC;AACH,CAAC;AAED,SAAgB,+BAA+B,CAC7C,MAA0B,EAC1B,MAAc;AACd,+FAA+F;AAC/F,eAAe;AACf,MAAM,GAAG,EAAE,CAAC,MAAM;IAElB,IACE,EAAE,CAAC,UAAU,CAAC,MAAM,CAAC,UAAU,CAAC;QAChC,CAAC,EAAE,CAAC,QAAQ,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,EAAE;YACtC,EAAE,CAAC,WAAW,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,EAC3C,CAAC;QACD,MAAM,CAAC,OAAO,CACZ,kCAAkC,MAAM,CAAC,UAAU,4CAA4C,CAChG,CAAC;QACF,IAAI,CAAC;YACH,MAAM,CAAC,MAAM,CAAC,UAAU,EAAE;gBACxB,KAAK,EAAE,IAAI;gBACX,UAAU,EAAE,CAAC;gBACb,SAAS,EAAE,IAAI;aAChB,CAAC,CAAC;YAEH,MAAM,CAAC,IAAI,CACT,yCAAyC,MAAM,CAAC,UAAU,GAAG,CAC9D,CAAC;QACJ,CAAC;QAAC,OAAO,CAAC,EAAE,CAAC;YACX,MAAM,KAAK,GAAG,mEACZ,IAAA,+BAAgB,EAAC,aAAa,CAAC;gBAC7B,CAAC,CAAC,sCAAsC,MAAM,CAAC,UAAU,IAAI;gBAC7D,CAAC,CAAC,kCAAkC,MAAM,CAAC,UAAU,IAAI;oBACvD,yEACN,iEAAiE,CAAC;YAElE,kGAAkG;YAClG,IAAI,IAAA,iCAAkB,GAAE,EAAE,CAAC;gBACzB,MAAM,IAAI,IAAI,CAAC,kBAAkB,CAC/B,GAAG,KAAK,4GAA4G;oBAClH,sEAAsE,IAAI,CAAC,eAAe,CACxF,CAAC,CACF,EAAE,CACN,CAAC;YACJ,CAAC;iBAAM,CAAC;gBACN,MAAM,IAAI,KAAK,CACb,GAAG,KAAK,sDAAsD;oBAC5D,+EAA+E;oBAC/E,yCAAyC,IAAI,CAAC,eAAe,CAAC,CAAC,CAAC,EAAE,CACrE,CAAC;YACJ,CAAC;QACH,CAAC;IACH,CAAC;AACH,CAAC"}
{"version":3,"file":"init.js","sourceRoot":"","sources":["../src/init.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AA0BA,gCAyCC;AAED,gCAgBC;AAED,wDAuCC;AAED,0BAoCC;AAED,0DAeC;AAMD,sDAkBC;AAED,0EAkDC;AAjQD,uCAAyB;AACzB,2CAA6B;AAE7B,yEAA2D;AAC3D,gDAAkC;AAClC,+CAAiC;AAEjC,iDAAsE;AAEtE,qCAA+C;AAC/C,4DAA8C;AAE9C,2CAAyC;AACzC,2CAA0D;AAE1D,qEAGkC;AAIlC,qDAAgD;AAChD,mDAAwE;AACxE,6CAA+B;AAExB,KAAK,UAAU,UAAU,CAC9B,UAA8B,EAC9B,UAA4B,EAC5B,OAAe,EACf,OAA2B,EAC3B,iBAA2C,EAC3C,QAA2B,EAC3B,MAAc;IAQd,MAAM,CAAC,UAAU,CAAC,oBAAoB,CAAC,CAAC;IACxC,MAAM,EACJ,MAAM,EACN,yBAAyB,EACzB,WAAW,EACX,YAAY,EACZ,gBAAgB,GACjB,GAAG,MAAM,IAAA,oBAAW,EACnB,UAAU,EACV,UAAU,EACV,OAAO,EACP,OAAO,EACP,iBAAiB,EACjB,MAAM,EACN,QAAQ,EACR,IAAI,CACL,CAAC;IACF,MAAM,MAAM,CAAC,YAAY,EAAE,CAAC;IAC5B,MAAM,CAAC,QAAQ,EAAE,CAAC;IAClB,OAAO;QACL,MAAM;QACN,yBAAyB;QACzB,WAAW;QACX,YAAY;QACZ,gBAAgB;KACjB,CAAC;AACJ,CAAC;AAEM,KAAK,UAAU,UAAU,CAC9B,MAAoC,EACpC,MAAc;IAEd,MAAM,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC;IAC7B,MAAM,CAAC,UAAU,CAAC,6BAA6B,CAAC,CAAC;IACjD,MAAM,MAAM,GAAG,MAAM,WAAW,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC;IACpD,IACE,CAAC,CAAC,MAAM,MAAM,CAAC,eAAe,CAC5B,6BAAY,CAAC,kCAAkC,CAChD,CAAC,EACF,CAAC;QACD,uBAAuB,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IAC1C,CAAC;IACD,MAAM,CAAC,QAAQ,EAAE,CAAC;IAClB,OAAO,MAAM,CAAC;AAChB,CAAC;AAEM,KAAK,UAAU,sBAAsB,CAC1C,aAAqB,EACrB,MAA0B,EAC1B,UAAkB,EAClB,MAAc;IAEd,MAAM,mBAAmB,GAAG,OAAO,CAAC,GAAG,CAAC,4BAA4B,CAAC;IAErE,IACE,mBAAmB,KAAK,4CAAmB,CAAC,OAAO;QACnD,mBAAmB,KAAK,4CAAmB,CAAC,WAAW,EACvD,CAAC;QACD,IAAI,MAAM,CAAC,SAAS,KAAK,IAAI,CAAC,SAAS,CAAC,IAAI,EAAE,CAAC;YAC7C,MAAM,CAAC,OAAO,CACZ,mBAAmB,mBAAmB,oBAAoB;gBACxD,yBAAyB,MAAM,CAAC,SAAS,uBAAuB;gBAChE,0DAA0D,CAC7D,CAAC;YACF,OAAO,4CAAmB,CAAC,IAAI,CAAC;QAClC,CAAC;QACD,IAAI,MAAM,CAAC,EAAE,CAAC,aAAa,EAAE,uDAA8B,CAAC,EAAE,CAAC;YAC7D,MAAM,CAAC,OAAO,CACZ,mBAAmB,mBAAmB,oBAAoB;gBACxD,gCAAgC,uDAA8B,IAAI;gBAClE,0DAA0D,CAC7D,CAAC;YACF,OAAO,4CAAmB,CAAC,IAAI,CAAC;QAClC,CAAC;QACD,IAAI,CAAC,MAAM,IAAA,sBAAU,EAAC,UAAU,CAAC,CAAC,KAAK,SAAS,EAAE,CAAC;YACjD,MAAM,CAAC,OAAO,CACZ,mBAAmB,mBAAmB,oBAAoB;gBACxD,oBAAoB,UAAU,oCAAoC;gBAClE,0DAA0D,CAC7D,CAAC;YACF,OAAO,4CAAmB,CAAC,IAAI,CAAC;QAClC,CAAC;QACD,OAAO,mBAA0C,CAAC;IACpD,CAAC;IACD,OAAO,4CAAmB,CAAC,IAAI,CAAC;AAClC,CAAC;AAEM,KAAK,UAAU,OAAO,CAC3B,MAAc,EACd,MAA0B,EAC1B,UAAkB,EAClB,WAA+B,EAC/B,eAAmC,EACnC,UAAoC,EACpC,mBAAwC,EACxC,MAAc;IAEd,EAAE,CAAC,SAAS,CAAC,MAAM,CAAC,UAAU,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;IAErD,MAAM,EAAE,oBAAoB,EAAE,YAAY,EAAE,GAC1C,MAAM,WAAW,CAAC,kBAAkB,CAClC,eAAe,EACf,MAAM,CAAC,OAAO,EACd,MAAM,CACP,CAAC;IACJ,MAAM,WAAW,CAAC,eAAe,CAC/B;QACE,YAAY,EAAE,UAAU,CAAC,IAAI;QAC7B,sBAAsB,EAAE,oBAAoB;KAC7C;IAED,0BAA0B;IAC1B,KAAK,IAAI,EAAE,CACT,MAAM,MAAM,CAAC,mBAAmB,CAC9B,MAAM,EACN,UAAU,EACV,WAAW,EACX,YAAY,EACZ,mBAAmB,EACnB,MAAM,CACP,CACJ,CAAC;IACF,OAAO,MAAM,IAAA,uCAAuB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;AACvD,CAAC;AAED,SAAgB,uBAAuB,CACrC,MAA0B,EAC1B,MAAc;IAEd,qEAAqE;IACrE,sEAAsE;IACtE,IACE,CAAC,MAAM,CAAC,iBAAiB,CAAC,KAAK,EAAE,MAAM;QACrC,MAAM,CAAC,iBAAiB,CAAC,cAAc,CAAC,EAAE,MAAM,CAAC;QACnD,CAAC,MAAM,CAAC,SAAS,CAAC,KAAK,CAAC,6BAAiB,CAAC,EAC1C,CAAC;QACD,MAAM,CAAC,OAAO,CACZ,mGAAmG,CACpG,CAAC;IACJ,CAAC;AACH,CAAC;AAED;;;GAGG;AACI,KAAK,UAAU,qBAAqB,CACzC,SAAqB,EACrB,MAAc;IAEd,IACE,SAAS,CAAC,QAAQ,CAAC,oBAAQ,CAAC,MAAM,CAAC;QACnC,OAAO,CAAC,QAAQ,KAAK,OAAO;QAC5B,CAAC,CAAC,MAAM,MAAM,CAAC,UAAU,EAAE,CAAC,CAAC,QAAQ,EAAE,iBAAiB,EACxD,CAAC;QACD,MAAM,MAAM,GAAG,IAAI,CAAC,OAAO,CACzB,SAAS,EACT,iBAAiB,EACjB,oBAAoB,CACrB,CAAC;QACF,MAAM,IAAI,UAAU,CAAC,UAAU,CAAC,MAAM,EAAE,CAAC,KAAK,CAAC,YAAY,EAAE,IAAI,CAAC,EAAE;YAClE,MAAM;SACP,CAAC,CAAC,IAAI,EAAE,CAAC;IACZ,CAAC;AACH,CAAC;AAED,SAAgB,+BAA+B,CAC7C,MAA0B,EAC1B,MAAc;AACd,+FAA+F;AAC/F,eAAe;AACf,MAAM,GAAG,EAAE,CAAC,MAAM;IAElB,IACE,EAAE,CAAC,UAAU,CAAC,MAAM,CAAC,UAAU,CAAC;QAChC,CAAC,EAAE,CAAC,QAAQ,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,EAAE;YACtC,EAAE,CAAC,WAAW,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,EAC3C,CAAC;QACD,MAAM,CAAC,OAAO,CACZ,kCAAkC,MAAM,CAAC,UAAU,4CAA4C,CAChG,CAAC;QACF,IAAI,CAAC;YACH,MAAM,CAAC,MAAM,CAAC,UAAU,EAAE;gBACxB,KAAK,EAAE,IAAI;gBACX,UAAU,EAAE,CAAC;gBACb,SAAS,EAAE,IAAI;aAChB,CAAC,CAAC;YAEH,MAAM,CAAC,IAAI,CACT,yCAAyC,MAAM,CAAC,UAAU,GAAG,CAC9D,CAAC;QACJ,CAAC;QAAC,OAAO,CAAC,EAAE,CAAC;YACX,MAAM,KAAK,GAAG,mEACZ,IAAA,+BAAgB,EAAC,aAAa,CAAC;gBAC7B,CAAC,CAAC,sCAAsC,MAAM,CAAC,UAAU,IAAI;gBAC7D,CAAC,CAAC,kCAAkC,MAAM,CAAC,UAAU,IAAI;oBACvD,yEACN,iEAAiE,CAAC;YAElE,kGAAkG;YAClG,IAAI,IAAA,iCAAkB,GAAE,EAAE,CAAC;gBACzB,MAAM,IAAI,IAAI,CAAC,kBAAkB,CAC/B,GAAG,KAAK,4GAA4G;oBAClH,sEAAsE,IAAI,CAAC,eAAe,CACxF,CAAC,CACF,EAAE,CACN,CAAC;YACJ,CAAC;iBAAM,CAAC;gBACN,MAAM,IAAI,KAAK,CACb,GAAG,KAAK,sDAAsD;oBAC5D,+EAA+E;oBAC/E,yCAAyC,IAAI,CAAC,eAAe,CAAC,CAAC,CAAC,EAAE,CACrE,CAAC;YACJ,CAAC;QACH,CAAC;IACH,CAAC;AACH,CAAC"}

129
lib/overlay-database-utils.js generated Normal file
View File

@@ -0,0 +1,129 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || (function () {
var ownKeys = function(o) {
ownKeys = Object.getOwnPropertyNames || function (o) {
var ar = [];
for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
return ar;
};
return ownKeys(o);
};
return function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
__setModuleDefault(result, mod);
return result;
};
})();
Object.defineProperty(exports, "__esModule", { value: true });
exports.CODEQL_OVERLAY_MINIMUM_VERSION = exports.OverlayDatabaseMode = void 0;
exports.writeBaseDatabaseOidsFile = writeBaseDatabaseOidsFile;
exports.writeOverlayChangesFile = writeOverlayChangesFile;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const actions_util_1 = require("./actions-util");
const git_utils_1 = require("./git-utils");
var OverlayDatabaseMode;
(function (OverlayDatabaseMode) {
OverlayDatabaseMode["Overlay"] = "overlay";
OverlayDatabaseMode["OverlayBase"] = "overlay-base";
OverlayDatabaseMode["None"] = "none";
})(OverlayDatabaseMode || (exports.OverlayDatabaseMode = OverlayDatabaseMode = {}));
exports.CODEQL_OVERLAY_MINIMUM_VERSION = "2.20.5";
/**
* Writes a JSON file containing Git OIDs for all tracked files (represented
* by path relative to the source root) under the source root. The file is
* written into the database location specified in the config.
*
* @param config The configuration object containing the database location
* @param sourceRoot The root directory containing the source files to process
* @throws {Error} If the Git repository root cannot be determined
*/
async function writeBaseDatabaseOidsFile(config, sourceRoot) {
const gitFileOids = await (0, git_utils_1.getFileOidsUnderPath)(sourceRoot);
const gitFileOidsJson = JSON.stringify(gitFileOids);
const baseDatabaseOidsFilePath = getBaseDatabaseOidsFilePath(config);
await fs.promises.writeFile(baseDatabaseOidsFilePath, gitFileOidsJson);
}
/**
* Reads and parses the JSON file containing the base database Git OIDs.
* This file contains the mapping of file paths to their corresponding Git OIDs
* that was previously written by writeBaseDatabaseOidsFile().
*
* @param config The configuration object containing the database location
* @param logger The logger instance to use for error reporting
* @returns An object mapping file paths (relative to source root) to their Git OIDs
* @throws {Error} If the file cannot be read or parsed
*/
async function readBaseDatabaseOidsFile(config, logger) {
const baseDatabaseOidsFilePath = getBaseDatabaseOidsFilePath(config);
try {
const contents = await fs.promises.readFile(baseDatabaseOidsFilePath, "utf-8");
return JSON.parse(contents);
}
catch (e) {
logger.error("Failed to read overlay-base file OIDs from " +
`${baseDatabaseOidsFilePath}: ${e.message || e}`);
throw e;
}
}
function getBaseDatabaseOidsFilePath(config) {
return path.join(config.dbLocation, "base-database-oids.json");
}
/**
* Writes a JSON file containing the source-root-relative paths of files under
* `sourceRoot` that have changed (added, removed, or modified) from the overlay
* base database.
*
* This function uses the Git index to determine which files have changed, so it
* requires the following preconditions, both when this function is called and
* when the overlay-base database was initialized:
*
* - It requires that `sourceRoot` is inside a Git repository.
* - It assumes that all changes in the working tree are staged in the index.
* - It assumes that all files of interest are tracked by Git, e.g. not covered
* by `.gitignore`.
*/
async function writeOverlayChangesFile(config, sourceRoot, logger) {
const baseFileOids = await readBaseDatabaseOidsFile(config, logger);
const overlayFileOids = await (0, git_utils_1.getFileOidsUnderPath)(sourceRoot);
const changedFiles = computeChangedFiles(baseFileOids, overlayFileOids);
logger.info(`Found ${changedFiles.length} changed file(s) under ${sourceRoot}.`);
const changedFilesJson = JSON.stringify({ changes: changedFiles });
const overlayChangesFile = path.join((0, actions_util_1.getTemporaryDirectory)(), "overlay-changes.json");
logger.debug(`Writing overlay changed files to ${overlayChangesFile}: ${changedFilesJson}`);
await fs.promises.writeFile(overlayChangesFile, changedFilesJson);
return overlayChangesFile;
}
function computeChangedFiles(baseFileOids, overlayFileOids) {
const changes = [];
for (const [file, oid] of Object.entries(overlayFileOids)) {
if (!(file in baseFileOids) || baseFileOids[file] !== oid) {
changes.push(file);
}
}
for (const file of Object.keys(baseFileOids)) {
if (!(file in overlayFileOids)) {
changes.push(file);
}
}
return changes;
}
//# sourceMappingURL=overlay-database-utils.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"overlay-database-utils.js","sourceRoot":"","sources":["../src/overlay-database-utils.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAyBA,8DAQC;AAkDD,0DAsBC;AAzGD,uCAAyB;AACzB,2CAA6B;AAE7B,iDAAuD;AAEvD,2CAAmD;AAGnD,IAAY,mBAIX;AAJD,WAAY,mBAAmB;IAC7B,0CAAmB,CAAA;IACnB,mDAA4B,CAAA;IAC5B,oCAAa,CAAA;AACf,CAAC,EAJW,mBAAmB,mCAAnB,mBAAmB,QAI9B;AAEY,QAAA,8BAA8B,GAAG,QAAQ,CAAC;AAEvD;;;;;;;;GAQG;AACI,KAAK,UAAU,yBAAyB,CAC7C,MAAc,EACd,UAAkB;IAElB,MAAM,WAAW,GAAG,MAAM,IAAA,gCAAoB,EAAC,UAAU,CAAC,CAAC;IAC3D,MAAM,eAAe,GAAG,IAAI,CAAC,SAAS,CAAC,WAAW,CAAC,CAAC;IACpD,MAAM,wBAAwB,GAAG,2BAA2B,CAAC,MAAM,CAAC,CAAC;IACrE,MAAM,EAAE,CAAC,QAAQ,CAAC,SAAS,CAAC,wBAAwB,EAAE,eAAe,CAAC,CAAC;AACzE,CAAC;AAED;;;;;;;;;GASG;AACH,KAAK,UAAU,wBAAwB,CACrC,MAAc,EACd,MAAc;IAEd,MAAM,wBAAwB,GAAG,2BAA2B,CAAC,MAAM,CAAC,CAAC;IACrE,IAAI,CAAC;QACH,MAAM,QAAQ,GAAG,MAAM,EAAE,CAAC,QAAQ,CAAC,QAAQ,CACzC,wBAAwB,EACxB,OAAO,CACR,CAAC;QACF,OAAO,IAAI,CAAC,KAAK,CAAC,QAAQ,CAA8B,CAAC;IAC3D,CAAC;IAAC,OAAO,CAAC,EAAE,CAAC;QACX,MAAM,CAAC,KAAK,CACV,6CAA6C;YAC3C,GAAG,wBAAwB,KAAM,CAAS,CAAC,OAAO,IAAI,CAAC,EAAE,CAC5D,CAAC;QACF,MAAM,CAAC,CAAC;IACV,CAAC;AACH,CAAC;AAED,SAAS,2BAA2B,CAAC,MAAc;IACjD,OAAO,IAAI,CAAC,IAAI,CAAC,MAAM,CAAC,UAAU,EAAE,yBAAyB,CAAC,CAAC;AACjE,CAAC;AAED;;;;;;;;;;;;;GAaG;AACI,KAAK,UAAU,uBAAuB,CAC3C,MAAc,EACd,UAAkB,EAClB,MAAc;IAEd,MAAM,YAAY,GAAG,MAAM,wBAAwB,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IACpE,MAAM,eAAe,GAAG,MAAM,IAAA,gCAAoB,EAAC,UAAU,CAAC,CAAC;IAC/D,MAAM,YAAY,GAAG,mBAAmB,CAAC,YAAY,EAAE,eAAe,CAAC,CAAC;IACxE,MAAM,CAAC,IAAI,CACT,SAAS,YAAY,CAAC,MAAM,0BAA0B,UAAU,GAAG,CACpE,CAAC;IAEF,MAAM,gBAAgB,GAAG,IAAI,CAAC,SAAS,CAAC,EAAE,OAAO,EAAE,YAAY,EAAE,CAAC,CAAC;IACnE,MAAM,kBAAkB,GAAG,IAAI,CAAC,IAAI,CAClC,IAAA,oCAAqB,GAAE,EACvB,sBAAsB,CACvB,CAAC;IACF,MAAM,CAAC,KAAK,CACV,oCAAoC,kBAAkB,KAAK,gBAAgB,EAAE,CAC9E,CAAC;IACF,MAAM,EAAE,CAAC,QAAQ,CAAC,SAAS,CAAC,kBAAkB,EAAE,gBAAgB,CAAC,CAAC;IAClE,OAAO,kBAAkB,CAAC;AAC5B,CAAC;AAED,SAAS,mBAAmB,CAC1B,YAAuC,EACvC,eAA0C;IAE1C,MAAM,OAAO,GAAa,EAAE,CAAC;IAC7B,KAAK,MAAM,CAAC,IAAI,EAAE,GAAG,CAAC,IAAI,MAAM,CAAC,OAAO,CAAC,eAAe,CAAC,EAAE,CAAC;QAC1D,IAAI,CAAC,CAAC,IAAI,IAAI,YAAY,CAAC,IAAI,YAAY,CAAC,IAAI,CAAC,KAAK,GAAG,EAAE,CAAC;YAC1D,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QACrB,CAAC;IACH,CAAC;IACD,KAAK,MAAM,IAAI,IAAI,MAAM,CAAC,IAAI,CAAC,YAAY,CAAC,EAAE,CAAC;QAC7C,IAAI,CAAC,CAAC,IAAI,IAAI,eAAe,CAAC,EAAE,CAAC;YAC/B,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QACrB,CAAC;IACH,CAAC;IACD,OAAO,OAAO,CAAC;AACjB,CAAC"}

94
lib/overlay-database-utils.test.js generated Normal file
View File

@@ -0,0 +1,94 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || (function () {
var ownKeys = function(o) {
ownKeys = Object.getOwnPropertyNames || function (o) {
var ar = [];
for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
return ar;
};
return ownKeys(o);
};
return function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
__setModuleDefault(result, mod);
return result;
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const ava_1 = __importDefault(require("ava"));
const sinon = __importStar(require("sinon"));
const actionsUtil = __importStar(require("./actions-util"));
const gitUtils = __importStar(require("./git-utils"));
const logging_1 = require("./logging");
const overlay_database_utils_1 = require("./overlay-database-utils");
const testing_utils_1 = require("./testing-utils");
const util_1 = require("./util");
(0, testing_utils_1.setupTests)(ava_1.default);
(0, ava_1.default)("writeOverlayChangesFile generates correct changes file", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const dbLocation = path.join(tmpDir, "db");
await fs.promises.mkdir(dbLocation, { recursive: true });
const sourceRoot = path.join(tmpDir, "src");
await fs.promises.mkdir(sourceRoot, { recursive: true });
const tempDir = path.join(tmpDir, "temp");
await fs.promises.mkdir(tempDir, { recursive: true });
const logger = (0, logging_1.getRunnerLogger)(true);
const config = (0, testing_utils_1.createTestConfig)({ dbLocation });
// Mock the getFileOidsUnderPath function to return base OIDs
const baseOids = {
"unchanged.js": "aaa111",
"modified.js": "bbb222",
"deleted.js": "ccc333",
};
const getFileOidsStubForBase = sinon
.stub(gitUtils, "getFileOidsUnderPath")
.resolves(baseOids);
// Write the base database OIDs file
await (0, overlay_database_utils_1.writeBaseDatabaseOidsFile)(config, sourceRoot);
getFileOidsStubForBase.restore();
// Mock the getFileOidsUnderPath function to return overlay OIDs
const currentOids = {
"unchanged.js": "aaa111",
"modified.js": "ddd444", // Changed OID
"added.js": "eee555", // New file
};
const getFileOidsStubForOverlay = sinon
.stub(gitUtils, "getFileOidsUnderPath")
.resolves(currentOids);
// Write the overlay changes file, which uses the mocked overlay OIDs
// and the base database OIDs file
const getTempDirStub = sinon
.stub(actionsUtil, "getTemporaryDirectory")
.returns(tempDir);
const changesFilePath = await (0, overlay_database_utils_1.writeOverlayChangesFile)(config, sourceRoot, logger);
getFileOidsStubForOverlay.restore();
getTempDirStub.restore();
const fileContent = await fs.promises.readFile(changesFilePath, "utf-8");
const parsedContent = JSON.parse(fileContent);
t.deepEqual(parsedContent.changes.sort(), ["added.js", "deleted.js", "modified.js"], "Should identify added, deleted, and modified files");
});
});
//# sourceMappingURL=overlay-database-utils.test.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"overlay-database-utils.test.js","sourceRoot":"","sources":["../src/overlay-database-utils.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,uCAAyB;AACzB,2CAA6B;AAE7B,8CAAuB;AACvB,6CAA+B;AAE/B,4DAA8C;AAC9C,sDAAwC;AACxC,uCAA4C;AAC5C,qEAGkC;AAClC,mDAA+D;AAC/D,iCAAoC;AAEpC,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,IAAA,aAAI,EAAC,wDAAwD,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACzE,MAAM,IAAA,iBAAU,EAAC,KAAK,EAAE,MAAM,EAAE,EAAE;QAChC,MAAM,UAAU,GAAG,IAAI,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,CAAC;QAC3C,MAAM,EAAE,CAAC,QAAQ,CAAC,KAAK,CAAC,UAAU,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;QACzD,MAAM,UAAU,GAAG,IAAI,CAAC,IAAI,CAAC,MAAM,EAAE,KAAK,CAAC,CAAC;QAC5C,MAAM,EAAE,CAAC,QAAQ,CAAC,KAAK,CAAC,UAAU,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;QACzD,MAAM,OAAO,GAAG,IAAI,CAAC,IAAI,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QAC1C,MAAM,EAAE,CAAC,QAAQ,CAAC,KAAK,CAAC,OAAO,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;QAEtD,MAAM,MAAM,GAAG,IAAA,yBAAe,EAAC,IAAI,CAAC,CAAC;QACrC,MAAM,MAAM,GAAG,IAAA,gCAAgB,EAAC,EAAE,UAAU,EAAE,CAAC,CAAC;QAEhD,6DAA6D;QAC7D,MAAM,QAAQ,GAAG;YACf,cAAc,EAAE,QAAQ;YACxB,aAAa,EAAE,QAAQ;YACvB,YAAY,EAAE,QAAQ;SACvB,CAAC;QACF,MAAM,sBAAsB,GAAG,KAAK;aACjC,IAAI,CAAC,QAAQ,EAAE,sBAAsB,CAAC;aACtC,QAAQ,CAAC,QAAQ,CAAC,CAAC;QAEtB,oCAAoC;QACpC,MAAM,IAAA,kDAAyB,EAAC,MAAM,EAAE,UAAU,CAAC,CAAC;QACpD,sBAAsB,CAAC,OAAO,EAAE,CAAC;QAEjC,gEAAgE;QAChE,MAAM,WAAW,GAAG;YAClB,cAAc,EAAE,QAAQ;YACxB,aAAa,EAAE,QAAQ,EAAE,cAAc;YACvC,UAAU,EAAE,QAAQ,EAAE,WAAW;SAClC,CAAC;QACF,MAAM,yBAAyB,GAAG,KAAK;aACpC,IAAI,CAAC,QAAQ,EAAE,sBAAsB,CAAC;aACtC,QAAQ,CAAC,WAAW,CAAC,CAAC;QAEzB,qEAAqE;QACrE,kCAAkC;QAClC,MAAM,cAAc,GAAG,KAAK;aACzB,IAAI,CAAC,WAAW,EAAE,uBAAuB,CAAC;aAC1C,OAAO,CAAC,OAAO,CAAC,CAAC;QACpB,MAAM,eAAe,GAAG,MAAM,IAAA,gDAAuB,EACnD,MAAM,EACN,UAAU,EACV,MAAM,CACP,CAAC;QACF,yBAAyB,CAAC,OAAO,EAAE,CAAC;QACpC,cAAc,CAAC,OAAO,EAAE,CAAC;QAEzB,MAAM,WAAW,GAAG,MAAM,EAAE,CAAC,QAAQ,CAAC,QAAQ,CAAC,eAAe,EAAE,OAAO,CAAC,CAAC;QACzE,MAAM,aAAa,GAAG,IAAI,CAAC,KAAK,CAAC,WAAW,CAA0B,CAAC;QAEvE,CAAC,CAAC,SAAS,CACT,aAAa,CAAC,OAAO,CAAC,IAAI,EAAE,EAC5B,CAAC,UAAU,EAAE,YAAY,EAAE,aAAa,CAAC,EACzC,oDAAoD,CACrD,CAAC;IACJ,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC"}

26
lib/repository.js generated
View File

@@ -1,7 +1,33 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.getRepositoryNwo = getRepositoryNwo;
exports.getRepositoryNwoFromEnv = getRepositoryNwoFromEnv;
exports.parseRepositoryNwo = parseRepositoryNwo;
const util_1 = require("./util");
/**
* Get the repository name with owner from the environment variable
* `GITHUB_REPOSITORY`.
*
* @returns The repository name with owner.
*/
function getRepositoryNwo() {
return getRepositoryNwoFromEnv("GITHUB_REPOSITORY");
}
/**
* Get the repository name with owner from the first environment variable that
* is set and non-empty.
*
* @param envVarNames The names of the environment variables to check.
* @returns The repository name with owner.
* @throws ConfigurationError if none of the environment variables are set.
*/
function getRepositoryNwoFromEnv(...envVarNames) {
const envVarName = envVarNames.find((name) => process.env[name]);
if (!envVarName) {
throw new util_1.ConfigurationError(`None of the env vars ${envVarNames.join(", ")} are set`);
}
return parseRepositoryNwo((0, util_1.getRequiredEnvParam)(envVarName));
}
function parseRepositoryNwo(input) {
const parts = input.split("/");
if (parts.length !== 2) {

View File

@@ -1 +1 @@
{"version":3,"file":"repository.js","sourceRoot":"","sources":["../src/repository.ts"],"names":[],"mappings":";;AAQA,gDASC;AAjBD,iCAA4C;AAQ5C,SAAgB,kBAAkB,CAAC,KAAa;IAC9C,MAAM,KAAK,GAAG,KAAK,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;IAC/B,IAAI,KAAK,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;QACvB,MAAM,IAAI,yBAAkB,CAAC,IAAI,KAAK,kCAAkC,CAAC,CAAC;IAC5E,CAAC;IACD,OAAO;QACL,KAAK,EAAE,KAAK,CAAC,CAAC,CAAC;QACf,IAAI,EAAE,KAAK,CAAC,CAAC,CAAC;KACf,CAAC;AACJ,CAAC"}
{"version":3,"file":"repository.js","sourceRoot":"","sources":["../src/repository.ts"],"names":[],"mappings":";;AAcA,4CAEC;AAUD,0DAUC;AAED,gDASC;AA/CD,iCAAiE;AAQjE;;;;;GAKG;AACH,SAAgB,gBAAgB;IAC9B,OAAO,uBAAuB,CAAC,mBAAmB,CAAC,CAAC;AACtD,CAAC;AAED;;;;;;;GAOG;AACH,SAAgB,uBAAuB,CACrC,GAAG,WAAqB;IAExB,MAAM,UAAU,GAAG,WAAW,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAAC,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,CAAC;IACjE,IAAI,CAAC,UAAU,EAAE,CAAC;QAChB,MAAM,IAAI,yBAAkB,CAC1B,wBAAwB,WAAW,CAAC,IAAI,CAAC,IAAI,CAAC,UAAU,CACzD,CAAC;IACJ,CAAC;IACD,OAAO,kBAAkB,CAAC,IAAA,0BAAmB,EAAC,UAAU,CAAC,CAAC,CAAC;AAC7D,CAAC;AAED,SAAgB,kBAAkB,CAAC,KAAa;IAC9C,MAAM,KAAK,GAAG,KAAK,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;IAC/B,IAAI,KAAK,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;QACvB,MAAM,IAAI,yBAAkB,CAAC,IAAI,KAAK,kCAAkC,CAAC,CAAC;IAC5E,CAAC;IACD,OAAO;QACL,KAAK,EAAE,KAAK,CAAC,CAAC,CAAC;QACf,IAAI,EAAE,KAAK,CAAC,CAAC,CAAC;KACf,CAAC;AACJ,CAAC"}

View File

@@ -43,8 +43,8 @@ const logging_1 = require("./logging");
const start_proxy_1 = require("./start-proxy");
const util = __importStar(require("./util"));
const UPDATEJOB_PROXY = "update-job-proxy";
const UPDATEJOB_PROXY_VERSION = "v2.0.20241023203727";
const UPDATEJOB_PROXY_URL_PREFIX = "https://github.com/github/codeql-action/releases/download/codeql-bundle-v2.18.1/";
const UPDATEJOB_PROXY_VERSION = "v2.0.20250424171100";
const UPDATEJOB_PROXY_URL_PREFIX = "https://github.com/github/codeql-action/releases/download/codeql-bundle-v2.21.1/";
const KEY_SIZE = 2048;
const KEY_EXPIRY_YEARS = 2;
const CERT_SUBJECT = [

2
lib/start-proxy.js generated
View File

@@ -10,10 +10,10 @@ const LANGUAGE_TO_REGISTRY_TYPE = {
python: "python_index",
ruby: "rubygems_server",
rust: "cargo_registry",
go: "goproxy_server",
// We do not have an established proxy type for these languages, thus leaving empty.
actions: "",
cpp: "",
go: "",
swift: "",
};
// getCredentials returns registry credentials from action inputs.

View File

@@ -1 +1 @@
{"version":3,"file":"start-proxy.js","sourceRoot":"","sources":["../src/start-proxy.ts"],"names":[],"mappings":";;AA8BA,wCA2EC;AAzGD,2CAAsD;AAEtD,iCAA4C;AAW5C,MAAM,yBAAyB,GAA6B;IAC1D,IAAI,EAAE,kBAAkB;IACxB,MAAM,EAAE,YAAY;IACpB,UAAU,EAAE,cAAc;IAC1B,MAAM,EAAE,cAAc;IACtB,IAAI,EAAE,iBAAiB;IACvB,IAAI,EAAE,gBAAgB;IACtB,oFAAoF;IACpF,OAAO,EAAE,EAAE;IACX,GAAG,EAAE,EAAE;IACP,EAAE,EAAE,EAAE;IACN,KAAK,EAAE,EAAE;CACD,CAAC;AAEX,kEAAkE;AAClE,+DAA+D;AAC/D,gDAAgD;AAChD,SAAgB,cAAc,CAC5B,MAAc,EACd,eAAmC,EACnC,qBAAyC,EACzC,cAAkC;IAElC,MAAM,QAAQ,GAAG,cAAc,CAAC,CAAC,CAAC,IAAA,yBAAa,EAAC,cAAc,CAAC,CAAC,CAAC,CAAC,SAAS,CAAC;IAC5E,MAAM,uBAAuB,GAAG,QAAQ;QACtC,CAAC,CAAC,yBAAyB,CAAC,QAAQ,CAAC;QACrC,CAAC,CAAC,SAAS,CAAC;IAEd,IAAI,cAAsB,CAAC;IAC3B,IAAI,qBAAqB,KAAK,SAAS,EAAE,CAAC;QACxC,MAAM,CAAC,IAAI,CAAC,qCAAqC,CAAC,CAAC;QACnD,cAAc,GAAG,MAAM,CAAC,IAAI,CAAC,qBAAqB,EAAE,QAAQ,CAAC,CAAC,QAAQ,EAAE,CAAC;IAC3E,CAAC;SAAM,IAAI,eAAe,KAAK,SAAS,EAAE,CAAC;QACzC,MAAM,CAAC,IAAI,CAAC,+BAA+B,CAAC,CAAC;QAC7C,cAAc,GAAG,eAAe,CAAC;IACnC,CAAC;SAAM,CAAC;QACN,MAAM,CAAC,IAAI,CAAC,yBAAyB,CAAC,CAAC;QACvC,OAAO,EAAE,CAAC;IACZ,CAAC;IAED,qCAAqC;IACrC,IAAI,MAAoB,CAAC;IACzB,IAAI,CAAC;QACH,MAAM,GAAG,IAAI,CAAC,KAAK,CAAC,cAAc,CAAiB,CAAC;IACtD,CAAC;IAAC,MAAM,CAAC;QACP,oEAAoE;QACpE,MAAM,CAAC,KAAK,CAAC,uCAAuC,CAAC,CAAC;QACtD,MAAM,IAAI,yBAAkB,CAAC,6BAA6B,CAAC,CAAC;IAC9D,CAAC;IAED,MAAM,GAAG,GAAiB,EAAE,CAAC;IAC7B,KAAK,MAAM,CAAC,IAAI,MAAM,EAAE,CAAC;QACvB,IAAI,CAAC,CAAC,GAAG,KAAK,SAAS,IAAI,CAAC,CAAC,IAAI,KAAK,SAAS,EAAE,CAAC;YAChD,yFAAyF;YACzF,MAAM,IAAI,yBAAkB,CAC1B,gDAAgD,CACjD,CAAC;QACJ,CAAC;QAED,kFAAkF;QAClF,iEAAiE;QACjE,IAAI,uBAAuB,IAAI,CAAC,CAAC,IAAI,KAAK,uBAAuB,EAAE,CAAC;YAClE,SAAS;QACX,CAAC;QAED,MAAM,WAAW,GAAG,CAAC,GAAuB,EAAW,EAAE;YACvD,OAAO,GAAG,CAAC,CAAC,CAAC,gBAAgB,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC;QACjD,CAAC,CAAC;QAEF,IACE,CAAC,WAAW,CAAC,CAAC,CAAC,IAAI,CAAC;YACpB,CAAC,WAAW,CAAC,CAAC,CAAC,IAAI,CAAC;YACpB,CAAC,WAAW,CAAC,CAAC,CAAC,GAAG,CAAC;YACnB,CAAC,WAAW,CAAC,CAAC,CAAC,QAAQ,CAAC;YACxB,CAAC,WAAW,CAAC,CAAC,CAAC,QAAQ,CAAC;YACxB,CAAC,WAAW,CAAC,CAAC,CAAC,KAAK,CAAC,EACrB,CAAC;YACD,MAAM,IAAI,yBAAkB,CAC1B,qEAAqE,CACtE,CAAC;QACJ,CAAC;QAED,GAAG,CAAC,IAAI,CAAC;YACP,IAAI,EAAE,CAAC,CAAC,IAAI;YACZ,IAAI,EAAE,CAAC,CAAC,IAAI;YACZ,GAAG,EAAE,CAAC,CAAC,GAAG;YACV,QAAQ,EAAE,CAAC,CAAC,QAAQ;YACpB,QAAQ,EAAE,CAAC,CAAC,QAAQ;YACpB,KAAK,EAAE,CAAC,CAAC,KAAK;SACf,CAAC,CAAC;IACL,CAAC;IACD,OAAO,GAAG,CAAC;AACb,CAAC"}
{"version":3,"file":"start-proxy.js","sourceRoot":"","sources":["../src/start-proxy.ts"],"names":[],"mappings":";;AA8BA,wCA2EC;AAzGD,2CAAsD;AAEtD,iCAA4C;AAW5C,MAAM,yBAAyB,GAA6B;IAC1D,IAAI,EAAE,kBAAkB;IACxB,MAAM,EAAE,YAAY;IACpB,UAAU,EAAE,cAAc;IAC1B,MAAM,EAAE,cAAc;IACtB,IAAI,EAAE,iBAAiB;IACvB,IAAI,EAAE,gBAAgB;IACtB,EAAE,EAAE,gBAAgB;IACpB,oFAAoF;IACpF,OAAO,EAAE,EAAE;IACX,GAAG,EAAE,EAAE;IACP,KAAK,EAAE,EAAE;CACD,CAAC;AAEX,kEAAkE;AAClE,+DAA+D;AAC/D,gDAAgD;AAChD,SAAgB,cAAc,CAC5B,MAAc,EACd,eAAmC,EACnC,qBAAyC,EACzC,cAAkC;IAElC,MAAM,QAAQ,GAAG,cAAc,CAAC,CAAC,CAAC,IAAA,yBAAa,EAAC,cAAc,CAAC,CAAC,CAAC,CAAC,SAAS,CAAC;IAC5E,MAAM,uBAAuB,GAAG,QAAQ;QACtC,CAAC,CAAC,yBAAyB,CAAC,QAAQ,CAAC;QACrC,CAAC,CAAC,SAAS,CAAC;IAEd,IAAI,cAAsB,CAAC;IAC3B,IAAI,qBAAqB,KAAK,SAAS,EAAE,CAAC;QACxC,MAAM,CAAC,IAAI,CAAC,qCAAqC,CAAC,CAAC;QACnD,cAAc,GAAG,MAAM,CAAC,IAAI,CAAC,qBAAqB,EAAE,QAAQ,CAAC,CAAC,QAAQ,EAAE,CAAC;IAC3E,CAAC;SAAM,IAAI,eAAe,KAAK,SAAS,EAAE,CAAC;QACzC,MAAM,CAAC,IAAI,CAAC,+BAA+B,CAAC,CAAC;QAC7C,cAAc,GAAG,eAAe,CAAC;IACnC,CAAC;SAAM,CAAC;QACN,MAAM,CAAC,IAAI,CAAC,yBAAyB,CAAC,CAAC;QACvC,OAAO,EAAE,CAAC;IACZ,CAAC;IAED,qCAAqC;IACrC,IAAI,MAAoB,CAAC;IACzB,IAAI,CAAC;QACH,MAAM,GAAG,IAAI,CAAC,KAAK,CAAC,cAAc,CAAiB,CAAC;IACtD,CAAC;IAAC,MAAM,CAAC;QACP,oEAAoE;QACpE,MAAM,CAAC,KAAK,CAAC,uCAAuC,CAAC,CAAC;QACtD,MAAM,IAAI,yBAAkB,CAAC,6BAA6B,CAAC,CAAC;IAC9D,CAAC;IAED,MAAM,GAAG,GAAiB,EAAE,CAAC;IAC7B,KAAK,MAAM,CAAC,IAAI,MAAM,EAAE,CAAC;QACvB,IAAI,CAAC,CAAC,GAAG,KAAK,SAAS,IAAI,CAAC,CAAC,IAAI,KAAK,SAAS,EAAE,CAAC;YAChD,yFAAyF;YACzF,MAAM,IAAI,yBAAkB,CAC1B,gDAAgD,CACjD,CAAC;QACJ,CAAC;QAED,kFAAkF;QAClF,iEAAiE;QACjE,IAAI,uBAAuB,IAAI,CAAC,CAAC,IAAI,KAAK,uBAAuB,EAAE,CAAC;YAClE,SAAS;QACX,CAAC;QAED,MAAM,WAAW,GAAG,CAAC,GAAuB,EAAW,EAAE;YACvD,OAAO,GAAG,CAAC,CAAC,CAAC,gBAAgB,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC;QACjD,CAAC,CAAC;QAEF,IACE,CAAC,WAAW,CAAC,CAAC,CAAC,IAAI,CAAC;YACpB,CAAC,WAAW,CAAC,CAAC,CAAC,IAAI,CAAC;YACpB,CAAC,WAAW,CAAC,CAAC,CAAC,GAAG,CAAC;YACnB,CAAC,WAAW,CAAC,CAAC,CAAC,QAAQ,CAAC;YACxB,CAAC,WAAW,CAAC,CAAC,CAAC,QAAQ,CAAC;YACxB,CAAC,WAAW,CAAC,CAAC,CAAC,KAAK,CAAC,EACrB,CAAC;YACD,MAAM,IAAI,yBAAkB,CAC1B,qEAAqE,CACtE,CAAC;QACJ,CAAC;QAED,GAAG,CAAC,IAAI,CAAC;YACP,IAAI,EAAE,CAAC,CAAC,IAAI;YACZ,IAAI,EAAE,CAAC,CAAC,IAAI;YACZ,GAAG,EAAE,CAAC,CAAC,GAAG;YACV,QAAQ,EAAE,CAAC,CAAC,QAAQ;YACpB,QAAQ,EAAE,CAAC,CAAC,QAAQ;YACpB,KAAK,EAAE,CAAC,CAAC,KAAK;SACf,CAAC,CAAC;IACL,CAAC;IACD,OAAO,GAAG,CAAC;AACb,CAAC"}

21
lib/status-report.js generated
View File

@@ -35,6 +35,7 @@ var __importStar = (this && this.__importStar) || (function () {
Object.defineProperty(exports, "__esModule", { value: true });
exports.JobStatus = exports.ActionName = void 0;
exports.isFirstPartyAnalysis = isFirstPartyAnalysis;
exports.isThirdPartyAnalysis = isThirdPartyAnalysis;
exports.getActionsStatus = getActionsStatus;
exports.getJobStatusDisplayName = getJobStatusDisplayName;
exports.createStatusReportBase = createStatusReportBase;
@@ -46,6 +47,7 @@ const api_client_1 = require("./api-client");
const doc_url_1 = require("./doc-url");
const environment_1 = require("./environment");
const git_utils_1 = require("./git-utils");
const repository_1 = require("./repository");
const util_1 = require("./util");
var ActionName;
(function (ActionName) {
@@ -70,6 +72,12 @@ function isFirstPartyAnalysis(actionName) {
}
return process.env[environment_1.EnvVar.INIT_ACTION_HAS_RUN] === "true";
}
/**
* @returns true if the analysis is considered to be third party.
*/
function isThirdPartyAnalysis(actionName) {
return !isFirstPartyAnalysis(actionName);
}
/** Overall status of the entire job. String values match the Hydro schema. */
var JobStatus;
(function (JobStatus) {
@@ -141,10 +149,10 @@ async function createStatusReportBase(actionName, status, actionStartedAt, confi
const runnerOs = (0, util_1.getRequiredEnvParam)("RUNNER_OS");
const codeQlCliVersion = (0, util_1.getCachedCodeQlVersion)();
const actionRef = process.env["GITHUB_ACTION_REF"] || "";
const testingEnvironment = process.env[environment_1.EnvVar.TESTING_ENVIRONMENT] || "";
const testingEnvironment = (0, util_1.getTestingEnvironment)();
// re-export the testing environment variable so that it is available to subsequent steps,
// even if it was only set for this step
if (testingEnvironment !== "") {
if (testingEnvironment) {
core.exportVariable(environment_1.EnvVar.TESTING_ENVIRONMENT, testingEnvironment);
}
const isSteadyStateDefaultSetupRun = process.env["CODE_SCANNING_IS_STEADY_STATE_DEFAULT_SETUP"] === "true";
@@ -165,7 +173,7 @@ async function createStatusReportBase(actionName, status, actionStartedAt, confi
started_at: workflowStartedAt,
status,
steady_state_default_setup: isSteadyStateDefaultSetupRun,
testing_environment: testingEnvironment,
testing_environment: testingEnvironment || "",
workflow_name: workflowName,
workflow_run_attempt: workflowRunAttempt,
workflow_run_id: workflowRunID,
@@ -248,13 +256,12 @@ async function sendStatusReport(statusReport) {
core.debug("In test mode. Status reports are not uploaded.");
return;
}
const nwo = (0, util_1.getRequiredEnvParam)("GITHUB_REPOSITORY");
const [owner, repo] = nwo.split("/");
const nwo = (0, repository_1.getRepositoryNwo)();
const client = (0, api_client_1.getApiClient)();
try {
await client.request("PUT /repos/:owner/:repo/code-scanning/analysis/status", {
owner,
repo,
owner: nwo.owner,
repo: nwo.repo,
data: statusReportJSON,
});
}

File diff suppressed because one or more lines are too long

View File

@@ -109,4 +109,14 @@ function setupEnvironmentAndStub(tmpDir) {
t.is((await (0, status_report_1.createStatusReportBase)(status_report_1.ActionName.Analyze, "failure", new Date("May 19, 2023 05:19:00"), (0, testing_utils_1.createTestConfig)({}), { numAvailableBytes: 100, numTotalBytes: 500 }, (0, logging_1.getRunnerLogger)(false), "failure cause", "exception stack trace"))?.first_party_analysis, true);
});
});
(0, ava_1.default)("getActionStatus handling correctly various types of errors", (t) => {
t.is((0, status_report_1.getActionsStatus)(new Error("arbitrary error")), "failure", "We categorise an arbitrary error as a failure");
t.is((0, status_report_1.getActionsStatus)(new util_1.ConfigurationError("arbitrary error")), "user-error", "We categorise a ConfigurationError as a user error");
t.is((0, status_report_1.getActionsStatus)(new Error("exit code 1"), "multiple things went wrong"), "failure", "getActionsStatus should return failure if passed an arbitrary error and an additional failure cause");
t.is((0, status_report_1.getActionsStatus)(new util_1.ConfigurationError("exit code 1"), "multiple things went wrong"), "user-error", "getActionsStatus should return user-error if passed a configuration error and an additional failure cause");
t.is((0, status_report_1.getActionsStatus)(), "success", "getActionsStatus should return success if no error is passed");
t.is((0, status_report_1.getActionsStatus)(new Object()), "failure", "getActionsStatus should return failure if passed an arbitrary object");
t.is((0, status_report_1.getActionsStatus)(null, "an error occurred"), "failure", "getActionsStatus should return failure if passed null and an additional failure cause");
t.is((0, status_report_1.getActionsStatus)((0, util_1.wrapError)(new util_1.ConfigurationError("arbitrary error"))), "user-error", "We still recognise a wrapped ConfigurationError as a user error");
});
//# sourceMappingURL=status-report.test.js.map

File diff suppressed because one or more lines are too long

99
lib/upload-lib.js generated
View File

@@ -38,12 +38,16 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
Object.defineProperty(exports, "__esModule", { value: true });
exports.InvalidSarifUploadError = void 0;
exports.shouldShowCombineSarifFilesDeprecationWarning = shouldShowCombineSarifFilesDeprecationWarning;
exports.throwIfCombineSarifFilesDisabled = throwIfCombineSarifFilesDisabled;
exports.populateRunAutomationDetails = populateRunAutomationDetails;
exports.findSarifFilesInDir = findSarifFilesInDir;
exports.readSarifFile = readSarifFile;
exports.validateSarifFileSchema = validateSarifFileSchema;
exports.buildPayload = buildPayload;
exports.uploadFiles = uploadFiles;
exports.waitForProcessing = waitForProcessing;
exports.shouldConsiderConfigurationError = shouldConsiderConfigurationError;
exports.shouldConsiderInvalidRequest = shouldConsiderInvalidRequest;
exports.validateUniqueCategory = validateUniqueCategory;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
@@ -51,15 +55,15 @@ const zlib_1 = __importDefault(require("zlib"));
const core = __importStar(require("@actions/core"));
const file_url_1 = __importDefault(require("file-url"));
const jsonschema = __importStar(require("jsonschema"));
const semver = __importStar(require("semver"));
const actionsUtil = __importStar(require("./actions-util"));
const actions_util_1 = require("./actions-util");
const api = __importStar(require("./api-client"));
const api_client_1 = require("./api-client");
const codeql_1 = require("./codeql");
const config_utils_1 = require("./config-utils");
const diff_filtering_utils_1 = require("./diff-filtering-utils");
const diff_informed_analysis_utils_1 = require("./diff-informed-analysis-utils");
const environment_1 = require("./environment");
const feature_flags_1 = require("./feature-flags");
const fingerprints = __importStar(require("./fingerprints"));
const gitUtils = __importStar(require("./git-utils"));
const init_1 = require("./init");
@@ -133,7 +137,7 @@ function areAllRunsUnique(sarifObjects) {
async function shouldShowCombineSarifFilesDeprecationWarning(sarifObjects, githubVersion) {
// Do not show this warning on GHES versions before 3.14.0
if (githubVersion.type === util_1.GitHubVariant.GHES &&
semver.lt(githubVersion.version, "3.14.0")) {
(0, util_1.satisfiesGHESVersion)(githubVersion.version, "<3.14", true)) {
return false;
}
// Only give a deprecation warning when not all runs are unique and
@@ -141,6 +145,36 @@ async function shouldShowCombineSarifFilesDeprecationWarning(sarifObjects, githu
return (!areAllRunsUnique(sarifObjects) &&
!process.env.CODEQL_MERGE_SARIF_DEPRECATION_WARNING);
}
async function throwIfCombineSarifFilesDisabled(sarifObjects, features, githubVersion) {
if (!(await shouldDisableCombineSarifFiles(sarifObjects, features, githubVersion))) {
return;
}
// TODO: Update this changelog URL to the correct one when it's published.
const deprecationMoreInformationMessage = "For more information, see https://github.blog/changelog/2024-05-06-code-scanning-will-stop-combining-runs-from-a-single-upload";
throw new util_1.ConfigurationError(`The CodeQL Action does not support uploading multiple SARIF runs with the same category. Please update your workflow to upload a single run per category. ${deprecationMoreInformationMessage}`);
}
// Checks whether combining SARIF files should be disabled.
async function shouldDisableCombineSarifFiles(sarifObjects, features, githubVersion) {
if (githubVersion.type === util_1.GitHubVariant.GHES) {
// Never block on GHES versions before 3.18.
if ((0, util_1.satisfiesGHESVersion)(githubVersion.version, "<3.18", true)) {
return false;
}
}
else {
// Never block when the feature flag is disabled.
if (!(await features.getValue(feature_flags_1.Feature.DisableCombineSarifFiles))) {
return false;
}
}
if (areAllRunsUnique(sarifObjects)) {
// If all runs are unique, we can safely combine them.
return false;
}
// Combining SARIF files is not supported and Code Scanning will return an
// error if multiple runs with the same category are uploaded.
return true;
}
// Takes a list of paths to sarif files and combines them together using the
// CLI `github merge-results` command when all SARIF files are produced by
// CodeQL. Otherwise, it will fall back to combining the files in the action.
@@ -155,9 +189,10 @@ async function combineSarifFilesUsingCLI(sarifFiles, gitHubVersion, features, lo
});
const deprecationWarningMessage = gitHubVersion.type === util_1.GitHubVariant.GHES
? "and will be removed in GitHub Enterprise Server 3.18"
: "and will be removed on June 4, 2025";
: "and will be removed in July 2025";
const deprecationMoreInformationMessage = "For more information, see https://github.blog/changelog/2024-05-06-code-scanning-will-stop-combining-runs-from-a-single-upload";
if (!areAllRunsProducedByCodeQL(sarifObjects)) {
await throwIfCombineSarifFilesDisabled(sarifObjects, features, gitHubVersion);
logger.debug("Not all SARIF files were produced by CodeQL. Merging files in the action.");
if (await shouldShowCombineSarifFilesDeprecationWarning(sarifObjects, gitHubVersion)) {
logger.warning(`Uploading multiple SARIF runs with the same category is deprecated ${deprecationWarningMessage}. Please update your workflow to upload a single run per category. ${deprecationMoreInformationMessage}`);
@@ -189,6 +224,7 @@ async function combineSarifFilesUsingCLI(sarifFiles, gitHubVersion, features, lo
codeQL = initCodeQLResult.codeql;
}
if (!(await codeQL.supportsFeature(tools_features_1.ToolsFeature.SarifMergeRunsFromEqualCategory))) {
await throwIfCombineSarifFilesDisabled(sarifObjects, features, gitHubVersion);
logger.warning("The CodeQL CLI does not support merging SARIF files. Merging files in the action.");
if (await shouldShowCombineSarifFilesDeprecationWarning(sarifObjects, gitHubVersion)) {
logger.warning(`Uploading multiple CodeQL runs with the same category is deprecated ${deprecationWarningMessage} for CodeQL CLI 2.16.6 and earlier. Please update your CodeQL CLI version or update your workflow to set a distinct category for each CodeQL run. ${deprecationMoreInformationMessage}`);
@@ -322,17 +358,24 @@ function countResultsInSarif(sarif) {
}
return numResults;
}
// Validates that the given file path refers to a valid SARIF file.
// Throws an error if the file is invalid.
function validateSarifFileSchema(sarifFilePath, logger) {
logger.info(`Validating ${sarifFilePath}`);
let sarif;
function readSarifFile(sarifFilePath) {
try {
sarif = JSON.parse(fs.readFileSync(sarifFilePath, "utf8"));
return JSON.parse(fs.readFileSync(sarifFilePath, "utf8"));
}
catch (e) {
throw new InvalidSarifUploadError(`Invalid SARIF. JSON syntax error: ${(0, util_1.getErrorMessage)(e)}`);
}
}
// Validates the given SARIF object and throws an error if the SARIF object is invalid.
// The file path is only used in error messages to improve clarity.
function validateSarifFileSchema(sarif, sarifFilePath, logger) {
if (areAllRunsProducedByCodeQL([sarif]) &&
// We want to validate CodeQL SARIF in testing environments.
!util.getTestingEnvironment()) {
logger.debug(`Skipping SARIF schema validation for ${sarifFilePath} as all runs are produced by CodeQL.`);
return;
}
logger.info(`Validating ${sarifFilePath}`);
// eslint-disable-next-line @typescript-eslint/no-require-imports
const schema = require("../src/sarif-schema-2.1.0.json");
const result = new jsonschema.Validator().validate(sarif, schema);
@@ -400,19 +443,28 @@ function buildPayload(commitOid, ref, analysisKey, analysisName, zippedSarif, wo
return payloadObj;
}
/**
* Uploads a single SARIF file or a directory of SARIF files depending on what `sarifPath` refers
* Uploads a single SARIF file or a directory of SARIF files depending on what `inputSarifPath` refers
* to.
*/
async function uploadFiles(sarifPath, checkoutPath, category, features, logger) {
const sarifFiles = getSarifFilePaths(sarifPath);
async function uploadFiles(inputSarifPath, checkoutPath, category, features, logger) {
const sarifPaths = getSarifFilePaths(inputSarifPath);
logger.startGroup("Uploading results");
logger.info(`Processing sarif files: ${JSON.stringify(sarifFiles)}`);
logger.info(`Processing sarif files: ${JSON.stringify(sarifPaths)}`);
const gitHubVersion = await (0, api_client_1.getGitHubVersion)();
// Validate that the files we were asked to upload are all valid SARIF files
for (const file of sarifFiles) {
validateSarifFileSchema(file, logger);
let sarif;
if (sarifPaths.length > 1) {
// Validate that the files we were asked to upload are all valid SARIF files
for (const sarifPath of sarifPaths) {
const parsedSarif = readSarifFile(sarifPath);
validateSarifFileSchema(parsedSarif, sarifPath, logger);
}
sarif = await combineSarifFilesUsingCLI(sarifPaths, gitHubVersion, features, logger);
}
else {
const sarifPath = sarifPaths[0];
sarif = readSarifFile(sarifPath);
validateSarifFileSchema(sarif, sarifPath, logger);
}
let sarif = await combineSarifFilesUsingCLI(sarifFiles, gitHubVersion, features, logger);
sarif = filterAlertsByDiffRange(logger, sarif);
sarif = await fingerprints.addFingerprints(sarif, checkoutPath, logger);
const analysisKey = await api.getAnalysisKey();
@@ -435,7 +487,7 @@ async function uploadFiles(sarifPath, checkoutPath, category, features, logger)
const numResultInSarif = countResultsInSarif(sarifPayload);
logger.debug(`Number of results in upload: ${numResultInSarif}`);
// Make the upload
const sarifID = await uploadPayload(payload, (0, repository_1.parseRepositoryNwo)(util.getRequiredEnvParam("GITHUB_REPOSITORY")), logger);
const sarifID = await uploadPayload(payload, (0, repository_1.getRepositoryNwo)(), logger);
logger.endGroup();
return {
statusReport: {
@@ -524,9 +576,12 @@ async function waitForProcessing(repositoryNwo, sarifID, logger, options = {
* Returns whether the provided processing errors are a configuration error.
*/
function shouldConsiderConfigurationError(processingErrors) {
const expectedConfigErrors = [
"CodeQL analyses from advanced configurations cannot be processed when the default setup is enabled",
"rejecting delivery as the repository has too many logical alerts",
];
return (processingErrors.length === 1 &&
processingErrors[0] ===
"CodeQL analyses from advanced configurations cannot be processed when the default setup is enabled");
expectedConfigErrors.some((msg) => processingErrors[0].includes(msg)));
}
/**
* Returns whether the provided processing errors are the result of an invalid SARIF upload request.
@@ -610,7 +665,7 @@ class InvalidSarifUploadError extends Error {
}
exports.InvalidSarifUploadError = InvalidSarifUploadError;
function filterAlertsByDiffRange(logger, sarif) {
const diffRanges = (0, diff_filtering_utils_1.readDiffRangesJsonFile)(logger);
const diffRanges = (0, diff_informed_analysis_utils_1.readDiffRangesJsonFile)(logger);
if (!diffRanges?.length) {
return sarif;
}

File diff suppressed because one or more lines are too long

123
lib/upload-lib.test.js generated
View File

@@ -39,6 +39,7 @@ Object.defineProperty(exports, "__esModule", { value: true });
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const ava_1 = __importDefault(require("ava"));
const feature_flags_1 = require("./feature-flags");
const logging_1 = require("./logging");
const testing_utils_1 = require("./testing-utils");
const uploadLib = __importStar(require("./upload-lib"));
@@ -49,11 +50,11 @@ ava_1.default.beforeEach(() => {
});
(0, ava_1.default)("validateSarifFileSchema - valid", (t) => {
const inputFile = `${__dirname}/../src/testdata/valid-sarif.sarif`;
t.notThrows(() => uploadLib.validateSarifFileSchema(inputFile, (0, logging_1.getRunnerLogger)(true)));
t.notThrows(() => uploadLib.validateSarifFileSchema(uploadLib.readSarifFile(inputFile), inputFile, (0, logging_1.getRunnerLogger)(true)));
});
(0, ava_1.default)("validateSarifFileSchema - invalid", (t) => {
const inputFile = `${__dirname}/../src/testdata/invalid-sarif.sarif`;
t.throws(() => uploadLib.validateSarifFileSchema(inputFile, (0, logging_1.getRunnerLogger)(true)));
t.throws(() => uploadLib.validateSarifFileSchema(uploadLib.readSarifFile(inputFile), inputFile, (0, logging_1.getRunnerLogger)(true)));
});
(0, ava_1.default)("validate correct payload used for push, PR merge commit, and PR head", async (t) => {
process.env["GITHUB_EVENT_NAME"] = "push";
@@ -202,7 +203,7 @@ ava_1.default.beforeEach(() => {
},
};
const sarifFile = `${__dirname}/../src/testdata/with-invalid-uri.sarif`;
uploadLib.validateSarifFileSchema(sarifFile, mockLogger);
uploadLib.validateSarifFileSchema(uploadLib.readSarifFile(sarifFile), sarifFile, mockLogger);
t.deepEqual(loggedMessages.length, 3);
t.deepEqual(loggedMessages[1], "Warning: 'not a valid URI' is not a valid URI in 'instance.runs[0].tool.driver.rules[0].helpUri'.", "Warning: 'not a valid URI' is not a valid URI in 'instance.runs[0].results[0].locations[0].physicalLocation.artifactLocation.uri'.");
});
@@ -223,6 +224,12 @@ ava_1.default.beforeEach(() => {
version: "3.14.0",
}));
});
(0, ava_1.default)("shouldShowCombineSarifFilesDeprecationWarning when on GHES 3.16 pre", async (t) => {
t.true(await uploadLib.shouldShowCombineSarifFilesDeprecationWarning([createMockSarif("abc", "def"), createMockSarif("abc", "def")], {
type: util_1.GitHubVariant.GHES,
version: "3.16.0.pre1",
}));
});
(0, ava_1.default)("shouldShowCombineSarifFilesDeprecationWarning with only 1 run", async (t) => {
t.false(await uploadLib.shouldShowCombineSarifFilesDeprecationWarning([createMockSarif("abc", "def")], {
type: util_1.GitHubVariant.DOTCOM,
@@ -244,6 +251,116 @@ ava_1.default.beforeEach(() => {
type: util_1.GitHubVariant.DOTCOM,
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on dotcom with feature flag", async (t) => {
await t.throwsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.DisableCombineSarifFiles]), {
type: util_1.GitHubVariant.DOTCOM,
}), {
message: /The CodeQL Action does not support uploading multiple SARIF runs with the same category/,
});
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on dotcom without feature flag", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.DOTCOM,
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.13", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.13.2",
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.14", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.14.0",
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.17", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.17.0",
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.18 pre", async (t) => {
await t.throwsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.18.0.pre1",
}), {
message: /The CodeQL Action does not support uploading multiple SARIF runs with the same category/,
});
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.18 alpha", async (t) => {
await t.throwsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.18.0-alpha.1",
}), {
message: /The CodeQL Action does not support uploading multiple SARIF runs with the same category/,
});
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.18", async (t) => {
await t.throwsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.18.0",
}), {
message: /The CodeQL Action does not support uploading multiple SARIF runs with the same category/,
});
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled with an invalid GHES version", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "foobar",
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled with only 1 run", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.DisableCombineSarifFiles]), {
type: util_1.GitHubVariant.DOTCOM,
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled with distinct categories", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("def", "def")], (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.DisableCombineSarifFiles]), {
type: util_1.GitHubVariant.DOTCOM,
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled with distinct tools", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "abc"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.DisableCombineSarifFiles]), {
type: util_1.GitHubVariant.DOTCOM,
}));
});
(0, ava_1.default)("shouldConsiderConfigurationError correctly detects configuration errors", (t) => {
const error1 = [
"CodeQL analyses from advanced configurations cannot be processed when the default setup is enabled",
];
t.true(uploadLib.shouldConsiderConfigurationError(error1));
const error2 = [
"rejecting delivery as the repository has too many logical alerts",
];
t.true(uploadLib.shouldConsiderConfigurationError(error2));
// We fail cases where we get > 1 error messages back
const error3 = [
"rejecting delivery as the repository has too many alerts",
"extra error message",
];
t.false(uploadLib.shouldConsiderConfigurationError(error3));
});
(0, ava_1.default)("shouldConsiderInvalidRequest returns correct recognises processing errors", (t) => {
const error1 = [
"rejecting SARIF",
"an invalid URI was provided as a SARIF location",
];
t.true(uploadLib.shouldConsiderInvalidRequest(error1));
const error2 = [
"locationFromSarifResult: expected artifact location",
"an invalid URI was provided as a SARIF location",
];
t.true(uploadLib.shouldConsiderInvalidRequest(error2));
// We expect ALL errors to be of processing errors, for the outcome to be classified as
// an invalid SARIF upload error.
const error3 = [
"could not convert rules: invalid security severity value, is not a number",
"an unknown error occurred",
];
t.false(uploadLib.shouldConsiderInvalidRequest(error3));
});
function createMockSarif(id, tool) {
return {
runs: [

File diff suppressed because one or more lines are too long

View File

@@ -61,7 +61,7 @@ async function run() {
(0, util_1.checkActionVersion)((0, actions_util_1.getActionVersion)(), gitHubVersion);
// Make inputs accessible in the `post` step.
actionsUtil.persistInputs();
const repositoryNwo = (0, repository_1.parseRepositoryNwo)((0, util_1.getRequiredEnvParam)("GITHUB_REPOSITORY"));
const repositoryNwo = (0, repository_1.getRepositoryNwo)();
const features = new feature_flags_1.Features(gitHubVersion, repositoryNwo, (0, actions_util_1.getTemporaryDirectory)(), logger);
const startingStatusReportBase = await (0, status_report_1.createStatusReportBase)(status_report_1.ActionName.UploadSarif, "starting", startedAt, undefined, await (0, util_1.checkDiskUsage)(logger), logger);
if (startingStatusReportBase !== undefined) {
@@ -75,12 +75,12 @@ async function run() {
core.debug("In test mode. Waiting for processing is disabled.");
}
else if (actionsUtil.getRequiredInput("wait-for-processing") === "true") {
await upload_lib.waitForProcessing((0, repository_1.parseRepositoryNwo)((0, util_1.getRequiredEnvParam)("GITHUB_REPOSITORY")), uploadResult.sarifID, logger);
await upload_lib.waitForProcessing((0, repository_1.getRepositoryNwo)(), uploadResult.sarifID, logger);
}
await sendSuccessStatusReport(startedAt, uploadResult.statusReport, logger);
}
catch (unwrappedError) {
const error = !(0, status_report_1.isFirstPartyAnalysis)(status_report_1.ActionName.UploadSarif) &&
const error = (0, status_report_1.isThirdPartyAnalysis)(status_report_1.ActionName.UploadSarif) &&
unwrappedError instanceof upload_lib.InvalidSarifUploadError
? new util_1.ConfigurationError(unwrappedError.message)
: (0, util_1.wrapError)(unwrappedError);

View File

@@ -1 +1 @@
{"version":3,"file":"upload-sarif-action.js","sourceRoot":"","sources":["../src/upload-sarif-action.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,oDAAsC;AAEtC,4DAA8C;AAC9C,iDAAyE;AACzE,6CAAgD;AAChD,mDAA2C;AAC3C,uCAAqD;AACrD,6CAAkD;AAClD,mDAOyB;AACzB,yDAA2C;AAC3C,iCASgB;AAMhB,KAAK,UAAU,uBAAuB,CACpC,SAAe,EACf,WAA0C,EAC1C,MAAc;IAEd,MAAM,gBAAgB,GAAG,MAAM,IAAA,sCAAsB,EACnD,0BAAU,CAAC,WAAW,EACtB,SAAS,EACT,SAAS,EACT,SAAS,EACT,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,CACP,CAAC;IACF,IAAI,gBAAgB,KAAK,SAAS,EAAE,CAAC;QACnC,MAAM,YAAY,GAA4B;YAC5C,GAAG,gBAAgB;YACnB,GAAG,WAAW;SACf,CAAC;QACF,MAAM,IAAA,gCAAgB,EAAC,YAAY,CAAC,CAAC;IACvC,CAAC;AACH,CAAC;AAED,KAAK,UAAU,GAAG;IAChB,MAAM,SAAS,GAAG,IAAI,IAAI,EAAE,CAAC;IAC7B,MAAM,MAAM,GAAG,IAAA,0BAAgB,GAAE,CAAC;IAClC,IAAA,4BAAqB,EAAC,IAAA,+BAAgB,GAAE,CAAC,CAAC;IAE1C,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;IAC/C,IAAA,yBAAkB,EAAC,IAAA,+BAAgB,GAAE,EAAE,aAAa,CAAC,CAAC;IAEtD,6CAA6C;IAC7C,WAAW,CAAC,aAAa,EAAE,CAAC;IAE5B,MAAM,aAAa,GAAG,IAAA,+BAAkB,EACtC,IAAA,0BAAmB,EAAC,mBAAmB,CAAC,CACzC,CAAC;IACF,MAAM,QAAQ,GAAG,IAAI,wBAAQ,CAC3B,aAAa,EACb,aAAa,EACb,IAAA,oCAAqB,GAAE,EACvB,MAAM,CACP,CAAC;IAEF,MAAM,wBAAwB,GAAG,MAAM,IAAA,sCAAsB,EAC3D,0BAAU,CAAC,WAAW,EACtB,UAAU,EACV,SAAS,EACT,SAAS,EACT,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,CACP,CAAC;IACF,IAAI,wBAAwB,KAAK,SAAS,EAAE,CAAC;QAC3C,MAAM,IAAA,gCAAgB,EAAC,wBAAwB,CAAC,CAAC;IACnD,CAAC;IAED,IAAI,CAAC;QACH,MAAM,YAAY,GAAG,MAAM,UAAU,CAAC,WAAW,CAC/C,WAAW,CAAC,gBAAgB,CAAC,YAAY,CAAC,EAC1C,WAAW,CAAC,gBAAgB,CAAC,eAAe,CAAC,EAC7C,WAAW,CAAC,gBAAgB,CAAC,UAAU,CAAC,EACxC,QAAQ,EACR,MAAM,CACP,CAAC;QACF,IAAI,CAAC,SAAS,CAAC,UAAU,EAAE,YAAY,CAAC,OAAO,CAAC,CAAC;QAEjD,qEAAqE;QACrE,IAAI,IAAA,mBAAY,GAAE,EAAE,CAAC;YACnB,IAAI,CAAC,KAAK,CAAC,mDAAmD,CAAC,CAAC;QAClE,CAAC;aAAM,IAAI,WAAW,CAAC,gBAAgB,CAAC,qBAAqB,CAAC,KAAK,MAAM,EAAE,CAAC;YAC1E,MAAM,UAAU,CAAC,iBAAiB,CAChC,IAAA,+BAAkB,EAAC,IAAA,0BAAmB,EAAC,mBAAmB,CAAC,CAAC,EAC5D,YAAY,CAAC,OAAO,EACpB,MAAM,CACP,CAAC;QACJ,CAAC;QACD,MAAM,uBAAuB,CAAC,SAAS,EAAE,YAAY,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC;IAC9E,CAAC;IAAC,OAAO,cAAc,EAAE,CAAC;QACxB,MAAM,KAAK,GACT,CAAC,IAAA,oCAAoB,EAAC,0BAAU,CAAC,WAAW,CAAC;YAC7C,cAAc,YAAY,UAAU,CAAC,uBAAuB;YAC1D,CAAC,CAAC,IAAI,yBAAkB,CAAC,cAAc,CAAC,OAAO,CAAC;YAChD,CAAC,CAAC,IAAA,gBAAS,EAAC,cAAc,CAAC,CAAC;QAChC,MAAM,OAAO,GAAG,KAAK,CAAC,OAAO,CAAC;QAC9B,IAAI,CAAC,SAAS,CAAC,OAAO,CAAC,CAAC;QAExB,MAAM,qBAAqB,GAAG,MAAM,IAAA,sCAAsB,EACxD,0BAAU,CAAC,WAAW,EACtB,IAAA,gCAAgB,EAAC,KAAK,CAAC,EACvB,SAAS,EACT,SAAS,EACT,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,EACN,OAAO,EACP,KAAK,CAAC,KAAK,CACZ,CAAC;QACF,IAAI,qBAAqB,KAAK,SAAS,EAAE,CAAC;YACxC,MAAM,IAAA,gCAAgB,EAAC,qBAAqB,CAAC,CAAC;QAChD,CAAC;QACD,OAAO;IACT,CAAC;AACH,CAAC;AAED,KAAK,UAAU,UAAU;IACvB,IAAI,CAAC;QACH,MAAM,GAAG,EAAE,CAAC;IACd,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,IAAI,CAAC,SAAS,CACZ,sCAAsC,IAAA,sBAAe,EAAC,KAAK,CAAC,EAAE,CAC/D,CAAC;IACJ,CAAC;AACH,CAAC;AAED,KAAK,UAAU,EAAE,CAAC"}
{"version":3,"file":"upload-sarif-action.js","sourceRoot":"","sources":["../src/upload-sarif-action.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,oDAAsC;AAEtC,4DAA8C;AAC9C,iDAAyE;AACzE,6CAAgD;AAChD,mDAA2C;AAC3C,uCAAqD;AACrD,6CAAgD;AAChD,mDAOyB;AACzB,yDAA2C;AAC3C,iCAQgB;AAMhB,KAAK,UAAU,uBAAuB,CACpC,SAAe,EACf,WAA0C,EAC1C,MAAc;IAEd,MAAM,gBAAgB,GAAG,MAAM,IAAA,sCAAsB,EACnD,0BAAU,CAAC,WAAW,EACtB,SAAS,EACT,SAAS,EACT,SAAS,EACT,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,CACP,CAAC;IACF,IAAI,gBAAgB,KAAK,SAAS,EAAE,CAAC;QACnC,MAAM,YAAY,GAA4B;YAC5C,GAAG,gBAAgB;YACnB,GAAG,WAAW;SACf,CAAC;QACF,MAAM,IAAA,gCAAgB,EAAC,YAAY,CAAC,CAAC;IACvC,CAAC;AACH,CAAC;AAED,KAAK,UAAU,GAAG;IAChB,MAAM,SAAS,GAAG,IAAI,IAAI,EAAE,CAAC;IAC7B,MAAM,MAAM,GAAG,IAAA,0BAAgB,GAAE,CAAC;IAClC,IAAA,4BAAqB,EAAC,IAAA,+BAAgB,GAAE,CAAC,CAAC;IAE1C,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;IAC/C,IAAA,yBAAkB,EAAC,IAAA,+BAAgB,GAAE,EAAE,aAAa,CAAC,CAAC;IAEtD,6CAA6C;IAC7C,WAAW,CAAC,aAAa,EAAE,CAAC;IAE5B,MAAM,aAAa,GAAG,IAAA,6BAAgB,GAAE,CAAC;IACzC,MAAM,QAAQ,GAAG,IAAI,wBAAQ,CAC3B,aAAa,EACb,aAAa,EACb,IAAA,oCAAqB,GAAE,EACvB,MAAM,CACP,CAAC;IAEF,MAAM,wBAAwB,GAAG,MAAM,IAAA,sCAAsB,EAC3D,0BAAU,CAAC,WAAW,EACtB,UAAU,EACV,SAAS,EACT,SAAS,EACT,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,CACP,CAAC;IACF,IAAI,wBAAwB,KAAK,SAAS,EAAE,CAAC;QAC3C,MAAM,IAAA,gCAAgB,EAAC,wBAAwB,CAAC,CAAC;IACnD,CAAC;IAED,IAAI,CAAC;QACH,MAAM,YAAY,GAAG,MAAM,UAAU,CAAC,WAAW,CAC/C,WAAW,CAAC,gBAAgB,CAAC,YAAY,CAAC,EAC1C,WAAW,CAAC,gBAAgB,CAAC,eAAe,CAAC,EAC7C,WAAW,CAAC,gBAAgB,CAAC,UAAU,CAAC,EACxC,QAAQ,EACR,MAAM,CACP,CAAC;QACF,IAAI,CAAC,SAAS,CAAC,UAAU,EAAE,YAAY,CAAC,OAAO,CAAC,CAAC;QAEjD,qEAAqE;QACrE,IAAI,IAAA,mBAAY,GAAE,EAAE,CAAC;YACnB,IAAI,CAAC,KAAK,CAAC,mDAAmD,CAAC,CAAC;QAClE,CAAC;aAAM,IAAI,WAAW,CAAC,gBAAgB,CAAC,qBAAqB,CAAC,KAAK,MAAM,EAAE,CAAC;YAC1E,MAAM,UAAU,CAAC,iBAAiB,CAChC,IAAA,6BAAgB,GAAE,EAClB,YAAY,CAAC,OAAO,EACpB,MAAM,CACP,CAAC;QACJ,CAAC;QACD,MAAM,uBAAuB,CAAC,SAAS,EAAE,YAAY,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC;IAC9E,CAAC;IAAC,OAAO,cAAc,EAAE,CAAC;QACxB,MAAM,KAAK,GACT,IAAA,oCAAoB,EAAC,0BAAU,CAAC,WAAW,CAAC;YAC5C,cAAc,YAAY,UAAU,CAAC,uBAAuB;YAC1D,CAAC,CAAC,IAAI,yBAAkB,CAAC,cAAc,CAAC,OAAO,CAAC;YAChD,CAAC,CAAC,IAAA,gBAAS,EAAC,cAAc,CAAC,CAAC;QAChC,MAAM,OAAO,GAAG,KAAK,CAAC,OAAO,CAAC;QAC9B,IAAI,CAAC,SAAS,CAAC,OAAO,CAAC,CAAC;QAExB,MAAM,qBAAqB,GAAG,MAAM,IAAA,sCAAsB,EACxD,0BAAU,CAAC,WAAW,EACtB,IAAA,gCAAgB,EAAC,KAAK,CAAC,EACvB,SAAS,EACT,SAAS,EACT,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,EACN,OAAO,EACP,KAAK,CAAC,KAAK,CACZ,CAAC;QACF,IAAI,qBAAqB,KAAK,SAAS,EAAE,CAAC;YACxC,MAAM,IAAA,gCAAgB,EAAC,qBAAqB,CAAC,CAAC;QAChD,CAAC;QACD,OAAO;IACT,CAAC;AACH,CAAC;AAED,KAAK,UAAU,UAAU;IACvB,IAAI,CAAC;QACH,MAAM,GAAG,EAAE,CAAC;IACd,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,IAAI,CAAC,SAAS,CACZ,sCAAsC,IAAA,sBAAe,EAAC,KAAK,CAAC,EAAE,CAC/D,CAAC;IACJ,CAAC;AACH,CAAC;AAED,KAAK,UAAU,EAAE,CAAC"}

38
lib/util.js generated
View File

@@ -62,6 +62,7 @@ exports.bundleDb = bundleDb;
exports.delay = delay;
exports.isGoodVersion = isGoodVersion;
exports.isInTestMode = isInTestMode;
exports.getTestingEnvironment = getTestingEnvironment;
exports.doesDirectoryExist = doesDirectoryExist;
exports.listFolder = listFolder;
exports.tryGetFolderBytes = tryGetFolderBytes;
@@ -76,6 +77,7 @@ exports.getErrorMessage = getErrorMessage;
exports.prettyPrintPack = prettyPrintPack;
exports.checkDiskUsage = checkDiskUsage;
exports.checkActionVersion = checkActionVersion;
exports.satisfiesGHESVersion = satisfiesGHESVersion;
exports.cloneObject = cloneObject;
exports.checkSipEnablement = checkSipEnablement;
exports.cleanUpGlob = cleanUpGlob;
@@ -577,15 +579,27 @@ async function delay(milliseconds, opts) {
function isGoodVersion(versionSpec) {
return !BROKEN_VERSIONS.includes(versionSpec);
}
/*
* Returns whether we are in test mode.
/**
* Returns whether we are in test mode. This is used by CodeQL Action PR checks.
*
* In test mode, we don't upload SARIF results or status reports to the GitHub API.
*/
function isInTestMode() {
return process.env[environment_1.EnvVar.TEST_MODE] === "true";
}
/*
/**
* Get the testing environment.
*
* This is set if the CodeQL Action is running in a non-production environment.
*/
function getTestingEnvironment() {
const testingEnvironment = process.env[environment_1.EnvVar.TESTING_ENVIRONMENT] || "";
if (testingEnvironment === "") {
return undefined;
}
return testingEnvironment;
}
/**
* Returns whether the path in the argument represents an existing directory.
*/
function doesDirectoryExist(dirPath) {
@@ -874,6 +888,24 @@ function checkActionVersion(version, githubVersion) {
}
}
}
/**
* This will check whether the given GitHub version satisfies the given range,
* taking into account that a range like >=3.18 will also match the GHES 3.18
* pre-release/RC versions.
*
* When the given `githubVersion` is not a GHES version, or if the version
* is invalid, this will return `defaultIfInvalid`.
*/
function satisfiesGHESVersion(ghesVersion, range, defaultIfInvalid) {
const semverVersion = semver.coerce(ghesVersion);
if (semverVersion === null) {
return defaultIfInvalid;
}
// We always drop the pre-release part of the version, since anything that
// applies to GHES 3.18.0 should also apply to GHES 3.18.0.pre1.
semverVersion.prerelease = [];
return semver.satisfies(semverVersion, range);
}
/**
* Supported build modes.
*

File diff suppressed because one or more lines are too long

4
lib/workflow.js generated
View File

@@ -51,7 +51,6 @@ const zlib_1 = __importDefault(require("zlib"));
const core = __importStar(require("@actions/core"));
const yaml = __importStar(require("js-yaml"));
const api = __importStar(require("./api-client"));
const environment_1 = require("./environment");
const util_1 = require("./util");
function toCodedErrors(errors) {
return Object.entries(errors).reduce((acc, [code, message]) => {
@@ -274,8 +273,7 @@ function getInputOrThrow(workflow, jobName, actionName, inputName, matrixVars) {
* This allows us to test workflow parsing functionality as a CodeQL Action PR check.
*/
function getAnalyzeActionName() {
if ((0, util_1.isInTestMode)() ||
process.env[environment_1.EnvVar.TESTING_ENVIRONMENT] === "codeql-action-pr-checks") {
if ((0, util_1.isInTestMode)() || (0, util_1.getTestingEnvironment)() === "codeql-action-pr-checks") {
return "./analyze";
}
else {

File diff suppressed because one or more lines are too long

1
node_modules/.bin/dot-object generated vendored
View File

@@ -1 +0,0 @@
../dot-object/bin/dot-object

View File

@@ -1 +0,0 @@
../twirp-ts/protoc-gen-twirp_ts

1
node_modules/.bin/tldts generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../tldts/bin/cli.js

2356
node_modules/.package-lock.json generated vendored

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +1,4 @@
export * from './google/protobuf/timestamp';
export * from './google/protobuf/wrappers';
export * from './results/api/v1/artifact';
export * from './results/api/v1/artifact.twirp';
export * from './results/api/v1/artifact.twirp-client';

View File

@@ -17,5 +17,5 @@ Object.defineProperty(exports, "__esModule", { value: true });
__exportStar(require("./google/protobuf/timestamp"), exports);
__exportStar(require("./google/protobuf/wrappers"), exports);
__exportStar(require("./results/api/v1/artifact"), exports);
__exportStar(require("./results/api/v1/artifact.twirp"), exports);
__exportStar(require("./results/api/v1/artifact.twirp-client"), exports);
//# sourceMappingURL=index.js.map

View File

@@ -1 +1 @@
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/generated/index.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;AAAA,8DAA2C;AAC3C,6DAA0C;AAC1C,4DAAyC;AACzC,kEAA+C"}
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/generated/index.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;AAAA,8DAA2C;AAC3C,6DAA0C;AAC1C,4DAAyC;AACzC,yEAAsD"}

View File

@@ -8,6 +8,66 @@ import { MessageType } from "@protobuf-ts/runtime";
import { Int64Value } from "../../../google/protobuf/wrappers";
import { StringValue } from "../../../google/protobuf/wrappers";
import { Timestamp } from "../../../google/protobuf/timestamp";
/**
* @generated from protobuf message github.actions.results.api.v1.MigrateArtifactRequest
*/
export interface MigrateArtifactRequest {
/**
* @generated from protobuf field: string workflow_run_backend_id = 1;
*/
workflowRunBackendId: string;
/**
* @generated from protobuf field: string name = 2;
*/
name: string;
/**
* @generated from protobuf field: google.protobuf.Timestamp expires_at = 3;
*/
expiresAt?: Timestamp;
}
/**
* @generated from protobuf message github.actions.results.api.v1.MigrateArtifactResponse
*/
export interface MigrateArtifactResponse {
/**
* @generated from protobuf field: bool ok = 1;
*/
ok: boolean;
/**
* @generated from protobuf field: string signed_upload_url = 2;
*/
signedUploadUrl: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactRequest
*/
export interface FinalizeMigratedArtifactRequest {
/**
* @generated from protobuf field: string workflow_run_backend_id = 1;
*/
workflowRunBackendId: string;
/**
* @generated from protobuf field: string name = 2;
*/
name: string;
/**
* @generated from protobuf field: int64 size = 3;
*/
size: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactResponse
*/
export interface FinalizeMigratedArtifactResponse {
/**
* @generated from protobuf field: bool ok = 1;
*/
ok: boolean;
/**
* @generated from protobuf field: int64 artifact_id = 2;
*/
artifactId: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.CreateArtifactRequest
*/
@@ -162,6 +222,12 @@ export interface ListArtifactsResponse_MonolithArtifact {
* @generated from protobuf field: google.protobuf.Timestamp created_at = 6;
*/
createdAt?: Timestamp;
/**
* The SHA-256 digest of the artifact, calculated on upload for upload-artifact v4 & newer
*
* @generated from protobuf field: google.protobuf.StringValue digest = 7;
*/
digest?: StringValue;
}
/**
* @generated from protobuf message github.actions.results.api.v1.GetSignedArtifactURLRequest
@@ -219,6 +285,46 @@ export interface DeleteArtifactResponse {
*/
artifactId: string;
}
declare class MigrateArtifactRequest$Type extends MessageType<MigrateArtifactRequest> {
constructor();
create(value?: PartialMessage<MigrateArtifactRequest>): MigrateArtifactRequest;
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: MigrateArtifactRequest): MigrateArtifactRequest;
internalBinaryWrite(message: MigrateArtifactRequest, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter;
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactRequest
*/
export declare const MigrateArtifactRequest: MigrateArtifactRequest$Type;
declare class MigrateArtifactResponse$Type extends MessageType<MigrateArtifactResponse> {
constructor();
create(value?: PartialMessage<MigrateArtifactResponse>): MigrateArtifactResponse;
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: MigrateArtifactResponse): MigrateArtifactResponse;
internalBinaryWrite(message: MigrateArtifactResponse, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter;
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactResponse
*/
export declare const MigrateArtifactResponse: MigrateArtifactResponse$Type;
declare class FinalizeMigratedArtifactRequest$Type extends MessageType<FinalizeMigratedArtifactRequest> {
constructor();
create(value?: PartialMessage<FinalizeMigratedArtifactRequest>): FinalizeMigratedArtifactRequest;
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: FinalizeMigratedArtifactRequest): FinalizeMigratedArtifactRequest;
internalBinaryWrite(message: FinalizeMigratedArtifactRequest, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter;
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactRequest
*/
export declare const FinalizeMigratedArtifactRequest: FinalizeMigratedArtifactRequest$Type;
declare class FinalizeMigratedArtifactResponse$Type extends MessageType<FinalizeMigratedArtifactResponse> {
constructor();
create(value?: PartialMessage<FinalizeMigratedArtifactResponse>): FinalizeMigratedArtifactResponse;
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: FinalizeMigratedArtifactResponse): FinalizeMigratedArtifactResponse;
internalBinaryWrite(message: FinalizeMigratedArtifactResponse, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter;
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactResponse
*/
export declare const FinalizeMigratedArtifactResponse: FinalizeMigratedArtifactResponse$Type;
declare class CreateArtifactRequest$Type extends MessageType<CreateArtifactRequest> {
constructor();
create(value?: PartialMessage<CreateArtifactRequest>): CreateArtifactRequest;

View File

@@ -1,6 +1,6 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.ArtifactService = exports.DeleteArtifactResponse = exports.DeleteArtifactRequest = exports.GetSignedArtifactURLResponse = exports.GetSignedArtifactURLRequest = exports.ListArtifactsResponse_MonolithArtifact = exports.ListArtifactsResponse = exports.ListArtifactsRequest = exports.FinalizeArtifactResponse = exports.FinalizeArtifactRequest = exports.CreateArtifactResponse = exports.CreateArtifactRequest = void 0;
exports.ArtifactService = exports.DeleteArtifactResponse = exports.DeleteArtifactRequest = exports.GetSignedArtifactURLResponse = exports.GetSignedArtifactURLRequest = exports.ListArtifactsResponse_MonolithArtifact = exports.ListArtifactsResponse = exports.ListArtifactsRequest = exports.FinalizeArtifactResponse = exports.FinalizeArtifactRequest = exports.CreateArtifactResponse = exports.CreateArtifactRequest = exports.FinalizeMigratedArtifactResponse = exports.FinalizeMigratedArtifactRequest = exports.MigrateArtifactResponse = exports.MigrateArtifactRequest = void 0;
// @generated by protobuf-ts 2.9.1 with parameter long_type_string,client_none,generate_dependencies
// @generated from protobuf file "results/api/v1/artifact.proto" (package "github.actions.results.api.v1", syntax proto3)
// tslint:disable
@@ -14,6 +14,236 @@ const wrappers_1 = require("../../../google/protobuf/wrappers");
const wrappers_2 = require("../../../google/protobuf/wrappers");
const timestamp_1 = require("../../../google/protobuf/timestamp");
// @generated message type with reflection information, may provide speed optimized methods
class MigrateArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.MigrateArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "expires_at", kind: "message", T: () => timestamp_1.Timestamp }
]);
}
create(value) {
const message = { workflowRunBackendId: "", name: "" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string name */ 2:
message.name = reader.string();
break;
case /* google.protobuf.Timestamp expires_at */ 3:
message.expiresAt = timestamp_1.Timestamp.internalBinaryRead(reader, reader.uint32(), options, message.expiresAt);
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, runtime_1.WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string name = 2; */
if (message.name !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.name);
/* google.protobuf.Timestamp expires_at = 3; */
if (message.expiresAt)
timestamp_1.Timestamp.internalBinaryWrite(message.expiresAt, writer.tag(3, runtime_1.WireType.LengthDelimited).fork(), options).join();
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactRequest
*/
exports.MigrateArtifactRequest = new MigrateArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class MigrateArtifactResponse$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.MigrateArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "signed_upload_url", kind: "scalar", T: 9 /*ScalarType.STRING*/ }
]);
}
create(value) {
const message = { ok: false, signedUploadUrl: "" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* string signed_upload_url */ 2:
message.signedUploadUrl = reader.string();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, runtime_1.WireType.Varint).bool(message.ok);
/* string signed_upload_url = 2; */
if (message.signedUploadUrl !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.signedUploadUrl);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactResponse
*/
exports.MigrateArtifactResponse = new MigrateArtifactResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class FinalizeMigratedArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.FinalizeMigratedArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "size", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value) {
const message = { workflowRunBackendId: "", name: "", size: "0" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string name */ 2:
message.name = reader.string();
break;
case /* int64 size */ 3:
message.size = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, runtime_1.WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string name = 2; */
if (message.name !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.name);
/* int64 size = 3; */
if (message.size !== "0")
writer.tag(3, runtime_1.WireType.Varint).int64(message.size);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactRequest
*/
exports.FinalizeMigratedArtifactRequest = new FinalizeMigratedArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class FinalizeMigratedArtifactResponse$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.FinalizeMigratedArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "artifact_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value) {
const message = { ok: false, artifactId: "0" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* int64 artifact_id */ 2:
message.artifactId = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, runtime_1.WireType.Varint).bool(message.ok);
/* int64 artifact_id = 2; */
if (message.artifactId !== "0")
writer.tag(2, runtime_1.WireType.Varint).int64(message.artifactId);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactResponse
*/
exports.FinalizeMigratedArtifactResponse = new FinalizeMigratedArtifactResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class CreateArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.CreateArtifactRequest", [
@@ -395,7 +625,8 @@ class ListArtifactsResponse_MonolithArtifact$Type extends runtime_5.MessageType
{ no: 3, name: "database_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ },
{ no: 4, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 5, name: "size", kind: "scalar", T: 3 /*ScalarType.INT64*/ },
{ no: 6, name: "created_at", kind: "message", T: () => timestamp_1.Timestamp }
{ no: 6, name: "created_at", kind: "message", T: () => timestamp_1.Timestamp },
{ no: 7, name: "digest", kind: "message", T: () => wrappers_2.StringValue }
]);
}
create(value) {
@@ -428,6 +659,9 @@ class ListArtifactsResponse_MonolithArtifact$Type extends runtime_5.MessageType
case /* google.protobuf.Timestamp created_at */ 6:
message.createdAt = timestamp_1.Timestamp.internalBinaryRead(reader, reader.uint32(), options, message.createdAt);
break;
case /* google.protobuf.StringValue digest */ 7:
message.digest = wrappers_2.StringValue.internalBinaryRead(reader, reader.uint32(), options, message.digest);
break;
default:
let u = options.readUnknownField;
if (u === "throw")
@@ -458,6 +692,9 @@ class ListArtifactsResponse_MonolithArtifact$Type extends runtime_5.MessageType
/* google.protobuf.Timestamp created_at = 6; */
if (message.createdAt)
timestamp_1.Timestamp.internalBinaryWrite(message.createdAt, writer.tag(6, runtime_1.WireType.LengthDelimited).fork(), options).join();
/* google.protobuf.StringValue digest = 7; */
if (message.digest)
wrappers_2.StringValue.internalBinaryWrite(message.digest, writer.tag(7, runtime_1.WireType.LengthDelimited).fork(), options).join();
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
@@ -699,6 +936,8 @@ exports.ArtifactService = new runtime_rpc_1.ServiceType("github.actions.results.
{ name: "FinalizeArtifact", options: {}, I: exports.FinalizeArtifactRequest, O: exports.FinalizeArtifactResponse },
{ name: "ListArtifacts", options: {}, I: exports.ListArtifactsRequest, O: exports.ListArtifactsResponse },
{ name: "GetSignedArtifactURL", options: {}, I: exports.GetSignedArtifactURLRequest, O: exports.GetSignedArtifactURLResponse },
{ name: "DeleteArtifact", options: {}, I: exports.DeleteArtifactRequest, O: exports.DeleteArtifactResponse }
{ name: "DeleteArtifact", options: {}, I: exports.DeleteArtifactRequest, O: exports.DeleteArtifactResponse },
{ name: "MigrateArtifact", options: {}, I: exports.MigrateArtifactRequest, O: exports.MigrateArtifactResponse },
{ name: "FinalizeMigratedArtifact", options: {}, I: exports.FinalizeMigratedArtifactRequest, O: exports.FinalizeMigratedArtifactResponse }
]);
//# sourceMappingURL=artifact.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,30 @@
import { CreateArtifactRequest, CreateArtifactResponse, FinalizeArtifactRequest, FinalizeArtifactResponse, ListArtifactsRequest, ListArtifactsResponse, GetSignedArtifactURLRequest, GetSignedArtifactURLResponse, DeleteArtifactRequest, DeleteArtifactResponse } from "./artifact";
interface Rpc {
request(service: string, method: string, contentType: "application/json" | "application/protobuf", data: object | Uint8Array): Promise<object | Uint8Array>;
}
export interface ArtifactServiceClient {
CreateArtifact(request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export declare class ArtifactServiceClientJSON implements ArtifactServiceClient {
private readonly rpc;
constructor(rpc: Rpc);
CreateArtifact(request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export declare class ArtifactServiceClientProtobuf implements ArtifactServiceClient {
private readonly rpc;
constructor(rpc: Rpc);
CreateArtifact(request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export {};

View File

@@ -0,0 +1,100 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.ArtifactServiceClientProtobuf = exports.ArtifactServiceClientJSON = void 0;
const artifact_1 = require("./artifact");
class ArtifactServiceClientJSON {
constructor(rpc) {
this.rpc = rpc;
this.CreateArtifact.bind(this);
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(request) {
const data = artifact_1.CreateArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "CreateArtifact", "application/json", data);
return promise.then((data) => artifact_1.CreateArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
FinalizeArtifact(request) {
const data = artifact_1.FinalizeArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "FinalizeArtifact", "application/json", data);
return promise.then((data) => artifact_1.FinalizeArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
ListArtifacts(request) {
const data = artifact_1.ListArtifactsRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "ListArtifacts", "application/json", data);
return promise.then((data) => artifact_1.ListArtifactsResponse.fromJson(data, { ignoreUnknownFields: true }));
}
GetSignedArtifactURL(request) {
const data = artifact_1.GetSignedArtifactURLRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "GetSignedArtifactURL", "application/json", data);
return promise.then((data) => artifact_1.GetSignedArtifactURLResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
DeleteArtifact(request) {
const data = artifact_1.DeleteArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "DeleteArtifact", "application/json", data);
return promise.then((data) => artifact_1.DeleteArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
}
exports.ArtifactServiceClientJSON = ArtifactServiceClientJSON;
class ArtifactServiceClientProtobuf {
constructor(rpc) {
this.rpc = rpc;
this.CreateArtifact.bind(this);
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(request) {
const data = artifact_1.CreateArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "CreateArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.CreateArtifactResponse.fromBinary(data));
}
FinalizeArtifact(request) {
const data = artifact_1.FinalizeArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "FinalizeArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.FinalizeArtifactResponse.fromBinary(data));
}
ListArtifacts(request) {
const data = artifact_1.ListArtifactsRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "ListArtifacts", "application/protobuf", data);
return promise.then((data) => artifact_1.ListArtifactsResponse.fromBinary(data));
}
GetSignedArtifactURL(request) {
const data = artifact_1.GetSignedArtifactURLRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "GetSignedArtifactURL", "application/protobuf", data);
return promise.then((data) => artifact_1.GetSignedArtifactURLResponse.fromBinary(data));
}
DeleteArtifact(request) {
const data = artifact_1.DeleteArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "DeleteArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.DeleteArtifactResponse.fromBinary(data));
}
}
exports.ArtifactServiceClientProtobuf = ArtifactServiceClientProtobuf;
//# sourceMappingURL=artifact.twirp-client.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"artifact.twirp-client.js","sourceRoot":"","sources":["../../../../../src/generated/results/api/v1/artifact.twirp-client.ts"],"names":[],"mappings":";;;AAAA,yCAWoB;AA+BpB,MAAa,yBAAyB;IAEpC,YAAY,GAAQ;QAClB,IAAI,CAAC,GAAG,GAAG,GAAG,CAAC;QACf,IAAI,CAAC,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QAC/B,IAAI,CAAC,gBAAgB,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QACjC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QAC9B,IAAI,CAAC,oBAAoB,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QACrC,IAAI,CAAC,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;IACjC,CAAC;IACD,cAAc,CACZ,OAA8B;QAE9B,MAAM,IAAI,GAAG,gCAAqB,CAAC,MAAM,CAAC,OAAO,EAAE;YACjD,iBAAiB,EAAE,IAAI;YACvB,iBAAiB,EAAE,KAAK;SACzB,CAAC,CAAC;QACH,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,gBAAgB,EAChB,kBAAkB,EAClB,IAAc,CACf,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,iCAAsB,CAAC,QAAQ,CAAC,IAAW,EAAE;YAC3C,mBAAmB,EAAE,IAAI;SAC1B,CAAC,CACH,CAAC;IACJ,CAAC;IAED,gBAAgB,CACd,OAAgC;QAEhC,MAAM,IAAI,GAAG,kCAAuB,CAAC,MAAM,CAAC,OAAO,EAAE;YACnD,iBAAiB,EAAE,IAAI;YACvB,iBAAiB,EAAE,KAAK;SACzB,CAAC,CAAC;QACH,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,kBAAkB,EAClB,kBAAkB,EAClB,IAAc,CACf,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,mCAAwB,CAAC,QAAQ,CAAC,IAAW,EAAE;YAC7C,mBAAmB,EAAE,IAAI;SAC1B,CAAC,CACH,CAAC;IACJ,CAAC;IAED,aAAa,CAAC,OAA6B;QACzC,MAAM,IAAI,GAAG,+BAAoB,CAAC,MAAM,CAAC,OAAO,EAAE;YAChD,iBAAiB,EAAE,IAAI;YACvB,iBAAiB,EAAE,KAAK;SACzB,CAAC,CAAC;QACH,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,eAAe,EACf,kBAAkB,EAClB,IAAc,CACf,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,gCAAqB,CAAC,QAAQ,CAAC,IAAW,EAAE,EAAE,mBAAmB,EAAE,IAAI,EAAE,CAAC,CAC3E,CAAC;IACJ,CAAC;IAED,oBAAoB,CAClB,OAAoC;QAEpC,MAAM,IAAI,GAAG,sCAA2B,CAAC,MAAM,CAAC,OAAO,EAAE;YACvD,iBAAiB,EAAE,IAAI;YACvB,iBAAiB,EAAE,KAAK;SACzB,CAAC,CAAC;QACH,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,sBAAsB,EACtB,kBAAkB,EAClB,IAAc,CACf,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,uCAA4B,CAAC,QAAQ,CAAC,IAAW,EAAE;YACjD,mBAAmB,EAAE,IAAI;SAC1B,CAAC,CACH,CAAC;IACJ,CAAC;IAED,cAAc,CACZ,OAA8B;QAE9B,MAAM,IAAI,GAAG,gCAAqB,CAAC,MAAM,CAAC,OAAO,EAAE;YACjD,iBAAiB,EAAE,IAAI;YACvB,iBAAiB,EAAE,KAAK;SACzB,CAAC,CAAC;QACH,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,gBAAgB,EAChB,kBAAkB,EAClB,IAAc,CACf,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,iCAAsB,CAAC,QAAQ,CAAC,IAAW,EAAE;YAC3C,mBAAmB,EAAE,IAAI;SAC1B,CAAC,CACH,CAAC;IACJ,CAAC;CACF;AAzGD,8DAyGC;AAED,MAAa,6BAA6B;IAExC,YAAY,GAAQ;QAClB,IAAI,CAAC,GAAG,GAAG,GAAG,CAAC;QACf,IAAI,CAAC,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QAC/B,IAAI,CAAC,gBAAgB,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QACjC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QAC9B,IAAI,CAAC,oBAAoB,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QACrC,IAAI,CAAC,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;IACjC,CAAC;IACD,cAAc,CACZ,OAA8B;QAE9B,MAAM,IAAI,GAAG,gCAAqB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC;QACrD,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,gBAAgB,EAChB,sBAAsB,EACtB,IAAI,CACL,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,iCAAsB,CAAC,UAAU,CAAC,IAAkB,CAAC,CACtD,CAAC;IACJ,CAAC;IAED,gBAAgB,CACd,OAAgC;QAEhC,MAAM,IAAI,GAAG,kCAAuB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC;QACvD,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,kBAAkB,EAClB,sBAAsB,EACtB,IAAI,CACL,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,mCAAwB,CAAC,UAAU,CAAC,IAAkB,CAAC,CACxD,CAAC;IACJ,CAAC;IAED,aAAa,CAAC,OAA6B;QACzC,MAAM,IAAI,GAAG,+BAAoB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC;QACpD,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,eAAe,EACf,sBAAsB,EACtB,IAAI,CACL,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,gCAAqB,CAAC,UAAU,CAAC,IAAkB,CAAC,CACrD,CAAC;IACJ,CAAC;IAED,oBAAoB,CAClB,OAAoC;QAEpC,MAAM,IAAI,GAAG,sCAA2B,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC;QAC3D,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,sBAAsB,EACtB,sBAAsB,EACtB,IAAI,CACL,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,uCAA4B,CAAC,UAAU,CAAC,IAAkB,CAAC,CAC5D,CAAC;IACJ,CAAC;IAED,cAAc,CACZ,OAA8B;QAE9B,MAAM,IAAI,GAAG,gCAAqB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC;QACrD,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,gBAAgB,EAChB,sBAAsB,EACtB,IAAI,CACL,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,iCAAsB,CAAC,UAAU,CAAC,IAAkB,CAAC,CACtD,CAAC;IACJ,CAAC;CACF;AAlFD,sEAkFC"}

View File

@@ -1,48 +0,0 @@
/// <reference types="node" />
import { TwirpContext, TwirpServer } from "twirp-ts";
import { CreateArtifactRequest, CreateArtifactResponse, FinalizeArtifactRequest, FinalizeArtifactResponse, ListArtifactsRequest, ListArtifactsResponse, GetSignedArtifactURLRequest, GetSignedArtifactURLResponse, DeleteArtifactRequest, DeleteArtifactResponse } from "./artifact";
interface Rpc {
request(service: string, method: string, contentType: "application/json" | "application/protobuf", data: object | Uint8Array): Promise<object | Uint8Array>;
}
export interface ArtifactServiceClient {
CreateArtifact(request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export declare class ArtifactServiceClientJSON implements ArtifactServiceClient {
private readonly rpc;
constructor(rpc: Rpc);
CreateArtifact(request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export declare class ArtifactServiceClientProtobuf implements ArtifactServiceClient {
private readonly rpc;
constructor(rpc: Rpc);
CreateArtifact(request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export interface ArtifactServiceTwirp<T extends TwirpContext = TwirpContext> {
CreateArtifact(ctx: T, request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(ctx: T, request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(ctx: T, request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(ctx: T, request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(ctx: T, request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export declare enum ArtifactServiceMethod {
CreateArtifact = "CreateArtifact",
FinalizeArtifact = "FinalizeArtifact",
ListArtifacts = "ListArtifacts",
GetSignedArtifactURL = "GetSignedArtifactURL",
DeleteArtifact = "DeleteArtifact"
}
export declare const ArtifactServiceMethodList: ArtifactServiceMethod[];
export declare function createArtifactServiceServer<T extends TwirpContext = TwirpContext>(service: ArtifactServiceTwirp<T>): TwirpServer<ArtifactServiceTwirp<TwirpContext<import("http").IncomingMessage, import("http").ServerResponse<import("http").IncomingMessage>>>, T>;
export {};

View File

@@ -1,508 +0,0 @@
"use strict";
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
return new (P || (P = Promise))(function (resolve, reject) {
function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }
step((generator = generator.apply(thisArg, _arguments || [])).next());
});
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.createArtifactServiceServer = exports.ArtifactServiceMethodList = exports.ArtifactServiceMethod = exports.ArtifactServiceClientProtobuf = exports.ArtifactServiceClientJSON = void 0;
const twirp_ts_1 = require("twirp-ts");
const artifact_1 = require("./artifact");
class ArtifactServiceClientJSON {
constructor(rpc) {
this.rpc = rpc;
this.CreateArtifact.bind(this);
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(request) {
const data = artifact_1.CreateArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "CreateArtifact", "application/json", data);
return promise.then((data) => artifact_1.CreateArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
FinalizeArtifact(request) {
const data = artifact_1.FinalizeArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "FinalizeArtifact", "application/json", data);
return promise.then((data) => artifact_1.FinalizeArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
ListArtifacts(request) {
const data = artifact_1.ListArtifactsRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "ListArtifacts", "application/json", data);
return promise.then((data) => artifact_1.ListArtifactsResponse.fromJson(data, { ignoreUnknownFields: true }));
}
GetSignedArtifactURL(request) {
const data = artifact_1.GetSignedArtifactURLRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "GetSignedArtifactURL", "application/json", data);
return promise.then((data) => artifact_1.GetSignedArtifactURLResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
DeleteArtifact(request) {
const data = artifact_1.DeleteArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "DeleteArtifact", "application/json", data);
return promise.then((data) => artifact_1.DeleteArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
}
exports.ArtifactServiceClientJSON = ArtifactServiceClientJSON;
class ArtifactServiceClientProtobuf {
constructor(rpc) {
this.rpc = rpc;
this.CreateArtifact.bind(this);
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(request) {
const data = artifact_1.CreateArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "CreateArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.CreateArtifactResponse.fromBinary(data));
}
FinalizeArtifact(request) {
const data = artifact_1.FinalizeArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "FinalizeArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.FinalizeArtifactResponse.fromBinary(data));
}
ListArtifacts(request) {
const data = artifact_1.ListArtifactsRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "ListArtifacts", "application/protobuf", data);
return promise.then((data) => artifact_1.ListArtifactsResponse.fromBinary(data));
}
GetSignedArtifactURL(request) {
const data = artifact_1.GetSignedArtifactURLRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "GetSignedArtifactURL", "application/protobuf", data);
return promise.then((data) => artifact_1.GetSignedArtifactURLResponse.fromBinary(data));
}
DeleteArtifact(request) {
const data = artifact_1.DeleteArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "DeleteArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.DeleteArtifactResponse.fromBinary(data));
}
}
exports.ArtifactServiceClientProtobuf = ArtifactServiceClientProtobuf;
var ArtifactServiceMethod;
(function (ArtifactServiceMethod) {
ArtifactServiceMethod["CreateArtifact"] = "CreateArtifact";
ArtifactServiceMethod["FinalizeArtifact"] = "FinalizeArtifact";
ArtifactServiceMethod["ListArtifacts"] = "ListArtifacts";
ArtifactServiceMethod["GetSignedArtifactURL"] = "GetSignedArtifactURL";
ArtifactServiceMethod["DeleteArtifact"] = "DeleteArtifact";
})(ArtifactServiceMethod || (exports.ArtifactServiceMethod = ArtifactServiceMethod = {}));
exports.ArtifactServiceMethodList = [
ArtifactServiceMethod.CreateArtifact,
ArtifactServiceMethod.FinalizeArtifact,
ArtifactServiceMethod.ListArtifacts,
ArtifactServiceMethod.GetSignedArtifactURL,
ArtifactServiceMethod.DeleteArtifact,
];
function createArtifactServiceServer(service) {
return new twirp_ts_1.TwirpServer({
service,
packageName: "github.actions.results.api.v1",
serviceName: "ArtifactService",
methodList: exports.ArtifactServiceMethodList,
matchRoute: matchArtifactServiceRoute,
});
}
exports.createArtifactServiceServer = createArtifactServiceServer;
function matchArtifactServiceRoute(method, events) {
switch (method) {
case "CreateArtifact":
return (ctx, service, data, interceptors) => __awaiter(this, void 0, void 0, function* () {
ctx = Object.assign(Object.assign({}, ctx), { methodName: "CreateArtifact" });
yield events.onMatch(ctx);
return handleArtifactServiceCreateArtifactRequest(ctx, service, data, interceptors);
});
case "FinalizeArtifact":
return (ctx, service, data, interceptors) => __awaiter(this, void 0, void 0, function* () {
ctx = Object.assign(Object.assign({}, ctx), { methodName: "FinalizeArtifact" });
yield events.onMatch(ctx);
return handleArtifactServiceFinalizeArtifactRequest(ctx, service, data, interceptors);
});
case "ListArtifacts":
return (ctx, service, data, interceptors) => __awaiter(this, void 0, void 0, function* () {
ctx = Object.assign(Object.assign({}, ctx), { methodName: "ListArtifacts" });
yield events.onMatch(ctx);
return handleArtifactServiceListArtifactsRequest(ctx, service, data, interceptors);
});
case "GetSignedArtifactURL":
return (ctx, service, data, interceptors) => __awaiter(this, void 0, void 0, function* () {
ctx = Object.assign(Object.assign({}, ctx), { methodName: "GetSignedArtifactURL" });
yield events.onMatch(ctx);
return handleArtifactServiceGetSignedArtifactURLRequest(ctx, service, data, interceptors);
});
case "DeleteArtifact":
return (ctx, service, data, interceptors) => __awaiter(this, void 0, void 0, function* () {
ctx = Object.assign(Object.assign({}, ctx), { methodName: "DeleteArtifact" });
yield events.onMatch(ctx);
return handleArtifactServiceDeleteArtifactRequest(ctx, service, data, interceptors);
});
default:
events.onNotFound();
const msg = `no handler found`;
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceCreateArtifactRequest(ctx, service, data, interceptors) {
switch (ctx.contentType) {
case twirp_ts_1.TwirpContentType.JSON:
return handleArtifactServiceCreateArtifactJSON(ctx, service, data, interceptors);
case twirp_ts_1.TwirpContentType.Protobuf:
return handleArtifactServiceCreateArtifactProtobuf(ctx, service, data, interceptors);
default:
const msg = "unexpected Content-Type";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceFinalizeArtifactRequest(ctx, service, data, interceptors) {
switch (ctx.contentType) {
case twirp_ts_1.TwirpContentType.JSON:
return handleArtifactServiceFinalizeArtifactJSON(ctx, service, data, interceptors);
case twirp_ts_1.TwirpContentType.Protobuf:
return handleArtifactServiceFinalizeArtifactProtobuf(ctx, service, data, interceptors);
default:
const msg = "unexpected Content-Type";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceListArtifactsRequest(ctx, service, data, interceptors) {
switch (ctx.contentType) {
case twirp_ts_1.TwirpContentType.JSON:
return handleArtifactServiceListArtifactsJSON(ctx, service, data, interceptors);
case twirp_ts_1.TwirpContentType.Protobuf:
return handleArtifactServiceListArtifactsProtobuf(ctx, service, data, interceptors);
default:
const msg = "unexpected Content-Type";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceGetSignedArtifactURLRequest(ctx, service, data, interceptors) {
switch (ctx.contentType) {
case twirp_ts_1.TwirpContentType.JSON:
return handleArtifactServiceGetSignedArtifactURLJSON(ctx, service, data, interceptors);
case twirp_ts_1.TwirpContentType.Protobuf:
return handleArtifactServiceGetSignedArtifactURLProtobuf(ctx, service, data, interceptors);
default:
const msg = "unexpected Content-Type";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceDeleteArtifactRequest(ctx, service, data, interceptors) {
switch (ctx.contentType) {
case twirp_ts_1.TwirpContentType.JSON:
return handleArtifactServiceDeleteArtifactJSON(ctx, service, data, interceptors);
case twirp_ts_1.TwirpContentType.Protobuf:
return handleArtifactServiceDeleteArtifactProtobuf(ctx, service, data, interceptors);
default:
const msg = "unexpected Content-Type";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceCreateArtifactJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
const body = JSON.parse(data.toString() || "{}");
request = artifact_1.CreateArtifactRequest.fromJson(body, {
ignoreUnknownFields: true,
});
}
catch (e) {
if (e instanceof Error) {
const msg = "the json request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.CreateArtifact(ctx, inputReq);
});
}
else {
response = yield service.CreateArtifact(ctx, request);
}
return JSON.stringify(artifact_1.CreateArtifactResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false,
}));
});
}
function handleArtifactServiceFinalizeArtifactJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
const body = JSON.parse(data.toString() || "{}");
request = artifact_1.FinalizeArtifactRequest.fromJson(body, {
ignoreUnknownFields: true,
});
}
catch (e) {
if (e instanceof Error) {
const msg = "the json request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.FinalizeArtifact(ctx, inputReq);
});
}
else {
response = yield service.FinalizeArtifact(ctx, request);
}
return JSON.stringify(artifact_1.FinalizeArtifactResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false,
}));
});
}
function handleArtifactServiceListArtifactsJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
const body = JSON.parse(data.toString() || "{}");
request = artifact_1.ListArtifactsRequest.fromJson(body, {
ignoreUnknownFields: true,
});
}
catch (e) {
if (e instanceof Error) {
const msg = "the json request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.ListArtifacts(ctx, inputReq);
});
}
else {
response = yield service.ListArtifacts(ctx, request);
}
return JSON.stringify(artifact_1.ListArtifactsResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false,
}));
});
}
function handleArtifactServiceGetSignedArtifactURLJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
const body = JSON.parse(data.toString() || "{}");
request = artifact_1.GetSignedArtifactURLRequest.fromJson(body, {
ignoreUnknownFields: true,
});
}
catch (e) {
if (e instanceof Error) {
const msg = "the json request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.GetSignedArtifactURL(ctx, inputReq);
});
}
else {
response = yield service.GetSignedArtifactURL(ctx, request);
}
return JSON.stringify(artifact_1.GetSignedArtifactURLResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false,
}));
});
}
function handleArtifactServiceDeleteArtifactJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
const body = JSON.parse(data.toString() || "{}");
request = artifact_1.DeleteArtifactRequest.fromJson(body, {
ignoreUnknownFields: true,
});
}
catch (e) {
if (e instanceof Error) {
const msg = "the json request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.DeleteArtifact(ctx, inputReq);
});
}
else {
response = yield service.DeleteArtifact(ctx, request);
}
return JSON.stringify(artifact_1.DeleteArtifactResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false,
}));
});
}
function handleArtifactServiceCreateArtifactProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
request = artifact_1.CreateArtifactRequest.fromBinary(data);
}
catch (e) {
if (e instanceof Error) {
const msg = "the protobuf request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.CreateArtifact(ctx, inputReq);
});
}
else {
response = yield service.CreateArtifact(ctx, request);
}
return Buffer.from(artifact_1.CreateArtifactResponse.toBinary(response));
});
}
function handleArtifactServiceFinalizeArtifactProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
request = artifact_1.FinalizeArtifactRequest.fromBinary(data);
}
catch (e) {
if (e instanceof Error) {
const msg = "the protobuf request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.FinalizeArtifact(ctx, inputReq);
});
}
else {
response = yield service.FinalizeArtifact(ctx, request);
}
return Buffer.from(artifact_1.FinalizeArtifactResponse.toBinary(response));
});
}
function handleArtifactServiceListArtifactsProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
request = artifact_1.ListArtifactsRequest.fromBinary(data);
}
catch (e) {
if (e instanceof Error) {
const msg = "the protobuf request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.ListArtifacts(ctx, inputReq);
});
}
else {
response = yield service.ListArtifacts(ctx, request);
}
return Buffer.from(artifact_1.ListArtifactsResponse.toBinary(response));
});
}
function handleArtifactServiceGetSignedArtifactURLProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
request = artifact_1.GetSignedArtifactURLRequest.fromBinary(data);
}
catch (e) {
if (e instanceof Error) {
const msg = "the protobuf request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.GetSignedArtifactURL(ctx, inputReq);
});
}
else {
response = yield service.GetSignedArtifactURL(ctx, request);
}
return Buffer.from(artifact_1.GetSignedArtifactURLResponse.toBinary(response));
});
}
function handleArtifactServiceDeleteArtifactProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
request = artifact_1.DeleteArtifactRequest.fromBinary(data);
}
catch (e) {
if (e instanceof Error) {
const msg = "the protobuf request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.DeleteArtifact(ctx, inputReq);
});
}
else {
response = yield service.DeleteArtifact(ctx, request);
}
return Buffer.from(artifact_1.DeleteArtifactResponse.toBinary(response));
});
}
//# sourceMappingURL=artifact.twirp.js.map

File diff suppressed because one or more lines are too long

View File

@@ -1,4 +1,4 @@
import { DownloadArtifactOptions, DownloadArtifactResponse } from '../shared/interfaces';
export declare function streamExtractExternal(url: string, directory: string): Promise<void>;
import { DownloadArtifactOptions, DownloadArtifactResponse, StreamExtractResponse } from '../shared/interfaces';
export declare function streamExtractExternal(url: string, directory: string): Promise<StreamExtractResponse>;
export declare function downloadArtifactPublic(artifactId: number, repositoryOwner: string, repositoryName: string, token: string, options?: DownloadArtifactOptions): Promise<DownloadArtifactResponse>;
export declare function downloadArtifactInternal(artifactId: number, options?: DownloadArtifactOptions): Promise<DownloadArtifactResponse>;

View File

@@ -37,6 +37,8 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
Object.defineProperty(exports, "__esModule", { value: true });
exports.downloadArtifactInternal = exports.downloadArtifactPublic = exports.streamExtractExternal = void 0;
const promises_1 = __importDefault(require("fs/promises"));
const crypto = __importStar(require("crypto"));
const stream = __importStar(require("stream"));
const github = __importStar(require("@actions/github"));
const core = __importStar(require("@actions/core"));
const httpClient = __importStar(require("@actions/http-client"));
@@ -73,8 +75,7 @@ function streamExtract(url, directory) {
let retryCount = 0;
while (retryCount < 5) {
try {
yield streamExtractExternal(url, directory);
return;
return yield streamExtractExternal(url, directory);
}
catch (error) {
retryCount++;
@@ -94,12 +95,18 @@ function streamExtractExternal(url, directory) {
throw new Error(`Unexpected HTTP response from blob storage: ${response.message.statusCode} ${response.message.statusMessage}`);
}
const timeout = 30 * 1000; // 30 seconds
let sha256Digest = undefined;
return new Promise((resolve, reject) => {
const timerFn = () => {
response.message.destroy(new Error(`Blob storage chunk did not respond in ${timeout}ms`));
};
const timer = setTimeout(timerFn, timeout);
response.message
const hashStream = crypto.createHash('sha256').setEncoding('hex');
const passThrough = new stream.PassThrough();
response.message.pipe(passThrough);
passThrough.pipe(hashStream);
const extractStream = passThrough;
extractStream
.on('data', () => {
timer.refresh();
})
@@ -111,7 +118,12 @@ function streamExtractExternal(url, directory) {
.pipe(unzip_stream_1.default.Extract({ path: directory }))
.on('close', () => {
clearTimeout(timer);
resolve();
if (hashStream) {
hashStream.end();
sha256Digest = hashStream.read();
core.info(`SHA256 digest of downloaded artifact is ${sha256Digest}`);
}
resolve({ sha256Digest: `sha256:${sha256Digest}` });
})
.on('error', (error) => {
reject(error);
@@ -124,6 +136,7 @@ function downloadArtifactPublic(artifactId, repositoryOwner, repositoryName, tok
return __awaiter(this, void 0, void 0, function* () {
const downloadPath = yield resolveOrCreateDirectory(options === null || options === void 0 ? void 0 : options.path);
const api = github.getOctokit(token);
let digestMismatch = false;
core.info(`Downloading artifact '${artifactId}' from '${repositoryOwner}/${repositoryName}'`);
const { headers, status } = yield api.rest.actions.downloadArtifact({
owner: repositoryOwner,
@@ -144,13 +157,20 @@ function downloadArtifactPublic(artifactId, repositoryOwner, repositoryName, tok
core.info(`Redirecting to blob download url: ${scrubQueryParameters(location)}`);
try {
core.info(`Starting download of artifact to: ${downloadPath}`);
yield streamExtract(location, downloadPath);
const extractResponse = yield streamExtract(location, downloadPath);
core.info(`Artifact download completed successfully.`);
if (options === null || options === void 0 ? void 0 : options.expectedHash) {
if ((options === null || options === void 0 ? void 0 : options.expectedHash) !== extractResponse.sha256Digest) {
digestMismatch = true;
core.debug(`Computed digest: ${extractResponse.sha256Digest}`);
core.debug(`Expected digest: ${options.expectedHash}`);
}
}
}
catch (error) {
throw new Error(`Unable to download and extract artifact: ${error.message}`);
}
return { downloadPath };
return { downloadPath, digestMismatch };
});
}
exports.downloadArtifactPublic = downloadArtifactPublic;
@@ -158,6 +178,7 @@ function downloadArtifactInternal(artifactId, options) {
return __awaiter(this, void 0, void 0, function* () {
const downloadPath = yield resolveOrCreateDirectory(options === null || options === void 0 ? void 0 : options.path);
const artifactClient = (0, artifact_twirp_client_1.internalArtifactTwirpClient)();
let digestMismatch = false;
const { workflowRunBackendId, workflowJobRunBackendId } = (0, util_1.getBackendIdsFromToken)();
const listReq = {
workflowRunBackendId,
@@ -180,13 +201,20 @@ function downloadArtifactInternal(artifactId, options) {
core.info(`Redirecting to blob download url: ${scrubQueryParameters(signedUrl)}`);
try {
core.info(`Starting download of artifact to: ${downloadPath}`);
yield streamExtract(signedUrl, downloadPath);
const extractResponse = yield streamExtract(signedUrl, downloadPath);
core.info(`Artifact download completed successfully.`);
if (options === null || options === void 0 ? void 0 : options.expectedHash) {
if ((options === null || options === void 0 ? void 0 : options.expectedHash) !== extractResponse.sha256Digest) {
digestMismatch = true;
core.debug(`Computed digest: ${extractResponse.sha256Digest}`);
core.debug(`Expected digest: ${options.expectedHash}`);
}
}
}
catch (error) {
throw new Error(`Unable to download and extract artifact: ${error.message}`);
}
return { downloadPath };
return { downloadPath, digestMismatch };
});
}
exports.downloadArtifactInternal = downloadArtifactInternal;

File diff suppressed because one or more lines are too long

View File

@@ -80,13 +80,17 @@ function getArtifactPublic(artifactName, workflowRunId, repositoryOwner, reposit
name: artifact.name,
id: artifact.id,
size: artifact.size_in_bytes,
createdAt: artifact.created_at ? new Date(artifact.created_at) : undefined
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: artifact.digest
}
};
});
}
exports.getArtifactPublic = getArtifactPublic;
function getArtifactInternal(artifactName) {
var _a;
return __awaiter(this, void 0, void 0, function* () {
const artifactClient = (0, artifact_twirp_client_1.internalArtifactTwirpClient)();
const { workflowRunBackendId, workflowJobRunBackendId } = (0, util_1.getBackendIdsFromToken)();
@@ -113,7 +117,8 @@ function getArtifactInternal(artifactName) {
size: Number(artifact.size),
createdAt: artifact.createdAt
? generated_1.Timestamp.toDate(artifact.createdAt)
: undefined
: undefined,
digest: (_a = artifact.digest) === null || _a === void 0 ? void 0 : _a.value
}
};
});

View File

@@ -1 +1 @@
{"version":3,"file":"get-artifact.js","sourceRoot":"","sources":["../../../src/internal/find/get-artifact.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,4CAA0C;AAC1C,wDAA2C;AAC3C,oDAAqC;AAErC,qDAA0E;AAC1E,mDAA+C;AAC/C,oEAAsD;AAEtD,yCAAqD;AACrD,qDAAuD;AACvD,2EAA2E;AAC3E,+CAA4E;AAC5E,6CAA4E;AAE5E,SAAsB,iBAAiB,CACrC,YAAoB,EACpB,aAAqB,EACrB,eAAuB,EACvB,cAAsB,EACtB,KAAa;;;QAEb,MAAM,CAAC,SAAS,EAAE,WAAW,CAAC,GAAG,IAAA,+BAAe,EAAC,gBAAoB,CAAC,CAAA;QAEtE,MAAM,IAAI,GAAmB;YAC3B,GAAG,EAAE,SAAS;YACd,SAAS,EAAE,IAAA,+BAAkB,GAAE;YAC/B,QAAQ,EAAE,SAAS;YACnB,KAAK,EAAE,SAAS;YAChB,OAAO,EAAE,WAAW;SACrB,CAAA;QAED,MAAM,MAAM,GAAG,IAAA,mBAAU,EAAC,KAAK,EAAE,IAAI,EAAE,oBAAK,EAAE,+BAAU,CAAC,CAAA;QAEzD,MAAM,eAAe,GAAG,MAAM,MAAM,CAAC,OAAO,CAC1C,kEAAkE,EAClE;YACE,KAAK,EAAE,eAAe;YACtB,IAAI,EAAE,cAAc;YACpB,MAAM,EAAE,aAAa;YACrB,IAAI,EAAE,YAAY;SACnB,CACF,CAAA;QAED,IAAI,eAAe,CAAC,MAAM,KAAK,GAAG,EAAE;YAClC,MAAM,IAAI,6BAAoB,CAC5B,qCAAqC,eAAe,CAAC,MAAM,KAAK,MAAA,eAAe,aAAf,eAAe,uBAAf,eAAe,CAAE,OAAO,0CAAG,qBAAqB,CAAC,GAAG,CACrH,CAAA;SACF;QAED,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;YAC/C,MAAM,IAAI,8BAAqB,CAC7B,gCAAgC,YAAY;;yIAEuF,CACpI,CAAA;SACF;QAED,IAAI,QAAQ,GAAG,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAA;QAChD,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YAC7C,QAAQ,GAAG,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAA;YACxE,IAAI,CAAC,KAAK,CACR,yEAAyE,QAAQ,CAAC,EAAE,GAAG,CACxF,CAAA;SACF;QAED,OAAO;YACL,QAAQ,EAAE;gBACR,IAAI,EAAE,QAAQ,CAAC,IAAI;gBACnB,EAAE,EAAE,QAAQ,CAAC,EAAE;gBACf,IAAI,EAAE,QAAQ,CAAC,aAAa;gBAC5B,SAAS,EAAE,QAAQ,CAAC,UAAU,CAAC,CAAC,CAAC,IAAI,IAAI,CAAC,QAAQ,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,SAAS;aAC3E;SACF,CAAA;;CACF;AA3DD,8CA2DC;AAED,SAAsB,mBAAmB,CACvC,YAAoB;;QAEpB,MAAM,cAAc,GAAG,IAAA,mDAA2B,GAAE,CAAA;QAEpD,MAAM,EAAC,oBAAoB,EAAE,uBAAuB,EAAC,GACnD,IAAA,6BAAsB,GAAE,CAAA;QAE1B,MAAM,GAAG,GAAyB;YAChC,oBAAoB;YACpB,uBAAuB;YACvB,UAAU,EAAE,uBAAW,CAAC,MAAM,CAAC,EAAC,KAAK,EAAE,YAAY,EAAC,CAAC;SACtD,CAAA;QAED,MAAM,GAAG,GAAG,MAAM,cAAc,CAAC,aAAa,CAAC,GAAG,CAAC,CAAA;QAEnD,IAAI,GAAG,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;YAC9B,MAAM,IAAI,8BAAqB,CAC7B,gCAAgC,YAAY;;yIAEuF,CACpI,CAAA;SACF;QAED,IAAI,QAAQ,GAAG,GAAG,CAAC,SAAS,CAAC,CAAC,CAAC,CAAA;QAC/B,IAAI,GAAG,CAAC,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YAC5B,QAAQ,GAAG,GAAG,CAAC,SAAS,CAAC,IAAI,CAC3B,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,MAAM,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,MAAM,CAAC,CAAC,CAAC,UAAU,CAAC,CACtD,CAAC,CAAC,CAAC,CAAA;YAEJ,IAAI,CAAC,KAAK,CACR,yEAAyE,QAAQ,CAAC,UAAU,GAAG,CAChG,CAAA;SACF;QAED,OAAO;YACL,QAAQ,EAAE;gBACR,IAAI,EAAE,QAAQ,CAAC,IAAI;gBACnB,EAAE,EAAE,MAAM,CAAC,QAAQ,CAAC,UAAU,CAAC;gBAC/B,IAAI,EAAE,MAAM,CAAC,QAAQ,CAAC,IAAI,CAAC;gBAC3B,SAAS,EAAE,QAAQ,CAAC,SAAS;oBAC3B,CAAC,CAAC,qBAAS,CAAC,MAAM,CAAC,QAAQ,CAAC,SAAS,CAAC;oBACtC,CAAC,CAAC,SAAS;aACd;SACF,CAAA;IACH,CAAC;CAAA;AA7CD,kDA6CC"}
{"version":3,"file":"get-artifact.js","sourceRoot":"","sources":["../../../src/internal/find/get-artifact.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,4CAA0C;AAC1C,wDAA2C;AAC3C,oDAAqC;AAErC,qDAA0E;AAC1E,mDAA+C;AAC/C,oEAAsD;AAEtD,yCAAqD;AACrD,qDAAuD;AACvD,2EAA2E;AAC3E,+CAA4E;AAC5E,6CAA4E;AAE5E,SAAsB,iBAAiB,CACrC,YAAoB,EACpB,aAAqB,EACrB,eAAuB,EACvB,cAAsB,EACtB,KAAa;;;QAEb,MAAM,CAAC,SAAS,EAAE,WAAW,CAAC,GAAG,IAAA,+BAAe,EAAC,gBAAoB,CAAC,CAAA;QAEtE,MAAM,IAAI,GAAmB;YAC3B,GAAG,EAAE,SAAS;YACd,SAAS,EAAE,IAAA,+BAAkB,GAAE;YAC/B,QAAQ,EAAE,SAAS;YACnB,KAAK,EAAE,SAAS;YAChB,OAAO,EAAE,WAAW;SACrB,CAAA;QAED,MAAM,MAAM,GAAG,IAAA,mBAAU,EAAC,KAAK,EAAE,IAAI,EAAE,oBAAK,EAAE,+BAAU,CAAC,CAAA;QAEzD,MAAM,eAAe,GAAG,MAAM,MAAM,CAAC,OAAO,CAC1C,kEAAkE,EAClE;YACE,KAAK,EAAE,eAAe;YACtB,IAAI,EAAE,cAAc;YACpB,MAAM,EAAE,aAAa;YACrB,IAAI,EAAE,YAAY;SACnB,CACF,CAAA;QAED,IAAI,eAAe,CAAC,MAAM,KAAK,GAAG,EAAE;YAClC,MAAM,IAAI,6BAAoB,CAC5B,qCAAqC,eAAe,CAAC,MAAM,KAAK,MAAA,eAAe,aAAf,eAAe,uBAAf,eAAe,CAAE,OAAO,0CAAG,qBAAqB,CAAC,GAAG,CACrH,CAAA;SACF;QAED,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;YAC/C,MAAM,IAAI,8BAAqB,CAC7B,gCAAgC,YAAY;;yIAEuF,CACpI,CAAA;SACF;QAED,IAAI,QAAQ,GAAG,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAA;QAChD,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YAC7C,QAAQ,GAAG,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAA;YACxE,IAAI,CAAC,KAAK,CACR,yEAAyE,QAAQ,CAAC,EAAE,GAAG,CACxF,CAAA;SACF;QAED,OAAO;YACL,QAAQ,EAAE;gBACR,IAAI,EAAE,QAAQ,CAAC,IAAI;gBACnB,EAAE,EAAE,QAAQ,CAAC,EAAE;gBACf,IAAI,EAAE,QAAQ,CAAC,aAAa;gBAC5B,SAAS,EAAE,QAAQ,CAAC,UAAU;oBAC5B,CAAC,CAAC,IAAI,IAAI,CAAC,QAAQ,CAAC,UAAU,CAAC;oBAC/B,CAAC,CAAC,SAAS;gBACb,MAAM,EAAE,QAAQ,CAAC,MAAM;aACxB;SACF,CAAA;;CACF;AA9DD,8CA8DC;AAED,SAAsB,mBAAmB,CACvC,YAAoB;;;QAEpB,MAAM,cAAc,GAAG,IAAA,mDAA2B,GAAE,CAAA;QAEpD,MAAM,EAAC,oBAAoB,EAAE,uBAAuB,EAAC,GACnD,IAAA,6BAAsB,GAAE,CAAA;QAE1B,MAAM,GAAG,GAAyB;YAChC,oBAAoB;YACpB,uBAAuB;YACvB,UAAU,EAAE,uBAAW,CAAC,MAAM,CAAC,EAAC,KAAK,EAAE,YAAY,EAAC,CAAC;SACtD,CAAA;QAED,MAAM,GAAG,GAAG,MAAM,cAAc,CAAC,aAAa,CAAC,GAAG,CAAC,CAAA;QAEnD,IAAI,GAAG,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;YAC9B,MAAM,IAAI,8BAAqB,CAC7B,gCAAgC,YAAY;;yIAEuF,CACpI,CAAA;SACF;QAED,IAAI,QAAQ,GAAG,GAAG,CAAC,SAAS,CAAC,CAAC,CAAC,CAAA;QAC/B,IAAI,GAAG,CAAC,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YAC5B,QAAQ,GAAG,GAAG,CAAC,SAAS,CAAC,IAAI,CAC3B,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,MAAM,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,MAAM,CAAC,CAAC,CAAC,UAAU,CAAC,CACtD,CAAC,CAAC,CAAC,CAAA;YAEJ,IAAI,CAAC,KAAK,CACR,yEAAyE,QAAQ,CAAC,UAAU,GAAG,CAChG,CAAA;SACF;QAED,OAAO;YACL,QAAQ,EAAE;gBACR,IAAI,EAAE,QAAQ,CAAC,IAAI;gBACnB,EAAE,EAAE,MAAM,CAAC,QAAQ,CAAC,UAAU,CAAC;gBAC/B,IAAI,EAAE,MAAM,CAAC,QAAQ,CAAC,IAAI,CAAC;gBAC3B,SAAS,EAAE,QAAQ,CAAC,SAAS;oBAC3B,CAAC,CAAC,qBAAS,CAAC,MAAM,CAAC,QAAQ,CAAC,SAAS,CAAC;oBACtC,CAAC,CAAC,SAAS;gBACb,MAAM,EAAE,MAAA,QAAQ,CAAC,MAAM,0CAAE,KAAK;aAC/B;SACF,CAAA;;CACF;AA9CD,kDA8CC"}

Some files were not shown because too many files have changed in this diff Show More