Compare commits

..

134 Commits

Author SHA1 Message Date
Michael B. Gale
4e828ff8d4 Merge pull request #2989 from github/update-v3.29.4-37264dc0b
Merge main into releases/v3
2025-07-23 14:15:56 +01:00
github-actions[bot]
b3114b8965 Update changelog for v3.29.4 2025-07-23 13:00:50 +00:00
Koen Vlaswinkel
37264dc0b3 Merge pull request #2988 from github/koesie10/disable-combine-single-file
Disable combining runs within a single file
2025-07-23 14:17:59 +02:00
Koen Vlaswinkel
5a29823d01 Merge remote-tracking branch 'origin/main' into koesie10/disable-combine-single-file 2025-07-23 14:03:16 +02:00
Michael B. Gale
5a2327a6fd Merge pull request #2987 from github/mbg/combine-sarif-error
Treat processing error for multiple runs with the same category as configuration error
2025-07-23 13:02:32 +01:00
Koen Vlaswinkel
287d421cf3 Disable combining runs within a single file 2025-07-23 13:51:13 +02:00
Michael B. Gale
43afe6ec0b Treat processing error for multiple runs with the same category as configuration error
This will result in it being reported as a user error rather than a failure
2025-07-23 12:48:44 +01:00
Michael B. Gale
8f2e63676d Merge pull request #2981 from github/dependabot/npm_and_yarn/npm-fe13dfda46
Bump the npm group with 5 updates
2025-07-23 09:29:24 +01:00
Michael B. Gale
76bf77db0b Merge pull request #2980 from github/dependabot/github_actions/actions-504b6cee34
Bump ruby/setup-ruby from 1.245.0 to 1.247.0 in the actions group
2025-07-22 18:24:17 +01:00
Michael B. Gale
9e7d13dd99 Merge pull request #2983 from github/koesie10/update-changelog-link
Update combining SARIF runs changelog post URL
2025-07-22 18:09:52 +01:00
Michael B. Gale
2b952be91d Update workflow template 2025-07-22 13:31:35 +01:00
Koen Vlaswinkel
48ce740f61 Update combining SARIF runs changelog post URL 2025-07-22 11:51:12 +02:00
github-actions[bot]
4749491b98 Update checked-in dependencies 2025-07-21 19:50:38 +00:00
dependabot[bot]
b7a5452764 Bump the npm group with 5 updates
Bumps the npm group with 5 updates:

| Package | From | To |
| --- | --- | --- |
| [@types/node-forge](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node-forge) | `1.3.12` | `1.3.13` |
| [@eslint/js](https://github.com/eslint/eslint/tree/HEAD/packages/js) | `9.30.1` | `9.31.0` |
| [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin) | `8.35.1` | `8.38.0` |
| [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser) | `8.35.1` | `8.38.0` |
| [nock](https://github.com/nock/nock) | `14.0.5` | `14.0.6` |


Updates `@types/node-forge` from 1.3.12 to 1.3.13
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node-forge)

Updates `@eslint/js` from 9.30.1 to 9.31.0
- [Release notes](https://github.com/eslint/eslint/releases)
- [Changelog](https://github.com/eslint/eslint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslint/commits/v9.31.0/packages/js)

Updates `@typescript-eslint/eslint-plugin` from 8.35.1 to 8.38.0
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.38.0/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.35.1 to 8.38.0
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.38.0/packages/parser)

Updates `nock` from 14.0.5 to 14.0.6
- [Release notes](https://github.com/nock/nock/releases)
- [Changelog](https://github.com/nock/nock/blob/main/CHANGELOG.md)
- [Commits](https://github.com/nock/nock/compare/v14.0.5...v14.0.6)

---
updated-dependencies:
- dependency-name: "@types/node-forge"
  dependency-version: 1.3.13
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@eslint/js"
  dependency-version: 9.31.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-version: 8.38.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-version: 8.38.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: nock
  dependency-version: 14.0.6
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-21 19:49:59 +00:00
dependabot[bot]
20477a3fe1 Bump ruby/setup-ruby from 1.245.0 to 1.247.0 in the actions group
Bumps the actions group with 1 update: [ruby/setup-ruby](https://github.com/ruby/setup-ruby).


Updates `ruby/setup-ruby` from 1.245.0 to 1.247.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](a4effe49ee...4727905401)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-version: 1.247.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-21 18:34:27 +00:00
Chuan-kai Lin
eefe1b5db9 Merge pull request #2975 from github/cklin/overlay-telemetry
Overlay: report telemetry
2025-07-21 06:23:15 -07:00
Koen Vlaswinkel
b6332872af Merge pull request #2979 from github/koesie10/v3.28.20-changelog
Add changelog entry for v3.28.20 backport
2025-07-21 14:56:14 +02:00
Koen Vlaswinkel
8e442bc480 Merge pull request #2978 from github/mergeback/v3.29.3-to-main-d6bbdef4
Mergeback v3.29.3 refs/heads/releases/v3 into main
2025-07-21 13:49:06 +02:00
Koen Vlaswinkel
a7cb1b8b39 Add changelog entry for v3.28.20 backport 2025-07-21 13:38:40 +02:00
github-actions[bot]
b195e1bfc6 Update checked-in dependencies 2025-07-21 11:35:49 +00:00
github-actions[bot]
df82387698 Update changelog and version after v3.29.3 2025-07-21 11:33:16 +00:00
Koen Vlaswinkel
d6bbdef45e Merge pull request #2977 from github/update-v3.29.3-7710ed11e
Merge main into releases/v3
2025-07-21 13:32:49 +02:00
github-actions[bot]
210cc9bfa2 Update changelog for v3.29.3 2025-07-21 09:29:13 +00:00
Chuan-kai Lin
39b0524b50 build: refresh js files 2025-07-18 07:45:45 -07:00
Chuan-kai Lin
c3bbcab41b Add downloadOverlayBaseDatabaseFromCache tests 2025-07-18 07:44:43 -07:00
Chuan-kai Lin
e37b293334 Overlay: report overlay-base database stats 2025-07-18 07:44:22 -07:00
Chuan-kai Lin
19075c4376 Overlay: report overlay analysis mode 2025-07-18 07:18:38 -07:00
Chuan-kai Lin
7710ed11e3 Merge pull request #2970 from github/cklin/diff-informed-feature-enable
Enable Feature.DiffInformedQueries
2025-07-17 08:21:08 -07:00
Chuan-kai Lin
6a49a8cbce build: refresh js files 2025-07-17 06:17:30 -07:00
Chuan-kai Lin
3aef4108d1 Add diff-informed-analysis-utils.test.ts 2025-07-17 06:14:37 -07:00
Chuan-kai Lin
614b64c6ec Diff-informed analysis: disable for GHES below 3.19 2025-07-17 06:10:14 -07:00
Chuan-kai Lin
aefb854fe5 Feature.DiffInformedQueries: default to true 2025-07-17 06:03:52 -07:00
Chuan-kai Lin
03a2a17e75 Merge pull request #2967 from github/cklin/overlay-feature-flags
Overlay: additional feature flags
2025-07-17 05:54:21 -07:00
Koen Vlaswinkel
07455ed3c3 Merge pull request #2972 from github/koesie10/ghes-satisfies
Ignore pre-release parts when comparing GHES versions
2025-07-17 10:35:33 +02:00
Chuan-kai Lin
3fb562ddcc build: refresh js files 2025-07-16 07:10:40 -07:00
Chuan-kai Lin
709cf22a66 Limit Code Scanning API to 25 features per request 2025-07-16 07:07:44 -07:00
Chuan-kai Lin
3eaefb4deb Replicate "too many feature flags" error in test 2025-07-16 07:06:52 -07:00
Koen Vlaswinkel
e30db30685 Ignore pre-release parts when comparing GHES versions 2025-07-16 11:51:53 +02:00
Arthur Baars
0d17ea4843 Merge pull request #2963 from github/dependabot/npm_and_yarn/npm-d16eacb461
Bump the npm group across 1 directory with 7 updates
2025-07-15 14:45:25 +02:00
Arthur Baars
38fdaed818 npm run build 2025-07-15 07:33:26 +00:00
github-actions[bot]
37e3c3113a Update checked-in dependencies 2025-07-15 07:33:26 +00:00
Arthur Baars
15605b194f Make eslint happy 2025-07-15 07:31:22 +00:00
Arthur Baars
0b8d278f47 Run: npx update-browserslist-db@latest 2025-07-15 07:30:36 +00:00
Arthur Baars
ca53360d04 Fix tests 2025-07-15 07:25:49 +00:00
Arthur Baars
bbf184bd4c Update ava 2025-07-15 07:25:49 +00:00
dependabot[bot]
0c2ac60444 Bump the npm group across 1 directory with 7 updates
Bumps the npm group with 6 updates in the / directory:

| Package | From | To |
| --- | --- | --- |
| [@types/node-forge](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node-forge) | `1.3.11` | `1.3.12` |
| [@ava/typescript](https://github.com/avajs/typescript) | `4.1.0` | `6.0.0` |
| [@eslint/compat](https://github.com/eslint/rewrite/tree/HEAD/packages/compat) | `1.1.1` | `1.3.1` |
| [@eslint/js](https://github.com/eslint/eslint/tree/HEAD/packages/js) | `9.28.0` | `9.30.1` |
| [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin) | `8.33.1` | `8.35.1` |
| [sinon](https://github.com/sinonjs/sinon) | `20.0.0` | `21.0.0` |



Updates `@types/node-forge` from 1.3.11 to 1.3.12
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node-forge)

Updates `@ava/typescript` from 4.1.0 to 6.0.0
- [Release notes](https://github.com/avajs/typescript/releases)
- [Commits](https://github.com/avajs/typescript/compare/v4.1.0...v6.0.0)

Updates `@eslint/compat` from 1.1.1 to 1.3.1
- [Release notes](https://github.com/eslint/rewrite/releases)
- [Changelog](https://github.com/eslint/rewrite/blob/main/packages/compat/CHANGELOG.md)
- [Commits](https://github.com/eslint/rewrite/commits/compat-v1.3.1/packages/compat)

Updates `@eslint/js` from 9.28.0 to 9.30.1
- [Release notes](https://github.com/eslint/eslint/releases)
- [Changelog](https://github.com/eslint/eslint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslint/commits/v9.30.1/packages/js)

Updates `@typescript-eslint/eslint-plugin` from 8.33.1 to 8.35.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.35.1/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.33.1 to 8.35.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.35.1/packages/parser)

Updates `sinon` from 20.0.0 to 21.0.0
- [Release notes](https://github.com/sinonjs/sinon/releases)
- [Changelog](https://github.com/sinonjs/sinon/blob/main/docs/changelog.md)
- [Commits](https://github.com/sinonjs/sinon/commits)

---
updated-dependencies:
- dependency-name: "@types/node-forge"
  dependency-version: 1.3.12
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@ava/typescript"
  dependency-version: 6.0.0
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: npm
- dependency-name: "@eslint/compat"
  dependency-version: 1.3.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@eslint/js"
  dependency-version: 9.30.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-version: 8.35.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-version: 8.35.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: sinon
  dependency-version: 21.0.0
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-14 20:52:48 +00:00
Koen Vlaswinkel
6f936b5c2d Merge pull request #2969 from github/koesie10/fix-ghes-version-parsing
Fix parsing of GHES pre-release versions
2025-07-14 13:42:48 +02:00
Koen Vlaswinkel
c6a6c1490f Move comment to JSDoc 2025-07-14 13:18:38 +02:00
Michael B. Gale
4e20239e7b Merge pull request #2951 from github/update-supported-enterprise-server-versions
Update supported GitHub Enterprise Server versions
2025-07-14 10:39:53 +01:00
Koen Vlaswinkel
59d67fc4bf Fix parsing of GHES pre-release versions 2025-07-14 11:25:20 +02:00
Chuan-kai Lin
b37e7e2c5d Move initializeFeatures() to testing-utils
This change eliminates the need for setup-codeql.test to import from
feature-flags.test, which makes the former run all tests defined in the
latter.
2025-07-11 09:54:40 -07:00
Chuan-kai Lin
90d7727554 Overlay: check code-scanning features 2025-07-10 14:16:19 -07:00
Chuan-kai Lin
fb771764cb Extract generateCodeScanningConfig() 2025-07-10 14:14:46 -07:00
Chuan-kai Lin
d799ff5e6a Overlay: check per-language features 2025-07-10 14:14:14 -07:00
Chuan-kai Lin
9f70a5fc86 Overlay: define language-specific features 2025-07-10 11:09:28 -07:00
Chuan-kai Lin
55cb6b8b94 Extract isOverlayAnalysisFeatureEnabled() 2025-07-10 10:48:43 -07:00
Chuan-kai Lin
4bdb7fe04f Overlay database mode tests: list features
Before we introduce additional features for controlling overlay analysis
enablement, change the unit tests to specify features directly instead
of through a isFeatureEnabled boolean field.
2025-07-10 10:46:32 -07:00
Chuan-kai Lin
64fce5856f Use exclude-from-incremental also for overlay analysis 2025-07-09 14:32:05 -07:00
Chuan-kai Lin
fe7205c739 Move getOverlayDatabaseMode() call into initConfig()
In an upcoming change, getOverlayDatabaseMode() will depend on the
contents of Config. As a result, getOverlayDatabaseMode() needs to be
called after the rest of Config has already been populated.

This commit performs the refactoring to move the
getOverlayDatabaseMode() into initConfig(), after the rest of Config has
already been populated.
2025-07-09 14:32:05 -07:00
Chuan-kai Lin
4cd7a721f7 Remove loadConfig()
The loadConfig() function is mostly the same as getDefaultConfig(),
except that it calls loadUserConfig() and stores the results in
originalUserInput.

This refactoring commit replaces the loadConfig() call with
getDefaultConfig() and loadUserConfig(), which allows deleting a large
amount of duplicated code.
2025-07-09 14:32:05 -07:00
Chuan-kai Lin
f4358b38d1 Extract loadUserConfig() 2025-07-09 14:32:05 -07:00
Koen Vlaswinkel
f53ec7c550 Merge pull request #2961 from github/koesie10/disable-combine-sarif-files-ghes
Unconditionally disable combining SARIF files for GHES 3.18
2025-07-08 10:01:06 +02:00
Chuan-kai Lin
624d0bca90 Merge pull request #2945 from github/cklin/overlay-analysis
Basic support for overlay PR analysis
2025-07-07 08:41:24 -07:00
Chuan-kai Lin
ec836d6b8a build: refresh js files 2025-07-07 08:15:20 -07:00
Chuan-kai Lin
95a1b7e2bf Add getOverlayDatabaseMode() tests 2025-07-07 08:14:41 -07:00
Chuan-kai Lin
8c5122ea75 Add getPullRequestBranches() tests 2025-07-07 08:13:06 -07:00
Koen Vlaswinkel
aafbeb29bc Unconditionally disable combining SARIF files for GHES 3.18 2025-07-04 15:24:36 +02:00
Chuan-kai Lin
6a51e635a5 Add "overlay" to SARIF incrementalMode run property 2025-07-03 12:35:25 -07:00
Chuan-kai Lin
42835b3971 Override cleanup-level for overlay-base database 2025-07-03 12:35:25 -07:00
Chuan-kai Lin
2fc04c80cc Download overlay-base database from actions cache 2025-07-03 12:35:25 -07:00
Chuan-kai Lin
b95402dae1 Extract checkOverlayBaseDatabase() 2025-07-03 12:35:24 -07:00
Chuan-kai Lin
6ca06f41c4 Upload overlay-base database to actions cache 2025-07-03 12:35:24 -07:00
Chuan-kai Lin
d42ce71087 Add AugmentationProperties.useOverlayDatabaseCaching
This commit adds useOverlayDatabaseCaching to AugmentationProperties to
indicate whether the action should upload overlay-base databases to the
actions cache and to download a cached overlay-base database when
creating an overlay database.
2025-07-03 12:35:24 -07:00
Chuan-kai Lin
b4425372ef Limit OverlayAnalysis to internal repos 2025-07-03 12:35:24 -07:00
Chuan-kai Lin
93e8729640 getOverlayDatabaseMode: use Feature.OverlayAnalysis
This commit changes getOverlayDatabaseMode so that, when
Feature.OverlayAnalysis is enabled, it calculates the overlay database
mode automatically based on analysis metadata. If we are analyzing the
default branch, use OverlayBase, and if we are analyzing a PR, use
Overlay.

If CODEQL_OVERLAY_DATABASE_MODE is set to a valid overlay database mode,
that environment variable still takes precedence.
2025-07-03 12:35:24 -07:00
Chuan-kai Lin
da758dc0cd Add Feature.OverlayAnalysis 2025-07-03 12:35:24 -07:00
Chuan-kai Lin
60a2a7d623 Add isAnalyzingPullRequest() 2025-07-03 12:35:24 -07:00
Chuan-kai Lin
a336faa497 databaseInitCluster: use overlayDatabaseMode from config
This commit changes databaseInitCluster() to use overlayDatabaseMode
from AugmentationProperties instead of the overlayDatabaseMode
parameter. There is no behavior change because both overlayDatabaseMode
values are computed the same way.

The commit then cleans up the overlayDatabaseMode parameter and the code
paths that feed into it.
2025-07-03 12:35:24 -07:00
Chuan-kai Lin
ee8a8c4e0b config-utils: populate getOverlayDatabaseMode()
This commit populates getOverlayDatabaseMode() in config-utils with the
same code from getOverlayDatabaseMode() in init.
2025-07-03 12:35:24 -07:00
Chuan-kai Lin
9022c7382c Add AugmentationProperties.overlayDatabaseMode
This commit adds overlayDatabaseMode to AugmentationProperties and
creates a placeholder getOverlayDatabaseMode() function, with the
necessary inputs, to populate it.
2025-07-03 12:35:24 -07:00
Michael B. Gale
b69421388d Merge pull request #2956 from github/mbg/start-proxy/validation-improvements
Improve JSON validation in `start-proxy` action
2025-07-03 12:23:56 +01:00
Koen Vlaswinkel
33f84897c3 Merge pull request #2959 from github/koesie10/remove-combine-runs
Remove support for combining SARIF runs with non-unique categories
2025-07-02 14:34:01 +02:00
Koen Vlaswinkel
612df8d91c Remove support for combining SARIF runs with non-unique categories 2025-07-01 15:20:28 +02:00
Michael B. Gale
dcc1a6637b Merge pull request #2958 from github/mergeback/v3.29.2-to-main-181d5eef
Mergeback v3.29.2 refs/heads/releases/v3 into main
2025-06-30 14:15:46 +01:00
github-actions[bot]
144d3b8f62 Update checked-in dependencies 2025-06-30 13:02:41 +00:00
github-actions[bot]
6881d2cdc1 Update changelog and version after v3.29.2 2025-06-30 13:01:12 +00:00
Michael B. Gale
181d5eefc2 Merge pull request #2957 from github/update-v3.29.2-4c57370d0
Merge main into releases/v3
2025-06-30 14:00:45 +01:00
Michael B. Gale
c77386a9db Fix changelog PR number 2025-06-30 13:48:01 +01:00
github-actions[bot]
8d43d4ecec Update changelog for v3.29.2 2025-06-30 12:44:54 +00:00
Michael B. Gale
9281048a40 Include goproxy_server in configuration filtering tests 2025-06-27 14:32:16 +01:00
Michael B. Gale
6b83dc33ed Check for null in addition to undefined; extend tests accordingly 2025-06-27 14:32:16 +01:00
Michael B. Gale
ca0540d370 Check that individual proxy configurations are objects 2025-06-27 14:32:16 +01:00
Michael B. Gale
e9938e34d5 Check that proxy configurations are an array 2025-06-27 14:32:15 +01:00
Michael B. Gale
4c57370d03 Merge pull request #2935 from github/mbg/interpret-cq-results
Produce separate SARIF file for `quality-queries` alerts
2025-06-27 14:03:38 +01:00
Michael B. Gale
2830b750e5 Add changelog entry 2025-06-27 13:49:45 +01:00
Michael B. Gale
aa72ddaead Merge branch 'main' into mbg/interpret-cq-results 2025-06-27 13:45:51 +01:00
Michael B. Gale
65d1e45f0b Rename SARIF_UPLOAD_ENDPOINT members 2025-06-27 13:45:14 +01:00
Michael B. Gale
362ebf85da Check both SARIF files in quality-queries.yml test 2025-06-27 12:32:56 +01:00
Michael B. Gale
10a3e4b17d Fix formatting 2025-06-27 12:32:56 +01:00
Arthur Baars
8593ea65e2 Merge pull request #2954 from github/mergeback/v3.29.1-to-main-39edc492
Mergeback v3.29.1 refs/heads/releases/v3 into main
2025-06-27 13:11:54 +02:00
Michael B. Gale
3e95091e3b Add test workflow for upload-sarif with quality results 2025-06-27 12:11:12 +01:00
Michael B. Gale
7b3d150883 Use findSarifFilesInDir in upload-sarif to avoid error when there are no quality.sarif files 2025-06-27 12:08:40 +01:00
github-actions[bot]
2e3a72539c Update checked-in dependencies 2025-06-27 10:52:35 +00:00
github-actions[bot]
baf20c9b52 Update changelog and version after v3.29.1 2025-06-27 10:44:54 +00:00
Arthur Baars
39edc492db Merge pull request #2953 from github/update-v3.29.1-428aea55f
Merge main into releases/v3
2025-06-27 12:44:25 +02:00
github-actions[bot]
27c4fb1eef Update changelog for v3.29.1 2025-06-27 10:15:45 +00:00
Mads Navntoft
428aea55f5 Merge pull request #2952 from github/redsun82/fix-swift-test
Swift: recreate a default Swift package to fix test
2025-06-27 07:27:03 +02:00
Paolo Tranquilli
973250f3d2 Swift: recreate a default Swift package to fix test 2025-06-26 17:41:45 +02:00
Michael B. Gale
ad6046ff97 Avoid default arguments with historical values 2025-06-26 13:51:08 +01:00
Michael B. Gale
9ec0bb9605 Fix incorrect getSarifFilePaths call in upload-sarif action 2025-06-26 12:22:08 +01:00
Arthur Baars
8ef17824cf Merge pull request #2950 from github/update-bundle/codeql-bundle-v2.22.1
Update default bundle to 2.22.1
2025-06-26 12:53:13 +02:00
Michael B. Gale
08955dbc0d Move .sarif predicates into UploadTarget instances and rename 2025-06-26 11:43:36 +01:00
Michael B. Gale
71dd63398f Rename SARIF_UPLOAD_TARGET 2025-06-26 11:38:45 +01:00
Michael B. Gale
27db6cb5d6 Document queries parameter for databaseRunQueries 2025-06-26 11:37:10 +01:00
Michael B. Gale
768fc170da Rename resolveQuerySuiteAlias parameter 2025-06-26 11:32:48 +01:00
Michael B. Gale
79049d92c6 Fix config-queries.qls location 2025-06-25 14:42:24 +01:00
Michael B. Gale
e382508853 Prototyping adding quality queries when running queries 2025-06-25 14:24:34 +01:00
Michael B. Gale
2c76207fa4 Upload .quality.sarif files to CQ service in upload-sarif action 2025-06-25 13:43:39 +01:00
github-actions[bot]
83de9b082b Update supported GitHub Enterprise Server versions 2025-06-25 00:17:41 +00:00
github-actions[bot]
f3bfb98603 Add changelog note 2025-06-24 14:13:14 +00:00
github-actions[bot]
2b4afc20b6 Update default bundle to codeql-bundle-v2.22.1 2025-06-24 14:13:10 +00:00
Michael B. Gale
86f47e8b74 Add some more comments 2025-06-24 13:59:46 +01:00
Michael B. Gale
9b9286a835 Add test for resolveQuerySuiteAlias 2025-06-24 13:42:52 +01:00
Michael B. Gale
af32bc6d6f Add test for modified validateUniqueCategory 2025-06-24 13:26:34 +01:00
Michael B. Gale
51891595a7 Add test for modified findSarifFilesInDir 2025-06-24 13:24:04 +01:00
Michael B. Gale
f7fbaa019f Support all default query suites and resolve them 2025-06-24 13:08:56 +01:00
Michael B. Gale
6abacdb184 Fix getSarifFilePaths not using right filter 2025-06-23 18:19:43 +01:00
Michael B. Gale
f1834221f2 Allow the same category once for each type of upload 2025-06-23 18:19:43 +01:00
Michael B. Gale
45b3bec064 Upload quality SARIFs to CQ endpoint 2025-06-23 18:19:42 +01:00
Michael B. Gale
22444a650f Add ability to use different filters in findSarifFilesInDir 2025-06-23 18:19:42 +01:00
Michael B. Gale
320f7b0fd6 Resolve code-quality alias 2025-06-23 18:19:42 +01:00
Michael B. Gale
3a7544ea8f Check SARIF with quality results for expected configuration 2025-06-23 18:19:42 +01:00
Michael B. Gale
aba8788d12 Upload both SARIF files in quality-queries check 2025-06-23 18:19:42 +01:00
Michael B. Gale
3963bf423a Interpret results for quality queries and store as separate SARIF file 2025-06-23 18:19:40 +01:00
2362 changed files with 85437 additions and 22956 deletions

View File

@@ -64,37 +64,54 @@ jobs:
with:
output: ${{ runner.temp }}/results
upload-database: false
- name: Upload SARIF
- name: Upload security SARIF
uses: actions/upload-artifact@v4
with:
name: config-export-${{ matrix.os }}-${{ matrix.version }}.sarif.json
name: quality-queries-${{ matrix.os }}-${{ matrix.version }}.sarif.json
path: ${{ runner.temp }}/results/javascript.sarif
retention-days: 7
- name: Check config properties appear in SARIF
- name: Upload quality SARIF
uses: actions/upload-artifact@v4
with:
name: quality-queries-${{ matrix.os }}-${{ matrix.version }}.quality.sarif.json
path: ${{ runner.temp }}/results/javascript.quality.sarif
retention-days: 7
- name: Check quality query does not appear in security SARIF
uses: actions/github-script@v7
env:
SARIF_PATH: ${{ runner.temp }}/results/javascript.sarif
EXPECT_PRESENT: 'false'
with:
script: |
const fs = require('fs');
const sarif = JSON.parse(fs.readFileSync(process.env['SARIF_PATH'], 'utf8'));
const run = sarif.runs[0];
const configSummary = run.properties.codeqlConfigSummary;
if (configSummary === undefined) {
core.setFailed('`codeqlConfigSummary` property not found in the SARIF run property bag.');
}
if (configSummary.disableDefaultQueries !== false) {
core.setFailed('`disableDefaultQueries` property incorrect: expected false, got ' +
`${JSON.stringify(configSummary.disableDefaultQueries)}.`);
}
const expectedQueries = [{ type: 'builtinSuite', uses: 'code-quality' }];
// Use JSON.stringify to deep-equal the arrays.
if (JSON.stringify(configSummary.queries) !== JSON.stringify(expectedQueries)) {
core.setFailed(`\`queries\` property incorrect: expected ${JSON.stringify(expectedQueries)}, got ` +
`${JSON.stringify(configSummary.queries)}.`);
}
core.info('Finished config export tests.');
script: ${{ env.CHECK_SCRIPT }}
- name: Check quality query appears in quality SARIF
uses: actions/github-script@v7
env:
SARIF_PATH: ${{ runner.temp }}/results/javascript.quality.sarif
EXPECT_PRESENT: 'true'
with:
script: ${{ env.CHECK_SCRIPT }}
env:
CHECK_SCRIPT: |
const fs = require('fs');
const sarif = JSON.parse(fs.readFileSync(process.env['SARIF_PATH'], 'utf8'));
const expectPresent = JSON.parse(process.env['EXPECT_PRESENT']);
const run = sarif.runs[0];
const extensions = run.tool.extensions;
if (extensions === undefined) {
core.setFailed('`extensions` property not found in the SARIF run property bag.');
}
// ID of a query we want to check the presence for
const targetId = 'js/regex/always-matches';
const found = extensions.find(extension => extension.rules && extension.rules.find(rule => rule.id === targetId));
if (found && expectPresent) {
console.log(`Found rule with id '${targetId}'.`);
} else if (!found && !expectPresent) {
console.log(`Rule with id '${targetId}' was not found.`);
} else {
core.setFailed(`${ found ? "Found" : "Didn't find" } rule ${targetId}`);
}
CODEQL_ACTION_TEST_MODE: true

View File

@@ -46,7 +46,7 @@ jobs:
use-all-platform-bundle: 'false'
setup-kotlin: 'true'
- name: Set up Ruby
uses: ruby/setup-ruby@a4effe49ee8ee5b8b5091268c473a4628afb5651 # v1.245.0
uses: ruby/setup-ruby@472790540115ce5bd69d399a020189a8c87d641f # v1.247.0
with:
ruby-version: 2.6
- name: Install Code Scanning integration

78
.github/workflows/__upload-quality-sarif.yml generated vendored Normal file
View File

@@ -0,0 +1,78 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# (cd pr-checks; pip install ruamel.yaml@0.17.31 && python3 sync.py)
# to regenerate this file.
name: 'PR Check - Upload-sarif: code quality endpoint'
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch: {}
jobs:
upload-quality-sarif:
strategy:
fail-fast: false
matrix:
include:
- os: ubuntu-latest
version: default
- os: macos-latest
version: default
- os: windows-latest
version: default
name: 'Upload-sarif: code quality endpoint'
permissions:
contents: read
security-events: read
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
- name: Check out repository
uses: actions/checkout@v4
- name: Prepare test
id: prepare-test
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
use-all-platform-bundle: 'false'
setup-kotlin: 'true'
- name: Install Go
uses: actions/setup-go@v5
with:
go-version: '>=1.21.0'
cache: false
- uses: ./../action/init
with:
tools: ${{ steps.prepare-test.outputs.tools-url }}
languages: cpp,csharp,java,javascript,python
config-file: ${{ github.repository }}/tests/multi-language-repo/.github/codeql/custom-queries.yml@${{
github.sha }}
quality-queries: code-quality
- name: Build code
shell: bash
run: ./build.sh
# Generate some SARIF we can upload with the upload-sarif step
- uses: ./../action/analyze
with:
ref: refs/heads/main
sha: 5e235361806c361d4d3f8859e3c897658025a9a2
upload: never
- uses: ./../action/upload-sarif
with:
ref: refs/heads/main
sha: 5e235361806c361d4d3f8859e3c897658025a9a2
env:
CODEQL_ACTION_TEST_MODE: true

View File

@@ -2,15 +2,32 @@
See the [releases page](https://github.com/github/codeql-action/releases) for the relevant changes to the CodeQL CLI and language packs.
## [UNRELEASED]
## 3.29.4 - 23 Jul 2025
No user facing changes.
## 3.29.3 - 21 Jul 2025
No user facing changes.
## 3.29.2 - 30 Jun 2025
- Experimental: When the `quality-queries` input for the `init` action is provided with an argument, separate `.quality.sarif` files are produced and uploaded for each language with the results of the specified queries. Do not use this in production as it is part of an internal experiment and subject to change at any time. [#2935](https://github.com/github/codeql-action/pull/2935)
## 3.29.1 - 27 Jun 2025
- Fix bug in PR analysis where user-provided `include` query filter fails to exclude non-included queries. [#2938](https://github.com/github/codeql-action/pull/2938)
- Update default CodeQL bundle version to 2.22.1. [#2950](https://github.com/github/codeql-action/pull/2950)
## 3.29.0 - 11 Jun 2025
- Update default CodeQL bundle version to 2.22.0. [#2925](https://github.com/github/codeql-action/pull/2925)
- Bump minimum CodeQL bundle version to 2.16.6. [#2912](https://github.com/github/codeql-action/pull/2912)
## 3.28.20 - 21 July 2025
- Remove support for combining SARIF files from a single upload for GHES 3.18, see [the changelog post](https://github.blog/changelog/2024-05-06-code-scanning-will-stop-combining-runs-from-a-single-upload/). [#2959](https://github.com/github/codeql-action/pull/2959)
## 3.28.19 - 03 Jun 2025
- The CodeQL Action no longer includes its own copy of the extractor for the `actions` language, which is currently in public preview.

40
lib/actions-util.js generated
View File

@@ -49,10 +49,13 @@ exports.isDefaultSetup = isDefaultSetup;
exports.prettyPrintInvocation = prettyPrintInvocation;
exports.ensureEndsInPeriod = ensureEndsInPeriod;
exports.runTool = runTool;
exports.getPullRequestBranches = getPullRequestBranches;
exports.isAnalyzingPullRequest = isAnalyzingPullRequest;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const core = __importStar(require("@actions/core"));
const toolrunner = __importStar(require("@actions/exec/lib/toolrunner"));
const github = __importStar(require("@actions/github"));
const io = __importStar(require("@actions/io"));
const util_1 = require("./util");
// eslint-disable-next-line import/no-commonjs, @typescript-eslint/no-require-imports
@@ -352,4 +355,41 @@ const restoreInputs = function () {
}
};
exports.restoreInputs = restoreInputs;
/**
* Returns the base and head branches of the pull request being analyzed.
*
* @returns the base and head branches of the pull request, or undefined if
* we are not analyzing a pull request.
*/
function getPullRequestBranches() {
const pullRequest = github.context.payload.pull_request;
if (pullRequest) {
return {
base: pullRequest.base.ref,
// We use the head label instead of the head ref here, because the head
// ref lacks owner information and by itself does not uniquely identify
// the head branch (which may be in a forked repository).
head: pullRequest.head.label,
};
}
// PR analysis under Default Setup does not have the pull_request context,
// but it should set CODE_SCANNING_REF and CODE_SCANNING_BASE_BRANCH.
const codeScanningRef = process.env.CODE_SCANNING_REF;
const codeScanningBaseBranch = process.env.CODE_SCANNING_BASE_BRANCH;
if (codeScanningRef && codeScanningBaseBranch) {
return {
base: codeScanningBaseBranch,
// PR analysis under Default Setup analyzes the PR head commit instead of
// the merge commit, so we can use the provided ref directly.
head: codeScanningRef,
};
}
return undefined;
}
/**
* Returns whether we are analyzing a pull request.
*/
function isAnalyzingPullRequest() {
return getPullRequestBranches() !== undefined;
}
//# sourceMappingURL=actions-util.js.map

File diff suppressed because one or more lines are too long

136
lib/actions-util.test.js generated
View File

@@ -1,14 +1,78 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || (function () {
var ownKeys = function(o) {
ownKeys = Object.getOwnPropertyNames || function (o) {
var ar = [];
for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
return ar;
};
return ownKeys(o);
};
return function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
__setModuleDefault(result, mod);
return result;
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
const github = __importStar(require("@actions/github"));
const ava_1 = __importDefault(require("ava"));
const actions_util_1 = require("./actions-util");
const api_client_1 = require("./api-client");
const environment_1 = require("./environment");
const testing_utils_1 = require("./testing-utils");
const util_1 = require("./util");
(0, testing_utils_1.setupTests)(ava_1.default);
function withMockedContext(mockPayload, testFn) {
const originalPayload = github.context.payload;
github.context.payload = mockPayload;
try {
return testFn();
}
finally {
github.context.payload = originalPayload;
}
}
function withMockedEnv(envVars, testFn) {
const originalEnv = { ...process.env };
// Apply environment changes
for (const [key, value] of Object.entries(envVars)) {
if (value === undefined) {
delete process.env[key];
}
else {
process.env[key] = value;
}
}
try {
return testFn();
}
finally {
// Restore original environment
process.env = originalEnv;
}
}
(0, ava_1.default)("computeAutomationID()", async (t) => {
let actualAutomationID = (0, api_client_1.computeAutomationID)(".github/workflows/codeql-analysis.yml:analyze", '{"language": "javascript", "os": "linux"}');
t.deepEqual(actualAutomationID, ".github/workflows/codeql-analysis.yml:analyze/language:javascript/os:linux/");
@@ -25,6 +89,78 @@ const util_1 = require("./util");
actualAutomationID = (0, api_client_1.computeAutomationID)(".github/workflows/codeql-analysis.yml:analyze", undefined);
t.deepEqual(actualAutomationID, ".github/workflows/codeql-analysis.yml:analyze/");
});
(0, ava_1.default)("getPullRequestBranches() with pull request context", (t) => {
withMockedContext({
pull_request: {
number: 123,
base: { ref: "main" },
head: { label: "user:feature-branch" },
},
}, () => {
t.deepEqual((0, actions_util_1.getPullRequestBranches)(), {
base: "main",
head: "user:feature-branch",
});
t.is((0, actions_util_1.isAnalyzingPullRequest)(), true);
});
});
(0, ava_1.default)("getPullRequestBranches() returns undefined with push context", (t) => {
withMockedContext({
push: {
ref: "refs/heads/main",
},
}, () => {
t.is((0, actions_util_1.getPullRequestBranches)(), undefined);
t.is((0, actions_util_1.isAnalyzingPullRequest)(), false);
});
});
(0, ava_1.default)("getPullRequestBranches() with Default Setup environment variables", (t) => {
withMockedContext({}, () => {
withMockedEnv({
CODE_SCANNING_REF: "refs/heads/feature-branch",
CODE_SCANNING_BASE_BRANCH: "main",
}, () => {
t.deepEqual((0, actions_util_1.getPullRequestBranches)(), {
base: "main",
head: "refs/heads/feature-branch",
});
t.is((0, actions_util_1.isAnalyzingPullRequest)(), true);
});
});
});
(0, ava_1.default)("getPullRequestBranches() returns undefined when only CODE_SCANNING_REF is set", (t) => {
withMockedContext({}, () => {
withMockedEnv({
CODE_SCANNING_REF: "refs/heads/feature-branch",
CODE_SCANNING_BASE_BRANCH: undefined,
}, () => {
t.is((0, actions_util_1.getPullRequestBranches)(), undefined);
t.is((0, actions_util_1.isAnalyzingPullRequest)(), false);
});
});
});
(0, ava_1.default)("getPullRequestBranches() returns undefined when only CODE_SCANNING_BASE_BRANCH is set", (t) => {
withMockedContext({}, () => {
withMockedEnv({
CODE_SCANNING_REF: undefined,
CODE_SCANNING_BASE_BRANCH: "main",
}, () => {
t.is((0, actions_util_1.getPullRequestBranches)(), undefined);
t.is((0, actions_util_1.isAnalyzingPullRequest)(), false);
});
});
});
(0, ava_1.default)("getPullRequestBranches() returns undefined when no PR context", (t) => {
withMockedContext({}, () => {
withMockedEnv({
CODE_SCANNING_REF: undefined,
CODE_SCANNING_BASE_BRANCH: undefined,
}, () => {
t.is((0, actions_util_1.getPullRequestBranches)(), undefined);
t.is((0, actions_util_1.isAnalyzingPullRequest)(), false);
});
});
});
(0, ava_1.default)("initializeEnvironment", (t) => {
(0, util_1.initializeEnvironment)("1.2.3");
t.deepEqual(process.env[environment_1.EnvVar.VERSION], "1.2.3");

View File

@@ -1 +1 @@
{"version":3,"file":"actions-util.test.js","sourceRoot":"","sources":["../src/actions-util.test.ts"],"names":[],"mappings":";;;;;AAAA,8CAAuB;AAEvB,6CAAmD;AACnD,+CAAuC;AACvC,mDAA6C;AAC7C,iCAA+C;AAE/C,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,IAAA,aAAI,EAAC,uBAAuB,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACxC,IAAI,kBAAkB,GAAG,IAAA,gCAAmB,EAC1C,+CAA+C,EAC/C,2CAA2C,CAC5C,CAAC;IACF,CAAC,CAAC,SAAS,CACT,kBAAkB,EAClB,6EAA6E,CAC9E,CAAC;IAEF,gCAAgC;IAChC,kBAAkB,GAAG,IAAA,gCAAmB,EACtC,+CAA+C,EAC/C,2CAA2C,CAC5C,CAAC;IACF,CAAC,CAAC,SAAS,CACT,kBAAkB,EAClB,6EAA6E,CAC9E,CAAC;IAEF,6DAA6D;IAC7D,kBAAkB,GAAG,IAAA,gCAAmB,EACtC,+CAA+C,EAC/C,IAAI,CACL,CAAC;IACF,CAAC,CAAC,SAAS,CACT,kBAAkB,EAClB,gDAAgD,CACjD,CAAC;IAEF,sCAAsC;IACtC,kBAAkB,GAAG,IAAA,gCAAmB,EACtC,+CAA+C,EAC/C,qDAAqD,CACtD,CAAC;IACF,CAAC,CAAC,SAAS,CACT,kBAAkB,EAClB,gEAAgE,CACjE,CAAC;IAEF,8BAA8B;IAC9B,kBAAkB,GAAG,IAAA,gCAAmB,EACtC,+CAA+C,EAC/C,SAAS,CACV,CAAC;IACF,CAAC,CAAC,SAAS,CACT,kBAAkB,EAClB,gDAAgD,CACjD,CAAC;AACJ,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,uBAAuB,EAAE,CAAC,CAAC,EAAE,EAAE;IAClC,IAAA,4BAAqB,EAAC,OAAO,CAAC,CAAC;IAC/B,CAAC,CAAC,SAAS,CAAC,OAAO,CAAC,GAAG,CAAC,oBAAM,CAAC,OAAO,CAAC,EAAE,OAAO,CAAC,CAAC;AACpD,CAAC,CAAC,CAAC"}
{"version":3,"file":"actions-util.test.js","sourceRoot":"","sources":["../src/actions-util.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,wDAA0C;AAC1C,8CAAuB;AAEvB,iDAAgF;AAChF,6CAAmD;AACnD,+CAAuC;AACvC,mDAA6C;AAC7C,iCAA+C;AAE/C,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,SAAS,iBAAiB,CAAI,WAAgB,EAAE,MAAe;IAC7D,MAAM,eAAe,GAAG,MAAM,CAAC,OAAO,CAAC,OAAO,CAAC;IAC/C,MAAM,CAAC,OAAO,CAAC,OAAO,GAAG,WAAW,CAAC;IACrC,IAAI,CAAC;QACH,OAAO,MAAM,EAAE,CAAC;IAClB,CAAC;YAAS,CAAC;QACT,MAAM,CAAC,OAAO,CAAC,OAAO,GAAG,eAAe,CAAC;IAC3C,CAAC;AACH,CAAC;AAED,SAAS,aAAa,CACpB,OAA2C,EAC3C,MAAe;IAEf,MAAM,WAAW,GAAG,EAAE,GAAG,OAAO,CAAC,GAAG,EAAE,CAAC;IAEvC,4BAA4B;IAC5B,KAAK,MAAM,CAAC,GAAG,EAAE,KAAK,CAAC,IAAI,MAAM,CAAC,OAAO,CAAC,OAAO,CAAC,EAAE,CAAC;QACnD,IAAI,KAAK,KAAK,SAAS,EAAE,CAAC;YACxB,OAAO,OAAO,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC;QAC1B,CAAC;aAAM,CAAC;YACN,OAAO,CAAC,GAAG,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC;QAC3B,CAAC;IACH,CAAC;IAED,IAAI,CAAC;QACH,OAAO,MAAM,EAAE,CAAC;IAClB,CAAC;YAAS,CAAC;QACT,+BAA+B;QAC/B,OAAO,CAAC,GAAG,GAAG,WAAW,CAAC;IAC5B,CAAC;AACH,CAAC;AAED,IAAA,aAAI,EAAC,uBAAuB,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACxC,IAAI,kBAAkB,GAAG,IAAA,gCAAmB,EAC1C,+CAA+C,EAC/C,2CAA2C,CAC5C,CAAC;IACF,CAAC,CAAC,SAAS,CACT,kBAAkB,EAClB,6EAA6E,CAC9E,CAAC;IAEF,gCAAgC;IAChC,kBAAkB,GAAG,IAAA,gCAAmB,EACtC,+CAA+C,EAC/C,2CAA2C,CAC5C,CAAC;IACF,CAAC,CAAC,SAAS,CACT,kBAAkB,EAClB,6EAA6E,CAC9E,CAAC;IAEF,6DAA6D;IAC7D,kBAAkB,GAAG,IAAA,gCAAmB,EACtC,+CAA+C,EAC/C,IAAI,CACL,CAAC;IACF,CAAC,CAAC,SAAS,CACT,kBAAkB,EAClB,gDAAgD,CACjD,CAAC;IAEF,sCAAsC;IACtC,kBAAkB,GAAG,IAAA,gCAAmB,EACtC,+CAA+C,EAC/C,qDAAqD,CACtD,CAAC;IACF,CAAC,CAAC,SAAS,CACT,kBAAkB,EAClB,gEAAgE,CACjE,CAAC;IAEF,8BAA8B;IAC9B,kBAAkB,GAAG,IAAA,gCAAmB,EACtC,+CAA+C,EAC/C,SAAS,CACV,CAAC;IACF,CAAC,CAAC,SAAS,CACT,kBAAkB,EAClB,gDAAgD,CACjD,CAAC;AACJ,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,oDAAoD,EAAE,CAAC,CAAC,EAAE,EAAE;IAC/D,iBAAiB,CACf;QACE,YAAY,EAAE;YACZ,MAAM,EAAE,GAAG;YACX,IAAI,EAAE,EAAE,GAAG,EAAE,MAAM,EAAE;YACrB,IAAI,EAAE,EAAE,KAAK,EAAE,qBAAqB,EAAE;SACvC;KACF,EACD,GAAG,EAAE;QACH,CAAC,CAAC,SAAS,CAAC,IAAA,qCAAsB,GAAE,EAAE;YACpC,IAAI,EAAE,MAAM;YACZ,IAAI,EAAE,qBAAqB;SAC5B,CAAC,CAAC;QACH,CAAC,CAAC,EAAE,CAAC,IAAA,qCAAsB,GAAE,EAAE,IAAI,CAAC,CAAC;IACvC,CAAC,CACF,CAAC;AACJ,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,8DAA8D,EAAE,CAAC,CAAC,EAAE,EAAE;IACzE,iBAAiB,CACf;QACE,IAAI,EAAE;YACJ,GAAG,EAAE,iBAAiB;SACvB;KACF,EACD,GAAG,EAAE;QACH,CAAC,CAAC,EAAE,CAAC,IAAA,qCAAsB,GAAE,EAAE,SAAS,CAAC,CAAC;QAC1C,CAAC,CAAC,EAAE,CAAC,IAAA,qCAAsB,GAAE,EAAE,KAAK,CAAC,CAAC;IACxC,CAAC,CACF,CAAC;AACJ,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,mEAAmE,EAAE,CAAC,CAAC,EAAE,EAAE;IAC9E,iBAAiB,CAAC,EAAE,EAAE,GAAG,EAAE;QACzB,aAAa,CACX;YACE,iBAAiB,EAAE,2BAA2B;YAC9C,yBAAyB,EAAE,MAAM;SAClC,EACD,GAAG,EAAE;YACH,CAAC,CAAC,SAAS,CAAC,IAAA,qCAAsB,GAAE,EAAE;gBACpC,IAAI,EAAE,MAAM;gBACZ,IAAI,EAAE,2BAA2B;aAClC,CAAC,CAAC;YACH,CAAC,CAAC,EAAE,CAAC,IAAA,qCAAsB,GAAE,EAAE,IAAI,CAAC,CAAC;QACvC,CAAC,CACF,CAAC;IACJ,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,+EAA+E,EAAE,CAAC,CAAC,EAAE,EAAE;IAC1F,iBAAiB,CAAC,EAAE,EAAE,GAAG,EAAE;QACzB,aAAa,CACX;YACE,iBAAiB,EAAE,2BAA2B;YAC9C,yBAAyB,EAAE,SAAS;SACrC,EACD,GAAG,EAAE;YACH,CAAC,CAAC,EAAE,CAAC,IAAA,qCAAsB,GAAE,EAAE,SAAS,CAAC,CAAC;YAC1C,CAAC,CAAC,EAAE,CAAC,IAAA,qCAAsB,GAAE,EAAE,KAAK,CAAC,CAAC;QACxC,CAAC,CACF,CAAC;IACJ,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,uFAAuF,EAAE,CAAC,CAAC,EAAE,EAAE;IAClG,iBAAiB,CAAC,EAAE,EAAE,GAAG,EAAE;QACzB,aAAa,CACX;YACE,iBAAiB,EAAE,SAAS;YAC5B,yBAAyB,EAAE,MAAM;SAClC,EACD,GAAG,EAAE;YACH,CAAC,CAAC,EAAE,CAAC,IAAA,qCAAsB,GAAE,EAAE,SAAS,CAAC,CAAC;YAC1C,CAAC,CAAC,EAAE,CAAC,IAAA,qCAAsB,GAAE,EAAE,KAAK,CAAC,CAAC;QACxC,CAAC,CACF,CAAC;IACJ,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,+DAA+D,EAAE,CAAC,CAAC,EAAE,EAAE;IAC1E,iBAAiB,CAAC,EAAE,EAAE,GAAG,EAAE;QACzB,aAAa,CACX;YACE,iBAAiB,EAAE,SAAS;YAC5B,yBAAyB,EAAE,SAAS;SACrC,EACD,GAAG,EAAE;YACH,CAAC,CAAC,EAAE,CAAC,IAAA,qCAAsB,GAAE,EAAE,SAAS,CAAC,CAAC;YAC1C,CAAC,CAAC,EAAE,CAAC,IAAA,qCAAsB,GAAE,EAAE,KAAK,CAAC,CAAC;QACxC,CAAC,CACF,CAAC;IACJ,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,uBAAuB,EAAE,CAAC,CAAC,EAAE,EAAE;IAClC,IAAA,4BAAqB,EAAC,OAAO,CAAC,CAAC;IAC/B,CAAC,CAAC,SAAS,CAAC,OAAO,CAAC,GAAG,CAAC,oBAAM,CAAC,OAAO,CAAC,EAAE,OAAO,CAAC,CAAC;AACpD,CAAC,CAAC,CAAC"}

View File

@@ -68,6 +68,7 @@ const util = __importStar(require("./util"));
};
sinon.stub(configUtils, "getConfig").resolves({
gitHubVersion,
augmentationProperties: {},
languages: [],
packs: [],
trapCaches: {},
@@ -75,6 +76,7 @@ const util = __importStar(require("./util"));
const requiredInputStub = sinon.stub(actionsUtil, "getRequiredInput");
requiredInputStub.withArgs("token").returns("fake-token");
requiredInputStub.withArgs("upload-database").returns("false");
requiredInputStub.withArgs("output").returns("out");
const optionalInputStub = sinon.stub(actionsUtil, "getOptionalInput");
optionalInputStub.withArgs("cleanup-level").returns("none");
optionalInputStub.withArgs("expect-error").returns("false");

View File

@@ -1 +1 @@
{"version":3,"file":"analyze-action-env.test.js","sourceRoot":"","sources":["../src/analyze-action-env.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,8CAAuB;AACvB,6CAA+B;AAE/B,4DAA8C;AAC9C,mDAAqC;AACrC,kDAAoC;AACpC,4DAA8C;AAC9C,sDAAwC;AACxC,8DAAgD;AAChD,mDAIyB;AACzB,6CAA+B;AAE/B,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,4EAA4E;AAC5E,4EAA4E;AAC5E,+EAA+E;AAC/E,+EAA+E;AAC/E,gFAAgF;AAChF,iCAAiC;AAEjC,IAAA,aAAI,EAAC,8DAA8D,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IAC/E,MAAM,IAAI,CAAC,UAAU,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE;QACrC,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,IAAI,CAAC,iBAAiB,CAAC;QAC1D,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,sCAAsC,CAAC;QAC1E,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,wBAAwB,CAAC;QACzD,KAAK;aACF,IAAI,CAAC,YAAY,EAAE,wBAAwB,CAAC;aAC5C,QAAQ,CAAC,EAAmC,CAAC,CAAC;QACjD,KAAK,CAAC,IAAI,CAAC,YAAY,EAAE,kBAAkB,CAAC,CAAC,QAAQ,EAAE,CAAC;QACxD,KAAK,CAAC,IAAI,CAAC,QAAQ,EAAE,0BAA0B,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC;QAEhE,MAAM,aAAa,GAAuB;YACxC,IAAI,EAAE,IAAI,CAAC,aAAa,CAAC,MAAM;SAChC,CAAC;QACF,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,WAAW,CAAC,CAAC,QAAQ,CAAC;YAC5C,aAAa;YACb,SAAS,EAAE,EAAE;YACb,KAAK,EAAE,EAAE;YACT,UAAU,EAAE,EAAE;SACkB,CAAC,CAAC;QACpC,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC;QAC1D,iBAAiB,CAAC,QAAQ,CAAC,iBAAiB,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC/D,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,eAAe,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC;QAC5D,iBAAiB,CAAC,QAAQ,CAAC,cAAc,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC5D,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,kBAAkB,CAAC,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;QAC5D,IAAA,gCAAgB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACjC,IAAA,0CAA0B,EAAC,GAAG,EAAE,EAAE,CAAC,CAAC;QAEpC,uEAAuE;QACvE,0EAA0E;QAC1E,iBAAiB;QACjB,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,IAAI,CAAC;QACrC,OAAO,CAAC,GAAG,CAAC,YAAY,CAAC,GAAG,MAAM,CAAC;QAEnC,MAAM,eAAe,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,aAAa,CAAC,CAAC;QAC3D,MAAM,cAAc,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,YAAY,CAAC,CAAC;QACzD,iEAAiE;QACjE,MAAM,aAAa,GAAG,OAAO,CAAC,kBAAkB,CAAC,CAAC;QAElD,uEAAuE;QACvE,oEAAoE;QACpE,4EAA4E;QAC5E,wEAAwE;QACxE,MAAM,aAAa,CAAC,UAAU,CAAC;QAE/B,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC/D,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;QAC7D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC9D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;IAC9D,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC"}
{"version":3,"file":"analyze-action-env.test.js","sourceRoot":"","sources":["../src/analyze-action-env.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,8CAAuB;AACvB,6CAA+B;AAE/B,4DAA8C;AAC9C,mDAAqC;AACrC,kDAAoC;AACpC,4DAA8C;AAC9C,sDAAwC;AACxC,8DAAgD;AAChD,mDAIyB;AACzB,6CAA+B;AAE/B,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,4EAA4E;AAC5E,4EAA4E;AAC5E,+EAA+E;AAC/E,+EAA+E;AAC/E,gFAAgF;AAChF,iCAAiC;AAEjC,IAAA,aAAI,EAAC,8DAA8D,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IAC/E,MAAM,IAAI,CAAC,UAAU,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE;QACrC,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,IAAI,CAAC,iBAAiB,CAAC;QAC1D,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,sCAAsC,CAAC;QAC1E,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,wBAAwB,CAAC;QACzD,KAAK;aACF,IAAI,CAAC,YAAY,EAAE,wBAAwB,CAAC;aAC5C,QAAQ,CAAC,EAAmC,CAAC,CAAC;QACjD,KAAK,CAAC,IAAI,CAAC,YAAY,EAAE,kBAAkB,CAAC,CAAC,QAAQ,EAAE,CAAC;QACxD,KAAK,CAAC,IAAI,CAAC,QAAQ,EAAE,0BAA0B,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC;QAEhE,MAAM,aAAa,GAAuB;YACxC,IAAI,EAAE,IAAI,CAAC,aAAa,CAAC,MAAM;SAChC,CAAC;QACF,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,WAAW,CAAC,CAAC,QAAQ,CAAC;YAC5C,aAAa;YACb,sBAAsB,EAAE,EAAE;YAC1B,SAAS,EAAE,EAAE;YACb,KAAK,EAAE,EAAE;YACT,UAAU,EAAE,EAAE;SACkB,CAAC,CAAC;QACpC,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC;QAC1D,iBAAiB,CAAC,QAAQ,CAAC,iBAAiB,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC/D,iBAAiB,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC;QACpD,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,eAAe,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC;QAC5D,iBAAiB,CAAC,QAAQ,CAAC,cAAc,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC5D,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,kBAAkB,CAAC,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;QAC5D,IAAA,gCAAgB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACjC,IAAA,0CAA0B,EAAC,GAAG,EAAE,EAAE,CAAC,CAAC;QAEpC,uEAAuE;QACvE,0EAA0E;QAC1E,iBAAiB;QACjB,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,IAAI,CAAC;QACrC,OAAO,CAAC,GAAG,CAAC,YAAY,CAAC,GAAG,MAAM,CAAC;QAEnC,MAAM,eAAe,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,aAAa,CAAC,CAAC;QAC3D,MAAM,cAAc,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,YAAY,CAAC,CAAC;QACzD,iEAAiE;QACjE,MAAM,aAAa,GAAG,OAAO,CAAC,kBAAkB,CAAC,CAAC;QAElD,uEAAuE;QACvE,oEAAoE;QACpE,4EAA4E;QAC5E,wEAAwE;QACxE,MAAM,aAAa,CAAC,UAAU,CAAC;QAE/B,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC/D,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;QAC7D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC9D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;IAC9D,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC"}

View File

@@ -67,6 +67,7 @@ const util = __importStar(require("./util"));
};
sinon.stub(configUtils, "getConfig").resolves({
gitHubVersion,
augmentationProperties: {},
languages: [],
packs: [],
trapCaches: {},
@@ -74,6 +75,7 @@ const util = __importStar(require("./util"));
const requiredInputStub = sinon.stub(actionsUtil, "getRequiredInput");
requiredInputStub.withArgs("token").returns("fake-token");
requiredInputStub.withArgs("upload-database").returns("false");
requiredInputStub.withArgs("output").returns("out");
const optionalInputStub = sinon.stub(actionsUtil, "getOptionalInput");
optionalInputStub.withArgs("cleanup-level").returns("none");
optionalInputStub.withArgs("expect-error").returns("false");

View File

@@ -1 +1 @@
{"version":3,"file":"analyze-action-input.test.js","sourceRoot":"","sources":["../src/analyze-action-input.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,8CAAuB;AACvB,6CAA+B;AAE/B,4DAA8C;AAC9C,mDAAqC;AACrC,kDAAoC;AACpC,4DAA8C;AAC9C,sDAAwC;AACxC,8DAAgD;AAChD,mDAIyB;AACzB,6CAA+B;AAE/B,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,4EAA4E;AAC5E,4EAA4E;AAC5E,+EAA+E;AAC/E,+EAA+E;AAC/E,gFAAgF;AAChF,iCAAiC;AAEjC,IAAA,aAAI,EAAC,sDAAsD,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACvE,MAAM,IAAI,CAAC,UAAU,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE;QACrC,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,IAAI,CAAC,iBAAiB,CAAC;QAC1D,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,sCAAsC,CAAC;QAC1E,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,wBAAwB,CAAC;QACzD,KAAK;aACF,IAAI,CAAC,YAAY,EAAE,wBAAwB,CAAC;aAC5C,QAAQ,CAAC,EAAmC,CAAC,CAAC;QACjD,KAAK,CAAC,IAAI,CAAC,YAAY,EAAE,kBAAkB,CAAC,CAAC,QAAQ,EAAE,CAAC;QACxD,MAAM,aAAa,GAAuB;YACxC,IAAI,EAAE,IAAI,CAAC,aAAa,CAAC,MAAM;SAChC,CAAC;QACF,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,WAAW,CAAC,CAAC,QAAQ,CAAC;YAC5C,aAAa;YACb,SAAS,EAAE,EAAE;YACb,KAAK,EAAE,EAAE;YACT,UAAU,EAAE,EAAE;SACkB,CAAC,CAAC;QACpC,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC;QAC1D,iBAAiB,CAAC,QAAQ,CAAC,iBAAiB,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC/D,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,eAAe,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC;QAC5D,iBAAiB,CAAC,QAAQ,CAAC,cAAc,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC5D,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,kBAAkB,CAAC,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;QAC5D,KAAK,CAAC,IAAI,CAAC,QAAQ,EAAE,0BAA0B,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC;QAChE,IAAA,gCAAgB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACjC,IAAA,0CAA0B,EAAC,GAAG,EAAE,EAAE,CAAC,CAAC;QAEpC,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,GAAG,CAAC;QACpC,OAAO,CAAC,GAAG,CAAC,YAAY,CAAC,GAAG,MAAM,CAAC;QAEnC,4DAA4D;QAC5D,iBAAiB,CAAC,QAAQ,CAAC,SAAS,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;QACpD,iBAAiB,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC;QAElD,MAAM,eAAe,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,aAAa,CAAC,CAAC;QAC3D,MAAM,cAAc,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,YAAY,CAAC,CAAC;QACzD,iEAAiE;QACjE,MAAM,aAAa,GAAG,OAAO,CAAC,kBAAkB,CAAC,CAAC;QAElD,uEAAuE;QACvE,oEAAoE;QACpE,4EAA4E;QAC5E,wEAAwE;QACxE,MAAM,aAAa,CAAC,UAAU,CAAC;QAE/B,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC/D,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;QAC7D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC9D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;IAC9D,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC"}
{"version":3,"file":"analyze-action-input.test.js","sourceRoot":"","sources":["../src/analyze-action-input.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,8CAAuB;AACvB,6CAA+B;AAE/B,4DAA8C;AAC9C,mDAAqC;AACrC,kDAAoC;AACpC,4DAA8C;AAC9C,sDAAwC;AACxC,8DAAgD;AAChD,mDAIyB;AACzB,6CAA+B;AAE/B,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,4EAA4E;AAC5E,4EAA4E;AAC5E,+EAA+E;AAC/E,+EAA+E;AAC/E,gFAAgF;AAChF,iCAAiC;AAEjC,IAAA,aAAI,EAAC,sDAAsD,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACvE,MAAM,IAAI,CAAC,UAAU,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE;QACrC,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,IAAI,CAAC,iBAAiB,CAAC;QAC1D,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,sCAAsC,CAAC;QAC1E,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,wBAAwB,CAAC;QACzD,KAAK;aACF,IAAI,CAAC,YAAY,EAAE,wBAAwB,CAAC;aAC5C,QAAQ,CAAC,EAAmC,CAAC,CAAC;QACjD,KAAK,CAAC,IAAI,CAAC,YAAY,EAAE,kBAAkB,CAAC,CAAC,QAAQ,EAAE,CAAC;QACxD,MAAM,aAAa,GAAuB;YACxC,IAAI,EAAE,IAAI,CAAC,aAAa,CAAC,MAAM;SAChC,CAAC;QACF,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,WAAW,CAAC,CAAC,QAAQ,CAAC;YAC5C,aAAa;YACb,sBAAsB,EAAE,EAAE;YAC1B,SAAS,EAAE,EAAE;YACb,KAAK,EAAE,EAAE;YACT,UAAU,EAAE,EAAE;SACkB,CAAC,CAAC;QACpC,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC;QAC1D,iBAAiB,CAAC,QAAQ,CAAC,iBAAiB,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC/D,iBAAiB,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC;QACpD,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,eAAe,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC;QAC5D,iBAAiB,CAAC,QAAQ,CAAC,cAAc,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC5D,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,kBAAkB,CAAC,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;QAC5D,KAAK,CAAC,IAAI,CAAC,QAAQ,EAAE,0BAA0B,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC;QAChE,IAAA,gCAAgB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACjC,IAAA,0CAA0B,EAAC,GAAG,EAAE,EAAE,CAAC,CAAC;QAEpC,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,GAAG,CAAC;QACpC,OAAO,CAAC,GAAG,CAAC,YAAY,CAAC,GAAG,MAAM,CAAC;QAEnC,4DAA4D;QAC5D,iBAAiB,CAAC,QAAQ,CAAC,SAAS,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;QACpD,iBAAiB,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC;QAElD,MAAM,eAAe,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,aAAa,CAAC,CAAC;QAC3D,MAAM,cAAc,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,YAAY,CAAC,CAAC;QACzD,iEAAiE;QACjE,MAAM,aAAa,GAAG,OAAO,CAAC,kBAAkB,CAAC,CAAC;QAElD,uEAAuE;QACvE,oEAAoE;QACpE,4EAA4E;QAC5E,wEAAwE;QACxE,MAAM,aAAa,CAAC,UAAU,CAAC;QAE/B,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC/D,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;QAC7D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC9D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;IAC9D,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC"}

18
lib/analyze-action.js generated
View File

@@ -55,6 +55,7 @@ const environment_1 = require("./environment");
const feature_flags_1 = require("./feature-flags");
const languages_1 = require("./languages");
const logging_1 = require("./logging");
const overlay_database_utils_1 = require("./overlay-database-utils");
const repository_1 = require("./repository");
const statusReport = __importStar(require("./status-report"));
const status_report_1 = require("./status-report");
@@ -201,7 +202,14 @@ async function run() {
await (0, analyze_1.warnIfGoInstalledAfterInit)(config, logger);
await runAutobuildIfLegacyGoWorkflow(config, logger);
dbCreationTimings = await (0, analyze_1.runFinalize)(outputDir, threads, memory, codeql, config, logger);
const cleanupLevel = actionsUtil.getOptionalInput("cleanup-level") || "brutal";
// An overlay-base database should always use the 'overlay' cleanup level
// to preserve the cached intermediate results.
//
// Note that we may be overriding the 'cleanup-level' input parameter.
const cleanupLevel = config.augmentationProperties.overlayDatabaseMode ===
overlay_database_utils_1.OverlayDatabaseMode.OverlayBase
? "overlay"
: actionsUtil.getOptionalInput("cleanup-level") || "brutal";
if (actionsUtil.getRequiredInput("skip-queries") !== "true") {
runStats = await (0, analyze_1.runQueries)(outputDir, memory, util.getAddSnippetsFlag(actionsUtil.getRequiredInput("add-snippets")), threads, cleanupLevel, diffRangePackDir, actionsUtil.getOptionalInput("category"), config, logger, features);
}
@@ -216,14 +224,20 @@ async function run() {
core.setOutput("sarif-output", path_1.default.resolve(outputDir));
const uploadInput = actionsUtil.getOptionalInput("upload");
if (runStats && actionsUtil.getUploadValue(uploadInput) === "always") {
uploadResult = await uploadLib.uploadFiles(outputDir, actionsUtil.getRequiredInput("checkout_path"), actionsUtil.getOptionalInput("category"), features, logger);
uploadResult = await uploadLib.uploadFiles(outputDir, actionsUtil.getRequiredInput("checkout_path"), actionsUtil.getOptionalInput("category"), features, logger, uploadLib.CodeScanningTarget);
core.setOutput("sarif-id", uploadResult.sarifID);
if (config.augmentationProperties.qualityQueriesInput !== undefined) {
const qualityUploadResult = await uploadLib.uploadFiles(outputDir, actionsUtil.getRequiredInput("checkout_path"), actionsUtil.getOptionalInput("category"), features, logger, uploadLib.CodeQualityTarget);
core.setOutput("quality-sarif-id", qualityUploadResult.sarifID);
}
}
else {
logger.info("Not uploading results");
}
// Possibly upload the database bundles for remote queries
await (0, database_upload_1.uploadDatabases)(repositoryNwo, config, apiDetails, logger);
// Possibly upload the overlay-base database to actions cache
await (0, overlay_database_utils_1.uploadOverlayBaseDatabaseToCache)(codeql, config, logger);
// Possibly upload the TRAP caches for later re-use
const trapCacheUploadStartTime = perf_hooks_1.performance.now();
didUploadTrapCaches = await (0, trap_caching_1.uploadTrapCaches)(codeql, config, logger);

File diff suppressed because one or more lines are too long

65
lib/analyze.js generated
View File

@@ -36,10 +36,11 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.exportedForTesting = exports.CodeQLAnalysisError = void 0;
exports.exportedForTesting = exports.defaultSuites = exports.CodeQLAnalysisError = void 0;
exports.runExtraction = runExtraction;
exports.dbIsFinalized = dbIsFinalized;
exports.setupDiffInformedQueryRun = setupDiffInformedQueryRun;
exports.resolveQuerySuiteAlias = resolveQuerySuiteAlias;
exports.runQueries = runQueries;
exports.runFinalize = runFinalize;
exports.warnIfGoInstalledAfterInit = warnIfGoInstalledAfterInit;
@@ -50,7 +51,7 @@ const perf_hooks_1 = require("perf_hooks");
const io = __importStar(require("@actions/io"));
const del_1 = __importDefault(require("del"));
const yaml = __importStar(require("js-yaml"));
const actionsUtil = __importStar(require("./actions-util"));
const actions_util_1 = require("./actions-util");
const api_client_1 = require("./api-client");
const autobuild_1 = require("./autobuild");
const codeql_1 = require("./codeql");
@@ -61,6 +62,7 @@ const environment_1 = require("./environment");
const feature_flags_1 = require("./feature-flags");
const languages_1 = require("./languages");
const logging_1 = require("./logging");
const overlay_database_utils_1 = require("./overlay-database-utils");
const repository_1 = require("./repository");
const tracer_config_1 = require("./tracer-config");
const util = __importStar(require("./util"));
@@ -245,7 +247,7 @@ function getDiffRanges(fileDiff, logger) {
// uses forward slashes as the path separator, so on Windows we need to
// replace any backslashes with forward slashes.
const filename = path
.join(actionsUtil.getRequiredInput("checkout_path"), fileDiff.filename)
.join((0, actions_util_1.getRequiredInput)("checkout_path"), fileDiff.filename)
.replaceAll(path.sep, "/");
if (fileDiff.patch === undefined) {
if (fileDiff.changes === 0) {
@@ -339,7 +341,7 @@ function writeDiffRangeDataExtensionPack(logger, ranges) {
// range to a non-empty list that cannot match any alert location.
ranges = [{ path: "", startLine: 0, endLine: 0 }];
}
const diffRangeDir = path.join(actionsUtil.getTemporaryDirectory(), "pr-diff-range");
const diffRangeDir = path.join((0, actions_util_1.getTemporaryDirectory)(), "pr-diff-range");
// We expect the Actions temporary directory to already exist, so are mainly
// using `recursive: true` to avoid errors if the directory already exists,
// for example if the analyze Action is run multiple times in the same job.
@@ -385,10 +387,33 @@ extensions:
(0, diff_informed_analysis_utils_1.writeDiffRangesJsonFile)(logger, ranges);
return diffRangeDir;
}
// A set of default query suite names that are understood by the CLI.
exports.defaultSuites = new Set([
"security-experimental",
"security-extended",
"security-and-quality",
"code-quality",
"code-scanning",
]);
/**
* If `maybeSuite` is the name of a default query suite, it is resolved into the corresponding
* query suite name for the given `language`. Otherwise, `maybeSuite` is returned as is.
*
* @param language The language for which to resolve the default query suite name.
* @param maybeSuite The string that potentially contains the name of a default query suite.
* @returns Returns the resolved query suite name, or the unmodified input.
*/
function resolveQuerySuiteAlias(language, maybeSuite) {
if (exports.defaultSuites.has(maybeSuite)) {
return `${language}-${maybeSuite}.qls`;
}
return maybeSuite;
}
// Runs queries and creates sarif files in the given folder
async function runQueries(sarifFolder, memoryFlag, addSnippetsFlag, threadsFlag, cleanupLevel, diffRangePackDir, automationDetailsId, config, logger, features) {
const statusReport = {};
const queryFlags = [memoryFlag, threadsFlag];
const incrementalMode = [];
if (cleanupLevel !== "overlay") {
queryFlags.push("--expect-discarded-cache");
}
@@ -396,14 +421,33 @@ async function runQueries(sarifFolder, memoryFlag, addSnippetsFlag, threadsFlag,
if (diffRangePackDir) {
queryFlags.push(`--additional-packs=${diffRangePackDir}`);
queryFlags.push("--extension-packs=codeql-action/pr-diff-range");
incrementalMode.push("diff-informed");
}
const sarifRunPropertyFlag = diffRangePackDir
? "--sarif-run-property=incrementalMode=diff-informed"
statusReport.analysis_is_overlay =
config.augmentationProperties.overlayDatabaseMode ===
overlay_database_utils_1.OverlayDatabaseMode.Overlay;
statusReport.analysis_builds_overlay_base_database =
config.augmentationProperties.overlayDatabaseMode ===
overlay_database_utils_1.OverlayDatabaseMode.OverlayBase;
if (config.augmentationProperties.overlayDatabaseMode ===
overlay_database_utils_1.OverlayDatabaseMode.Overlay) {
incrementalMode.push("overlay");
}
const sarifRunPropertyFlag = incrementalMode.length > 0
? `--sarif-run-property=incrementalMode=${incrementalMode.join(",")}`
: undefined;
const codeql = await (0, codeql_1.getCodeQL)(config.codeQLCmd);
for (const language of config.languages) {
try {
const sarifFile = path.join(sarifFolder, `${language}.sarif`);
const queries = [];
if (config.augmentationProperties.qualityQueriesInput !== undefined) {
queries.push(path.join(util.getCodeQLDatabasePath(config, language), "temp", "config-queries.qls"));
for (const qualityQuery of config.augmentationProperties
.qualityQueriesInput) {
queries.push(resolveQuerySuiteAlias(language, qualityQuery.uses));
}
}
// The work needed to generate the query suites
// is done in the CLI. We just need to make a single
// call to run all the queries for each language and
@@ -411,7 +455,7 @@ async function runQueries(sarifFolder, memoryFlag, addSnippetsFlag, threadsFlag,
logger.startGroup(`Running queries for ${language}`);
const startTimeRunQueries = new Date().getTime();
const databasePath = util.getCodeQLDatabasePath(config, language);
await codeql.databaseRunQueries(databasePath, queryFlags);
await codeql.databaseRunQueries(databasePath, queryFlags, queries);
logger.debug(`Finished running queries for ${language}.`);
// TODO should not be using `builtin` here. We should be using `all` instead.
// The status report does not support `all` yet.
@@ -420,6 +464,13 @@ async function runQueries(sarifFolder, memoryFlag, addSnippetsFlag, threadsFlag,
logger.startGroup(`Interpreting results for ${language}`);
const startTimeInterpretResults = new Date();
const analysisSummary = await runInterpretResults(language, undefined, sarifFile, config.debugMode);
if (config.augmentationProperties.qualityQueriesInput !== undefined) {
logger.info(`Interpreting quality results for ${language}`);
const qualitySarifFile = path.join(sarifFolder, `${language}.quality.sarif`);
const qualityAnalysisSummary = await runInterpretResults(language, config.augmentationProperties.qualityQueriesInput.map((i) => resolveQuerySuiteAlias(language, i.uses)), qualitySarifFile, config.debugMode);
// TODO: move
logger.info(qualityAnalysisSummary);
}
const endTimeInterpretResults = new Date();
statusReport[`interpret_results_${language}_duration_ms`] =
endTimeInterpretResults.getTime() - startTimeInterpretResults.getTime();

File diff suppressed because one or more lines are too long

15
lib/analyze.test.js generated
View File

@@ -116,7 +116,9 @@ const util = __importStar(require("./util"));
});
const statusReport = await (0, analyze_1.runQueries)(tmpDir, memoryFlag, addSnippetsFlag, threadsFlag, "brutal", undefined, undefined, config, (0, logging_1.getRunnerLogger)(true), (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.QaTelemetryEnabled]));
t.deepEqual(Object.keys(statusReport).sort(), [
"analysis_builds_overlay_base_database",
"analysis_is_diff_informed",
"analysis_is_overlay",
`analyze_builtin_queries_${language}_duration_ms`,
"event_reports",
`interpret_results_${language}_duration_ms`,
@@ -313,4 +315,17 @@ function runGetDiffRanges(changes, patch) {
const diffRanges = runGetDiffRanges(2, ["@@ 30 +50,2 @@", "+1", "+2"]);
t.deepEqual(diffRanges, undefined);
});
(0, ava_1.default)("resolveQuerySuiteAlias", (t) => {
// default query suite names should resolve to something language-specific ending in `.qls`.
for (const suite of analyze_1.defaultSuites) {
const resolved = (0, analyze_1.resolveQuerySuiteAlias)(languages_1.Language.go, suite);
t.assert(resolved.endsWith(".qls"), "Resolved default suite doesn't end in .qls");
t.assert(resolved.indexOf(languages_1.Language.go) >= 0, "Resolved default suite doesn't contain language name");
}
// other inputs should be returned unchanged
const names = ["foo", "bar", "codeql/go-queries@1.0"];
for (const name of names) {
t.deepEqual((0, analyze_1.resolveQuerySuiteAlias)(languages_1.Language.go, name), name);
}
});
//# sourceMappingURL=analyze.test.js.map

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
{ "maximumVersion": "3.18", "minimumVersion": "3.14" }
{ "maximumVersion": "3.18", "minimumVersion": "3.13" }

63
lib/codeql.js generated
View File

@@ -50,6 +50,7 @@ const toolrunner = __importStar(require("@actions/exec/lib/toolrunner"));
const yaml = __importStar(require("js-yaml"));
const actions_util_1 = require("./actions-util");
const cli_errors_1 = require("./cli-errors");
const config_utils_1 = require("./config-utils");
const doc_url_1 = require("./doc-url");
const environment_1 = require("./environment");
const feature_flags_1 = require("./feature-flags");
@@ -254,14 +255,14 @@ async function getCodeQLForCmd(cmd, checkVersion) {
async supportsFeature(feature) {
return (0, tools_features_1.isSupportedToolsFeature)(await this.getVersion(), feature);
},
async databaseInitCluster(config, sourceRoot, processName, qlconfigFile, overlayDatabaseMode, logger) {
async databaseInitCluster(config, sourceRoot, processName, qlconfigFile, logger) {
const extraArgs = config.languages.map((language) => `--language=${language}`);
if (await (0, tracer_config_1.shouldEnableIndirectTracing)(codeql, config)) {
extraArgs.push("--begin-tracing");
extraArgs.push(...(await getTrapCachingExtractorConfigArgs(config)));
extraArgs.push(`--trace-process-name=${processName}`);
}
const codeScanningConfigFile = await generateCodeScanningConfig(config, logger);
const codeScanningConfigFile = await writeCodeScanningConfigFile(config, logger);
const externalRepositoryToken = (0, actions_util_1.getOptionalInput)("external-repository-token");
extraArgs.push(`--codescanning-config=${codeScanningConfigFile}`);
if (externalRepositoryToken) {
@@ -276,6 +277,7 @@ async function getCodeQLForCmd(cmd, checkVersion) {
const overwriteFlag = (0, tools_features_1.isSupportedToolsFeature)(await this.getVersion(), tools_features_1.ToolsFeature.ForceOverwrite)
? "--force-overwrite"
: "--overwrite";
const overlayDatabaseMode = config.augmentationProperties.overlayDatabaseMode;
if (overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.Overlay) {
const overlayChangesFile = await (0, overlay_database_utils_1.writeOverlayChangesFile)(config, sourceRoot, logger);
extraArgs.push(`--overlay-changes=${overlayChangesFile}`);
@@ -450,7 +452,7 @@ async function getCodeQLForCmd(cmd, checkVersion) {
throw new Error(`Unexpected output from codeql resolve build-environment: ${e} in\n${output}`);
}
},
async databaseRunQueries(databasePath, flags) {
async databaseRunQueries(databasePath, flags, queries = []) {
const codeqlArgs = [
"database",
"run-queries",
@@ -459,6 +461,7 @@ async function getCodeQLForCmd(cmd, checkVersion) {
"--intra-layer-parallelism",
"--min-disk-free=1024", // Try to leave at least 1GB free
"-v",
...queries,
...getExtraOptionsFromEnv(["database", "run-queries"], {
ignoringOptions: ["--expect-discarded-cache"],
}),
@@ -754,59 +757,9 @@ async function runCli(cmd, args = [], opts = {}) {
* @param config The configuration to use.
* @returns the path to the generated user configuration file.
*/
async function generateCodeScanningConfig(config, logger) {
async function writeCodeScanningConfigFile(config, logger) {
const codeScanningConfigFile = getGeneratedCodeScanningConfigPath(config);
// make a copy so we can modify it
const augmentedConfig = (0, util_1.cloneObject)(config.originalUserInput);
// Inject the queries from the input
if (config.augmentationProperties.queriesInput ||
config.augmentationProperties.qualityQueriesInput) {
const queryInputs = (config.augmentationProperties.queriesInput || []).concat(config.augmentationProperties.qualityQueriesInput || []);
if (config.augmentationProperties.queriesInputCombines) {
augmentedConfig.queries = (augmentedConfig.queries || []).concat(queryInputs);
}
else {
augmentedConfig.queries = queryInputs;
}
}
if (augmentedConfig.queries?.length === 0) {
delete augmentedConfig.queries;
}
// Inject the packs from the input
if (config.augmentationProperties.packsInput) {
if (config.augmentationProperties.packsInputCombines) {
// At this point, we already know that this is a single-language analysis
if (Array.isArray(augmentedConfig.packs)) {
augmentedConfig.packs = (augmentedConfig.packs || []).concat(config.augmentationProperties.packsInput);
}
else if (!augmentedConfig.packs) {
augmentedConfig.packs = config.augmentationProperties.packsInput;
}
else {
// At this point, we know there is only one language.
// If there were more than one language, an error would already have been thrown.
const language = Object.keys(augmentedConfig.packs)[0];
augmentedConfig.packs[language] = augmentedConfig.packs[language].concat(config.augmentationProperties.packsInput);
}
}
else {
augmentedConfig.packs = config.augmentationProperties.packsInput;
}
}
if (Array.isArray(augmentedConfig.packs) && !augmentedConfig.packs.length) {
delete augmentedConfig.packs;
}
augmentedConfig["query-filters"] = [
// Ordering matters. If the first filter is an inclusion, it implicitly
// excludes all queries that are not included. If it is an exclusion,
// it implicitly includes all queries that are not excluded. So user
// filters (if any) should always be first to preserve intent.
...(augmentedConfig["query-filters"] || []),
...(config.augmentationProperties.extraQueryExclusions || []),
];
if (augmentedConfig["query-filters"]?.length === 0) {
delete augmentedConfig["query-filters"];
}
const augmentedConfig = (0, config_utils_1.generateCodeScanningConfig)(config.originalUserInput, config.augmentationProperties);
logger.info(`Writing augmented user configuration file to ${codeScanningConfigFile}`);
logger.startGroup("Augmented user configuration file contents");
logger.info(yaml.dump(augmentedConfig));

File diff suppressed because one or more lines are too long

9
lib/codeql.test.js generated
View File

@@ -54,7 +54,6 @@ const defaults = __importStar(require("./defaults.json"));
const doc_url_1 = require("./doc-url");
const languages_1 = require("./languages");
const logging_1 = require("./logging");
const overlay_database_utils_1 = require("./overlay-database-utils");
const setup_codeql_1 = require("./setup-codeql");
const testing_utils_1 = require("./testing-utils");
const tools_features_1 = require("./tools-features");
@@ -337,7 +336,7 @@ const injectedConfigMacro = ava_1.default.macro({
tempDir,
augmentationProperties,
};
await codeqlObject.databaseInitCluster(thisStubConfig, "", undefined, undefined, overlay_database_utils_1.OverlayDatabaseMode.None, (0, logging_1.getRunnerLogger)(true));
await codeqlObject.databaseInitCluster(thisStubConfig, "", undefined, undefined, (0, logging_1.getRunnerLogger)(true));
const args = runnerConstructorStub.firstCall.args[1];
// should have used an config file
const configArg = args.find((arg) => arg.startsWith("--codescanning-config="));
@@ -470,7 +469,7 @@ const injectedConfigMacro = ava_1.default.macro({
const runnerConstructorStub = stubToolRunnerConstructor();
const codeqlObject = await codeql.getCodeQLForTesting();
sinon.stub(codeqlObject, "getVersion").resolves((0, testing_utils_1.makeVersionInfo)("2.17.6"));
await codeqlObject.databaseInitCluster({ ...stubConfig, tempDir }, "", undefined, "/path/to/qlconfig.yml", overlay_database_utils_1.OverlayDatabaseMode.None, (0, logging_1.getRunnerLogger)(true));
await codeqlObject.databaseInitCluster({ ...stubConfig, tempDir }, "", undefined, "/path/to/qlconfig.yml", (0, logging_1.getRunnerLogger)(true));
const args = runnerConstructorStub.firstCall.args[1];
// should have used a config file
const hasCodeScanningConfigArg = args.some((arg) => arg.startsWith("--codescanning-config="));
@@ -486,7 +485,7 @@ const injectedConfigMacro = ava_1.default.macro({
const codeqlObject = await codeql.getCodeQLForTesting();
sinon.stub(codeqlObject, "getVersion").resolves((0, testing_utils_1.makeVersionInfo)("2.17.6"));
await codeqlObject.databaseInitCluster({ ...stubConfig, tempDir }, "", undefined, undefined, // undefined qlconfigFile
overlay_database_utils_1.OverlayDatabaseMode.None, (0, logging_1.getRunnerLogger)(true));
(0, logging_1.getRunnerLogger)(true));
const args = runnerConstructorStub.firstCall.args[1];
const hasQlconfigArg = args.some((arg) => arg.startsWith("--qlconfig-file="));
t.false(hasQlconfigArg, "should NOT have injected a qlconfig");
@@ -637,7 +636,7 @@ for (const { codeqlVersion, flagPassed, githubVersion, negativeFlagPassed, } of
sinon.stub(io, "which").resolves("");
process.env["CODEQL_ACTION_EXTRA_OPTIONS"] =
'{ "database": { "init": ["--overwrite"] } }';
await codeqlObject.databaseInitCluster(stubConfig, "sourceRoot", undefined, undefined, overlay_database_utils_1.OverlayDatabaseMode.None, (0, logging_1.getRunnerLogger)(false));
await codeqlObject.databaseInitCluster(stubConfig, "sourceRoot", undefined, undefined, (0, logging_1.getRunnerLogger)(false));
t.true(runnerConstructorStub.calledOnce);
const args = runnerConstructorStub.firstCall.args[1];
t.is(args.filter((option) => option === "--overwrite").length, 1, "--overwrite should only be passed once");

File diff suppressed because one or more lines are too long

270
lib/config-utils.js generated
View File

@@ -47,6 +47,7 @@ exports.getLanguages = getLanguages;
exports.getRawLanguages = getRawLanguages;
exports.getDefaultConfig = getDefaultConfig;
exports.calculateAugmentation = calculateAugmentation;
exports.getOverlayDatabaseMode = getOverlayDatabaseMode;
exports.parsePacksFromInput = parsePacksFromInput;
exports.parsePacksSpecification = parsePacksSpecification;
exports.validatePackSpecification = validatePackSpecification;
@@ -57,16 +58,20 @@ exports.getConfig = getConfig;
exports.generateRegistries = generateRegistries;
exports.wrapEnvironment = wrapEnvironment;
exports.parseBuildModeInput = parseBuildModeInput;
exports.generateCodeScanningConfig = generateCodeScanningConfig;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const perf_hooks_1 = require("perf_hooks");
const yaml = __importStar(require("js-yaml"));
const semver = __importStar(require("semver"));
const actions_util_1 = require("./actions-util");
const api = __importStar(require("./api-client"));
const caching_utils_1 = require("./caching-utils");
const diff_informed_analysis_utils_1 = require("./diff-informed-analysis-utils");
const feature_flags_1 = require("./feature-flags");
const git_utils_1 = require("./git-utils");
const languages_1 = require("./languages");
const overlay_database_utils_1 = require("./overlay-database-utils");
const trap_caching_1 = require("./trap-caching");
const util_1 = require("./util");
// Property names from the user-supplied config file.
@@ -82,6 +87,8 @@ exports.defaultAugmentationProperties = {
queriesInput: undefined,
qualityQueriesInput: undefined,
extraQueryExclusions: [],
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
};
function getPacksStrInvalid(packStr, configFile) {
return configFile
@@ -225,12 +232,12 @@ async function getRawLanguages(languagesInput, repository, logger) {
return { rawLanguages, autodetected };
}
/**
* Get the default config for when the user has not supplied one.
* Get the default config, populated without user configuration file.
*/
async function getDefaultConfig({ languagesInput, queriesInput, qualityQueriesInput, packsInput, buildModeInput, dbLocation, trapCachingEnabled, dependencyCachingEnabled, debugMode, debugArtifactName, debugDatabaseName, repository, tempDir, codeql, githubVersion, features, logger, }) {
const languages = await getLanguages(codeql, languagesInput, repository, logger);
const buildMode = await parseBuildModeInput(buildModeInput, languages, features, logger);
const augmentationProperties = await calculateAugmentation(codeql, features, packsInput, queriesInput, qualityQueriesInput, languages, logger);
const augmentationProperties = await calculateAugmentation(packsInput, queriesInput, qualityQueriesInput, languages);
const { trapCaches, trapCacheDownloadTime } = await downloadCacheWithTime(trapCachingEnabled, codeql, languages, logger);
return {
languages,
@@ -259,11 +266,7 @@ async function downloadCacheWithTime(trapCachingEnabled, codeQL, languages, logg
}
return { trapCaches, trapCacheDownloadTime };
}
/**
* Load the config from the given file.
*/
async function loadConfig({ languagesInput, queriesInput, qualityQueriesInput, packsInput, buildModeInput, configFile, dbLocation, trapCachingEnabled, dependencyCachingEnabled, debugMode, debugArtifactName, debugDatabaseName, repository, tempDir, codeql, workspacePath, githubVersion, apiDetails, features, logger, }) {
let parsedYAML;
async function loadUserConfig(configFile, workspacePath, apiDetails, tempDir) {
if (isLocal(configFile)) {
if (configFile !== userConfigFromActionPath(tempDir)) {
// If the config file is not generated by the Action, it should be relative to the workspace.
@@ -273,31 +276,11 @@ async function loadConfig({ languagesInput, queriesInput, qualityQueriesInput, p
throw new util_1.ConfigurationError(getConfigFileOutsideWorkspaceErrorMessage(configFile));
}
}
parsedYAML = getLocalConfig(configFile);
return getLocalConfig(configFile);
}
else {
parsedYAML = await getRemoteConfig(configFile, apiDetails);
return await getRemoteConfig(configFile, apiDetails);
}
const languages = await getLanguages(codeql, languagesInput, repository, logger);
const buildMode = await parseBuildModeInput(buildModeInput, languages, features, logger);
const augmentationProperties = await calculateAugmentation(codeql, features, packsInput, queriesInput, qualityQueriesInput, languages, logger);
const { trapCaches, trapCacheDownloadTime } = await downloadCacheWithTime(trapCachingEnabled, codeql, languages, logger);
return {
languages,
buildMode,
originalUserInput: parsedYAML,
tempDir,
codeQLCmd: codeql.getPath(),
gitHubVersion: githubVersion,
dbLocation: dbLocationOrDefault(dbLocation, tempDir),
debugMode,
debugArtifactName,
debugDatabaseName,
augmentationProperties,
trapCaches,
trapCacheDownloadTime,
dependencyCachingEnabled: (0, caching_utils_1.getCachingKind)(dependencyCachingEnabled),
};
}
/**
* Calculates how the codeql config file needs to be augmented before passing
@@ -306,14 +289,11 @@ async function loadConfig({ languagesInput, queriesInput, qualityQueriesInput, p
* and the CLI does not know about these inputs so we need to inject them into
* the config file sent to the CLI.
*
* @param codeql The CodeQL object.
* @param features The feature enablement object.
* @param rawPacksInput The packs input from the action configuration.
* @param rawQueriesInput The queries input from the action configuration.
* @param languages The languages that the config file is for. If the packs input
* is non-empty, then there must be exactly one language. Otherwise, an
* error is thrown.
* @param logger The logger to use for logging.
*
* @returns The properties that need to be augmented in the config file.
*
@@ -321,25 +301,21 @@ async function loadConfig({ languagesInput, queriesInput, qualityQueriesInput, p
* not have exactly one language.
*/
// exported for testing.
async function calculateAugmentation(codeql, features, rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages, logger) {
async function calculateAugmentation(rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages) {
const packsInputCombines = shouldCombine(rawPacksInput);
const packsInput = parsePacksFromInput(rawPacksInput, languages, packsInputCombines);
const queriesInputCombines = shouldCombine(rawQueriesInput);
const queriesInput = parseQueriesFromInput(rawQueriesInput, queriesInputCombines);
const qualityQueriesInput = parseQueriesFromInput(rawQualityQueriesInput, false);
const extraQueryExclusions = [];
if (await (0, diff_informed_analysis_utils_1.shouldPerformDiffInformedAnalysis)(codeql, features, logger)) {
extraQueryExclusions.push({
exclude: { tags: "exclude-from-incremental" },
});
}
return {
packsInputCombines,
packsInput: packsInput?.[languages[0]],
queriesInput,
queriesInputCombines,
qualityQueriesInput,
extraQueryExclusions,
extraQueryExclusions: [],
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
};
}
function parseQueriesFromInput(rawQueriesInput, queriesInputCombines) {
@@ -354,6 +330,142 @@ function parseQueriesFromInput(rawQueriesInput, queriesInputCombines) {
}
return trimmedInput.split(",").map((query) => ({ uses: query.trim() }));
}
const OVERLAY_ANALYSIS_FEATURES = {
actions: feature_flags_1.Feature.OverlayAnalysisActions,
cpp: feature_flags_1.Feature.OverlayAnalysisCpp,
csharp: feature_flags_1.Feature.OverlayAnalysisCsharp,
go: feature_flags_1.Feature.OverlayAnalysisGo,
java: feature_flags_1.Feature.OverlayAnalysisJava,
javascript: feature_flags_1.Feature.OverlayAnalysisJavascript,
python: feature_flags_1.Feature.OverlayAnalysisPython,
ruby: feature_flags_1.Feature.OverlayAnalysisRuby,
rust: feature_flags_1.Feature.OverlayAnalysisRust,
swift: feature_flags_1.Feature.OverlayAnalysisSwift,
};
const OVERLAY_ANALYSIS_CODE_SCANNING_FEATURES = {
actions: feature_flags_1.Feature.OverlayAnalysisCodeScanningActions,
cpp: feature_flags_1.Feature.OverlayAnalysisCodeScanningCpp,
csharp: feature_flags_1.Feature.OverlayAnalysisCodeScanningCsharp,
go: feature_flags_1.Feature.OverlayAnalysisCodeScanningGo,
java: feature_flags_1.Feature.OverlayAnalysisCodeScanningJava,
javascript: feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
python: feature_flags_1.Feature.OverlayAnalysisCodeScanningPython,
ruby: feature_flags_1.Feature.OverlayAnalysisCodeScanningRuby,
rust: feature_flags_1.Feature.OverlayAnalysisCodeScanningRust,
swift: feature_flags_1.Feature.OverlayAnalysisCodeScanningSwift,
};
async function isOverlayAnalysisFeatureEnabled(repository, features, codeql, languages, codeScanningConfig) {
// TODO: Remove the repository owner check once support for overlay analysis
// stabilizes, and no more backward-incompatible changes are expected.
if (!["github", "dsp-testing"].includes(repository.owner)) {
return false;
}
if (!(await features.getValue(feature_flags_1.Feature.OverlayAnalysis, codeql))) {
return false;
}
let enableForCodeScanningOnly = false;
for (const language of languages) {
const feature = OVERLAY_ANALYSIS_FEATURES[language];
if (feature && (await features.getValue(feature, codeql))) {
continue;
}
const codeScanningFeature = OVERLAY_ANALYSIS_CODE_SCANNING_FEATURES[language];
if (codeScanningFeature &&
(await features.getValue(codeScanningFeature, codeql))) {
enableForCodeScanningOnly = true;
continue;
}
return false;
}
if (enableForCodeScanningOnly) {
// A code-scanning configuration runs only the (default) code-scanning suite
// if the default queries are not disabled, and no packs, queries, or
// query-filters are specified.
return (codeScanningConfig["disable-default-queries"] !== true &&
codeScanningConfig.packs === undefined &&
codeScanningConfig.queries === undefined &&
codeScanningConfig["query-filters"] === undefined);
}
return true;
}
/**
* Calculate and validate the overlay database mode and caching to use.
*
* - If the environment variable `CODEQL_OVERLAY_DATABASE_MODE` is set, use it.
* In this case, the workflow is responsible for managing database storage and
* retrieval, and the action will not perform overlay database caching. Think
* of it as a "manual control" mode where the calling workflow is responsible
* for making sure that everything is set up correctly.
* - Otherwise, if `Feature.OverlayAnalysis` is enabled, calculate the mode
* based on what we are analyzing. Think of it as a "automatic control" mode
* where the action will do the right thing by itself.
* - If we are analyzing a pull request, use `Overlay` with caching.
* - If we are analyzing the default branch, use `OverlayBase` with caching.
* - Otherwise, use `None`.
*
* For `Overlay` and `OverlayBase`, the function performs further checks and
* reverts to `None` if any check should fail.
*
* @returns An object containing the overlay database mode and whether the
* action should perform overlay-base database caching.
*/
async function getOverlayDatabaseMode(codeql, repository, features, languages, sourceRoot, buildMode, codeScanningConfig, logger) {
let overlayDatabaseMode = overlay_database_utils_1.OverlayDatabaseMode.None;
let useOverlayDatabaseCaching = false;
const modeEnv = process.env.CODEQL_OVERLAY_DATABASE_MODE;
// Any unrecognized CODEQL_OVERLAY_DATABASE_MODE value will be ignored and
// treated as if the environment variable was not set.
if (modeEnv === overlay_database_utils_1.OverlayDatabaseMode.Overlay ||
modeEnv === overlay_database_utils_1.OverlayDatabaseMode.OverlayBase ||
modeEnv === overlay_database_utils_1.OverlayDatabaseMode.None) {
overlayDatabaseMode = modeEnv;
logger.info(`Setting overlay database mode to ${overlayDatabaseMode} ` +
"from the CODEQL_OVERLAY_DATABASE_MODE environment variable.");
}
else if (await isOverlayAnalysisFeatureEnabled(repository, features, codeql, languages, codeScanningConfig)) {
if ((0, actions_util_1.isAnalyzingPullRequest)()) {
overlayDatabaseMode = overlay_database_utils_1.OverlayDatabaseMode.Overlay;
useOverlayDatabaseCaching = true;
logger.info(`Setting overlay database mode to ${overlayDatabaseMode} ` +
"with caching because we are analyzing a pull request.");
}
else if (await (0, git_utils_1.isAnalyzingDefaultBranch)()) {
overlayDatabaseMode = overlay_database_utils_1.OverlayDatabaseMode.OverlayBase;
useOverlayDatabaseCaching = true;
logger.info(`Setting overlay database mode to ${overlayDatabaseMode} ` +
"with caching because we are analyzing the default branch.");
}
}
const nonOverlayAnalysis = {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
};
if (overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.None) {
return nonOverlayAnalysis;
}
if (buildMode !== util_1.BuildMode.None && languages.some(languages_1.isTracedLanguage)) {
logger.warning(`Cannot build an ${overlayDatabaseMode} database because ` +
`build-mode is set to "${buildMode}" instead of "none". ` +
"Falling back to creating a normal full database instead.");
return nonOverlayAnalysis;
}
if (!(await (0, util_1.codeQlVersionAtLeast)(codeql, overlay_database_utils_1.CODEQL_OVERLAY_MINIMUM_VERSION))) {
logger.warning(`Cannot build an ${overlayDatabaseMode} database because ` +
`the CodeQL CLI is older than ${overlay_database_utils_1.CODEQL_OVERLAY_MINIMUM_VERSION}. ` +
"Falling back to creating a normal full database instead.");
return nonOverlayAnalysis;
}
if ((await (0, git_utils_1.getGitRoot)(sourceRoot)) === undefined) {
logger.warning(`Cannot build an ${overlayDatabaseMode} database because ` +
`the source root "${sourceRoot}" is not inside a git repository. ` +
"Falling back to creating a normal full database instead.");
return nonOverlayAnalysis;
}
return {
overlayDatabaseMode,
useOverlayDatabaseCaching,
};
}
/**
* Pack names must be in the form of `scope/name`, with only alpha-numeric characters,
* and `-` allowed as long as not the first or last char.
@@ -490,7 +602,6 @@ function userConfigFromActionPath(tempDir) {
* a default config. The parsed config is then stored to a known location.
*/
async function initConfig(inputs) {
let config;
const { logger, tempDir } = inputs;
// if configInput is set, it takes precedence over configFile
if (inputs.configInput) {
@@ -501,14 +612,31 @@ async function initConfig(inputs) {
fs.writeFileSync(inputs.configFile, inputs.configInput);
logger.debug(`Using config from action input: ${inputs.configFile}`);
}
// If no config file was provided create an empty one
let userConfig = {};
if (!inputs.configFile) {
logger.debug("No configuration file was provided");
config = await getDefaultConfig(inputs);
}
else {
// Convince the type checker that inputs.configFile is defined.
config = await loadConfig({ ...inputs, configFile: inputs.configFile });
logger.debug(`Using configuration file: ${inputs.configFile}`);
userConfig = await loadUserConfig(inputs.configFile, inputs.workspacePath, inputs.apiDetails, tempDir);
}
const config = await getDefaultConfig(inputs);
const augmentationProperties = config.augmentationProperties;
config.originalUserInput = userConfig;
// The choice of overlay database mode depends on the selection of languages
// and queries, which in turn depends on the user config and the augmentation
// properties. So we need to calculate the overlay database mode after the
// rest of the config has been populated.
const { overlayDatabaseMode, useOverlayDatabaseCaching } = await getOverlayDatabaseMode(inputs.codeql, inputs.repository, inputs.features, config.languages, inputs.sourceRoot, config.buildMode, generateCodeScanningConfig(userConfig, augmentationProperties), logger);
logger.info(`Using overlay database mode: ${overlayDatabaseMode} ` +
`${useOverlayDatabaseCaching ? "with" : "without"} caching.`);
augmentationProperties.overlayDatabaseMode = overlayDatabaseMode;
augmentationProperties.useOverlayDatabaseCaching = useOverlayDatabaseCaching;
if (overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.Overlay ||
(await (0, diff_informed_analysis_utils_1.shouldPerformDiffInformedAnalysis)(inputs.codeql, inputs.features, logger))) {
augmentationProperties.extraQueryExclusions.push({
exclude: { tags: "exclude-from-incremental" },
});
}
// Save the config so we can easily access it again in the future
await saveConfig(config, logger);
@@ -711,4 +839,56 @@ async function parseBuildModeInput(input, languages, features, logger) {
}
return input;
}
function generateCodeScanningConfig(originalUserInput, augmentationProperties) {
// make a copy so we can modify it
const augmentedConfig = (0, util_1.cloneObject)(originalUserInput);
// Inject the queries from the input
if (augmentationProperties.queriesInput) {
if (augmentationProperties.queriesInputCombines) {
augmentedConfig.queries = (augmentedConfig.queries || []).concat(augmentationProperties.queriesInput);
}
else {
augmentedConfig.queries = augmentationProperties.queriesInput;
}
}
if (augmentedConfig.queries?.length === 0) {
delete augmentedConfig.queries;
}
// Inject the packs from the input
if (augmentationProperties.packsInput) {
if (augmentationProperties.packsInputCombines) {
// At this point, we already know that this is a single-language analysis
if (Array.isArray(augmentedConfig.packs)) {
augmentedConfig.packs = (augmentedConfig.packs || []).concat(augmentationProperties.packsInput);
}
else if (!augmentedConfig.packs) {
augmentedConfig.packs = augmentationProperties.packsInput;
}
else {
// At this point, we know there is only one language.
// If there were more than one language, an error would already have been thrown.
const language = Object.keys(augmentedConfig.packs)[0];
augmentedConfig.packs[language] = augmentedConfig.packs[language].concat(augmentationProperties.packsInput);
}
}
else {
augmentedConfig.packs = augmentationProperties.packsInput;
}
}
if (Array.isArray(augmentedConfig.packs) && !augmentedConfig.packs.length) {
delete augmentedConfig.packs;
}
augmentedConfig["query-filters"] = [
// Ordering matters. If the first filter is an inclusion, it implicitly
// excludes all queries that are not included. If it is an exclusion,
// it implicitly includes all queries that are not excluded. So user
// filters (if any) should always be first to preserve intent.
...(augmentedConfig["query-filters"] || []),
...augmentationProperties.extraQueryExclusions,
];
if (augmentedConfig["query-filters"]?.length === 0) {
delete augmentedConfig["query-filters"];
}
return augmentedConfig;
}
//# sourceMappingURL=config-utils.js.map

File diff suppressed because one or more lines are too long

395
lib/config-utils.test.js generated
View File

@@ -42,13 +42,16 @@ const github = __importStar(require("@actions/github"));
const ava_1 = __importDefault(require("ava"));
const yaml = __importStar(require("js-yaml"));
const sinon = __importStar(require("sinon"));
const actionsUtil = __importStar(require("./actions-util"));
const api = __importStar(require("./api-client"));
const caching_utils_1 = require("./caching-utils");
const codeql_1 = require("./codeql");
const configUtils = __importStar(require("./config-utils"));
const feature_flags_1 = require("./feature-flags");
const gitUtils = __importStar(require("./git-utils"));
const languages_1 = require("./languages");
const logging_1 = require("./logging");
const overlay_database_utils_1 = require("./overlay-database-utils");
const repository_1 = require("./repository");
const testing_utils_1 = require("./testing-utils");
const util_1 = require("./util");
@@ -73,6 +76,7 @@ function createTestInitConfigInputs(overrides) {
tempDir: "",
codeql: {},
workspacePath: "",
sourceRoot: "",
githubVersion,
apiDetails: {
auth: "token",
@@ -625,7 +629,7 @@ const packSpecPrettyPrintingMacro = ava_1.default.macro({
const mockLogger = (0, logging_1.getRunnerLogger)(true);
const calculateAugmentationMacro = ava_1.default.macro({
exec: async (t, _title, rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages, expectedAugmentationProperties) => {
const actualAugmentationProperties = await configUtils.calculateAugmentation((0, codeql_1.getCachedCodeQL)(), (0, testing_utils_1.createFeatures)([]), rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages, mockLogger);
const actualAugmentationProperties = await configUtils.calculateAugmentation(rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages);
t.deepEqual(actualAugmentationProperties, expectedAugmentationProperties);
},
title: (_, title) => `Calculate Augmentation: ${title}`,
@@ -672,7 +676,7 @@ const calculateAugmentationMacro = ava_1.default.macro({
});
const calculateAugmentationErrorMacro = ava_1.default.macro({
exec: async (t, _title, rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages, expectedError) => {
await t.throwsAsync(() => configUtils.calculateAugmentation((0, codeql_1.getCachedCodeQL)(), (0, testing_utils_1.createFeatures)([]), rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages, mockLogger), { message: expectedError });
await t.throwsAsync(() => configUtils.calculateAugmentation(rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages), { message: expectedError });
},
title: (_, title) => `Calculate Augmentation Error: ${title}`,
});
@@ -824,4 +828,391 @@ for (const { displayName, language, feature } of [
]);
});
}
const defaultOverlayDatabaseModeTestSetup = {
overlayDatabaseEnvVar: undefined,
features: [],
isPullRequest: false,
isDefaultBranch: false,
repositoryOwner: "github",
buildMode: util_1.BuildMode.None,
languages: [languages_1.Language.javascript],
codeqlVersion: "2.21.0",
gitRoot: "/some/git/root",
codeScanningConfig: {},
};
const getOverlayDatabaseModeMacro = ava_1.default.macro({
exec: async (t, _title, setupOverrides, expected) => {
return await (0, util_1.withTmpDir)(async (tempDir) => {
const messages = [];
const logger = (0, testing_utils_1.getRecordingLogger)(messages);
// Save the original environment
const originalEnv = { ...process.env };
try {
const setup = {
...defaultOverlayDatabaseModeTestSetup,
...setupOverrides,
};
// Set up environment variable if specified
delete process.env.CODEQL_OVERLAY_DATABASE_MODE;
if (setup.overlayDatabaseEnvVar !== undefined) {
process.env.CODEQL_OVERLAY_DATABASE_MODE =
setup.overlayDatabaseEnvVar;
}
// Mock feature flags
const features = (0, testing_utils_1.createFeatures)(setup.features);
// Mock isAnalyzingPullRequest function
sinon
.stub(actionsUtil, "isAnalyzingPullRequest")
.returns(setup.isPullRequest);
// Mock repository owner
const repository = {
owner: setup.repositoryOwner,
repo: "test-repo",
};
// Set up CodeQL mock
const codeql = (0, testing_utils_1.mockCodeQLVersion)(setup.codeqlVersion);
// Mock git root detection
if (setup.gitRoot !== undefined) {
sinon.stub(gitUtils, "getGitRoot").resolves(setup.gitRoot);
}
// Mock default branch detection
sinon
.stub(gitUtils, "isAnalyzingDefaultBranch")
.resolves(setup.isDefaultBranch);
const result = await configUtils.getOverlayDatabaseMode(codeql, repository, features, setup.languages, tempDir, // sourceRoot
setup.buildMode, setup.codeScanningConfig, logger);
t.deepEqual(result, expected);
}
finally {
// Restore the original environment
process.env = originalEnv;
}
});
},
title: (_, title) => `getOverlayDatabaseMode: ${title}`,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Environment variable override - Overlay", {
overlayDatabaseEnvVar: "overlay",
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Environment variable override - OverlayBase", {
overlayDatabaseEnvVar: "overlay-base",
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.OverlayBase,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Environment variable override - None", {
overlayDatabaseEnvVar: "none",
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Ignore invalid environment variable", {
overlayDatabaseEnvVar: "invalid-mode",
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Ignore feature flag when analyzing non-default branch", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay-base database on default branch when feature enabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.OverlayBase,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay-base database on default branch when feature enabled with custom analysis", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
codeScanningConfig: {
packs: ["some-custom-pack@1.0.0"],
},
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.OverlayBase,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay-base database on default branch when code-scanning feature enabled", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.OverlayBase,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when code-scanning feature enabled with disable-default-queries", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
"disable-default-queries": true,
},
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when code-scanning feature enabled with packs", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
packs: ["some-custom-pack@1.0.0"],
},
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when code-scanning feature enabled with queries", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
queries: [{ uses: "some-query.ql" }],
},
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when code-scanning feature enabled with query-filters", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
"query-filters": [{ include: { "security-severity": "high" } }],
},
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when only language-specific feature enabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysisJavascript],
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when only code-scanning feature enabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript],
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when language-specific feature disabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis],
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay analysis on PR when feature enabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay analysis on PR when feature enabled with custom analysis", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
codeScanningConfig: {
packs: ["some-custom-pack@1.0.0"],
},
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay analysis on PR when code-scanning feature enabled", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when code-scanning feature enabled with disable-default-queries", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
"disable-default-queries": true,
},
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when code-scanning feature enabled with packs", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
packs: ["some-custom-pack@1.0.0"],
},
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when code-scanning feature enabled with queries", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
queries: [{ uses: "some-query.ql" }],
},
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when code-scanning feature enabled with query-filters", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
"query-filters": [{ include: { "security-severity": "high" } }],
},
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when only language-specific feature enabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysisJavascript],
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when only code-scanning feature enabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript],
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when language-specific feature disabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis],
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay PR analysis by env for dsp-testing", {
overlayDatabaseEnvVar: "overlay",
repositoryOwner: "dsp-testing",
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay PR analysis by env for other-org", {
overlayDatabaseEnvVar: "overlay",
repositoryOwner: "other-org",
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay PR analysis by feature flag for dsp-testing", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
isPullRequest: true,
repositoryOwner: "dsp-testing",
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay PR analysis by feature flag for other-org", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
isPullRequest: true,
repositoryOwner: "other-org",
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Fallback due to autobuild with traced language", {
overlayDatabaseEnvVar: "overlay",
buildMode: util_1.BuildMode.Autobuild,
languages: [languages_1.Language.java],
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Fallback due to no build mode with traced language", {
overlayDatabaseEnvVar: "overlay",
buildMode: undefined,
languages: [languages_1.Language.java],
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Fallback due to old CodeQL version", {
overlayDatabaseEnvVar: "overlay",
codeqlVersion: "2.14.0",
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Fallback due to missing git root", {
overlayDatabaseEnvVar: "overlay",
gitRoot: undefined,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
// Exercise language-specific overlay analysis features code paths
for (const language in languages_1.Language) {
(0, ava_1.default)(getOverlayDatabaseModeMacro, `Check default overlay analysis feature for ${language}`, {
languages: [language],
features: [feature_flags_1.Feature.OverlayAnalysis],
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
}
//# sourceMappingURL=config-utils.test.js.map

File diff suppressed because one or more lines are too long

View File

@@ -1,6 +1,6 @@
{
"bundleVersion": "codeql-bundle-v2.22.0",
"cliVersion": "2.22.0",
"priorBundleVersion": "codeql-bundle-v2.21.4",
"priorCliVersion": "2.21.4"
"bundleVersion": "codeql-bundle-v2.22.1",
"cliVersion": "2.22.1",
"priorBundleVersion": "codeql-bundle-v2.22.0",
"priorCliVersion": "2.22.0"
}

View File

@@ -39,34 +39,10 @@ exports.writeDiffRangesJsonFile = writeDiffRangesJsonFile;
exports.readDiffRangesJsonFile = readDiffRangesJsonFile;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const github = __importStar(require("@actions/github"));
const actionsUtil = __importStar(require("./actions-util"));
const api_client_1 = require("./api-client");
const feature_flags_1 = require("./feature-flags");
function getPullRequestBranches() {
const pullRequest = github.context.payload.pull_request;
if (pullRequest) {
return {
base: pullRequest.base.ref,
// We use the head label instead of the head ref here, because the head
// ref lacks owner information and by itself does not uniquely identify
// the head branch (which may be in a forked repository).
head: pullRequest.head.label,
};
}
// PR analysis under Default Setup does not have the pull_request context,
// but it should set CODE_SCANNING_REF and CODE_SCANNING_BASE_BRANCH.
const codeScanningRef = process.env.CODE_SCANNING_REF;
const codeScanningBaseBranch = process.env.CODE_SCANNING_BASE_BRANCH;
if (codeScanningRef && codeScanningBaseBranch) {
return {
base: codeScanningBaseBranch,
// PR analysis under Default Setup analyzes the PR head commit instead of
// the merge commit, so we can use the provided ref directly.
head: codeScanningRef,
};
}
return undefined;
}
const util_1 = require("./util");
/**
* Check if the action should perform diff-informed analysis.
*/
@@ -85,7 +61,12 @@ async function getDiffInformedAnalysisBranches(codeql, features, logger) {
if (!(await features.getValue(feature_flags_1.Feature.DiffInformedQueries, codeql))) {
return undefined;
}
const branches = getPullRequestBranches();
const gitHubVersion = await (0, api_client_1.getGitHubVersion)();
if (gitHubVersion.type === util_1.GitHubVariant.GHES &&
(0, util_1.satisfiesGHESVersion)(gitHubVersion.version, "<3.19", true)) {
return undefined;
}
const branches = actionsUtil.getPullRequestBranches();
if (!branches) {
logger.info("Not performing diff-informed analysis " +
"because we are not analyzing a pull request.");

View File

@@ -1 +1 @@
{"version":3,"file":"diff-informed-analysis-utils.js","sourceRoot":"","sources":["../src/diff-informed-analysis-utils.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AA6CA,8EASC;AASD,0EAiBC;AAYD,0DAUC;AAED,wDAaC;AArHD,uCAAyB;AACzB,2CAA6B;AAE7B,wDAA0C;AAE1C,4DAA8C;AAE9C,mDAA6D;AAQ7D,SAAS,sBAAsB;IAC7B,MAAM,WAAW,GAAG,MAAM,CAAC,OAAO,CAAC,OAAO,CAAC,YAAY,CAAC;IACxD,IAAI,WAAW,EAAE,CAAC;QAChB,OAAO;YACL,IAAI,EAAE,WAAW,CAAC,IAAI,CAAC,GAAG;YAC1B,uEAAuE;YACvE,uEAAuE;YACvE,yDAAyD;YACzD,IAAI,EAAE,WAAW,CAAC,IAAI,CAAC,KAAK;SAC7B,CAAC;IACJ,CAAC;IAED,0EAA0E;IAC1E,qEAAqE;IACrE,MAAM,eAAe,GAAG,OAAO,CAAC,GAAG,CAAC,iBAAiB,CAAC;IACtD,MAAM,sBAAsB,GAAG,OAAO,CAAC,GAAG,CAAC,yBAAyB,CAAC;IACrE,IAAI,eAAe,IAAI,sBAAsB,EAAE,CAAC;QAC9C,OAAO;YACL,IAAI,EAAE,sBAAsB;YAC5B,yEAAyE;YACzE,6DAA6D;YAC7D,IAAI,EAAE,eAAe;SACtB,CAAC;IACJ,CAAC;IACD,OAAO,SAAS,CAAC;AACnB,CAAC;AAED;;GAEG;AACI,KAAK,UAAU,iCAAiC,CACrD,MAAc,EACd,QAA2B,EAC3B,MAAc;IAEd,OAAO,CACL,CAAC,MAAM,+BAA+B,CAAC,MAAM,EAAE,QAAQ,EAAE,MAAM,CAAC,CAAC;QACjE,SAAS,CACV,CAAC;AACJ,CAAC;AAED;;;;;;GAMG;AACI,KAAK,UAAU,+BAA+B,CACnD,MAAc,EACd,QAA2B,EAC3B,MAAc;IAEd,IAAI,CAAC,CAAC,MAAM,QAAQ,CAAC,QAAQ,CAAC,uBAAO,CAAC,mBAAmB,EAAE,MAAM,CAAC,CAAC,EAAE,CAAC;QACpE,OAAO,SAAS,CAAC;IACnB,CAAC;IAED,MAAM,QAAQ,GAAG,sBAAsB,EAAE,CAAC;IAC1C,IAAI,CAAC,QAAQ,EAAE,CAAC;QACd,MAAM,CAAC,IAAI,CACT,wCAAwC;YACtC,8CAA8C,CACjD,CAAC;IACJ,CAAC;IACD,OAAO,QAAQ,CAAC;AAClB,CAAC;AAQD,SAAS,yBAAyB;IAChC,OAAO,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,qBAAqB,EAAE,EAAE,oBAAoB,CAAC,CAAC;AAC9E,CAAC;AAED,SAAgB,uBAAuB,CACrC,MAAc,EACd,MAAwB;IAExB,MAAM,YAAY,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,CAAC;IACrD,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,EAAE,CAAC,aAAa,CAAC,YAAY,EAAE,YAAY,CAAC,CAAC;IAC7C,MAAM,CAAC,KAAK,CACV,oCAAoC,YAAY,MAAM,YAAY,EAAE,CACrE,CAAC;AACJ,CAAC;AAED,SAAgB,sBAAsB,CACpC,MAAc;IAEd,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,IAAI,CAAC,EAAE,CAAC,UAAU,CAAC,YAAY,CAAC,EAAE,CAAC;QACjC,MAAM,CAAC,KAAK,CAAC,2CAA2C,YAAY,EAAE,CAAC,CAAC;QACxE,OAAO,SAAS,CAAC;IACnB,CAAC;IACD,MAAM,YAAY,GAAG,EAAE,CAAC,YAAY,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC;IAC3D,MAAM,CAAC,KAAK,CACV,qCAAqC,YAAY,MAAM,YAAY,EAAE,CACtE,CAAC;IACF,OAAO,IAAI,CAAC,KAAK,CAAC,YAAY,CAAqB,CAAC;AACtD,CAAC"}
{"version":3,"file":"diff-informed-analysis-utils.js","sourceRoot":"","sources":["../src/diff-informed-analysis-utils.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAcA,8EASC;AASD,0EAyBC;AAYD,0DAUC;AAED,wDAaC;AA9FD,uCAAyB;AACzB,2CAA6B;AAE7B,4DAA8C;AAE9C,6CAAgD;AAEhD,mDAA6D;AAE7D,iCAA6D;AAE7D;;GAEG;AACI,KAAK,UAAU,iCAAiC,CACrD,MAAc,EACd,QAA2B,EAC3B,MAAc;IAEd,OAAO,CACL,CAAC,MAAM,+BAA+B,CAAC,MAAM,EAAE,QAAQ,EAAE,MAAM,CAAC,CAAC;QACjE,SAAS,CACV,CAAC;AACJ,CAAC;AAED;;;;;;GAMG;AACI,KAAK,UAAU,+BAA+B,CACnD,MAAc,EACd,QAA2B,EAC3B,MAAc;IAEd,IAAI,CAAC,CAAC,MAAM,QAAQ,CAAC,QAAQ,CAAC,uBAAO,CAAC,mBAAmB,EAAE,MAAM,CAAC,CAAC,EAAE,CAAC;QACpE,OAAO,SAAS,CAAC;IACnB,CAAC;IAED,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;IAC/C,IACE,aAAa,CAAC,IAAI,KAAK,oBAAa,CAAC,IAAI;QACzC,IAAA,2BAAoB,EAAC,aAAa,CAAC,OAAO,EAAE,OAAO,EAAE,IAAI,CAAC,EAC1D,CAAC;QACD,OAAO,SAAS,CAAC;IACnB,CAAC;IAED,MAAM,QAAQ,GAAG,WAAW,CAAC,sBAAsB,EAAE,CAAC;IACtD,IAAI,CAAC,QAAQ,EAAE,CAAC;QACd,MAAM,CAAC,IAAI,CACT,wCAAwC;YACtC,8CAA8C,CACjD,CAAC;IACJ,CAAC;IACD,OAAO,QAAQ,CAAC;AAClB,CAAC;AAQD,SAAS,yBAAyB;IAChC,OAAO,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,qBAAqB,EAAE,EAAE,oBAAoB,CAAC,CAAC;AAC9E,CAAC;AAED,SAAgB,uBAAuB,CACrC,MAAc,EACd,MAAwB;IAExB,MAAM,YAAY,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,CAAC;IACrD,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,EAAE,CAAC,aAAa,CAAC,YAAY,EAAE,YAAY,CAAC,CAAC;IAC7C,MAAM,CAAC,KAAK,CACV,oCAAoC,YAAY,MAAM,YAAY,EAAE,CACrE,CAAC;AACJ,CAAC;AAED,SAAgB,sBAAsB,CACpC,MAAc;IAEd,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,IAAI,CAAC,EAAE,CAAC,UAAU,CAAC,YAAY,CAAC,EAAE,CAAC;QACjC,MAAM,CAAC,KAAK,CAAC,2CAA2C,YAAY,EAAE,CAAC,CAAC;QACxE,OAAO,SAAS,CAAC;IACnB,CAAC;IACD,MAAM,YAAY,GAAG,EAAE,CAAC,YAAY,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC;IAC3D,MAAM,CAAC,KAAK,CACV,qCAAqC,YAAY,MAAM,YAAY,EAAE,CACtE,CAAC;IACF,OAAO,IAAI,CAAC,KAAK,CAAC,YAAY,CAAqB,CAAC;AACtD,CAAC"}

130
lib/diff-informed-analysis-utils.test.js generated Normal file
View File

@@ -0,0 +1,130 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || (function () {
var ownKeys = function(o) {
ownKeys = Object.getOwnPropertyNames || function (o) {
var ar = [];
for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
return ar;
};
return ownKeys(o);
};
return function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
__setModuleDefault(result, mod);
return result;
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
const ava_1 = __importDefault(require("ava"));
const sinon = __importStar(require("sinon"));
const actionsUtil = __importStar(require("./actions-util"));
const apiClient = __importStar(require("./api-client"));
const diff_informed_analysis_utils_1 = require("./diff-informed-analysis-utils");
const feature_flags_1 = require("./feature-flags");
const logging_1 = require("./logging");
const repository_1 = require("./repository");
const testing_utils_1 = require("./testing-utils");
const util_1 = require("./util");
(0, testing_utils_1.setupTests)(ava_1.default);
const defaultTestCase = {
featureEnabled: true,
gitHubVersion: {
type: util_1.GitHubVariant.DOTCOM,
},
pullRequestBranches: {
base: "main",
head: "feature-branch",
},
codeQLVersion: "2.21.0",
};
const testShouldPerformDiffInformedAnalysis = ava_1.default.macro({
exec: async (t, _title, partialTestCase, expectedResult) => {
return await (0, util_1.withTmpDir)(async (tmpDir) => {
(0, testing_utils_1.setupActionsVars)(tmpDir, tmpDir);
const testCase = { ...defaultTestCase, ...partialTestCase };
const logger = (0, logging_1.getRunnerLogger)(true);
const codeql = (0, testing_utils_1.mockCodeQLVersion)(testCase.codeQLVersion);
if (testCase.diffInformedQueriesEnvVar !== undefined) {
process.env.CODEQL_ACTION_DIFF_INFORMED_QUERIES =
testCase.diffInformedQueriesEnvVar.toString();
}
else {
delete process.env.CODEQL_ACTION_DIFF_INFORMED_QUERIES;
}
const features = new feature_flags_1.Features(testCase.gitHubVersion, (0, repository_1.parseRepositoryNwo)("github/example"), tmpDir, logger);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, {
[feature_flags_1.Feature.DiffInformedQueries]: testCase.featureEnabled,
});
const getGitHubVersionStub = sinon
.stub(apiClient, "getGitHubVersion")
.resolves(testCase.gitHubVersion);
const getPullRequestBranchesStub = sinon
.stub(actionsUtil, "getPullRequestBranches")
.returns(testCase.pullRequestBranches);
const result = await (0, diff_informed_analysis_utils_1.shouldPerformDiffInformedAnalysis)(codeql, features, logger);
t.is(result, expectedResult);
delete process.env.CODEQL_ACTION_DIFF_INFORMED_QUERIES;
getGitHubVersionStub.restore();
getPullRequestBranchesStub.restore();
});
},
title: (_, title) => `shouldPerformDiffInformedAnalysis: ${title}`,
});
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns true in the default test case", {}, true);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns false when feature flag is disabled from the API", {
featureEnabled: false,
}, false);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns false when CODEQL_ACTION_DIFF_INFORMED_QUERIES is set to false", {
featureEnabled: true,
diffInformedQueriesEnvVar: false,
}, false);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns true when CODEQL_ACTION_DIFF_INFORMED_QUERIES is set to true", {
featureEnabled: false,
diffInformedQueriesEnvVar: true,
}, true);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns false for CodeQL version 2.20.0", {
codeQLVersion: "2.20.0",
}, false);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns false for invalid GHES version", {
gitHubVersion: {
type: util_1.GitHubVariant.GHES,
version: "invalid-version",
},
}, false);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns false for GHES version 3.18.5", {
gitHubVersion: {
type: util_1.GitHubVariant.GHES,
version: "3.18.5",
},
}, false);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns true for GHES version 3.19.0", {
gitHubVersion: {
type: util_1.GitHubVariant.GHES,
version: "3.19.0",
},
}, true);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns false when not a pull request", {
pullRequestBranches: undefined,
}, false);
//# sourceMappingURL=diff-informed-analysis-utils.test.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"diff-informed-analysis-utils.test.js","sourceRoot":"","sources":["../src/diff-informed-analysis-utils.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,8CAA6C;AAC7C,6CAA+B;AAE/B,4DAA8C;AAE9C,wDAA0C;AAC1C,iFAAmF;AACnF,mDAAoD;AACpD,uCAA4C;AAC5C,6CAAkD;AAClD,mDAKyB;AACzB,iCAAmD;AAGnD,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAUjB,MAAM,eAAe,GAAiC;IACpD,cAAc,EAAE,IAAI;IACpB,aAAa,EAAE;QACb,IAAI,EAAE,oBAAa,CAAC,MAAM;KAC3B;IACD,mBAAmB,EAAE;QACnB,IAAI,EAAE,MAAM;QACZ,IAAI,EAAE,gBAAgB;KACvB;IACD,aAAa,EAAE,QAAQ;CACxB,CAAC;AAEF,MAAM,qCAAqC,GAAG,aAAI,CAAC,KAAK,CAAC;IACvD,IAAI,EAAE,KAAK,EACT,CAAmB,EACnB,MAAc,EACd,eAAsD,EACtD,cAAuB,EACvB,EAAE;QACF,OAAO,MAAM,IAAA,iBAAU,EAAC,KAAK,EAAE,MAAM,EAAE,EAAE;YACvC,IAAA,gCAAgB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;YAEjC,MAAM,QAAQ,GAAG,EAAE,GAAG,eAAe,EAAE,GAAG,eAAe,EAAE,CAAC;YAC5D,MAAM,MAAM,GAAG,IAAA,yBAAe,EAAC,IAAI,CAAC,CAAC;YACrC,MAAM,MAAM,GAAG,IAAA,iCAAiB,EAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;YAEzD,IAAI,QAAQ,CAAC,yBAAyB,KAAK,SAAS,EAAE,CAAC;gBACrD,OAAO,CAAC,GAAG,CAAC,mCAAmC;oBAC7C,QAAQ,CAAC,yBAAyB,CAAC,QAAQ,EAAE,CAAC;YAClD,CAAC;iBAAM,CAAC;gBACN,OAAO,OAAO,CAAC,GAAG,CAAC,mCAAmC,CAAC;YACzD,CAAC;YAED,MAAM,QAAQ,GAAG,IAAI,wBAAQ,CAC3B,QAAQ,CAAC,aAAa,EACtB,IAAA,+BAAkB,EAAC,gBAAgB,CAAC,EACpC,MAAM,EACN,MAAM,CACP,CAAC;YACF,IAAA,0CAA0B,EAAC,GAAG,EAAE;gBAC9B,CAAC,uBAAO,CAAC,mBAAmB,CAAC,EAAE,QAAQ,CAAC,cAAc;aACvD,CAAC,CAAC;YAEH,MAAM,oBAAoB,GAAG,KAAK;iBAC/B,IAAI,CAAC,SAAS,EAAE,kBAAkB,CAAC;iBACnC,QAAQ,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;YACpC,MAAM,0BAA0B,GAAG,KAAK;iBACrC,IAAI,CAAC,WAAW,EAAE,wBAAwB,CAAC;iBAC3C,OAAO,CAAC,QAAQ,CAAC,mBAAmB,CAAC,CAAC;YAEzC,MAAM,MAAM,GAAG,MAAM,IAAA,gEAAiC,EACpD,MAAM,EACN,QAAQ,EACR,MAAM,CACP,CAAC;YAEF,CAAC,CAAC,EAAE,CAAC,MAAM,EAAE,cAAc,CAAC,CAAC;YAE7B,OAAO,OAAO,CAAC,GAAG,CAAC,mCAAmC,CAAC;YAEvD,oBAAoB,CAAC,OAAO,EAAE,CAAC;YAC/B,0BAA0B,CAAC,OAAO,EAAE,CAAC;QACvC,CAAC,CAAC,CAAC;IACL,CAAC;IACD,KAAK,EAAE,CAAC,CAAC,EAAE,KAAK,EAAE,EAAE,CAAC,sCAAsC,KAAK,EAAE;CACnE,CAAC,CAAC;AAEH,IAAA,aAAI,EACF,qCAAqC,EACrC,uCAAuC,EACvC,EAAE,EACF,IAAI,CACL,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,0DAA0D,EAC1D;IACE,cAAc,EAAE,KAAK;CACtB,EACD,KAAK,CACN,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,wEAAwE,EACxE;IACE,cAAc,EAAE,IAAI;IACpB,yBAAyB,EAAE,KAAK;CACjC,EACD,KAAK,CACN,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,sEAAsE,EACtE;IACE,cAAc,EAAE,KAAK;IACrB,yBAAyB,EAAE,IAAI;CAChC,EACD,IAAI,CACL,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,yCAAyC,EACzC;IACE,aAAa,EAAE,QAAQ;CACxB,EACD,KAAK,CACN,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,wCAAwC,EACxC;IACE,aAAa,EAAE;QACb,IAAI,EAAE,oBAAa,CAAC,IAAI;QACxB,OAAO,EAAE,iBAAiB;KAC3B;CACF,EACD,KAAK,CACN,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,uCAAuC,EACvC;IACE,aAAa,EAAE;QACb,IAAI,EAAE,oBAAa,CAAC,IAAI;QACxB,OAAO,EAAE,QAAQ;KAClB;CACF,EACD,KAAK,CACN,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,sCAAsC,EACtC;IACE,aAAa,EAAE;QACb,IAAI,EAAE,oBAAa,CAAC,IAAI;QACxB,OAAO,EAAE,QAAQ;KAClB;CACF,EACD,IAAI,CACL,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,uCAAuC,EACvC;IACE,mBAAmB,EAAE,SAAS;CAC/B,EACD,KAAK,CACN,CAAC"}

159
lib/feature-flags.js generated
View File

@@ -39,6 +39,7 @@ const path = __importStar(require("path"));
const semver = __importStar(require("semver"));
const api_client_1 = require("./api-client");
const defaults = __importStar(require("./defaults.json"));
const overlay_database_utils_1 = require("./overlay-database-utils");
const tools_features_1 = require("./tools-features");
const util = __importStar(require("./util"));
const DEFAULT_VERSION_FEATURE_FLAG_PREFIX = "default_codeql_version_";
@@ -61,11 +62,33 @@ var Feature;
Feature["CppBuildModeNone"] = "cpp_build_mode_none";
Feature["CppDependencyInstallation"] = "cpp_dependency_installation_enabled";
Feature["DiffInformedQueries"] = "diff_informed_queries";
Feature["DisableCombineSarifFiles"] = "disable_combine_sarif_files";
Feature["DisableCsharpBuildless"] = "disable_csharp_buildless";
Feature["DisableJavaBuildlessEnabled"] = "disable_java_buildless_enabled";
Feature["DisableKotlinAnalysisEnabled"] = "disable_kotlin_analysis_enabled";
Feature["ExportDiagnosticsEnabled"] = "export_diagnostics_enabled";
Feature["ExtractToToolcache"] = "extract_to_toolcache";
Feature["OverlayAnalysis"] = "overlay_analysis";
Feature["OverlayAnalysisActions"] = "overlay_analysis_actions";
Feature["OverlayAnalysisCodeScanningActions"] = "overlay_analysis_code_scanning_actions";
Feature["OverlayAnalysisCodeScanningCpp"] = "overlay_analysis_code_scanning_cpp";
Feature["OverlayAnalysisCodeScanningCsharp"] = "overlay_analysis_code_scanning_csharp";
Feature["OverlayAnalysisCodeScanningGo"] = "overlay_analysis_code_scanning_go";
Feature["OverlayAnalysisCodeScanningJava"] = "overlay_analysis_code_scanning_java";
Feature["OverlayAnalysisCodeScanningJavascript"] = "overlay_analysis_code_scanning_javascript";
Feature["OverlayAnalysisCodeScanningPython"] = "overlay_analysis_code_scanning_python";
Feature["OverlayAnalysisCodeScanningRuby"] = "overlay_analysis_code_scanning_ruby";
Feature["OverlayAnalysisCodeScanningRust"] = "overlay_analysis_code_scanning_rust";
Feature["OverlayAnalysisCodeScanningSwift"] = "overlay_analysis_code_scanning_swift";
Feature["OverlayAnalysisCpp"] = "overlay_analysis_cpp";
Feature["OverlayAnalysisCsharp"] = "overlay_analysis_csharp";
Feature["OverlayAnalysisGo"] = "overlay_analysis_go";
Feature["OverlayAnalysisJava"] = "overlay_analysis_java";
Feature["OverlayAnalysisJavascript"] = "overlay_analysis_javascript";
Feature["OverlayAnalysisPython"] = "overlay_analysis_python";
Feature["OverlayAnalysisRuby"] = "overlay_analysis_ruby";
Feature["OverlayAnalysisRust"] = "overlay_analysis_rust";
Feature["OverlayAnalysisSwift"] = "overlay_analysis_swift";
Feature["PythonDefaultIsToNotExtractStdlib"] = "python_default_is_to_not_extract_stdlib";
Feature["QaTelemetryEnabled"] = "qa_telemetry_enabled";
Feature["RustAnalysis"] = "rust_analysis";
@@ -94,10 +117,15 @@ exports.featureConfig = {
minimumVersion: "2.15.0",
},
[Feature.DiffInformedQueries]: {
defaultValue: false,
defaultValue: true,
envVar: "CODEQL_ACTION_DIFF_INFORMED_QUERIES",
minimumVersion: "2.21.0",
},
[Feature.DisableCombineSarifFiles]: {
defaultValue: false,
envVar: "CODEQL_ACTION_DISABLE_COMBINE_SARIF_FILES",
minimumVersion: undefined,
},
[Feature.DisableCsharpBuildless]: {
defaultValue: false,
envVar: "CODEQL_ACTION_DISABLE_CSHARP_BUILDLESS",
@@ -126,6 +154,111 @@ exports.featureConfig = {
envVar: "CODEQL_ACTION_EXTRACT_TOOLCACHE",
minimumVersion: undefined,
},
[Feature.OverlayAnalysis]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS",
minimumVersion: overlay_database_utils_1.CODEQL_OVERLAY_MINIMUM_VERSION,
},
[Feature.OverlayAnalysisActions]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_ACTIONS",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningActions]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_ACTIONS",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningCpp]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_CPP",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningCsharp]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_CSHARP",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningGo]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_GO",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningJava]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_JAVA",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningJavascript]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_JAVASCRIPT",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningPython]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_PYTHON",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningRuby]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_RUBY",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningRust]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_RUST",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningSwift]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_SWIFT",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCpp]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CPP",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCsharp]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CSHARP",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisGo]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_GO",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisJava]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_JAVA",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisJavascript]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_JAVASCRIPT",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisPython]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_PYTHON",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisRuby]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_RUBY",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisRust]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_RUST",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisSwift]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_SWIFT",
minimumVersion: undefined,
},
[Feature.PythonDefaultIsToNotExtractStdlib]: {
defaultValue: false,
envVar: "CODEQL_ACTION_DISABLE_PYTHON_STANDARD_LIBRARY_EXTRACTION",
@@ -357,14 +490,22 @@ class GitHubFeatureFlags {
try {
const featuresToRequest = Object.entries(exports.featureConfig)
.filter(([, config]) => !config.legacyApi)
.map(([f]) => f)
.join(",");
const response = await (0, api_client_1.getApiClient)().request("GET /repos/:owner/:repo/code-scanning/codeql-action/features", {
owner: this.repositoryNwo.owner,
repo: this.repositoryNwo.repo,
features: featuresToRequest,
});
const remoteFlags = response.data;
.map(([f]) => f);
const FEATURES_PER_REQUEST = 25;
const featureChunks = [];
while (featuresToRequest.length > 0) {
featureChunks.push(featuresToRequest.splice(0, FEATURES_PER_REQUEST));
}
let remoteFlags = {};
for (const chunk of featureChunks) {
const response = await (0, api_client_1.getApiClient)().request("GET /repos/:owner/:repo/code-scanning/codeql-action/features", {
owner: this.repositoryNwo.owner,
repo: this.repositoryNwo.repo,
features: chunk.join(","),
});
const chunkFlags = response.data;
remoteFlags = { ...remoteFlags, ...chunkFlags };
}
this.logger.debug("Loaded the following default values for the feature flags from the Code Scanning API:");
for (const [feature, value] of Object.entries(remoteFlags).sort(([nameA], [nameB]) => nameA.localeCompare(nameB))) {
this.logger.debug(` ${feature}: ${value}`);

File diff suppressed because one or more lines are too long

View File

@@ -36,7 +36,6 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.initializeFeatures = initializeFeatures;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const ava_1 = __importDefault(require("ava"));
@@ -68,7 +67,7 @@ const testRepositoryNwo = (0, repository_1.parseRepositoryNwo)("github/example")
await (0, util_1.withTmpDir)(async (tmpDir) => {
const loggedMessages = [];
const features = setUpFeatureFlagTests(tmpDir, (0, testing_utils_1.getRecordingLogger)(loggedMessages), { type: util_1.GitHubVariant.GHE_DOTCOM });
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, initializeFeatures(true));
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, (0, testing_utils_1.initializeFeatures)(true));
for (const feature of Object.values(feature_flags_1.Feature)) {
// Ensure we have gotten a response value back from the Mock API
t.assert(await features.getValue(feature, includeCodeQlIfRequired(feature)));
@@ -103,6 +102,24 @@ const testRepositoryNwo = (0, repository_1.parseRepositoryNwo)("github/example")
assertAllFeaturesUndefinedInApi(t, loggedMessages);
});
});
(0, ava_1.default)("Include no more than 25 features in each API request", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
(0, testing_utils_1.stubFeatureFlagApiEndpoint)((request) => {
const requestedFeatures = request.features.split(",");
return {
status: requestedFeatures.length <= 25 ? 200 : 400,
messageIfError: "Can request a maximum of 25 features.",
data: {},
};
});
// We only need to call getValue once, and it does not matter which feature
// we ask for. Under the hood, the features library will request all features
// from the API.
const feature = Object.values(feature_flags_1.Feature)[0];
await t.notThrowsAsync(async () => features.getValue(feature, includeCodeQlIfRequired(feature)));
});
});
(0, ava_1.default)("Feature flags exception is propagated if the API request errors", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
@@ -135,7 +152,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)(`Only feature '${feature}' is enabled if the associated environment variable is true. Others disabled.`, async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(false);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(false);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
// feature should be disabled initially
t.assert(!(await features.getValue(feature, includeCodeQlIfRequired(feature))));
@@ -147,7 +164,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)(`Feature '${feature}' is disabled if the associated environment variable is false, even if enabled in API`, async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
// feature should be enabled initially
t.assert(await features.getValue(feature, includeCodeQlIfRequired(feature)));
@@ -161,7 +178,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)(`Getting feature '${feature} should throw if no codeql is provided`, async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
await t.throwsAsync(async () => features.getValue(feature), {
message: `Internal error: A ${feature_flags_1.featureConfig[feature].minimumVersion !== undefined
@@ -175,7 +192,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)(`Feature '${feature}' is disabled if the minimum CLI version is below ${feature_flags_1.featureConfig[feature].minimumVersion}`, async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
// feature should be disabled when an old CLI version is set
let codeql = (0, testing_utils_1.mockCodeQLVersion)("2.0.0");
@@ -199,7 +216,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)(`Feature '${feature}' is disabled if the required tools feature is not enabled`, async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
// feature should be disabled when the required tools feature is not enabled
let codeql = (0, testing_utils_1.mockCodeQLVersion)("2.0.0");
@@ -225,7 +242,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)("Feature flags are saved to disk", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
const cachedFeatureFlags = path.join(tmpDir, feature_flags_1.FEATURE_FLAGS_FILE_NAME);
t.false(fs.existsSync(cachedFeatureFlags), "Feature flag cached file should not exist before getting feature flags");
@@ -244,7 +261,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)("Environment variable can override feature flag cache", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
const cachedFeatureFlags = path.join(tmpDir, feature_flags_1.FEATURE_FLAGS_FILE_NAME);
t.true(await features.getValue(feature_flags_1.Feature.QaTelemetryEnabled, includeCodeQlIfRequired(feature_flags_1.Feature.QaTelemetryEnabled)), "Feature flag should be enabled initially");
@@ -266,7 +283,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)("selects CLI v2.20.1 on Dotcom when feature flags enable v2.20.0 and v2.20.1", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
expectedFeatureEnablement["default_codeql_version_2_20_0_enabled"] = true;
expectedFeatureEnablement["default_codeql_version_2_20_1_enabled"] = true;
expectedFeatureEnablement["default_codeql_version_2_20_2_enabled"] = false;
@@ -285,7 +302,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)("includes tag name", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
expectedFeatureEnablement["default_codeql_version_2_20_0_enabled"] = true;
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
const defaultCliVersion = await features.getDefaultCliVersion(util_1.GitHubVariant.DOTCOM);
@@ -299,7 +316,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)(`selects CLI from defaults.json on Dotcom when no default version feature flags are enabled`, async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
const defaultCliVersion = await features.getDefaultCliVersion(util_1.GitHubVariant.DOTCOM);
t.deepEqual(defaultCliVersion, {
@@ -313,7 +330,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const loggedMessages = [];
const features = setUpFeatureFlagTests(tmpDir, (0, testing_utils_1.getRecordingLogger)(loggedMessages));
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
expectedFeatureEnablement["default_codeql_version_2_20_0_enabled"] = true;
expectedFeatureEnablement["default_codeql_version_2_20_1_enabled"] = true;
expectedFeatureEnablement["default_codeql_version_2_20_invalid_enabled"] =
@@ -358,12 +375,6 @@ function assertAllFeaturesUndefinedInApi(t, loggedMessages) {
v.message.includes("undefined in API response")) !== undefined);
}
}
function initializeFeatures(initialValue) {
return Object.keys(feature_flags_1.featureConfig).reduce((features, key) => {
features[key] = initialValue;
return features;
}, {});
}
function setUpFeatureFlagTests(tmpDir, logger = (0, logging_1.getRunnerLogger)(true), gitHubVersion = { type: util_1.GitHubVariant.DOTCOM }) {
(0, testing_utils_1.setupActionsVars)(tmpDir, tmpDir);
return new feature_flags_1.Features(gitHubVersion, testRepositoryNwo, tmpDir, logger);

File diff suppressed because one or more lines are too long

View File

@@ -87,7 +87,7 @@ async function maybeUploadFailedSarif(config, repositoryNwo, features, logger) {
await codeql.databaseExportDiagnostics(databasePath, sarifFile, category);
}
logger.info(`Uploading failed SARIF file ${sarifFile}`);
const uploadResult = await uploadLib.uploadFiles(sarifFile, checkoutPath, category, features, logger);
const uploadResult = await uploadLib.uploadFiles(sarifFile, checkoutPath, category, features, logger, uploadLib.CodeScanningTarget);
await uploadLib.waitForProcessing(repositoryNwo, uploadResult.sarifID, logger, { isUnsuccessfulExecution: true });
return uploadResult
? { ...uploadResult.statusReport, sarifID: uploadResult.sarifID }

File diff suppressed because one or more lines are too long

44
lib/init-action.js generated
View File

@@ -57,7 +57,7 @@ const status_report_1 = require("./status-report");
const tools_features_1 = require("./tools-features");
const util_1 = require("./util");
const workflow_1 = require("./workflow");
async function sendCompletedStatusReport(startedAt, config, configFile, toolsDownloadStatusReport, toolsFeatureFlagsValid, toolsSource, toolsVersion, logger, error) {
async function sendCompletedStatusReport(startedAt, config, configFile, toolsDownloadStatusReport, toolsFeatureFlagsValid, toolsSource, toolsVersion, overlayBaseDatabaseStats, logger, error) {
const statusReportBase = await (0, status_report_1.createStatusReportBase)(status_report_1.ActionName.Init, (0, status_report_1.getActionsStatus)(error), startedAt, config, await (0, util_1.checkDiskUsage)(logger), logger, error?.message, error?.stack);
if (statusReportBase === undefined) {
return;
@@ -126,6 +126,8 @@ async function sendCompletedStatusReport(startedAt, config, configFile, toolsDow
trap_cache_languages: Object.keys(config.trapCaches).join(","),
trap_cache_download_size_bytes: Math.round(await (0, caching_utils_1.getTotalCacheSize)(Object.values(config.trapCaches), logger)),
trap_cache_download_duration_ms: Math.round(config.trapCacheDownloadTime),
overlay_base_database_download_size_bytes: overlayBaseDatabaseStats?.databaseSizeBytes,
overlay_base_database_download_duration_ms: overlayBaseDatabaseStats?.databaseDownloadDurationMs,
query_filters: JSON.stringify(config.originalUserInput["query-filters"] ?? []),
registries: JSON.stringify(configUtils.parseRegistriesWithoutCredentials((0, actions_util_1.getOptionalInput)("registries")) ?? []),
};
@@ -167,6 +169,10 @@ async function run() {
core.exportVariable(environment_1.EnvVar.JOB_RUN_UUID, jobRunUuid);
core.exportVariable(environment_1.EnvVar.INIT_ACTION_HAS_RUN, "true");
const configFile = (0, actions_util_1.getOptionalInput)("config-file");
// path.resolve() respects the intended semantics of source-root. If
// source-root is relative, it is relative to the GITHUB_WORKSPACE. If
// source-root is absolute, it is used as given.
const sourceRoot = path.resolve((0, util_1.getRequiredEnvParam)("GITHUB_WORKSPACE"), (0, actions_util_1.getOptionalInput)("source-root") || "");
try {
const statusReportBase = await (0, status_report_1.createStatusReportBase)(status_report_1.ActionName.Init, "starting", startedAt, config, await (0, util_1.checkDiskUsage)(logger), logger);
if (statusReportBase !== undefined) {
@@ -211,6 +217,7 @@ async function run() {
tempDir: (0, actions_util_1.getTemporaryDirectory)(),
codeql,
workspacePath: (0, util_1.getRequiredEnvParam)("GITHUB_WORKSPACE"),
sourceRoot,
githubVersion: gitHubVersion,
apiDetails,
features,
@@ -227,11 +234,32 @@ async function run() {
}
return;
}
let overlayBaseDatabaseStats;
try {
const sourceRoot = path.resolve((0, util_1.getRequiredEnvParam)("GITHUB_WORKSPACE"), (0, actions_util_1.getOptionalInput)("source-root") || "");
const overlayDatabaseMode = await (0, init_1.getOverlayDatabaseMode)((await codeql.getVersion()).version, config, sourceRoot, logger);
logger.info(`Using overlay database mode: ${overlayDatabaseMode}`);
if (overlayDatabaseMode !== overlay_database_utils_1.OverlayDatabaseMode.Overlay) {
if (config.augmentationProperties.overlayDatabaseMode ===
overlay_database_utils_1.OverlayDatabaseMode.Overlay &&
config.augmentationProperties.useOverlayDatabaseCaching) {
// OverlayDatabaseMode.Overlay comes in two flavors: with database
// caching, or without. The flavor with database caching is intended to be
// an "automatic control" mode, which is supposed to be fail-safe. If we
// cannot download an overlay-base database, we revert to
// OverlayDatabaseMode.None so that the workflow can continue to run.
//
// The flavor without database caching is intended to be a "manual
// control" mode, where the workflow is supposed to make all the
// necessary preparations. So, in that mode, we would assume that
// everything is in order and let the analysis fail if that turns out not
// to be the case.
overlayBaseDatabaseStats = await (0, overlay_database_utils_1.downloadOverlayBaseDatabaseFromCache)(codeql, config, logger);
if (!overlayBaseDatabaseStats) {
config.augmentationProperties.overlayDatabaseMode =
overlay_database_utils_1.OverlayDatabaseMode.None;
logger.info("No overlay-base database found in cache, " +
`reverting overlay database mode to ${overlay_database_utils_1.OverlayDatabaseMode.None}.`);
}
}
if (config.augmentationProperties.overlayDatabaseMode !==
overlay_database_utils_1.OverlayDatabaseMode.Overlay) {
(0, init_1.cleanupDatabaseClusterDirectory)(config, logger);
}
if (zstdAvailability) {
@@ -402,7 +430,7 @@ async function run() {
core.exportVariable("CODEQL_EXTRACTOR_PYTHON_EXTRACT_STDLIB", "true");
}
}
const tracerConfig = await (0, init_1.runInit)(codeql, config, sourceRoot, "Runner.Worker.exe", (0, actions_util_1.getOptionalInput)("registries"), apiDetails, overlayDatabaseMode, logger);
const tracerConfig = await (0, init_1.runInit)(codeql, config, sourceRoot, "Runner.Worker.exe", (0, actions_util_1.getOptionalInput)("registries"), apiDetails, logger);
if (tracerConfig !== undefined) {
for (const [key, value] of Object.entries(tracerConfig.env)) {
core.exportVariable(key, value);
@@ -418,13 +446,13 @@ async function run() {
const error = (0, util_1.wrapError)(unwrappedError);
core.setFailed(error.message);
await sendCompletedStatusReport(startedAt, config, undefined, // We only report config info on success.
toolsDownloadStatusReport, toolsFeatureFlagsValid, toolsSource, toolsVersion, logger, error);
toolsDownloadStatusReport, toolsFeatureFlagsValid, toolsSource, toolsVersion, overlayBaseDatabaseStats, logger, error);
return;
}
finally {
(0, diagnostics_1.logUnwrittenDiagnostics)();
}
await sendCompletedStatusReport(startedAt, config, configFile, toolsDownloadStatusReport, toolsFeatureFlagsValid, toolsSource, toolsVersion, logger);
await sendCompletedStatusReport(startedAt, config, configFile, toolsDownloadStatusReport, toolsFeatureFlagsValid, toolsSource, toolsVersion, overlayBaseDatabaseStats, logger);
}
function getTrapCachingEnabled() {
// If the workflow specified something always respect that

File diff suppressed because one or more lines are too long

34
lib/init.js generated
View File

@@ -35,7 +35,6 @@ var __importStar = (this && this.__importStar) || (function () {
Object.defineProperty(exports, "__esModule", { value: true });
exports.initCodeQL = initCodeQL;
exports.initConfig = initConfig;
exports.getOverlayDatabaseMode = getOverlayDatabaseMode;
exports.runInit = runInit;
exports.checkInstallPython311 = checkInstallPython311;
exports.cleanupDatabaseClusterDirectory = cleanupDatabaseClusterDirectory;
@@ -43,14 +42,11 @@ const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const toolrunner = __importStar(require("@actions/exec/lib/toolrunner"));
const io = __importStar(require("@actions/io"));
const semver = __importStar(require("semver"));
const actions_util_1 = require("./actions-util");
const codeql_1 = require("./codeql");
const configUtils = __importStar(require("./config-utils"));
const git_utils_1 = require("./git-utils");
const languages_1 = require("./languages");
const logging_1 = require("./logging");
const overlay_database_utils_1 = require("./overlay-database-utils");
const tracer_config_1 = require("./tracer-config");
const util = __importStar(require("./util"));
async function initCodeQL(toolsInput, apiDetails, tempDir, variant, defaultCliVersion, features, logger) {
@@ -71,33 +67,7 @@ async function initConfig(inputs) {
return await configUtils.initConfig(inputs);
});
}
async function getOverlayDatabaseMode(codeqlVersion, config, sourceRoot, logger) {
const overlayDatabaseMode = process.env.CODEQL_OVERLAY_DATABASE_MODE;
if (overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.Overlay ||
overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.OverlayBase) {
if (config.buildMode !== util.BuildMode.None) {
logger.warning(`Cannot build an ${overlayDatabaseMode} database because ` +
`build-mode is set to "${config.buildMode}" instead of "none". ` +
"Falling back to creating a normal full database instead.");
return overlay_database_utils_1.OverlayDatabaseMode.None;
}
if (semver.lt(codeqlVersion, overlay_database_utils_1.CODEQL_OVERLAY_MINIMUM_VERSION)) {
logger.warning(`Cannot build an ${overlayDatabaseMode} database because ` +
`the CodeQL CLI is older than ${overlay_database_utils_1.CODEQL_OVERLAY_MINIMUM_VERSION}. ` +
"Falling back to creating a normal full database instead.");
return overlay_database_utils_1.OverlayDatabaseMode.None;
}
if ((await (0, git_utils_1.getGitRoot)(sourceRoot)) === undefined) {
logger.warning(`Cannot build an ${overlayDatabaseMode} database because ` +
`the source root "${sourceRoot}" is not inside a git repository. ` +
"Falling back to creating a normal full database instead.");
return overlay_database_utils_1.OverlayDatabaseMode.None;
}
return overlayDatabaseMode;
}
return overlay_database_utils_1.OverlayDatabaseMode.None;
}
async function runInit(codeql, config, sourceRoot, processName, registriesInput, apiDetails, overlayDatabaseMode, logger) {
async function runInit(codeql, config, sourceRoot, processName, registriesInput, apiDetails, logger) {
fs.mkdirSync(config.dbLocation, { recursive: true });
const { registriesAuthTokens, qlconfigFile } = await configUtils.generateRegistries(registriesInput, config.tempDir, logger);
await configUtils.wrapEnvironment({
@@ -105,7 +75,7 @@ async function runInit(codeql, config, sourceRoot, processName, registriesInput,
CODEQL_REGISTRIES_AUTH: registriesAuthTokens,
},
// Init a database cluster
async () => await codeql.databaseInitCluster(config, sourceRoot, processName, qlconfigFile, overlayDatabaseMode, logger));
async () => await codeql.databaseInitCluster(config, sourceRoot, processName, qlconfigFile, logger));
return await (0, tracer_config_1.getCombinedTracerConfig)(codeql, config);
}
/**

View File

@@ -1 +1 @@
{"version":3,"file":"init.js","sourceRoot":"","sources":["../src/init.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAyBA,gCAyCC;AAED,gCAMC;AAED,wDAuCC;AAED,0BAoCC;AAMD,sDAkBC;AAED,0EAkDC;AArOD,uCAAyB;AACzB,2CAA6B;AAE7B,yEAA2D;AAC3D,gDAAkC;AAClC,+CAAiC;AAEjC,iDAAsE;AAEtE,qCAA+C;AAC/C,4DAA8C;AAE9C,2CAAyC;AACzC,2CAAuC;AACvC,uCAAmD;AACnD,qEAGkC;AAIlC,mDAAwE;AACxE,6CAA+B;AAExB,KAAK,UAAU,UAAU,CAC9B,UAA8B,EAC9B,UAA4B,EAC5B,OAAe,EACf,OAA2B,EAC3B,iBAA2C,EAC3C,QAA2B,EAC3B,MAAc;IAQd,MAAM,CAAC,UAAU,CAAC,oBAAoB,CAAC,CAAC;IACxC,MAAM,EACJ,MAAM,EACN,yBAAyB,EACzB,WAAW,EACX,YAAY,EACZ,gBAAgB,GACjB,GAAG,MAAM,IAAA,oBAAW,EACnB,UAAU,EACV,UAAU,EACV,OAAO,EACP,OAAO,EACP,iBAAiB,EACjB,MAAM,EACN,QAAQ,EACR,IAAI,CACL,CAAC;IACF,MAAM,MAAM,CAAC,YAAY,EAAE,CAAC;IAC5B,MAAM,CAAC,QAAQ,EAAE,CAAC;IAClB,OAAO;QACL,MAAM;QACN,yBAAyB;QACzB,WAAW;QACX,YAAY;QACZ,gBAAgB;KACjB,CAAC;AACJ,CAAC;AAEM,KAAK,UAAU,UAAU,CAC9B,MAAoC;IAEpC,OAAO,MAAM,IAAA,wBAAc,EAAC,6BAA6B,EAAE,KAAK,IAAI,EAAE;QACpE,OAAO,MAAM,WAAW,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC;IAC9C,CAAC,CAAC,CAAC;AACL,CAAC;AAEM,KAAK,UAAU,sBAAsB,CAC1C,aAAqB,EACrB,MAA0B,EAC1B,UAAkB,EAClB,MAAc;IAEd,MAAM,mBAAmB,GAAG,OAAO,CAAC,GAAG,CAAC,4BAA4B,CAAC;IAErE,IACE,mBAAmB,KAAK,4CAAmB,CAAC,OAAO;QACnD,mBAAmB,KAAK,4CAAmB,CAAC,WAAW,EACvD,CAAC;QACD,IAAI,MAAM,CAAC,SAAS,KAAK,IAAI,CAAC,SAAS,CAAC,IAAI,EAAE,CAAC;YAC7C,MAAM,CAAC,OAAO,CACZ,mBAAmB,mBAAmB,oBAAoB;gBACxD,yBAAyB,MAAM,CAAC,SAAS,uBAAuB;gBAChE,0DAA0D,CAC7D,CAAC;YACF,OAAO,4CAAmB,CAAC,IAAI,CAAC;QAClC,CAAC;QACD,IAAI,MAAM,CAAC,EAAE,CAAC,aAAa,EAAE,uDAA8B,CAAC,EAAE,CAAC;YAC7D,MAAM,CAAC,OAAO,CACZ,mBAAmB,mBAAmB,oBAAoB;gBACxD,gCAAgC,uDAA8B,IAAI;gBAClE,0DAA0D,CAC7D,CAAC;YACF,OAAO,4CAAmB,CAAC,IAAI,CAAC;QAClC,CAAC;QACD,IAAI,CAAC,MAAM,IAAA,sBAAU,EAAC,UAAU,CAAC,CAAC,KAAK,SAAS,EAAE,CAAC;YACjD,MAAM,CAAC,OAAO,CACZ,mBAAmB,mBAAmB,oBAAoB;gBACxD,oBAAoB,UAAU,oCAAoC;gBAClE,0DAA0D,CAC7D,CAAC;YACF,OAAO,4CAAmB,CAAC,IAAI,CAAC;QAClC,CAAC;QACD,OAAO,mBAA0C,CAAC;IACpD,CAAC;IACD,OAAO,4CAAmB,CAAC,IAAI,CAAC;AAClC,CAAC;AAEM,KAAK,UAAU,OAAO,CAC3B,MAAc,EACd,MAA0B,EAC1B,UAAkB,EAClB,WAA+B,EAC/B,eAAmC,EACnC,UAAoC,EACpC,mBAAwC,EACxC,MAAc;IAEd,EAAE,CAAC,SAAS,CAAC,MAAM,CAAC,UAAU,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;IAErD,MAAM,EAAE,oBAAoB,EAAE,YAAY,EAAE,GAC1C,MAAM,WAAW,CAAC,kBAAkB,CAClC,eAAe,EACf,MAAM,CAAC,OAAO,EACd,MAAM,CACP,CAAC;IACJ,MAAM,WAAW,CAAC,eAAe,CAC/B;QACE,YAAY,EAAE,UAAU,CAAC,IAAI;QAC7B,sBAAsB,EAAE,oBAAoB;KAC7C;IAED,0BAA0B;IAC1B,KAAK,IAAI,EAAE,CACT,MAAM,MAAM,CAAC,mBAAmB,CAC9B,MAAM,EACN,UAAU,EACV,WAAW,EACX,YAAY,EACZ,mBAAmB,EACnB,MAAM,CACP,CACJ,CAAC;IACF,OAAO,MAAM,IAAA,uCAAuB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;AACvD,CAAC;AAED;;;GAGG;AACI,KAAK,UAAU,qBAAqB,CACzC,SAAqB,EACrB,MAAc;IAEd,IACE,SAAS,CAAC,QAAQ,CAAC,oBAAQ,CAAC,MAAM,CAAC;QACnC,OAAO,CAAC,QAAQ,KAAK,OAAO;QAC5B,CAAC,CAAC,MAAM,MAAM,CAAC,UAAU,EAAE,CAAC,CAAC,QAAQ,EAAE,iBAAiB,EACxD,CAAC;QACD,MAAM,MAAM,GAAG,IAAI,CAAC,OAAO,CACzB,SAAS,EACT,iBAAiB,EACjB,oBAAoB,CACrB,CAAC;QACF,MAAM,IAAI,UAAU,CAAC,UAAU,CAAC,MAAM,EAAE,CAAC,KAAK,CAAC,YAAY,EAAE,IAAI,CAAC,EAAE;YAClE,MAAM;SACP,CAAC,CAAC,IAAI,EAAE,CAAC;IACZ,CAAC;AACH,CAAC;AAED,SAAgB,+BAA+B,CAC7C,MAA0B,EAC1B,MAAc;AACd,+FAA+F;AAC/F,eAAe;AACf,MAAM,GAAG,EAAE,CAAC,MAAM;IAElB,IACE,EAAE,CAAC,UAAU,CAAC,MAAM,CAAC,UAAU,CAAC;QAChC,CAAC,EAAE,CAAC,QAAQ,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,EAAE;YACtC,EAAE,CAAC,WAAW,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,EAC3C,CAAC;QACD,MAAM,CAAC,OAAO,CACZ,kCAAkC,MAAM,CAAC,UAAU,4CAA4C,CAChG,CAAC;QACF,IAAI,CAAC;YACH,MAAM,CAAC,MAAM,CAAC,UAAU,EAAE;gBACxB,KAAK,EAAE,IAAI;gBACX,UAAU,EAAE,CAAC;gBACb,SAAS,EAAE,IAAI;aAChB,CAAC,CAAC;YAEH,MAAM,CAAC,IAAI,CACT,yCAAyC,MAAM,CAAC,UAAU,GAAG,CAC9D,CAAC;QACJ,CAAC;QAAC,OAAO,CAAC,EAAE,CAAC;YACX,MAAM,KAAK,GAAG,mEACZ,IAAA,+BAAgB,EAAC,aAAa,CAAC;gBAC7B,CAAC,CAAC,sCAAsC,MAAM,CAAC,UAAU,IAAI;gBAC7D,CAAC,CAAC,kCAAkC,MAAM,CAAC,UAAU,IAAI;oBACvD,yEACN,iEAAiE,CAAC;YAElE,kGAAkG;YAClG,IAAI,IAAA,iCAAkB,GAAE,EAAE,CAAC;gBACzB,MAAM,IAAI,IAAI,CAAC,kBAAkB,CAC/B,GAAG,KAAK,4GAA4G;oBAClH,sEAAsE,IAAI,CAAC,eAAe,CACxF,CAAC,CACF,EAAE,CACN,CAAC;YACJ,CAAC;iBAAM,CAAC;gBACN,MAAM,IAAI,KAAK,CACb,GAAG,KAAK,sDAAsD;oBAC5D,+EAA+E;oBAC/E,yCAAyC,IAAI,CAAC,eAAe,CAAC,CAAC,CAAC,EAAE,CACrE,CAAC;YACJ,CAAC;QACH,CAAC;IACH,CAAC;AACH,CAAC"}
{"version":3,"file":"init.js","sourceRoot":"","sources":["../src/init.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAmBA,gCAyCC;AAED,gCAMC;AAED,0BAkCC;AAMD,sDAkBC;AAED,0EAkDC;AApLD,uCAAyB;AACzB,2CAA6B;AAE7B,yEAA2D;AAC3D,gDAAkC;AAElC,iDAAsE;AAEtE,qCAA+C;AAC/C,4DAA8C;AAE9C,2CAAuC;AACvC,uCAAmD;AAInD,mDAAwE;AACxE,6CAA+B;AAExB,KAAK,UAAU,UAAU,CAC9B,UAA8B,EAC9B,UAA4B,EAC5B,OAAe,EACf,OAA2B,EAC3B,iBAA2C,EAC3C,QAA2B,EAC3B,MAAc;IAQd,MAAM,CAAC,UAAU,CAAC,oBAAoB,CAAC,CAAC;IACxC,MAAM,EACJ,MAAM,EACN,yBAAyB,EACzB,WAAW,EACX,YAAY,EACZ,gBAAgB,GACjB,GAAG,MAAM,IAAA,oBAAW,EACnB,UAAU,EACV,UAAU,EACV,OAAO,EACP,OAAO,EACP,iBAAiB,EACjB,MAAM,EACN,QAAQ,EACR,IAAI,CACL,CAAC;IACF,MAAM,MAAM,CAAC,YAAY,EAAE,CAAC;IAC5B,MAAM,CAAC,QAAQ,EAAE,CAAC;IAClB,OAAO;QACL,MAAM;QACN,yBAAyB;QACzB,WAAW;QACX,YAAY;QACZ,gBAAgB;KACjB,CAAC;AACJ,CAAC;AAEM,KAAK,UAAU,UAAU,CAC9B,MAAoC;IAEpC,OAAO,MAAM,IAAA,wBAAc,EAAC,6BAA6B,EAAE,KAAK,IAAI,EAAE;QACpE,OAAO,MAAM,WAAW,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC;IAC9C,CAAC,CAAC,CAAC;AACL,CAAC;AAEM,KAAK,UAAU,OAAO,CAC3B,MAAc,EACd,MAA0B,EAC1B,UAAkB,EAClB,WAA+B,EAC/B,eAAmC,EACnC,UAAoC,EACpC,MAAc;IAEd,EAAE,CAAC,SAAS,CAAC,MAAM,CAAC,UAAU,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;IAErD,MAAM,EAAE,oBAAoB,EAAE,YAAY,EAAE,GAC1C,MAAM,WAAW,CAAC,kBAAkB,CAClC,eAAe,EACf,MAAM,CAAC,OAAO,EACd,MAAM,CACP,CAAC;IACJ,MAAM,WAAW,CAAC,eAAe,CAC/B;QACE,YAAY,EAAE,UAAU,CAAC,IAAI;QAC7B,sBAAsB,EAAE,oBAAoB;KAC7C;IAED,0BAA0B;IAC1B,KAAK,IAAI,EAAE,CACT,MAAM,MAAM,CAAC,mBAAmB,CAC9B,MAAM,EACN,UAAU,EACV,WAAW,EACX,YAAY,EACZ,MAAM,CACP,CACJ,CAAC;IACF,OAAO,MAAM,IAAA,uCAAuB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;AACvD,CAAC;AAED;;;GAGG;AACI,KAAK,UAAU,qBAAqB,CACzC,SAAqB,EACrB,MAAc;IAEd,IACE,SAAS,CAAC,QAAQ,CAAC,oBAAQ,CAAC,MAAM,CAAC;QACnC,OAAO,CAAC,QAAQ,KAAK,OAAO;QAC5B,CAAC,CAAC,MAAM,MAAM,CAAC,UAAU,EAAE,CAAC,CAAC,QAAQ,EAAE,iBAAiB,EACxD,CAAC;QACD,MAAM,MAAM,GAAG,IAAI,CAAC,OAAO,CACzB,SAAS,EACT,iBAAiB,EACjB,oBAAoB,CACrB,CAAC;QACF,MAAM,IAAI,UAAU,CAAC,UAAU,CAAC,MAAM,EAAE,CAAC,KAAK,CAAC,YAAY,EAAE,IAAI,CAAC,EAAE;YAClE,MAAM;SACP,CAAC,CAAC,IAAI,EAAE,CAAC;IACZ,CAAC;AACH,CAAC;AAED,SAAgB,+BAA+B,CAC7C,MAA0B,EAC1B,MAAc;AACd,+FAA+F;AAC/F,eAAe;AACf,MAAM,GAAG,EAAE,CAAC,MAAM;IAElB,IACE,EAAE,CAAC,UAAU,CAAC,MAAM,CAAC,UAAU,CAAC;QAChC,CAAC,EAAE,CAAC,QAAQ,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,EAAE;YACtC,EAAE,CAAC,WAAW,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,EAC3C,CAAC;QACD,MAAM,CAAC,OAAO,CACZ,kCAAkC,MAAM,CAAC,UAAU,4CAA4C,CAChG,CAAC;QACF,IAAI,CAAC;YACH,MAAM,CAAC,MAAM,CAAC,UAAU,EAAE;gBACxB,KAAK,EAAE,IAAI;gBACX,UAAU,EAAE,CAAC;gBACb,SAAS,EAAE,IAAI;aAChB,CAAC,CAAC;YAEH,MAAM,CAAC,IAAI,CACT,yCAAyC,MAAM,CAAC,UAAU,GAAG,CAC9D,CAAC;QACJ,CAAC;QAAC,OAAO,CAAC,EAAE,CAAC;YACX,MAAM,KAAK,GAAG,mEACZ,IAAA,+BAAgB,EAAC,aAAa,CAAC;gBAC7B,CAAC,CAAC,sCAAsC,MAAM,CAAC,UAAU,IAAI;gBAC7D,CAAC,CAAC,kCAAkC,MAAM,CAAC,UAAU,IAAI;oBACvD,yEACN,iEAAiE,CAAC;YAElE,kGAAkG;YAClG,IAAI,IAAA,iCAAkB,GAAE,EAAE,CAAC;gBACzB,MAAM,IAAI,IAAI,CAAC,kBAAkB,CAC/B,GAAG,KAAK,4GAA4G;oBAClH,sEAAsE,IAAI,CAAC,eAAe,CACxF,CAAC,CACF,EAAE,CACN,CAAC;YACJ,CAAC;iBAAM,CAAC;gBACN,MAAM,IAAI,KAAK,CACb,GAAG,KAAK,sDAAsD;oBAC5D,+EAA+E;oBAC/E,yCAAyC,IAAI,CAAC,eAAe,CAAC,CAAC,CAAC,EAAE,CACrE,CAAC;YACJ,CAAC;QACH,CAAC;IACH,CAAC;AACH,CAAC"}

View File

@@ -36,10 +36,15 @@ Object.defineProperty(exports, "__esModule", { value: true });
exports.CODEQL_OVERLAY_MINIMUM_VERSION = exports.OverlayDatabaseMode = void 0;
exports.writeBaseDatabaseOidsFile = writeBaseDatabaseOidsFile;
exports.writeOverlayChangesFile = writeOverlayChangesFile;
exports.checkOverlayBaseDatabase = checkOverlayBaseDatabase;
exports.uploadOverlayBaseDatabaseToCache = uploadOverlayBaseDatabaseToCache;
exports.downloadOverlayBaseDatabaseFromCache = downloadOverlayBaseDatabaseFromCache;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const actionsCache = __importStar(require("@actions/cache"));
const actions_util_1 = require("./actions-util");
const git_utils_1 = require("./git-utils");
const util_1 = require("./util");
var OverlayDatabaseMode;
(function (OverlayDatabaseMode) {
OverlayDatabaseMode["Overlay"] = "overlay";
@@ -126,4 +131,165 @@ function computeChangedFiles(baseFileOids, overlayFileOids) {
}
return changes;
}
// Constants for database caching
const CACHE_VERSION = 1;
const CACHE_PREFIX = "codeql-overlay-base-database";
const MAX_CACHE_OPERATION_MS = 120_000; // Two minutes
/**
* Checks that the overlay-base database is valid by checking for the
* existence of the base database OIDs file.
*
* @param config The configuration object
* @param logger The logger instance
* @param warningPrefix Prefix for the check failure warning message
* @returns True if the verification succeeded, false otherwise
*/
function checkOverlayBaseDatabase(config, logger, warningPrefix) {
// An overlay-base database should contain the base database OIDs file.
const baseDatabaseOidsFilePath = getBaseDatabaseOidsFilePath(config);
if (!fs.existsSync(baseDatabaseOidsFilePath)) {
logger.warning(`${warningPrefix}: ${baseDatabaseOidsFilePath} does not exist`);
return false;
}
return true;
}
/**
* Uploads the overlay-base database to the GitHub Actions cache. If conditions
* for uploading are not met, the function does nothing and returns false.
*
* This function uses the `checkout_path` input to determine the repository path
* and works only when called from `analyze` or `upload-sarif`.
*
* @param codeql The CodeQL instance
* @param config The configuration object
* @param logger The logger instance
* @returns A promise that resolves to true if the upload was performed and
* successfully completed, or false otherwise
*/
async function uploadOverlayBaseDatabaseToCache(codeql, config, logger) {
const overlayDatabaseMode = config.augmentationProperties.overlayDatabaseMode;
if (overlayDatabaseMode !== OverlayDatabaseMode.OverlayBase) {
logger.debug(`Overlay database mode is ${overlayDatabaseMode}. ` +
"Skip uploading overlay-base database to cache.");
return false;
}
if (!config.augmentationProperties.useOverlayDatabaseCaching) {
logger.debug("Overlay database caching is disabled. " +
"Skip uploading overlay-base database to cache.");
return false;
}
if ((0, util_1.isInTestMode)()) {
logger.debug("In test mode. Skip uploading overlay-base database to cache.");
return false;
}
const databaseIsValid = checkOverlayBaseDatabase(config, logger, "Abort uploading overlay-base database to cache");
if (!databaseIsValid) {
return false;
}
const dbLocation = config.dbLocation;
const codeQlVersion = (await codeql.getVersion()).version;
const checkoutPath = (0, actions_util_1.getRequiredInput)("checkout_path");
const cacheKey = await generateCacheKey(config, codeQlVersion, checkoutPath);
logger.info(`Uploading overlay-base database to Actions cache with key ${cacheKey}`);
try {
const cacheId = await (0, util_1.withTimeout)(MAX_CACHE_OPERATION_MS, actionsCache.saveCache([dbLocation], cacheKey), () => { });
if (cacheId === undefined) {
logger.warning("Timed out while uploading overlay-base database");
return false;
}
}
catch (error) {
logger.warning("Failed to upload overlay-base database to cache: " +
`${error instanceof Error ? error.message : String(error)}`);
return false;
}
logger.info(`Successfully uploaded overlay-base database from ${dbLocation}`);
return true;
}
/**
* Downloads the overlay-base database from the GitHub Actions cache. If conditions
* for downloading are not met, the function does nothing and returns false.
*
* @param codeql The CodeQL instance
* @param config The configuration object
* @param logger The logger instance
* @returns A promise that resolves to download statistics if an overlay-base
* database was successfully downloaded, or undefined if the download was
* either not performed or failed.
*/
async function downloadOverlayBaseDatabaseFromCache(codeql, config, logger) {
const overlayDatabaseMode = config.augmentationProperties.overlayDatabaseMode;
if (overlayDatabaseMode !== OverlayDatabaseMode.Overlay) {
logger.debug(`Overlay database mode is ${overlayDatabaseMode}. ` +
"Skip downloading overlay-base database from cache.");
return undefined;
}
if (!config.augmentationProperties.useOverlayDatabaseCaching) {
logger.debug("Overlay database caching is disabled. " +
"Skip downloading overlay-base database from cache.");
return undefined;
}
if ((0, util_1.isInTestMode)()) {
logger.debug("In test mode. Skip downloading overlay-base database from cache.");
return undefined;
}
const dbLocation = config.dbLocation;
const codeQlVersion = (await codeql.getVersion()).version;
const restoreKey = getCacheRestoreKey(config, codeQlVersion);
logger.info(`Looking in Actions cache for overlay-base database with restore key ${restoreKey}`);
let databaseDownloadDurationMs = 0;
try {
const databaseDownloadStart = performance.now();
const foundKey = await (0, util_1.withTimeout)(MAX_CACHE_OPERATION_MS, actionsCache.restoreCache([dbLocation], restoreKey), () => {
logger.info("Timed out downloading overlay-base database from cache");
});
databaseDownloadDurationMs = Math.round(performance.now() - databaseDownloadStart);
if (foundKey === undefined) {
logger.info("No overlay-base database found in Actions cache");
return undefined;
}
logger.info(`Downloaded overlay-base database in cache with key ${foundKey}`);
}
catch (error) {
logger.warning("Failed to download overlay-base database from cache: " +
`${error instanceof Error ? error.message : String(error)}`);
return undefined;
}
const databaseIsValid = checkOverlayBaseDatabase(config, logger, "Downloaded overlay-base database is invalid");
if (!databaseIsValid) {
logger.warning("Downloaded overlay-base database failed validation");
return undefined;
}
const databaseSizeBytes = await (0, util_1.tryGetFolderBytes)(dbLocation, logger);
if (databaseSizeBytes === undefined) {
logger.info("Filesystem error while accessing downloaded overlay-base database");
// The problem that warrants reporting download failure is not that we are
// unable to determine the size of the database. Rather, it is that we
// encountered a filesystem error while accessing the database, which
// indicates that an overlay analysis will likely fail.
return undefined;
}
logger.info(`Successfully downloaded overlay-base database to ${dbLocation}`);
return {
databaseSizeBytes: Math.round(databaseSizeBytes),
databaseDownloadDurationMs,
};
}
async function generateCacheKey(config, codeQlVersion, checkoutPath) {
const sha = await (0, git_utils_1.getCommitOid)(checkoutPath);
return `${getCacheRestoreKey(config, codeQlVersion)}${sha}`;
}
function getCacheRestoreKey(config, codeQlVersion) {
// The restore key (prefix) specifies which cached overlay-base databases are
// compatible with the current analysis: the cached database must have the
// same cache version and the same CodeQL bundle version.
//
// Actions cache supports using multiple restore keys to indicate preference.
// Technically we prefer a cached overlay-base database with the same SHA as
// we are analyzing. However, since overlay-base databases are built from the
// default branch and used in PR analysis, it is exceedingly unlikely that
// the commit SHA will ever be the same, so we can just leave it out.
const languages = [...config.languages].sort().join("_");
return `${CACHE_PREFIX}-${CACHE_VERSION}-${languages}-${codeQlVersion}-`;
}
//# sourceMappingURL=overlay-database-utils.js.map

File diff suppressed because one or more lines are too long

View File

@@ -38,6 +38,7 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
Object.defineProperty(exports, "__esModule", { value: true });
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const actionsCache = __importStar(require("@actions/cache"));
const ava_1 = __importDefault(require("ava"));
const sinon = __importStar(require("sinon"));
const actionsUtil = __importStar(require("./actions-util"));
@@ -45,6 +46,7 @@ const gitUtils = __importStar(require("./git-utils"));
const logging_1 = require("./logging");
const overlay_database_utils_1 = require("./overlay-database-utils");
const testing_utils_1 = require("./testing-utils");
const utils = __importStar(require("./util"));
const util_1 = require("./util");
(0, testing_utils_1.setupTests)(ava_1.default);
(0, ava_1.default)("writeOverlayChangesFile generates correct changes file", async (t) => {
@@ -91,4 +93,93 @@ const util_1 = require("./util");
t.deepEqual(parsedContent.changes.sort(), ["added.js", "deleted.js", "modified.js"], "Should identify added, deleted, and modified files");
});
});
const defaultDownloadTestCase = {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: true,
isInTestMode: false,
restoreCacheResult: "cache-key",
hasBaseDatabaseOidsFile: true,
tryGetFolderBytesSucceeds: true,
codeQLVersion: "2.20.5",
};
const testDownloadOverlayBaseDatabaseFromCache = ava_1.default.macro({
exec: async (t, _title, partialTestCase, expectDownloadSuccess) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const dbLocation = path.join(tmpDir, "db");
await fs.promises.mkdir(dbLocation, { recursive: true });
const logger = (0, logging_1.getRunnerLogger)(true);
const config = (0, testing_utils_1.createTestConfig)({ dbLocation });
const testCase = { ...defaultDownloadTestCase, ...partialTestCase };
config.augmentationProperties.overlayDatabaseMode =
testCase.overlayDatabaseMode;
config.augmentationProperties.useOverlayDatabaseCaching =
testCase.useOverlayDatabaseCaching;
if (testCase.hasBaseDatabaseOidsFile) {
const baseDatabaseOidsFile = path.join(dbLocation, "base-database-oids.json");
await fs.promises.writeFile(baseDatabaseOidsFile, JSON.stringify({}));
}
const stubs = [];
const isInTestModeStub = sinon
.stub(utils, "isInTestMode")
.returns(testCase.isInTestMode);
stubs.push(isInTestModeStub);
if (testCase.restoreCacheResult instanceof Error) {
const restoreCacheStub = sinon
.stub(actionsCache, "restoreCache")
.rejects(testCase.restoreCacheResult);
stubs.push(restoreCacheStub);
}
else {
const restoreCacheStub = sinon
.stub(actionsCache, "restoreCache")
.resolves(testCase.restoreCacheResult);
stubs.push(restoreCacheStub);
}
const tryGetFolderBytesStub = sinon
.stub(utils, "tryGetFolderBytes")
.resolves(testCase.tryGetFolderBytesSucceeds ? 1024 * 1024 : undefined);
stubs.push(tryGetFolderBytesStub);
try {
const result = await (0, overlay_database_utils_1.downloadOverlayBaseDatabaseFromCache)((0, testing_utils_1.mockCodeQLVersion)(testCase.codeQLVersion), config, logger);
if (expectDownloadSuccess) {
t.truthy(result);
}
else {
t.is(result, undefined);
}
}
finally {
for (const stub of stubs) {
stub.restore();
}
}
});
},
title: (_, title) => `downloadOverlayBaseDatabaseFromCache: ${title}`,
});
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns stats when successful", {}, true);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when mode is OverlayDatabaseMode.OverlayBase", {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.OverlayBase,
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when mode is OverlayDatabaseMode.None", {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when caching is disabled", {
useOverlayDatabaseCaching: false,
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined in test mode", {
isInTestMode: true,
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when cache miss", {
restoreCacheResult: undefined,
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when download fails", {
restoreCacheResult: new Error("Download failed"),
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when downloaded database is invalid", {
hasBaseDatabaseOidsFile: false,
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when filesystem error occurs", {
tryGetFolderBytesSucceeds: false,
}, false);
//# sourceMappingURL=overlay-database-utils.test.js.map

File diff suppressed because one or more lines are too long

View File

@@ -40,14 +40,13 @@ const path = __importStar(require("path"));
const ava_1 = __importDefault(require("ava"));
const sinon = __importStar(require("sinon"));
const actionsUtil = __importStar(require("./actions-util"));
const feature_flags_test_1 = require("./feature-flags.test");
const logging_1 = require("./logging");
const setupCodeql = __importStar(require("./setup-codeql"));
const testing_utils_1 = require("./testing-utils");
const util_1 = require("./util");
(0, testing_utils_1.setupTests)(ava_1.default);
// TODO: Remove when when we no longer need to pass in features (https://github.com/github/codeql-action/issues/2600)
const expectedFeatureEnablement = (0, feature_flags_test_1.initializeFeatures)(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
expectedFeatureEnablement.getValue = function (feature) {
// eslint-disable-next-line @typescript-eslint/no-unsafe-return
return expectedFeatureEnablement[feature];

File diff suppressed because one or more lines are too long

21
lib/start-proxy.js generated
View File

@@ -50,6 +50,14 @@ const LANGUAGE_TO_REGISTRY_TYPE = {
cpp: "",
swift: "",
};
/**
* Checks that `value` is neither `undefined` nor `null`.
* @param value The value to test.
* @returns Narrows the type of `value` to exclude `undefined` and `null`.
*/
function isDefined(value) {
return value !== undefined && value !== null;
}
// getCredentials returns registry credentials from action inputs.
// It prefers `registries_credentials` over `registry_secrets`.
// If neither is set, it returns an empty array.
@@ -81,16 +89,23 @@ function getCredentials(logger, registrySecrets, registriesCredentials, language
logger.error("Failed to parse the credentials data.");
throw new util_1.ConfigurationError("Invalid credentials format.");
}
// Check that the parsed data is indeed an array.
if (!Array.isArray(parsed)) {
throw new util_1.ConfigurationError("Expected credentials data to be an array of configurations, but it is not.");
}
const out = [];
for (const e of parsed) {
if (e === null || typeof e !== "object") {
throw new util_1.ConfigurationError("Invalid credentials - must be an object");
}
// Mask credentials to reduce chance of accidental leakage in logs.
if (e.password !== undefined) {
if (isDefined(e.password)) {
core.setSecret(e.password);
}
if (e.token !== undefined) {
if (isDefined(e.token)) {
core.setSecret(e.token);
}
if (e.url === undefined && e.host === undefined) {
if (!isDefined(e.url) && !isDefined(e.host)) {
// The proxy needs one of these to work. If both are defined, the url has the precedence.
throw new util_1.ConfigurationError("Invalid credentials - must specify host or url");
}

View File

@@ -1 +1 @@
{"version":3,"file":"start-proxy.js","sourceRoot":"","sources":["../src/start-proxy.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAgCA,wCAmFC;AAnHD,oDAAsC;AAEtC,2CAAsD;AAEtD,iCAA4C;AAW5C,MAAM,yBAAyB,GAA6B;IAC1D,IAAI,EAAE,kBAAkB;IACxB,MAAM,EAAE,YAAY;IACpB,UAAU,EAAE,cAAc;IAC1B,MAAM,EAAE,cAAc;IACtB,IAAI,EAAE,iBAAiB;IACvB,IAAI,EAAE,gBAAgB;IACtB,EAAE,EAAE,gBAAgB;IACpB,oFAAoF;IACpF,OAAO,EAAE,EAAE;IACX,GAAG,EAAE,EAAE;IACP,KAAK,EAAE,EAAE;CACD,CAAC;AAEX,kEAAkE;AAClE,+DAA+D;AAC/D,gDAAgD;AAChD,SAAgB,cAAc,CAC5B,MAAc,EACd,eAAmC,EACnC,qBAAyC,EACzC,cAAkC;IAElC,MAAM,QAAQ,GAAG,cAAc,CAAC,CAAC,CAAC,IAAA,yBAAa,EAAC,cAAc,CAAC,CAAC,CAAC,CAAC,SAAS,CAAC;IAC5E,MAAM,uBAAuB,GAAG,QAAQ;QACtC,CAAC,CAAC,yBAAyB,CAAC,QAAQ,CAAC;QACrC,CAAC,CAAC,SAAS,CAAC;IAEd,IAAI,cAAsB,CAAC;IAC3B,IAAI,qBAAqB,KAAK,SAAS,EAAE,CAAC;QACxC,MAAM,CAAC,IAAI,CAAC,qCAAqC,CAAC,CAAC;QACnD,cAAc,GAAG,MAAM,CAAC,IAAI,CAAC,qBAAqB,EAAE,QAAQ,CAAC,CAAC,QAAQ,EAAE,CAAC;IAC3E,CAAC;SAAM,IAAI,eAAe,KAAK,SAAS,EAAE,CAAC;QACzC,MAAM,CAAC,IAAI,CAAC,+BAA+B,CAAC,CAAC;QAC7C,cAAc,GAAG,eAAe,CAAC;IACnC,CAAC;SAAM,CAAC;QACN,MAAM,CAAC,IAAI,CAAC,yBAAyB,CAAC,CAAC;QACvC,OAAO,EAAE,CAAC;IACZ,CAAC;IAED,qCAAqC;IACrC,IAAI,MAAoB,CAAC;IACzB,IAAI,CAAC;QACH,MAAM,GAAG,IAAI,CAAC,KAAK,CAAC,cAAc,CAAiB,CAAC;IACtD,CAAC;IAAC,MAAM,CAAC;QACP,oEAAoE;QACpE,MAAM,CAAC,KAAK,CAAC,uCAAuC,CAAC,CAAC;QACtD,MAAM,IAAI,yBAAkB,CAAC,6BAA6B,CAAC,CAAC;IAC9D,CAAC;IAED,MAAM,GAAG,GAAiB,EAAE,CAAC;IAC7B,KAAK,MAAM,CAAC,IAAI,MAAM,EAAE,CAAC;QACvB,mEAAmE;QACnE,IAAI,CAAC,CAAC,QAAQ,KAAK,SAAS,EAAE,CAAC;YAC7B,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,QAAQ,CAAC,CAAC;QAC7B,CAAC;QACD,IAAI,CAAC,CAAC,KAAK,KAAK,SAAS,EAAE,CAAC;YAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC;QAC1B,CAAC;QAED,IAAI,CAAC,CAAC,GAAG,KAAK,SAAS,IAAI,CAAC,CAAC,IAAI,KAAK,SAAS,EAAE,CAAC;YAChD,yFAAyF;YACzF,MAAM,IAAI,yBAAkB,CAC1B,gDAAgD,CACjD,CAAC;QACJ,CAAC;QAED,kFAAkF;QAClF,iEAAiE;QACjE,IAAI,uBAAuB,IAAI,CAAC,CAAC,IAAI,KAAK,uBAAuB,EAAE,CAAC;YAClE,SAAS;QACX,CAAC;QAED,MAAM,WAAW,GAAG,CAAC,GAAuB,EAAW,EAAE;YACvD,OAAO,GAAG,CAAC,CAAC,CAAC,gBAAgB,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC;QACjD,CAAC,CAAC;QAEF,IACE,CAAC,WAAW,CAAC,CAAC,CAAC,IAAI,CAAC;YACpB,CAAC,WAAW,CAAC,CAAC,CAAC,IAAI,CAAC;YACpB,CAAC,WAAW,CAAC,CAAC,CAAC,GAAG,CAAC;YACnB,CAAC,WAAW,CAAC,CAAC,CAAC,QAAQ,CAAC;YACxB,CAAC,WAAW,CAAC,CAAC,CAAC,QAAQ,CAAC;YACxB,CAAC,WAAW,CAAC,CAAC,CAAC,KAAK,CAAC,EACrB,CAAC;YACD,MAAM,IAAI,yBAAkB,CAC1B,qEAAqE,CACtE,CAAC;QACJ,CAAC;QAED,GAAG,CAAC,IAAI,CAAC;YACP,IAAI,EAAE,CAAC,CAAC,IAAI;YACZ,IAAI,EAAE,CAAC,CAAC,IAAI;YACZ,GAAG,EAAE,CAAC,CAAC,GAAG;YACV,QAAQ,EAAE,CAAC,CAAC,QAAQ;YACpB,QAAQ,EAAE,CAAC,CAAC,QAAQ;YACpB,KAAK,EAAE,CAAC,CAAC,KAAK;SACf,CAAC,CAAC;IACL,CAAC;IACD,OAAO,GAAG,CAAC;AACb,CAAC"}
{"version":3,"file":"start-proxy.js","sourceRoot":"","sources":["../src/start-proxy.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAyCA,wCA8FC;AAvID,oDAAsC;AAEtC,2CAAsD;AAEtD,iCAA4C;AAW5C,MAAM,yBAAyB,GAA6B;IAC1D,IAAI,EAAE,kBAAkB;IACxB,MAAM,EAAE,YAAY;IACpB,UAAU,EAAE,cAAc;IAC1B,MAAM,EAAE,cAAc;IACtB,IAAI,EAAE,iBAAiB;IACvB,IAAI,EAAE,gBAAgB;IACtB,EAAE,EAAE,gBAAgB;IACpB,oFAAoF;IACpF,OAAO,EAAE,EAAE;IACX,GAAG,EAAE,EAAE;IACP,KAAK,EAAE,EAAE;CACD,CAAC;AAEX;;;;GAIG;AACH,SAAS,SAAS,CAAI,KAA2B;IAC/C,OAAO,KAAK,KAAK,SAAS,IAAI,KAAK,KAAK,IAAI,CAAC;AAC/C,CAAC;AAED,kEAAkE;AAClE,+DAA+D;AAC/D,gDAAgD;AAChD,SAAgB,cAAc,CAC5B,MAAc,EACd,eAAmC,EACnC,qBAAyC,EACzC,cAAkC;IAElC,MAAM,QAAQ,GAAG,cAAc,CAAC,CAAC,CAAC,IAAA,yBAAa,EAAC,cAAc,CAAC,CAAC,CAAC,CAAC,SAAS,CAAC;IAC5E,MAAM,uBAAuB,GAAG,QAAQ;QACtC,CAAC,CAAC,yBAAyB,CAAC,QAAQ,CAAC;QACrC,CAAC,CAAC,SAAS,CAAC;IAEd,IAAI,cAAsB,CAAC;IAC3B,IAAI,qBAAqB,KAAK,SAAS,EAAE,CAAC;QACxC,MAAM,CAAC,IAAI,CAAC,qCAAqC,CAAC,CAAC;QACnD,cAAc,GAAG,MAAM,CAAC,IAAI,CAAC,qBAAqB,EAAE,QAAQ,CAAC,CAAC,QAAQ,EAAE,CAAC;IAC3E,CAAC;SAAM,IAAI,eAAe,KAAK,SAAS,EAAE,CAAC;QACzC,MAAM,CAAC,IAAI,CAAC,+BAA+B,CAAC,CAAC;QAC7C,cAAc,GAAG,eAAe,CAAC;IACnC,CAAC;SAAM,CAAC;QACN,MAAM,CAAC,IAAI,CAAC,yBAAyB,CAAC,CAAC;QACvC,OAAO,EAAE,CAAC;IACZ,CAAC;IAED,qCAAqC;IACrC,IAAI,MAAoB,CAAC;IACzB,IAAI,CAAC;QACH,MAAM,GAAG,IAAI,CAAC,KAAK,CAAC,cAAc,CAAiB,CAAC;IACtD,CAAC;IAAC,MAAM,CAAC;QACP,oEAAoE;QACpE,MAAM,CAAC,KAAK,CAAC,uCAAuC,CAAC,CAAC;QACtD,MAAM,IAAI,yBAAkB,CAAC,6BAA6B,CAAC,CAAC;IAC9D,CAAC;IAED,iDAAiD;IACjD,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,EAAE,CAAC;QAC3B,MAAM,IAAI,yBAAkB,CAC1B,4EAA4E,CAC7E,CAAC;IACJ,CAAC;IAED,MAAM,GAAG,GAAiB,EAAE,CAAC;IAC7B,KAAK,MAAM,CAAC,IAAI,MAAM,EAAE,CAAC;QACvB,IAAI,CAAC,KAAK,IAAI,IAAI,OAAO,CAAC,KAAK,QAAQ,EAAE,CAAC;YACxC,MAAM,IAAI,yBAAkB,CAAC,yCAAyC,CAAC,CAAC;QAC1E,CAAC;QAED,mEAAmE;QACnE,IAAI,SAAS,CAAC,CAAC,CAAC,QAAQ,CAAC,EAAE,CAAC;YAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,QAAQ,CAAC,CAAC;QAC7B,CAAC;QACD,IAAI,SAAS,CAAC,CAAC,CAAC,KAAK,CAAC,EAAE,CAAC;YACvB,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC;QAC1B,CAAC;QAED,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,IAAI,CAAC,EAAE,CAAC;YAC5C,yFAAyF;YACzF,MAAM,IAAI,yBAAkB,CAC1B,gDAAgD,CACjD,CAAC;QACJ,CAAC;QAED,kFAAkF;QAClF,iEAAiE;QACjE,IAAI,uBAAuB,IAAI,CAAC,CAAC,IAAI,KAAK,uBAAuB,EAAE,CAAC;YAClE,SAAS;QACX,CAAC;QAED,MAAM,WAAW,GAAG,CAAC,GAAuB,EAAW,EAAE;YACvD,OAAO,GAAG,CAAC,CAAC,CAAC,gBAAgB,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC;QACjD,CAAC,CAAC;QAEF,IACE,CAAC,WAAW,CAAC,CAAC,CAAC,IAAI,CAAC;YACpB,CAAC,WAAW,CAAC,CAAC,CAAC,IAAI,CAAC;YACpB,CAAC,WAAW,CAAC,CAAC,CAAC,GAAG,CAAC;YACnB,CAAC,WAAW,CAAC,CAAC,CAAC,QAAQ,CAAC;YACxB,CAAC,WAAW,CAAC,CAAC,CAAC,QAAQ,CAAC;YACxB,CAAC,WAAW,CAAC,CAAC,CAAC,KAAK,CAAC,EACrB,CAAC;YACD,MAAM,IAAI,yBAAkB,CAC1B,qEAAqE,CACtE,CAAC;QACJ,CAAC;QAED,GAAG,CAAC,IAAI,CAAC;YACP,IAAI,EAAE,CAAC,CAAC,IAAI;YACZ,IAAI,EAAE,CAAC,CAAC,IAAI;YACZ,GAAG,EAAE,CAAC,CAAC,GAAG;YACV,QAAQ,EAAE,CAAC,CAAC,QAAQ;YACpB,QAAQ,EAAE,CAAC,CAAC,QAAQ;YACpB,KAAK,EAAE,CAAC,CAAC,KAAK;SACf,CAAC,CAAC;IACL,CAAC;IACD,OAAO,GAAG,CAAC;AACb,CAAC"}

View File

@@ -41,6 +41,7 @@ const logging_1 = require("./logging");
const startProxyExports = __importStar(require("./start-proxy"));
const testing_utils_1 = require("./testing-utils");
(0, testing_utils_1.setupTests)(ava_1.default);
const toEncodedJSON = (data) => Buffer.from(JSON.stringify(data)).toString("base64");
(0, ava_1.default)("getCredentials prefers registriesCredentials over registrySecrets", async (t) => {
const registryCredentials = Buffer.from(JSON.stringify([
{ type: "npm_registry", host: "npm.pkg.github.com", token: "abc" },
@@ -52,19 +53,40 @@ const testing_utils_1 = require("./testing-utils");
t.is(credentials.length, 1);
t.is(credentials[0].host, "npm.pkg.github.com");
});
(0, ava_1.default)("getCredentials throws error when credential missing host and url", async (t) => {
const registryCredentials = Buffer.from(JSON.stringify([{ type: "npm_registry", token: "abc" }])).toString("base64");
(0, ava_1.default)("getCredentials throws an error when configurations are not an array", async (t) => {
const registryCredentials = Buffer.from(JSON.stringify({ type: "npm_registry", token: "abc" })).toString("base64");
t.throws(() => startProxyExports.getCredentials((0, logging_1.getRunnerLogger)(true), undefined, registryCredentials, undefined), {
message: "Invalid credentials - must specify host or url",
message: "Expected credentials data to be an array of configurations, but it is not.",
});
});
(0, ava_1.default)("getCredentials throws error when credential is not an object", async (t) => {
const testCredentials = [["foo"], [null]].map(toEncodedJSON);
for (const testCredential of testCredentials) {
t.throws(() => startProxyExports.getCredentials((0, logging_1.getRunnerLogger)(true), undefined, testCredential, undefined), {
message: "Invalid credentials - must be an object",
});
}
});
(0, ava_1.default)("getCredentials throws error when credential missing host and url", async (t) => {
const testCredentials = [
[{ type: "npm_registry", token: "abc" }],
[{ type: "npm_registry", token: "abc", host: null }],
[{ type: "npm_registry", token: "abc", url: null }],
].map(toEncodedJSON);
for (const testCredential of testCredentials) {
t.throws(() => startProxyExports.getCredentials((0, logging_1.getRunnerLogger)(true), undefined, testCredential, undefined), {
message: "Invalid credentials - must specify host or url",
});
}
});
(0, ava_1.default)("getCredentials filters by language when specified", async (t) => {
const mixedCredentials = [
{ type: "npm_registry", host: "npm.pkg.github.com", token: "abc" },
{ type: "maven_repository", host: "maven.pkg.github.com", token: "def" },
{ type: "nuget_feed", host: "nuget.pkg.github.com", token: "ghi" },
{ type: "goproxy_server", host: "goproxy.example.com", token: "jkl" },
];
const credentials = startProxyExports.getCredentials((0, logging_1.getRunnerLogger)(true), undefined, Buffer.from(JSON.stringify(mixedCredentials)).toString("base64"), "java");
const credentials = startProxyExports.getCredentials((0, logging_1.getRunnerLogger)(true), undefined, toEncodedJSON(mixedCredentials), "java");
t.is(credentials.length, 1);
t.is(credentials[0].type, "maven_repository");
});
@@ -73,10 +95,11 @@ const testing_utils_1 = require("./testing-utils");
{ type: "npm_registry", host: "npm.pkg.github.com", token: "abc" },
{ type: "maven_repository", host: "maven.pkg.github.com", token: "def" },
{ type: "nuget_feed", host: "nuget.pkg.github.com", token: "ghi" },
{ type: "goproxy_server", host: "goproxy.example.com", token: "jkl" },
];
const credentialsInput = Buffer.from(JSON.stringify(mixedCredentials)).toString("base64");
const credentialsInput = toEncodedJSON(mixedCredentials);
const credentials = startProxyExports.getCredentials((0, logging_1.getRunnerLogger)(true), undefined, credentialsInput, undefined);
t.is(credentials.length, 3);
t.is(credentials.length, mixedCredentials.length);
});
(0, ava_1.default)("getCredentials throws an error when non-printable characters are used", async (t) => {
const invalidCredentials = [

View File

@@ -1 +1 @@
{"version":3,"file":"start-proxy.test.js","sourceRoot":"","sources":["../src/start-proxy.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,8CAAuB;AAEvB,uCAA4C;AAC5C,iEAAmD;AACnD,mDAA6C;AAE7C,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,IAAA,aAAI,EAAC,mEAAmE,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACpF,MAAM,mBAAmB,GAAG,MAAM,CAAC,IAAI,CACrC,IAAI,CAAC,SAAS,CAAC;QACb,EAAE,IAAI,EAAE,cAAc,EAAE,IAAI,EAAE,oBAAoB,EAAE,KAAK,EAAE,KAAK,EAAE;KACnE,CAAC,CACH,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC;IACrB,MAAM,eAAe,GAAG,IAAI,CAAC,SAAS,CAAC;QACrC,EAAE,IAAI,EAAE,cAAc,EAAE,IAAI,EAAE,oBAAoB,EAAE,KAAK,EAAE,KAAK,EAAE;KACnE,CAAC,CAAC;IAEH,MAAM,WAAW,GAAG,iBAAiB,CAAC,cAAc,CAClD,IAAA,yBAAe,EAAC,IAAI,CAAC,EACrB,eAAe,EACf,mBAAmB,EACnB,SAAS,CACV,CAAC;IACF,CAAC,CAAC,EAAE,CAAC,WAAW,CAAC,MAAM,EAAE,CAAC,CAAC,CAAC;IAC5B,CAAC,CAAC,EAAE,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,IAAI,EAAE,oBAAoB,CAAC,CAAC;AAClD,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,kEAAkE,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACnF,MAAM,mBAAmB,GAAG,MAAM,CAAC,IAAI,CACrC,IAAI,CAAC,SAAS,CAAC,CAAC,EAAE,IAAI,EAAE,cAAc,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CAAC,CACzD,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC;IAErB,CAAC,CAAC,MAAM,CACN,GAAG,EAAE,CACH,iBAAiB,CAAC,cAAc,CAC9B,IAAA,yBAAe,EAAC,IAAI,CAAC,EACrB,SAAS,EACT,mBAAmB,EACnB,SAAS,CACV,EACH;QACE,OAAO,EAAE,gDAAgD;KAC1D,CACF,CAAC;AACJ,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,mDAAmD,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACpE,MAAM,gBAAgB,GAAG;QACvB,EAAE,IAAI,EAAE,cAAc,EAAE,IAAI,EAAE,oBAAoB,EAAE,KAAK,EAAE,KAAK,EAAE;QAClE,EAAE,IAAI,EAAE,kBAAkB,EAAE,IAAI,EAAE,sBAAsB,EAAE,KAAK,EAAE,KAAK,EAAE;QACxE,EAAE,IAAI,EAAE,YAAY,EAAE,IAAI,EAAE,sBAAsB,EAAE,KAAK,EAAE,KAAK,EAAE;KACnE,CAAC;IAEF,MAAM,WAAW,GAAG,iBAAiB,CAAC,cAAc,CAClD,IAAA,yBAAe,EAAC,IAAI,CAAC,EACrB,SAAS,EACT,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,gBAAgB,CAAC,CAAC,CAAC,QAAQ,CAAC,QAAQ,CAAC,EAChE,MAAM,CACP,CAAC;IACF,CAAC,CAAC,EAAE,CAAC,WAAW,CAAC,MAAM,EAAE,CAAC,CAAC,CAAC;IAC5B,CAAC,CAAC,EAAE,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,IAAI,EAAE,kBAAkB,CAAC,CAAC;AAChD,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,mEAAmE,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACpF,MAAM,gBAAgB,GAAG;QACvB,EAAE,IAAI,EAAE,cAAc,EAAE,IAAI,EAAE,oBAAoB,EAAE,KAAK,EAAE,KAAK,EAAE;QAClE,EAAE,IAAI,EAAE,kBAAkB,EAAE,IAAI,EAAE,sBAAsB,EAAE,KAAK,EAAE,KAAK,EAAE;QACxE,EAAE,IAAI,EAAE,YAAY,EAAE,IAAI,EAAE,sBAAsB,EAAE,KAAK,EAAE,KAAK,EAAE;KACnE,CAAC;IACF,MAAM,gBAAgB,GAAG,MAAM,CAAC,IAAI,CAClC,IAAI,CAAC,SAAS,CAAC,gBAAgB,CAAC,CACjC,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC;IAErB,MAAM,WAAW,GAAG,iBAAiB,CAAC,cAAc,CAClD,IAAA,yBAAe,EAAC,IAAI,CAAC,EACrB,SAAS,EACT,gBAAgB,EAChB,SAAS,CACV,CAAC;IACF,CAAC,CAAC,EAAE,CAAC,WAAW,CAAC,MAAM,EAAE,CAAC,CAAC,CAAC;AAC9B,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,uEAAuE,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACxF,MAAM,kBAAkB,GAAG;QACzB,EAAE,IAAI,EAAE,YAAY,EAAE,IAAI,EAAE,uBAAuB,EAAE,KAAK,EAAE,WAAW,EAAE,EAAE,mCAAmC;QAC9G,EAAE,IAAI,EAAE,YAAY,EAAE,IAAI,EAAE,6BAA6B,EAAE,EAAE,kCAAkC;QAC/F;YACE,IAAI,EAAE,YAAY;YAClB,IAAI,EAAE,uBAAuB;YAC7B,QAAQ,EAAE,WAAW;SACtB,EAAE,sCAAsC;QACzC,EAAE,IAAI,EAAE,YAAY,EAAE,IAAI,EAAE,uBAAuB,EAAE,QAAQ,EAAE,SAAS,EAAE,EAAE,sCAAsC;KACnH,CAAC;IAEF,KAAK,MAAM,iBAAiB,IAAI,kBAAkB,EAAE,CAAC;QACnD,MAAM,gBAAgB,GAAG,MAAM,CAAC,IAAI,CAClC,IAAI,CAAC,SAAS,CAAC,CAAC,iBAAiB,CAAC,CAAC,CACpC,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC;QAErB,CAAC,CAAC,MAAM,CACN,GAAG,EAAE,CACH,iBAAiB,CAAC,cAAc,CAC9B,IAAA,yBAAe,EAAC,IAAI,CAAC,EACrB,SAAS,EACT,gBAAgB,EAChB,SAAS,CACV,EACH;YACE,OAAO,EACL,qEAAqE;SACxE,CACF,CAAC;IACJ,CAAC;AACH,CAAC,CAAC,CAAC"}
{"version":3,"file":"start-proxy.test.js","sourceRoot":"","sources":["../src/start-proxy.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,8CAAuB;AAEvB,uCAA4C;AAC5C,iEAAmD;AACnD,mDAA6C;AAE7C,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,MAAM,aAAa,GAAG,CAAC,IAAS,EAAE,EAAE,CAClC,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC;AAEvD,IAAA,aAAI,EAAC,mEAAmE,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACpF,MAAM,mBAAmB,GAAG,MAAM,CAAC,IAAI,CACrC,IAAI,CAAC,SAAS,CAAC;QACb,EAAE,IAAI,EAAE,cAAc,EAAE,IAAI,EAAE,oBAAoB,EAAE,KAAK,EAAE,KAAK,EAAE;KACnE,CAAC,CACH,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC;IACrB,MAAM,eAAe,GAAG,IAAI,CAAC,SAAS,CAAC;QACrC,EAAE,IAAI,EAAE,cAAc,EAAE,IAAI,EAAE,oBAAoB,EAAE,KAAK,EAAE,KAAK,EAAE;KACnE,CAAC,CAAC;IAEH,MAAM,WAAW,GAAG,iBAAiB,CAAC,cAAc,CAClD,IAAA,yBAAe,EAAC,IAAI,CAAC,EACrB,eAAe,EACf,mBAAmB,EACnB,SAAS,CACV,CAAC;IACF,CAAC,CAAC,EAAE,CAAC,WAAW,CAAC,MAAM,EAAE,CAAC,CAAC,CAAC;IAC5B,CAAC,CAAC,EAAE,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,IAAI,EAAE,oBAAoB,CAAC,CAAC;AAClD,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,qEAAqE,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACtF,MAAM,mBAAmB,GAAG,MAAM,CAAC,IAAI,CACrC,IAAI,CAAC,SAAS,CAAC,EAAE,IAAI,EAAE,cAAc,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CACvD,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC;IAErB,CAAC,CAAC,MAAM,CACN,GAAG,EAAE,CACH,iBAAiB,CAAC,cAAc,CAC9B,IAAA,yBAAe,EAAC,IAAI,CAAC,EACrB,SAAS,EACT,mBAAmB,EACnB,SAAS,CACV,EACH;QACE,OAAO,EACL,4EAA4E;KAC/E,CACF,CAAC;AACJ,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,8DAA8D,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IAC/E,MAAM,eAAe,GAAG,CAAC,CAAC,KAAK,CAAC,EAAE,CAAC,IAAI,CAAC,CAAC,CAAC,GAAG,CAAC,aAAa,CAAC,CAAC;IAE7D,KAAK,MAAM,cAAc,IAAI,eAAe,EAAE,CAAC;QAC7C,CAAC,CAAC,MAAM,CACN,GAAG,EAAE,CACH,iBAAiB,CAAC,cAAc,CAC9B,IAAA,yBAAe,EAAC,IAAI,CAAC,EACrB,SAAS,EACT,cAAc,EACd,SAAS,CACV,EACH;YACE,OAAO,EAAE,yCAAyC;SACnD,CACF,CAAC;IACJ,CAAC;AACH,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,kEAAkE,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACnF,MAAM,eAAe,GAAG;QACtB,CAAC,EAAE,IAAI,EAAE,cAAc,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC;QACxC,CAAC,EAAE,IAAI,EAAE,cAAc,EAAE,KAAK,EAAE,KAAK,EAAE,IAAI,EAAE,IAAI,EAAE,CAAC;QACpD,CAAC,EAAE,IAAI,EAAE,cAAc,EAAE,KAAK,EAAE,KAAK,EAAE,GAAG,EAAE,IAAI,EAAE,CAAC;KACpD,CAAC,GAAG,CAAC,aAAa,CAAC,CAAC;IAErB,KAAK,MAAM,cAAc,IAAI,eAAe,EAAE,CAAC;QAC7C,CAAC,CAAC,MAAM,CACN,GAAG,EAAE,CACH,iBAAiB,CAAC,cAAc,CAC9B,IAAA,yBAAe,EAAC,IAAI,CAAC,EACrB,SAAS,EACT,cAAc,EACd,SAAS,CACV,EACH;YACE,OAAO,EAAE,gDAAgD;SAC1D,CACF,CAAC;IACJ,CAAC;AACH,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,mDAAmD,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACpE,MAAM,gBAAgB,GAAG;QACvB,EAAE,IAAI,EAAE,cAAc,EAAE,IAAI,EAAE,oBAAoB,EAAE,KAAK,EAAE,KAAK,EAAE;QAClE,EAAE,IAAI,EAAE,kBAAkB,EAAE,IAAI,EAAE,sBAAsB,EAAE,KAAK,EAAE,KAAK,EAAE;QACxE,EAAE,IAAI,EAAE,YAAY,EAAE,IAAI,EAAE,sBAAsB,EAAE,KAAK,EAAE,KAAK,EAAE;QAClE,EAAE,IAAI,EAAE,gBAAgB,EAAE,IAAI,EAAE,qBAAqB,EAAE,KAAK,EAAE,KAAK,EAAE;KACtE,CAAC;IAEF,MAAM,WAAW,GAAG,iBAAiB,CAAC,cAAc,CAClD,IAAA,yBAAe,EAAC,IAAI,CAAC,EACrB,SAAS,EACT,aAAa,CAAC,gBAAgB,CAAC,EAC/B,MAAM,CACP,CAAC;IACF,CAAC,CAAC,EAAE,CAAC,WAAW,CAAC,MAAM,EAAE,CAAC,CAAC,CAAC;IAC5B,CAAC,CAAC,EAAE,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,IAAI,EAAE,kBAAkB,CAAC,CAAC;AAChD,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,mEAAmE,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACpF,MAAM,gBAAgB,GAAG;QACvB,EAAE,IAAI,EAAE,cAAc,EAAE,IAAI,EAAE,oBAAoB,EAAE,KAAK,EAAE,KAAK,EAAE;QAClE,EAAE,IAAI,EAAE,kBAAkB,EAAE,IAAI,EAAE,sBAAsB,EAAE,KAAK,EAAE,KAAK,EAAE;QACxE,EAAE,IAAI,EAAE,YAAY,EAAE,IAAI,EAAE,sBAAsB,EAAE,KAAK,EAAE,KAAK,EAAE;QAClE,EAAE,IAAI,EAAE,gBAAgB,EAAE,IAAI,EAAE,qBAAqB,EAAE,KAAK,EAAE,KAAK,EAAE;KACtE,CAAC;IACF,MAAM,gBAAgB,GAAG,aAAa,CAAC,gBAAgB,CAAC,CAAC;IAEzD,MAAM,WAAW,GAAG,iBAAiB,CAAC,cAAc,CAClD,IAAA,yBAAe,EAAC,IAAI,CAAC,EACrB,SAAS,EACT,gBAAgB,EAChB,SAAS,CACV,CAAC;IACF,CAAC,CAAC,EAAE,CAAC,WAAW,CAAC,MAAM,EAAE,gBAAgB,CAAC,MAAM,CAAC,CAAC;AACpD,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,uEAAuE,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACxF,MAAM,kBAAkB,GAAG;QACzB,EAAE,IAAI,EAAE,YAAY,EAAE,IAAI,EAAE,uBAAuB,EAAE,KAAK,EAAE,WAAW,EAAE,EAAE,mCAAmC;QAC9G,EAAE,IAAI,EAAE,YAAY,EAAE,IAAI,EAAE,6BAA6B,EAAE,EAAE,kCAAkC;QAC/F;YACE,IAAI,EAAE,YAAY;YAClB,IAAI,EAAE,uBAAuB;YAC7B,QAAQ,EAAE,WAAW;SACtB,EAAE,sCAAsC;QACzC,EAAE,IAAI,EAAE,YAAY,EAAE,IAAI,EAAE,uBAAuB,EAAE,QAAQ,EAAE,SAAS,EAAE,EAAE,sCAAsC;KACnH,CAAC;IAEF,KAAK,MAAM,iBAAiB,IAAI,kBAAkB,EAAE,CAAC;QACnD,MAAM,gBAAgB,GAAG,MAAM,CAAC,IAAI,CAClC,IAAI,CAAC,SAAS,CAAC,CAAC,iBAAiB,CAAC,CAAC,CACpC,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC;QAErB,CAAC,CAAC,MAAM,CACN,GAAG,EAAE,CACH,iBAAiB,CAAC,cAAc,CAC9B,IAAA,yBAAe,EAAC,IAAI,CAAC,EACrB,SAAS,EACT,gBAAgB,EAChB,SAAS,CACV,EACH;YACE,OAAO,EACL,qEAAqE;SACxE,CACF,CAAC;IACJ,CAAC;AACH,CAAC,CAAC,CAAC"}

43
lib/testing-utils.js generated
View File

@@ -41,9 +41,11 @@ exports.setupTests = setupTests;
exports.setupActionsVars = setupActionsVars;
exports.getRecordingLogger = getRecordingLogger;
exports.mockFeatureFlagApiEndpoint = mockFeatureFlagApiEndpoint;
exports.stubFeatureFlagApiEndpoint = stubFeatureFlagApiEndpoint;
exports.mockLanguagesInRepo = mockLanguagesInRepo;
exports.mockCodeQLVersion = mockCodeQLVersion;
exports.createFeatures = createFeatures;
exports.initializeFeatures = initializeFeatures;
exports.mockBundleDownloadApi = mockBundleDownloadApi;
exports.createTestConfig = createTestConfig;
const node_util_1 = require("node:util");
@@ -54,6 +56,7 @@ const sinon = __importStar(require("sinon"));
const apiClient = __importStar(require("./api-client"));
const codeql = __importStar(require("./codeql"));
const defaults = __importStar(require("./defaults.json"));
const feature_flags_1 = require("./feature-flags");
const util_1 = require("./util");
exports.SAMPLE_DOTCOM_API_DETAILS = {
auth: "token",
@@ -172,21 +175,32 @@ function getRecordingLogger(messages) {
}
/** Mock the HTTP request to the feature flags enablement API endpoint. */
function mockFeatureFlagApiEndpoint(responseStatusCode, response) {
stubFeatureFlagApiEndpoint(() => ({
status: responseStatusCode,
messageIfError: "some error message",
data: response,
}));
}
/** Stub the HTTP request to the feature flags enablement API endpoint. */
function stubFeatureFlagApiEndpoint(responseFunction) {
// Passing an auth token is required, so we just use a dummy value
const client = github.getOctokit("123");
const requestSpy = sinon.stub(client, "request");
const optInSpy = requestSpy.withArgs("GET /repos/:owner/:repo/code-scanning/codeql-action/features");
if (responseStatusCode < 300) {
optInSpy.resolves({
status: responseStatusCode,
data: response,
headers: {},
url: "GET /repos/:owner/:repo/code-scanning/codeql-action/features",
});
}
else {
optInSpy.throws(new util_1.HTTPError("some error message", responseStatusCode));
}
optInSpy.callsFake((_route, params) => {
const response = responseFunction(params);
if (response.status < 300) {
return Promise.resolve({
status: response.status,
data: response.data,
headers: {},
url: "GET /repos/:owner/:repo/code-scanning/codeql-action/features",
});
}
else {
throw new util_1.HTTPError(response.messageIfError || "default stub error message", response.status);
}
});
sinon.stub(apiClient, "getApiClient").value(() => client);
}
function mockLanguagesInRepo(languages) {
@@ -240,6 +254,12 @@ function createFeatures(enabledFeatures) {
},
};
}
function initializeFeatures(initialValue) {
return Object.keys(feature_flags_1.featureConfig).reduce((features, key) => {
features[key] = initialValue;
return features;
}, {});
}
/**
* Mocks the API for downloading the bundle tagged `tagName`.
*
@@ -282,6 +302,7 @@ function createTestConfig(overrides) {
augmentationProperties: {
packsInputCombines: false,
queriesInputCombines: false,
extraQueryExclusions: [],
},
trapCaches: {},
trapCacheDownloadTime: 0,

File diff suppressed because one or more lines are too long

105
lib/upload-lib.js generated
View File

@@ -36,14 +36,17 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.InvalidSarifUploadError = void 0;
exports.InvalidSarifUploadError = exports.CodeQualityTarget = exports.CodeScanningTarget = exports.SARIF_UPLOAD_ENDPOINT = void 0;
exports.shouldShowCombineSarifFilesDeprecationWarning = shouldShowCombineSarifFilesDeprecationWarning;
exports.throwIfCombineSarifFilesDisabled = throwIfCombineSarifFilesDisabled;
exports.populateRunAutomationDetails = populateRunAutomationDetails;
exports.findSarifFilesInDir = findSarifFilesInDir;
exports.getSarifFilePaths = getSarifFilePaths;
exports.readSarifFile = readSarifFile;
exports.validateSarifFileSchema = validateSarifFileSchema;
exports.buildPayload = buildPayload;
exports.uploadFiles = uploadFiles;
exports.uploadSpecifiedFiles = uploadSpecifiedFiles;
exports.waitForProcessing = waitForProcessing;
exports.shouldConsiderConfigurationError = shouldConsiderConfigurationError;
exports.shouldConsiderInvalidRequest = shouldConsiderInvalidRequest;
@@ -54,15 +57,14 @@ const zlib_1 = __importDefault(require("zlib"));
const core = __importStar(require("@actions/core"));
const file_url_1 = __importDefault(require("file-url"));
const jsonschema = __importStar(require("jsonschema"));
const semver = __importStar(require("semver"));
const actionsUtil = __importStar(require("./actions-util"));
const actions_util_1 = require("./actions-util");
const api = __importStar(require("./api-client"));
const api_client_1 = require("./api-client");
const codeql_1 = require("./codeql");
const config_utils_1 = require("./config-utils");
const diff_informed_analysis_utils_1 = require("./diff-informed-analysis-utils");
const environment_1 = require("./environment");
const feature_flags_1 = require("./feature-flags");
const fingerprints = __importStar(require("./fingerprints"));
const gitUtils = __importStar(require("./git-utils"));
const init_1 = require("./init");
@@ -136,7 +138,7 @@ function areAllRunsUnique(sarifObjects) {
async function shouldShowCombineSarifFilesDeprecationWarning(sarifObjects, githubVersion) {
// Do not show this warning on GHES versions before 3.14.0
if (githubVersion.type === util_1.GitHubVariant.GHES &&
semver.lt(githubVersion.version, "3.14.0")) {
(0, util_1.satisfiesGHESVersion)(githubVersion.version, "<3.14", true)) {
return false;
}
// Only give a deprecation warning when not all runs are unique and
@@ -144,23 +146,50 @@ async function shouldShowCombineSarifFilesDeprecationWarning(sarifObjects, githu
return (!areAllRunsUnique(sarifObjects) &&
!process.env.CODEQL_MERGE_SARIF_DEPRECATION_WARNING);
}
async function throwIfCombineSarifFilesDisabled(sarifObjects, features, githubVersion) {
if (!(await shouldDisableCombineSarifFiles(sarifObjects, features, githubVersion))) {
return;
}
const deprecationMoreInformationMessage = "For more information, see https://github.blog/changelog/2025-07-21-code-scanning-will-stop-combining-multiple-sarif-runs-uploaded-in-the-same-sarif-file/";
throw new util_1.ConfigurationError(`The CodeQL Action does not support uploading multiple SARIF runs with the same category. Please update your workflow to upload a single run per category. ${deprecationMoreInformationMessage}`);
}
// Checks whether combining SARIF files should be disabled.
async function shouldDisableCombineSarifFiles(sarifObjects, features, githubVersion) {
if (githubVersion.type === util_1.GitHubVariant.GHES) {
// Never block on GHES versions before 3.18.
if ((0, util_1.satisfiesGHESVersion)(githubVersion.version, "<3.18", true)) {
return false;
}
}
else {
// Never block when the feature flag is disabled.
if (!(await features.getValue(feature_flags_1.Feature.DisableCombineSarifFiles))) {
return false;
}
}
if (areAllRunsUnique(sarifObjects)) {
// If all runs are unique, we can safely combine them.
return false;
}
// Combining SARIF files is not supported and Code Scanning will return an
// error if multiple runs with the same category are uploaded.
return true;
}
// Takes a list of paths to sarif files and combines them together using the
// CLI `github merge-results` command when all SARIF files are produced by
// CodeQL. Otherwise, it will fall back to combining the files in the action.
// Returns the contents of the combined sarif file.
async function combineSarifFilesUsingCLI(sarifFiles, gitHubVersion, features, logger) {
logger.info("Combining SARIF files using the CodeQL CLI");
if (sarifFiles.length === 1) {
return JSON.parse(fs.readFileSync(sarifFiles[0], "utf8"));
}
const sarifObjects = sarifFiles.map((sarifFile) => {
return JSON.parse(fs.readFileSync(sarifFile, "utf8"));
});
const deprecationWarningMessage = gitHubVersion.type === util_1.GitHubVariant.GHES
? "and will be removed in GitHub Enterprise Server 3.18"
: "and will be removed on June 4, 2025";
: "and will be removed in July 2025";
const deprecationMoreInformationMessage = "For more information, see https://github.blog/changelog/2024-05-06-code-scanning-will-stop-combining-runs-from-a-single-upload";
if (!areAllRunsProducedByCodeQL(sarifObjects)) {
await throwIfCombineSarifFilesDisabled(sarifObjects, features, gitHubVersion);
logger.debug("Not all SARIF files were produced by CodeQL. Merging files in the action.");
if (await shouldShowCombineSarifFilesDeprecationWarning(sarifObjects, gitHubVersion)) {
logger.warning(`Uploading multiple SARIF runs with the same category is deprecated ${deprecationWarningMessage}. Please update your workflow to upload a single run per category. ${deprecationMoreInformationMessage}`);
@@ -181,8 +210,8 @@ async function combineSarifFilesUsingCLI(sarifFiles, gitHubVersion, features, lo
else {
logger.info("Initializing CodeQL since the 'init' Action was not called before this step.");
const apiDetails = {
auth: (0, actions_util_1.getRequiredInput)("token"),
externalRepoAuth: (0, actions_util_1.getOptionalInput)("external-repository-token"),
auth: actionsUtil.getRequiredInput("token"),
externalRepoAuth: actionsUtil.getOptionalInput("external-repository-token"),
url: (0, util_1.getRequiredEnvParam)("GITHUB_SERVER_URL"),
apiURL: (0, util_1.getRequiredEnvParam)("GITHUB_API_URL"),
};
@@ -192,6 +221,7 @@ async function combineSarifFilesUsingCLI(sarifFiles, gitHubVersion, features, lo
codeQL = initCodeQLResult.codeql;
}
if (!(await codeQL.supportsFeature(tools_features_1.ToolsFeature.SarifMergeRunsFromEqualCategory))) {
await throwIfCombineSarifFilesDisabled(sarifObjects, features, gitHubVersion);
logger.warning("The CodeQL CLI does not support merging SARIF files. Merging files in the action.");
if (await shouldShowCombineSarifFilesDeprecationWarning(sarifObjects, gitHubVersion)) {
logger.warning(`Uploading multiple CodeQL runs with the same category is deprecated ${deprecationWarningMessage} for CodeQL CLI 2.16.6 and earlier. Please update your CodeQL CLI version or update your workflow to set a distinct category for each CodeQL run. ${deprecationMoreInformationMessage}`);
@@ -234,9 +264,15 @@ function getAutomationID(category, analysis_key, environment) {
}
return api.computeAutomationID(analysis_key, environment);
}
// Enumerates API endpoints that accept SARIF files.
var SARIF_UPLOAD_ENDPOINT;
(function (SARIF_UPLOAD_ENDPOINT) {
SARIF_UPLOAD_ENDPOINT["CODE_SCANNING"] = "PUT /repos/:owner/:repo/code-scanning/analysis";
SARIF_UPLOAD_ENDPOINT["CODE_QUALITY"] = "PUT /repos/:owner/:repo/code-quality/analysis";
})(SARIF_UPLOAD_ENDPOINT || (exports.SARIF_UPLOAD_ENDPOINT = SARIF_UPLOAD_ENDPOINT = {}));
// Upload the given payload.
// If the request fails then this will retry a small number of times.
async function uploadPayload(payload, repositoryNwo, logger) {
async function uploadPayload(payload, repositoryNwo, logger, target) {
logger.info("Uploading results");
// If in test mode we don't want to upload the results
if (util.isInTestMode()) {
@@ -248,7 +284,7 @@ async function uploadPayload(payload, repositoryNwo, logger) {
}
const client = api.getApiClient();
try {
const response = await client.request("PUT /repos/:owner/:repo/code-scanning/analysis", {
const response = await client.request(target, {
owner: repositoryNwo.owner,
repo: repositoryNwo.repo,
data: payload,
@@ -276,12 +312,12 @@ async function uploadPayload(payload, repositoryNwo, logger) {
}
// Recursively walks a directory and returns all SARIF files it finds.
// Does not follow symlinks.
function findSarifFilesInDir(sarifPath) {
function findSarifFilesInDir(sarifPath, isSarif) {
const sarifFiles = [];
const walkSarifFiles = (dir) => {
const entries = fs.readdirSync(dir, { withFileTypes: true });
for (const entry of entries) {
if (entry.isFile() && entry.name.endsWith(".sarif")) {
if (entry.isFile() && isSarif(entry.name)) {
sarifFiles.push(path.resolve(dir, entry.name));
}
else if (entry.isDirectory()) {
@@ -292,14 +328,14 @@ function findSarifFilesInDir(sarifPath) {
walkSarifFiles(sarifPath);
return sarifFiles;
}
function getSarifFilePaths(sarifPath) {
function getSarifFilePaths(sarifPath, isSarif) {
if (!fs.existsSync(sarifPath)) {
// This is always a configuration error, even for first-party runs.
throw new util_1.ConfigurationError(`Path does not exist: ${sarifPath}`);
}
let sarifFiles;
if (fs.lstatSync(sarifPath).isDirectory()) {
sarifFiles = findSarifFilesInDir(sarifPath);
sarifFiles = findSarifFilesInDir(sarifPath, isSarif);
if (sarifFiles.length === 0) {
// This is always a configuration error, even for first-party runs.
throw new util_1.ConfigurationError(`No SARIF files found to upload in "${sarifPath}".`);
@@ -409,13 +445,33 @@ function buildPayload(commitOid, ref, analysisKey, analysisName, zippedSarif, wo
}
return payloadObj;
}
// Represents the Code Scanning upload target.
exports.CodeScanningTarget = {
name: "code scanning",
target: SARIF_UPLOAD_ENDPOINT.CODE_SCANNING,
sarifPredicate: (name) => name.endsWith(".sarif") && !exports.CodeQualityTarget.sarifPredicate(name),
sentinelPrefix: "CODEQL_UPLOAD_SARIF_",
};
// Represents the Code Quality upload target.
exports.CodeQualityTarget = {
name: "code quality",
target: SARIF_UPLOAD_ENDPOINT.CODE_QUALITY,
sarifPredicate: (name) => name.endsWith(".quality.sarif"),
sentinelPrefix: "CODEQL_UPLOAD_QUALITY_SARIF_",
};
/**
* Uploads a single SARIF file or a directory of SARIF files depending on what `inputSarifPath` refers
* to.
*/
async function uploadFiles(inputSarifPath, checkoutPath, category, features, logger) {
const sarifPaths = getSarifFilePaths(inputSarifPath);
logger.startGroup("Uploading results");
async function uploadFiles(inputSarifPath, checkoutPath, category, features, logger, uploadTarget) {
const sarifPaths = getSarifFilePaths(inputSarifPath, uploadTarget.sarifPredicate);
return uploadSpecifiedFiles(sarifPaths, checkoutPath, category, features, logger, uploadTarget);
}
/**
* Uploads the given array of SARIF files.
*/
async function uploadSpecifiedFiles(sarifPaths, checkoutPath, category, features, logger, uploadTarget = exports.CodeScanningTarget) {
logger.startGroup(`Uploading ${uploadTarget.name} results`);
logger.info(`Processing sarif files: ${JSON.stringify(sarifPaths)}`);
const gitHubVersion = await (0, api_client_1.getGitHubVersion)();
let sarif;
@@ -431,6 +487,8 @@ async function uploadFiles(inputSarifPath, checkoutPath, category, features, log
const sarifPath = sarifPaths[0];
sarif = readSarifFile(sarifPath);
validateSarifFileSchema(sarif, sarifPath, logger);
// Validate that there are no runs for the same category
await throwIfCombineSarifFilesDisabled([sarif], features, gitHubVersion);
}
sarif = filterAlertsByDiffRange(logger, sarif);
sarif = await fingerprints.addFingerprints(sarif, checkoutPath, logger);
@@ -439,7 +497,7 @@ async function uploadFiles(inputSarifPath, checkoutPath, category, features, log
sarif = populateRunAutomationDetails(sarif, category, analysisKey, environment);
const toolNames = util.getToolNames(sarif);
logger.debug(`Validating that each SARIF run has a unique category`);
validateUniqueCategory(sarif);
validateUniqueCategory(sarif, uploadTarget.sentinelPrefix);
logger.debug(`Serializing SARIF for upload`);
const sarifPayload = JSON.stringify(sarif);
logger.debug(`Compressing serialized SARIF`);
@@ -454,7 +512,7 @@ async function uploadFiles(inputSarifPath, checkoutPath, category, features, log
const numResultInSarif = countResultsInSarif(sarifPayload);
logger.debug(`Number of results in upload: ${numResultInSarif}`);
// Make the upload
const sarifID = await uploadPayload(payload, (0, repository_1.getRepositoryNwo)(), logger);
const sarifID = await uploadPayload(payload, (0, repository_1.getRepositoryNwo)(), logger, uploadTarget.target);
logger.endGroup();
return {
statusReport: {
@@ -546,6 +604,7 @@ function shouldConsiderConfigurationError(processingErrors) {
const expectedConfigErrors = [
"CodeQL analyses from advanced configurations cannot be processed when the default setup is enabled",
"rejecting delivery as the repository has too many logical alerts",
"A delivery cannot contain multiple runs with the same category",
];
return (processingErrors.length === 1 &&
expectedConfigErrors.some((msg) => processingErrors[0].includes(msg)));
@@ -588,7 +647,7 @@ function handleProcessingResultForUnsuccessfulExecution(response, status, logger
util.assertNever(status);
}
}
function validateUniqueCategory(sarif) {
function validateUniqueCategory(sarif, sentinelPrefix = exports.CodeScanningTarget.sentinelPrefix) {
// duplicate categories are allowed in the same sarif file
// but not across multiple sarif files
const categories = {};
@@ -599,7 +658,7 @@ function validateUniqueCategory(sarif) {
categories[category] = { id, tool };
}
for (const [category, { id, tool }] of Object.entries(categories)) {
const sentinelEnvVar = `CODEQL_UPLOAD_SARIF_${category}`;
const sentinelEnvVar = `${sentinelPrefix}${category}`;
if (process.env[sentinelEnvVar]) {
// This is always a configuration error, even for first-party runs.
throw new util_1.ConfigurationError("Aborting upload: only one run of the codeql/analyze or codeql/upload-sarif actions is allowed per job per tool/category. " +

File diff suppressed because one or more lines are too long

96
lib/upload-lib.test.js generated
View File

@@ -39,6 +39,7 @@ Object.defineProperty(exports, "__esModule", { value: true });
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const ava_1 = __importDefault(require("ava"));
const feature_flags_1 = require("./feature-flags");
const logging_1 = require("./logging");
const testing_utils_1 = require("./testing-utils");
const uploadLib = __importStar(require("./upload-lib"));
@@ -91,13 +92,21 @@ ava_1.default.beforeEach(() => {
fs.mkdirSync(path.join(tmpDir, "dir3"));
fs.symlinkSync(tmpDir, path.join(tmpDir, "dir3", "symlink1"), "dir");
fs.symlinkSync(path.join(tmpDir, "a.sarif"), path.join(tmpDir, "dir3", "symlink2.sarif"), "file");
const sarifFiles = uploadLib.findSarifFilesInDir(tmpDir);
// add some `.quality.sarif` files that should be ignored, unless we look for them specifically
fs.writeFileSync(path.join(tmpDir, "a.quality.sarif"), "");
fs.writeFileSync(path.join(tmpDir, "dir1", "b.quality.sarif"), "");
const sarifFiles = uploadLib.findSarifFilesInDir(tmpDir, uploadLib.CodeScanningTarget.sarifPredicate);
t.deepEqual(sarifFiles, [
path.join(tmpDir, "a.sarif"),
path.join(tmpDir, "b.sarif"),
path.join(tmpDir, "dir1", "d.sarif"),
path.join(tmpDir, "dir1", "dir2", "e.sarif"),
]);
const qualitySarifFiles = uploadLib.findSarifFilesInDir(tmpDir, uploadLib.CodeQualityTarget.sarifPredicate);
t.deepEqual(qualitySarifFiles, [
path.join(tmpDir, "a.quality.sarif"),
path.join(tmpDir, "dir1", "b.quality.sarif"),
]);
});
});
(0, ava_1.default)("populateRunAutomationDetails", (t) => {
@@ -194,6 +203,10 @@ ava_1.default.beforeEach(() => {
t.throws(() => uploadLib.validateUniqueCategory(sarif1));
t.throws(() => uploadLib.validateUniqueCategory(sarif2));
});
(0, ava_1.default)("validateUniqueCategory with different prefixes", (t) => {
t.notThrows(() => uploadLib.validateUniqueCategory(createMockSarif()));
t.notThrows(() => uploadLib.validateUniqueCategory(createMockSarif(), uploadLib.CodeQualityTarget.sentinelPrefix));
});
(0, ava_1.default)("accept results with invalid artifactLocation.uri value", (t) => {
const loggedMessages = [];
const mockLogger = {
@@ -223,6 +236,12 @@ ava_1.default.beforeEach(() => {
version: "3.14.0",
}));
});
(0, ava_1.default)("shouldShowCombineSarifFilesDeprecationWarning when on GHES 3.16 pre", async (t) => {
t.true(await uploadLib.shouldShowCombineSarifFilesDeprecationWarning([createMockSarif("abc", "def"), createMockSarif("abc", "def")], {
type: util_1.GitHubVariant.GHES,
version: "3.16.0.pre1",
}));
});
(0, ava_1.default)("shouldShowCombineSarifFilesDeprecationWarning with only 1 run", async (t) => {
t.false(await uploadLib.shouldShowCombineSarifFilesDeprecationWarning([createMockSarif("abc", "def")], {
type: util_1.GitHubVariant.DOTCOM,
@@ -244,6 +263,81 @@ ava_1.default.beforeEach(() => {
type: util_1.GitHubVariant.DOTCOM,
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on dotcom with feature flag", async (t) => {
await t.throwsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.DisableCombineSarifFiles]), {
type: util_1.GitHubVariant.DOTCOM,
}), {
message: /The CodeQL Action does not support uploading multiple SARIF runs with the same category/,
});
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on dotcom without feature flag", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.DOTCOM,
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.13", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.13.2",
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.14", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.14.0",
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.17", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.17.0",
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.18 pre", async (t) => {
await t.throwsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.18.0.pre1",
}), {
message: /The CodeQL Action does not support uploading multiple SARIF runs with the same category/,
});
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.18 alpha", async (t) => {
await t.throwsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.18.0-alpha.1",
}), {
message: /The CodeQL Action does not support uploading multiple SARIF runs with the same category/,
});
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.18", async (t) => {
await t.throwsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.18.0",
}), {
message: /The CodeQL Action does not support uploading multiple SARIF runs with the same category/,
});
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled with an invalid GHES version", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "foobar",
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled with only 1 run", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.DisableCombineSarifFiles]), {
type: util_1.GitHubVariant.DOTCOM,
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled with distinct categories", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("def", "def")], (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.DisableCombineSarifFiles]), {
type: util_1.GitHubVariant.DOTCOM,
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled with distinct tools", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "abc"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.DisableCombineSarifFiles]), {
type: util_1.GitHubVariant.DOTCOM,
}));
});
(0, ava_1.default)("shouldConsiderConfigurationError correctly detects configuration errors", (t) => {
const error1 = [
"CodeQL analyses from advanced configurations cannot be processed when the default setup is enabled",

File diff suppressed because one or more lines are too long

View File

@@ -33,6 +33,7 @@ var __importStar = (this && this.__importStar) || (function () {
};
})();
Object.defineProperty(exports, "__esModule", { value: true });
const fs = __importStar(require("fs"));
const core = __importStar(require("@actions/core"));
const actionsUtil = __importStar(require("./actions-util"));
const actions_util_1 = require("./actions-util");
@@ -68,14 +69,28 @@ async function run() {
await (0, status_report_1.sendStatusReport)(startingStatusReportBase);
}
try {
const uploadResult = await upload_lib.uploadFiles(actionsUtil.getRequiredInput("sarif_file"), actionsUtil.getRequiredInput("checkout_path"), actionsUtil.getOptionalInput("category"), features, logger);
const sarifPath = actionsUtil.getRequiredInput("sarif_file");
const checkoutPath = actionsUtil.getRequiredInput("checkout_path");
const category = actionsUtil.getOptionalInput("category");
const uploadResult = await upload_lib.uploadFiles(sarifPath, checkoutPath, category, features, logger, upload_lib.CodeScanningTarget);
core.setOutput("sarif-id", uploadResult.sarifID);
// If there are `.quality.sarif` files in `sarifPath`, then upload those to the code quality service.
// Code quality can currently only be enabled on top of security, so we'd currently always expect to
// have a directory for the results here.
if (fs.lstatSync(sarifPath).isDirectory()) {
const qualitySarifFiles = upload_lib.findSarifFilesInDir(sarifPath, upload_lib.CodeQualityTarget.sarifPredicate);
if (qualitySarifFiles.length !== 0) {
await upload_lib.uploadSpecifiedFiles(qualitySarifFiles, checkoutPath, category, features, logger, upload_lib.CodeQualityTarget);
}
}
// We don't upload results in test mode, so don't wait for processing
if ((0, util_1.isInTestMode)()) {
core.debug("In test mode. Waiting for processing is disabled.");
}
else if (actionsUtil.getRequiredInput("wait-for-processing") === "true") {
await upload_lib.waitForProcessing((0, repository_1.getRepositoryNwo)(), uploadResult.sarifID, logger);
// The code quality service does not currently have an endpoint to wait for SARIF processing,
// so we can't wait for that here.
}
await sendSuccessStatusReport(startedAt, uploadResult.statusReport, logger);
}

View File

@@ -1 +1 @@
{"version":3,"file":"upload-sarif-action.js","sourceRoot":"","sources":["../src/upload-sarif-action.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,oDAAsC;AAEtC,4DAA8C;AAC9C,iDAAyE;AACzE,6CAAgD;AAChD,mDAA2C;AAC3C,uCAAqD;AACrD,6CAAgD;AAChD,mDAOyB;AACzB,yDAA2C;AAC3C,iCAQgB;AAMhB,KAAK,UAAU,uBAAuB,CACpC,SAAe,EACf,WAA0C,EAC1C,MAAc;IAEd,MAAM,gBAAgB,GAAG,MAAM,IAAA,sCAAsB,EACnD,0BAAU,CAAC,WAAW,EACtB,SAAS,EACT,SAAS,EACT,SAAS,EACT,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,CACP,CAAC;IACF,IAAI,gBAAgB,KAAK,SAAS,EAAE,CAAC;QACnC,MAAM,YAAY,GAA4B;YAC5C,GAAG,gBAAgB;YACnB,GAAG,WAAW;SACf,CAAC;QACF,MAAM,IAAA,gCAAgB,EAAC,YAAY,CAAC,CAAC;IACvC,CAAC;AACH,CAAC;AAED,KAAK,UAAU,GAAG;IAChB,MAAM,SAAS,GAAG,IAAI,IAAI,EAAE,CAAC;IAC7B,MAAM,MAAM,GAAG,IAAA,0BAAgB,GAAE,CAAC;IAClC,IAAA,4BAAqB,EAAC,IAAA,+BAAgB,GAAE,CAAC,CAAC;IAE1C,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;IAC/C,IAAA,yBAAkB,EAAC,IAAA,+BAAgB,GAAE,EAAE,aAAa,CAAC,CAAC;IAEtD,6CAA6C;IAC7C,WAAW,CAAC,aAAa,EAAE,CAAC;IAE5B,MAAM,aAAa,GAAG,IAAA,6BAAgB,GAAE,CAAC;IACzC,MAAM,QAAQ,GAAG,IAAI,wBAAQ,CAC3B,aAAa,EACb,aAAa,EACb,IAAA,oCAAqB,GAAE,EACvB,MAAM,CACP,CAAC;IAEF,MAAM,wBAAwB,GAAG,MAAM,IAAA,sCAAsB,EAC3D,0BAAU,CAAC,WAAW,EACtB,UAAU,EACV,SAAS,EACT,SAAS,EACT,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,CACP,CAAC;IACF,IAAI,wBAAwB,KAAK,SAAS,EAAE,CAAC;QAC3C,MAAM,IAAA,gCAAgB,EAAC,wBAAwB,CAAC,CAAC;IACnD,CAAC;IAED,IAAI,CAAC;QACH,MAAM,YAAY,GAAG,MAAM,UAAU,CAAC,WAAW,CAC/C,WAAW,CAAC,gBAAgB,CAAC,YAAY,CAAC,EAC1C,WAAW,CAAC,gBAAgB,CAAC,eAAe,CAAC,EAC7C,WAAW,CAAC,gBAAgB,CAAC,UAAU,CAAC,EACxC,QAAQ,EACR,MAAM,CACP,CAAC;QACF,IAAI,CAAC,SAAS,CAAC,UAAU,EAAE,YAAY,CAAC,OAAO,CAAC,CAAC;QAEjD,qEAAqE;QACrE,IAAI,IAAA,mBAAY,GAAE,EAAE,CAAC;YACnB,IAAI,CAAC,KAAK,CAAC,mDAAmD,CAAC,CAAC;QAClE,CAAC;aAAM,IAAI,WAAW,CAAC,gBAAgB,CAAC,qBAAqB,CAAC,KAAK,MAAM,EAAE,CAAC;YAC1E,MAAM,UAAU,CAAC,iBAAiB,CAChC,IAAA,6BAAgB,GAAE,EAClB,YAAY,CAAC,OAAO,EACpB,MAAM,CACP,CAAC;QACJ,CAAC;QACD,MAAM,uBAAuB,CAAC,SAAS,EAAE,YAAY,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC;IAC9E,CAAC;IAAC,OAAO,cAAc,EAAE,CAAC;QACxB,MAAM,KAAK,GACT,IAAA,oCAAoB,EAAC,0BAAU,CAAC,WAAW,CAAC;YAC5C,cAAc,YAAY,UAAU,CAAC,uBAAuB;YAC1D,CAAC,CAAC,IAAI,yBAAkB,CAAC,cAAc,CAAC,OAAO,CAAC;YAChD,CAAC,CAAC,IAAA,gBAAS,EAAC,cAAc,CAAC,CAAC;QAChC,MAAM,OAAO,GAAG,KAAK,CAAC,OAAO,CAAC;QAC9B,IAAI,CAAC,SAAS,CAAC,OAAO,CAAC,CAAC;QAExB,MAAM,qBAAqB,GAAG,MAAM,IAAA,sCAAsB,EACxD,0BAAU,CAAC,WAAW,EACtB,IAAA,gCAAgB,EAAC,KAAK,CAAC,EACvB,SAAS,EACT,SAAS,EACT,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,EACN,OAAO,EACP,KAAK,CAAC,KAAK,CACZ,CAAC;QACF,IAAI,qBAAqB,KAAK,SAAS,EAAE,CAAC;YACxC,MAAM,IAAA,gCAAgB,EAAC,qBAAqB,CAAC,CAAC;QAChD,CAAC;QACD,OAAO;IACT,CAAC;AACH,CAAC;AAED,KAAK,UAAU,UAAU;IACvB,IAAI,CAAC;QACH,MAAM,GAAG,EAAE,CAAC;IACd,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,IAAI,CAAC,SAAS,CACZ,sCAAsC,IAAA,sBAAe,EAAC,KAAK,CAAC,EAAE,CAC/D,CAAC;IACJ,CAAC;AACH,CAAC;AAED,KAAK,UAAU,EAAE,CAAC"}
{"version":3,"file":"upload-sarif-action.js","sourceRoot":"","sources":["../src/upload-sarif-action.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,uCAAyB;AAEzB,oDAAsC;AAEtC,4DAA8C;AAC9C,iDAAyE;AACzE,6CAAgD;AAChD,mDAA2C;AAC3C,uCAAqD;AACrD,6CAAgD;AAChD,mDAOyB;AACzB,yDAA2C;AAC3C,iCAQgB;AAMhB,KAAK,UAAU,uBAAuB,CACpC,SAAe,EACf,WAA0C,EAC1C,MAAc;IAEd,MAAM,gBAAgB,GAAG,MAAM,IAAA,sCAAsB,EACnD,0BAAU,CAAC,WAAW,EACtB,SAAS,EACT,SAAS,EACT,SAAS,EACT,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,CACP,CAAC;IACF,IAAI,gBAAgB,KAAK,SAAS,EAAE,CAAC;QACnC,MAAM,YAAY,GAA4B;YAC5C,GAAG,gBAAgB;YACnB,GAAG,WAAW;SACf,CAAC;QACF,MAAM,IAAA,gCAAgB,EAAC,YAAY,CAAC,CAAC;IACvC,CAAC;AACH,CAAC;AAED,KAAK,UAAU,GAAG;IAChB,MAAM,SAAS,GAAG,IAAI,IAAI,EAAE,CAAC;IAC7B,MAAM,MAAM,GAAG,IAAA,0BAAgB,GAAE,CAAC;IAClC,IAAA,4BAAqB,EAAC,IAAA,+BAAgB,GAAE,CAAC,CAAC;IAE1C,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;IAC/C,IAAA,yBAAkB,EAAC,IAAA,+BAAgB,GAAE,EAAE,aAAa,CAAC,CAAC;IAEtD,6CAA6C;IAC7C,WAAW,CAAC,aAAa,EAAE,CAAC;IAE5B,MAAM,aAAa,GAAG,IAAA,6BAAgB,GAAE,CAAC;IACzC,MAAM,QAAQ,GAAG,IAAI,wBAAQ,CAC3B,aAAa,EACb,aAAa,EACb,IAAA,oCAAqB,GAAE,EACvB,MAAM,CACP,CAAC;IAEF,MAAM,wBAAwB,GAAG,MAAM,IAAA,sCAAsB,EAC3D,0BAAU,CAAC,WAAW,EACtB,UAAU,EACV,SAAS,EACT,SAAS,EACT,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,CACP,CAAC;IACF,IAAI,wBAAwB,KAAK,SAAS,EAAE,CAAC;QAC3C,MAAM,IAAA,gCAAgB,EAAC,wBAAwB,CAAC,CAAC;IACnD,CAAC;IAED,IAAI,CAAC;QACH,MAAM,SAAS,GAAG,WAAW,CAAC,gBAAgB,CAAC,YAAY,CAAC,CAAC;QAC7D,MAAM,YAAY,GAAG,WAAW,CAAC,gBAAgB,CAAC,eAAe,CAAC,CAAC;QACnE,MAAM,QAAQ,GAAG,WAAW,CAAC,gBAAgB,CAAC,UAAU,CAAC,CAAC;QAE1D,MAAM,YAAY,GAAG,MAAM,UAAU,CAAC,WAAW,CAC/C,SAAS,EACT,YAAY,EACZ,QAAQ,EACR,QAAQ,EACR,MAAM,EACN,UAAU,CAAC,kBAAkB,CAC9B,CAAC;QACF,IAAI,CAAC,SAAS,CAAC,UAAU,EAAE,YAAY,CAAC,OAAO,CAAC,CAAC;QAEjD,qGAAqG;QACrG,oGAAoG;QACpG,yCAAyC;QACzC,IAAI,EAAE,CAAC,SAAS,CAAC,SAAS,CAAC,CAAC,WAAW,EAAE,EAAE,CAAC;YAC1C,MAAM,iBAAiB,GAAG,UAAU,CAAC,mBAAmB,CACtD,SAAS,EACT,UAAU,CAAC,iBAAiB,CAAC,cAAc,CAC5C,CAAC;YAEF,IAAI,iBAAiB,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;gBACnC,MAAM,UAAU,CAAC,oBAAoB,CACnC,iBAAiB,EACjB,YAAY,EACZ,QAAQ,EACR,QAAQ,EACR,MAAM,EACN,UAAU,CAAC,iBAAiB,CAC7B,CAAC;YACJ,CAAC;QACH,CAAC;QAED,qEAAqE;QACrE,IAAI,IAAA,mBAAY,GAAE,EAAE,CAAC;YACnB,IAAI,CAAC,KAAK,CAAC,mDAAmD,CAAC,CAAC;QAClE,CAAC;aAAM,IAAI,WAAW,CAAC,gBAAgB,CAAC,qBAAqB,CAAC,KAAK,MAAM,EAAE,CAAC;YAC1E,MAAM,UAAU,CAAC,iBAAiB,CAChC,IAAA,6BAAgB,GAAE,EAClB,YAAY,CAAC,OAAO,EACpB,MAAM,CACP,CAAC;YACF,6FAA6F;YAC7F,kCAAkC;QACpC,CAAC;QACD,MAAM,uBAAuB,CAAC,SAAS,EAAE,YAAY,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC;IAC9E,CAAC;IAAC,OAAO,cAAc,EAAE,CAAC;QACxB,MAAM,KAAK,GACT,IAAA,oCAAoB,EAAC,0BAAU,CAAC,WAAW,CAAC;YAC5C,cAAc,YAAY,UAAU,CAAC,uBAAuB;YAC1D,CAAC,CAAC,IAAI,yBAAkB,CAAC,cAAc,CAAC,OAAO,CAAC;YAChD,CAAC,CAAC,IAAA,gBAAS,EAAC,cAAc,CAAC,CAAC;QAChC,MAAM,OAAO,GAAG,KAAK,CAAC,OAAO,CAAC;QAC9B,IAAI,CAAC,SAAS,CAAC,OAAO,CAAC,CAAC;QAExB,MAAM,qBAAqB,GAAG,MAAM,IAAA,sCAAsB,EACxD,0BAAU,CAAC,WAAW,EACtB,IAAA,gCAAgB,EAAC,KAAK,CAAC,EACvB,SAAS,EACT,SAAS,EACT,MAAM,IAAA,qBAAc,EAAC,MAAM,CAAC,EAC5B,MAAM,EACN,OAAO,EACP,KAAK,CAAC,KAAK,CACZ,CAAC;QACF,IAAI,qBAAqB,KAAK,SAAS,EAAE,CAAC;YACxC,MAAM,IAAA,gCAAgB,EAAC,qBAAqB,CAAC,CAAC;QAChD,CAAC;QACD,OAAO;IACT,CAAC;AACH,CAAC;AAED,KAAK,UAAU,UAAU;IACvB,IAAI,CAAC;QACH,MAAM,GAAG,EAAE,CAAC;IACd,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,IAAI,CAAC,SAAS,CACZ,sCAAsC,IAAA,sBAAe,EAAC,KAAK,CAAC,EAAE,CAC/D,CAAC;IACJ,CAAC;AACH,CAAC;AAED,KAAK,UAAU,EAAE,CAAC"}

19
lib/util.js generated
View File

@@ -77,6 +77,7 @@ exports.getErrorMessage = getErrorMessage;
exports.prettyPrintPack = prettyPrintPack;
exports.checkDiskUsage = checkDiskUsage;
exports.checkActionVersion = checkActionVersion;
exports.satisfiesGHESVersion = satisfiesGHESVersion;
exports.cloneObject = cloneObject;
exports.checkSipEnablement = checkSipEnablement;
exports.cleanUpGlob = cleanUpGlob;
@@ -887,6 +888,24 @@ function checkActionVersion(version, githubVersion) {
}
}
}
/**
* This will check whether the given GitHub version satisfies the given range,
* taking into account that a range like >=3.18 will also match the GHES 3.18
* pre-release/RC versions.
*
* When the given `githubVersion` is not a GHES version, or if the version
* is invalid, this will return `defaultIfInvalid`.
*/
function satisfiesGHESVersion(ghesVersion, range, defaultIfInvalid) {
const semverVersion = semver.coerce(ghesVersion);
if (semverVersion === null) {
return defaultIfInvalid;
}
// We always drop the pre-release part of the version, since anything that
// applies to GHES 3.18.0 should also apply to GHES 3.18.0.pre1.
semverVersion.prerelease = [];
return semver.satisfies(semverVersion, range);
}
/**
* Supported build modes.
*

File diff suppressed because one or more lines are too long

3
lib/util.test.js generated
View File

@@ -240,9 +240,10 @@ const shortTime = 10;
(0, ava_1.default)("withTimeout on long task", async (t) => {
let longTaskTimedOut = false;
const longTask = new Promise((resolve) => {
setTimeout(() => {
const timer = setTimeout(() => {
resolve(42);
}, longTime);
t.teardown(() => clearTimeout(timer));
});
const result = await util.withTimeout(shortTime, longTask, () => {
longTaskTimedOut = true;

File diff suppressed because one or more lines are too long

1
node_modules/.bin/nft generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../@vercel/nft/out/cli.js

1
node_modules/.bin/node-gyp-build generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../node-gyp-build/bin.js

1
node_modules/.bin/node-gyp-build-optional generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../node-gyp-build/optional.js

1
node_modules/.bin/node-gyp-build-test generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../node-gyp-build/build-test.js

1
node_modules/.bin/node-pre-gyp generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../@mapbox/node-pre-gyp/bin/node-pre-gyp

1
node_modules/.bin/nopt generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../nopt/bin/nopt.js

1570
node_modules/.package-lock.json generated vendored

File diff suppressed because it is too large Load Diff

253
node_modules/@ava/typescript/index.js generated vendored
View File

@@ -4,8 +4,8 @@ import {pathToFileURL} from 'node:url';
import escapeStringRegexp from 'escape-string-regexp';
import {execa} from 'execa';
const pkg = JSON.parse(fs.readFileSync(new URL('package.json', import.meta.url)));
const help = `See https://github.com/avajs/typescript/blob/v${pkg.version}/README.md`;
const package_ = JSON.parse(fs.readFileSync(new URL('package.json', import.meta.url)));
const help = `See https://github.com/avajs/typescript/blob/v${package_.version}/README.md`;
function isPlainObject(x) {
return x !== null && typeof x === 'object' && Reflect.getPrototypeOf(x) === Object.prototype;
@@ -36,8 +36,8 @@ function validate(target, properties) {
}
}
async function compileTypeScript(projectDir) {
return execa('tsc', ['--incremental'], {preferLocal: true, cwd: projectDir});
async function compileTypeScript(projectDirectory) {
return execa({preferLocal: true, cwd: projectDirectory})`tsc --incremental`;
}
const configProperties = {
@@ -62,7 +62,7 @@ const configProperties = {
isValid(extensions) {
return Array.isArray(extensions)
&& extensions.length > 0
&& extensions.every(ext => typeof ext === 'string' && ext !== '')
&& extensions.every(extension => typeof extension === 'string' && extension !== '')
&& new Set(extensions).size === extensions.length;
},
},
@@ -75,7 +75,7 @@ const changeInterpretations = Object.freeze(Object.assign(Object.create(null), {
}));
export default function typescriptProvider({negotiateProtocol}) {
const protocol = negotiateProtocol(['ava-6', 'ava-3.2'], {version: pkg.version});
const protocol = negotiateProtocol(['ava-6'], {version: package_.version});
if (protocol === null) {
return;
}
@@ -98,143 +98,112 @@ export default function typescriptProvider({negotiateProtocol}) {
path.join(protocol.projectDir, from),
path.join(protocol.projectDir, to),
]);
const testFileExtension = new RegExp(`\\.(${extensions.map(ext => escapeStringRegexp(ext)).join('|')})$`);
const testFileExtension = new RegExp(`\\.(${extensions.map(extension => escapeStringRegexp(extension)).join('|')})$`);
const watchMode = protocol.identifier === 'ava-3.2'
? {
ignoreChange(filePath) {
if (!testFileExtension.test(filePath)) {
return false;
}
return rewritePaths.some(([from]) => filePath.startsWith(from));
},
resolveTestFile(testfile) { // Used under AVA 3.2 protocol by legacy watcher implementation.
if (!testFileExtension.test(testfile)) {
return testfile;
}
const rewrite = rewritePaths.find(([from]) => testfile.startsWith(from));
if (rewrite === undefined) {
return testfile;
}
const [from, to] = rewrite;
let newExtension = '.js';
if (testfile.endsWith('.cts')) {
newExtension = '.cjs';
} else if (testfile.endsWith('.mts')) {
newExtension = '.mjs';
}
return `${to}${testfile.slice(from.length)}`.replace(testFileExtension, newExtension);
},
}
: {
changeInterpretations,
interpretChange(filePath) {
if (config.compile === false) {
for (const [from] of rewritePaths) {
if (testFileExtension.test(filePath) && filePath.startsWith(from)) {
return changeInterpretations.waitForOutOfBandCompilation;
}
const watchMode = {
changeInterpretations,
interpretChange(filePath) {
if (config.compile === false) {
for (const [from] of rewritePaths) {
if (testFileExtension.test(filePath) && filePath.startsWith(from)) {
return changeInterpretations.waitForOutOfBandCompilation;
}
}
}
if (config.compile === 'tsc') {
for (const [, to] of rewritePaths) {
if (filePath.startsWith(to)) {
return changeInterpretations.ignoreCompiled;
}
if (config.compile === 'tsc') {
for (const [, to] of rewritePaths) {
if (filePath.startsWith(to)) {
return changeInterpretations.ignoreCompiled;
}
}
}
return changeInterpretations.unspecified;
},
resolvePossibleOutOfBandCompilationSources(filePath) {
if (config.compile !== false) {
return null;
}
// Only recognize .cjs, .mjs and .js files.
if (!/\.(c|m)?js$/.test(filePath)) {
return null;
}
for (const [from, to] of rewritePaths) {
if (!filePath.startsWith(to)) {
continue;
}
const rewritten = `${from}${filePath.slice(to.length)}`;
const possibleExtensions = [];
if (filePath.endsWith('.cjs')) {
if (extensions.includes('cjs')) {
possibleExtensions.push({replace: /\.cjs$/, extension: 'cjs'});
}
if (extensions.includes('cts')) {
possibleExtensions.push({replace: /\.cjs$/, extension: 'cts'});
}
if (possibleExtensions.length === 0) {
return null;
}
}
if (filePath.endsWith('.mjs')) {
if (extensions.includes('mjs')) {
possibleExtensions.push({replace: /\.mjs$/, extension: 'mjs'});
}
if (extensions.includes('mts')) {
possibleExtensions.push({replace: /\.mjs$/, extension: 'mts'});
}
if (possibleExtensions.length === 0) {
return null;
}
}
if (filePath.endsWith('.js')) {
if (extensions.includes('js')) {
possibleExtensions.push({replace: /\.js$/, extension: 'js'});
}
if (extensions.includes('ts')) {
possibleExtensions.push({replace: /\.js$/, extension: 'ts'});
}
if (extensions.includes('tsx')) {
possibleExtensions.push({replace: /\.js$/, extension: 'tsx'});
}
if (possibleExtensions.length === 0) {
return null;
}
}
const possibleDeletedFiles = [];
for (const {replace, extension} of possibleExtensions) {
const possibleFilePath = rewritten.replace(replace, `.${extension}`);
// Pick the first file path that exists.
if (fs.existsSync(possibleFilePath)) {
return [possibleFilePath];
}
possibleDeletedFiles.push(possibleFilePath);
}
return possibleDeletedFiles;
}
return changeInterpretations.unspecified;
},
resolvePossibleOutOfBandCompilationSources(filePath) {
if (config.compile !== false) {
return null;
},
};
}
// Only recognize .cjs, .mjs and .js files.
if (!/\.(c|m)?js$/.test(filePath)) {
return null;
}
for (const [from, to] of rewritePaths) {
if (!filePath.startsWith(to)) {
continue;
}
const rewritten = `${from}${filePath.slice(to.length)}`;
const possibleExtensions = [];
if (filePath.endsWith('.cjs')) {
if (extensions.includes('cjs')) {
possibleExtensions.push({replace: /\.cjs$/, extension: 'cjs'});
}
if (extensions.includes('cts')) {
possibleExtensions.push({replace: /\.cjs$/, extension: 'cts'});
}
if (possibleExtensions.length === 0) {
return null;
}
}
if (filePath.endsWith('.mjs')) {
if (extensions.includes('mjs')) {
possibleExtensions.push({replace: /\.mjs$/, extension: 'mjs'});
}
if (extensions.includes('mts')) {
possibleExtensions.push({replace: /\.mjs$/, extension: 'mts'});
}
if (possibleExtensions.length === 0) {
return null;
}
}
if (filePath.endsWith('.js')) {
if (extensions.includes('js')) {
possibleExtensions.push({replace: /\.js$/, extension: 'js'});
}
if (extensions.includes('ts')) {
possibleExtensions.push({replace: /\.js$/, extension: 'ts'});
}
if (extensions.includes('tsx')) {
possibleExtensions.push({replace: /\.js$/, extension: 'tsx'});
}
if (possibleExtensions.length === 0) {
return null;
}
}
const possibleDeletedFiles = [];
for (const {replace, extension} of possibleExtensions) {
const possibleFilePath = rewritten.replace(replace, `.${extension}`);
// Pick the first file path that exists.
if (fs.existsSync(possibleFilePath)) {
return [possibleFilePath];
}
possibleDeletedFiles.push(possibleFilePath);
}
return possibleDeletedFiles;
}
return null;
},
};
return {
...watchMode,
@@ -276,21 +245,21 @@ export default function typescriptProvider({negotiateProtocol}) {
worker({extensionsToLoadAsModules, state: {extensions, rewritePaths}}) {
const importJs = extensionsToLoadAsModules.includes('js');
const testFileExtension = new RegExp(`\\.(${extensions.map(ext => escapeStringRegexp(ext)).join('|')})$`);
const testFileExtension = new RegExp(`\\.(${extensions.map(extension => escapeStringRegexp(extension)).join('|')})$`);
return {
canLoad(ref) {
return testFileExtension.test(ref) && rewritePaths.some(([from]) => ref.startsWith(from));
canLoad(reference) {
return testFileExtension.test(reference) && rewritePaths.some(([from]) => reference.startsWith(from));
},
async load(ref, {requireFn}) {
const [from, to] = rewritePaths.find(([from]) => ref.startsWith(from));
let rewritten = `${to}${ref.slice(from.length)}`;
async load(reference, {requireFn}) {
const [from, to] = rewritePaths.find(([from]) => reference.startsWith(from));
let rewritten = `${to}${reference.slice(from.length)}`;
let useImport = true;
if (ref.endsWith('.cts')) {
if (reference.endsWith('.cts')) {
rewritten = rewritten.replace(/\.cts$/, '.cjs');
useImport = false;
} else if (ref.endsWith('.mts')) {
} else if (reference.endsWith('.mts')) {
rewritten = rewritten.replace(/\.mts$/, '.mjs');
} else {
rewritten = rewritten.replace(testFileExtension, '.js');

View File

@@ -1,9 +1,9 @@
{
"name": "@ava/typescript",
"version": "4.1.0",
"version": "6.0.0",
"description": "TypeScript provider for AVA",
"engines": {
"node": "^14.19 || ^16.15 || ^18 || ^20"
"node": "^20.8 || ^22 || >=24"
},
"files": [
"index.js"
@@ -24,36 +24,17 @@
},
"dependencies": {
"escape-string-regexp": "^5.0.0",
"execa": "^7.1.1"
"execa": "^9.6.0"
},
"devDependencies": {
"ava": "^5.3.1",
"c8": "^8.0.0",
"del": "^7.0.0",
"typescript": "^5.1.3",
"xo": "^0.54.2"
"ava": "^6.4.0",
"c8": "^10.1.3",
"del": "^8.0.0",
"typescript": "^5.8.3",
"xo": "^1.1.0"
},
"c8": {
"reporter": [
"html",
"lcov",
"text"
]
},
"ava": {
"files": [
"!test/broken-fixtures/**"
],
"ignoredByWatcher": [
"test/fixtures/**",
"test/broken-fixtures/**"
],
"timeout": "60s"
},
"xo": {
"ignores": [
"test/broken-fixtures",
"test/fixtures/**/compiled/**"
]
"volta": {
"node": "22.16.0",
"npm": "11.4.2"
}
}

View File

@@ -17,7 +17,7 @@ yarn add @eslint/compat -D
# or
pnpm install @eslint/compat -D
# or
bun install @eslint/compat -D
bun add @eslint/compat -D
```
For Deno:
@@ -30,10 +30,10 @@ deno add @eslint/compat
This package exports the following functions in both ESM and CommonJS format:
- `fixupRule(rule)` - wraps the given rule in a compatibility layer and returns the result
- `fixupPluginRules(plugin)` - wraps each rule in the given plugin using `fixupRule()` and returns a new object that represents the plugin with the fixed-up rules
- `fixupConfigRules(configs)` - wraps all plugins found in an array of config objects using `fixupPluginRules()`
- `includeIgnoreFile(path)` - reads an ignore file (like `.gitignore`) and converts the patterns into the correct format for the config file
- `fixupRule(rule)` - wraps the given rule in a compatibility layer and returns the result
- `fixupPluginRules(plugin)` - wraps each rule in the given plugin using `fixupRule()` and returns a new object that represents the plugin with the fixed-up rules
- `fixupConfigRules(configs)` - wraps all plugins found in an array of config objects using `fixupPluginRules()`
- `includeIgnoreFile(path)` - reads an ignore file (like `.gitignore`) and converts the patterns into the correct format for the config file
### Fixing Rules
@@ -145,7 +145,9 @@ module.exports = [
### Including Ignore Files
If you were using an alternate ignore file in ESLint v8.x, such as using `--ignore-path .gitignore` on the command line, you can include those patterns programmatically in your config file using the `includeIgnoreFile()` function. For example:
If you were using an alternate ignore file in ESLint v8.x, such as using `--ignore-path .gitignore` on the command line, you can include those patterns programmatically in your config file using the `includeIgnoreFile()` function.
The `includeIgnoreFile()` function also accepts a second optional `name` parameter that allows you to set a custom name for this configuration object. If not specified, it defaults to `"Imported .gitignore patterns"`. For example:
```js
// eslint.config.js - ESM example
@@ -158,7 +160,7 @@ const __dirname = path.dirname(__filename);
const gitignorePath = path.resolve(__dirname, ".gitignore");
export default [
includeIgnoreFile(gitignorePath),
includeIgnoreFile(gitignorePath, "Imported .gitignore patterns"), // second argument is optional.
{
// your overrides
},
@@ -174,7 +176,7 @@ const path = require("node:path");
const gitignorePath = path.resolve(__dirname, ".gitignore");
module.exports = [
includeIgnoreFile(gitignorePath),
includeIgnoreFile(gitignorePath, "Imported .gitignore patterns"), // second argument is optional.
{
// your overrides
},
@@ -187,21 +189,21 @@ module.exports = [
Apache 2.0
## Sponsors
The following companies, organizations, and individuals support ESLint's ongoing maintenance and development. [Become a Sponsor](https://eslint.org/donate) to get your logo on our README and website.
<!-- NOTE: This section is autogenerated. Do not manually edit.-->
<!--sponsorsstart-->
<h3>Platinum Sponsors</h3>
<p><a href="https://automattic.com"><img src="https://images.opencollective.com/automattic/d0ef3e1/logo.png" alt="Automattic" height="128"></a> <a href="https://www.airbnb.com/"><img src="https://images.opencollective.com/airbnb/d327d66/logo.png" alt="Airbnb" height="128"></a></p><h3>Gold Sponsors</h3>
<p><a href="#"><img src="https://images.opencollective.com/guest-bf377e88/avatar.png" alt="Eli Schleifer" height="96"></a> <a href="https://engineering.salesforce.com"><img src="https://images.opencollective.com/salesforce/ca8f997/logo.png" alt="Salesforce" height="96"></a></p><h3>Silver Sponsors</h3>
<p><a href="https://www.jetbrains.com/"><img src="https://images.opencollective.com/jetbrains/fe76f99/logo.png" alt="JetBrains" height="64"></a> <a href="https://liftoff.io/"><img src="https://images.opencollective.com/liftoff/5c4fa84/logo.png" alt="Liftoff" height="64"></a> <a href="https://americanexpress.io"><img src="https://avatars.githubusercontent.com/u/3853301?v=4" alt="American Express" height="64"></a> <a href="https://www.workleap.com"><img src="https://avatars.githubusercontent.com/u/53535748?u=d1e55d7661d724bf2281c1bfd33cb8f99fe2465f&v=4" alt="Workleap" height="64"></a></p><h3>Bronze Sponsors</h3>
<p><a href="https://www.notion.so"><img src="https://images.opencollective.com/notion/bf3b117/logo.png" alt="notion" height="32"></a> <a href="https://www.crosswordsolver.org/anagram-solver/"><img src="https://images.opencollective.com/anagram-solver/2666271/logo.png" alt="Anagram Solver" height="32"></a> <a href="https://icons8.com/"><img src="https://images.opencollective.com/icons8/7fa1641/logo.png" alt="Icons8" height="32"></a> <a href="https://discord.com"><img src="https://images.opencollective.com/discordapp/f9645d9/logo.png" alt="Discord" height="32"></a> <a href="https://www.ignitionapp.com"><img src="https://avatars.githubusercontent.com/u/5753491?v=4" alt="Ignition" height="32"></a> <a href="https://nx.dev"><img src="https://avatars.githubusercontent.com/u/23692104?v=4" alt="Nx" height="32"></a> <a href="https://herocoders.com"><img src="https://avatars.githubusercontent.com/u/37549774?v=4" alt="HeroCoders" height="32"></a> <a href="https://usenextbase.com"><img src="https://avatars.githubusercontent.com/u/145838380?v=4" alt="Nextbase Starter Kit" height="32"></a></p>
<!--sponsorsend-->
<!--techsponsorsstart-->
<h2>Technology Sponsors</h2>
<p><a href="https://netlify.com"><img src="https://raw.githubusercontent.com/eslint/eslint.org/main/src/assets/images/techsponsors/netlify-icon.svg" alt="Netlify" height="32"></a> <a href="https://algolia.com"><img src="https://raw.githubusercontent.com/eslint/eslint.org/main/src/assets/images/techsponsors/algolia-icon.svg" alt="Algolia" height="32"></a> <a href="https://1password.com"><img src="https://raw.githubusercontent.com/eslint/eslint.org/main/src/assets/images/techsponsors/1password-icon.svg" alt="1Password" height="32"></a>
</p>
<!--techsponsorsend-->
## Sponsors
The following companies, organizations, and individuals support ESLint's ongoing maintenance and development. [Become a Sponsor](https://eslint.org/donate)
to get your logo on our READMEs and [website](https://eslint.org/sponsors).
<h3>Diamond Sponsors</h3>
<p><a href="https://www.ag-grid.com/"><img src="https://images.opencollective.com/ag-grid/bec0580/logo.png" alt="AG Grid" height="128"></a></p><h3>Platinum Sponsors</h3>
<p><a href="https://automattic.com"><img src="https://images.opencollective.com/automattic/d0ef3e1/logo.png" alt="Automattic" height="128"></a> <a href="https://www.airbnb.com/"><img src="https://images.opencollective.com/airbnb/d327d66/logo.png" alt="Airbnb" height="128"></a></p><h3>Gold Sponsors</h3>
<p><a href="https://qlty.sh/"><img src="https://images.opencollective.com/qltysh/33d157d/logo.png" alt="Qlty Software" height="96"></a> <a href="https://trunk.io/"><img src="https://images.opencollective.com/trunkio/fb92d60/avatar.png" alt="trunk.io" height="96"></a> <a href="https://shopify.engineering/"><img src="https://avatars.githubusercontent.com/u/8085" alt="Shopify" height="96"></a></p><h3>Silver Sponsors</h3>
<p><a href="https://vite.dev/"><img src="https://images.opencollective.com/vite/e6d15e1/logo.png" alt="Vite" height="64"></a> <a href="https://liftoff.io/"><img src="https://images.opencollective.com/liftoff/5c4fa84/logo.png" alt="Liftoff" height="64"></a> <a href="https://americanexpress.io"><img src="https://avatars.githubusercontent.com/u/3853301" alt="American Express" height="64"></a> <a href="https://stackblitz.com"><img src="https://avatars.githubusercontent.com/u/28635252" alt="StackBlitz" height="64"></a></p><h3>Bronze Sponsors</h3>
<p><a href="https://sentry.io"><img src="https://github.com/getsentry.png" alt="Sentry" height="32"></a> <a href="https://syntax.fm"><img src="https://github.com/syntaxfm.png" alt="Syntax" height="32"></a> <a href="https://cybozu.co.jp/"><img src="https://images.opencollective.com/cybozu/933e46d/logo.png" alt="Cybozu" height="32"></a> <a href="https://www.crosswordsolver.org/anagram-solver/"><img src="https://images.opencollective.com/anagram-solver/2666271/logo.png" alt="Anagram Solver" height="32"></a> <a href="https://icons8.com/"><img src="https://images.opencollective.com/icons8/7fa1641/logo.png" alt="Icons8" height="32"></a> <a href="https://discord.com"><img src="https://images.opencollective.com/discordapp/f9645d9/logo.png" alt="Discord" height="32"></a> <a href="https://www.gitbook.com"><img src="https://avatars.githubusercontent.com/u/7111340" alt="GitBook" height="32"></a> <a href="https://nolebase.ayaka.io"><img src="https://avatars.githubusercontent.com/u/11081491" alt="Neko" height="32"></a> <a href="https://nx.dev"><img src="https://avatars.githubusercontent.com/u/23692104" alt="Nx" height="32"></a> <a href="https://opensource.mercedes-benz.com/"><img src="https://avatars.githubusercontent.com/u/34240465" alt="Mercedes-Benz Group" height="32"></a> <a href="https://herocoders.com"><img src="https://avatars.githubusercontent.com/u/37549774" alt="HeroCoders" height="32"></a> <a href="https://www.lambdatest.com"><img src="https://avatars.githubusercontent.com/u/171592363" alt="LambdaTest" height="32"></a></p>
<h3>Technology Sponsors</h3>
Technology sponsors allow us to use their products and services for free as part of a contribution to the open source ecosystem and our work.
<p><a href="https://netlify.com"><img src="https://raw.githubusercontent.com/eslint/eslint.org/main/src/assets/images/techsponsors/netlify-icon.svg" alt="Netlify" height="32"></a> <a href="https://algolia.com"><img src="https://raw.githubusercontent.com/eslint/eslint.org/main/src/assets/images/techsponsors/algolia-icon.svg" alt="Algolia" height="32"></a> <a href="https://1password.com"><img src="https://raw.githubusercontent.com/eslint/eslint.org/main/src/assets/images/techsponsors/1password-icon.svg" alt="1Password" height="32"></a></p>
<!--sponsorsend-->

View File

@@ -14,8 +14,8 @@ var path = require('node:path');
/** @typedef {import("eslint").ESLint.Plugin} FixupPluginDefinition */
/** @typedef {import("eslint").Rule.RuleModule} FixupRuleDefinition */
/** @typedef {import("eslint").Rule.OldStyleRule} FixupLegacyRuleDefinition */
/** @typedef {import("eslint").Linter.FlatConfig} FixupConfig */
/** @typedef {FixupRuleDefinition["create"]} FixupLegacyRuleDefinition */
/** @typedef {import("eslint").Linter.Config} FixupConfig */
/** @typedef {Array<FixupConfig>} FixupConfigArray */
//-----------------------------------------------------------------------------
@@ -188,6 +188,21 @@ function fixupRule(ruleDefinition) {
create: ruleCreate,
};
// copy `schema` property of function-style rule or top-level `schema` property of object-style rule into `meta` object
// @ts-ignore -- top-level `schema` property was not offically supported for object-style rules so it doesn't exist in types
const { schema } = ruleDefinition;
if (schema) {
if (!newRuleDefinition.meta) {
newRuleDefinition.meta = { schema };
} else {
newRuleDefinition.meta = {
...newRuleDefinition.meta,
// top-level `schema` had precedence over `meta.schema` so it's okay to overwrite `meta.schema` if it exists
schema,
};
}
}
// cache the fixed up rule
fixedUpRuleReplacements.set(ruleDefinition, newRuleDefinition);
fixedUpRules.add(newRuleDefinition);
@@ -270,7 +285,7 @@ function fixupConfigRules(config) {
// Types
//-----------------------------------------------------------------------------
/** @typedef {import("eslint").Linter.FlatConfig} FlatConfig */
/** @typedef {import("eslint").Linter.Config} FlatConfig */
//-----------------------------------------------------------------------------
// Exports
@@ -324,10 +339,11 @@ function convertIgnorePatternToMinimatch(pattern) {
/**
* Reads an ignore file and returns an object with the ignore patterns.
* @param {string} ignoreFilePath The absolute path to the ignore file.
* @param {string} [name] The name of the ignore file config.
* @returns {FlatConfig} An object with an `ignores` property that is an array of ignore patterns.
* @throws {Error} If the ignore file path is not an absolute path.
*/
function includeIgnoreFile(ignoreFilePath) {
function includeIgnoreFile(ignoreFilePath, name) {
if (!path.isAbsolute(ignoreFilePath)) {
throw new Error("The ignore file location must be an absolute path.");
}
@@ -336,7 +352,7 @@ function includeIgnoreFile(ignoreFilePath) {
const lines = ignoreFile.split(/\r?\n/u);
return {
name: "Imported .gitignore patterns",
name: name || "Imported .gitignore patterns",
ignores: lines
.map(line => line.trim())
.filter(line => line && !line.startsWith("#"))

View File

@@ -1,14 +1,14 @@
export type FlatConfig = import("eslint").Linter.FlatConfig;
export type FlatConfig = import("eslint").Linter.Config;
export type FixupPluginDefinition = import("eslint").ESLint.Plugin;
export type FixupRuleDefinition = import("eslint").Rule.RuleModule;
export type FixupLegacyRuleDefinition = import("eslint").Rule.OldStyleRule;
export type FixupConfig = import("eslint").Linter.FlatConfig;
export type FixupLegacyRuleDefinition = FixupRuleDefinition["create"];
export type FixupConfig = import("eslint").Linter.Config;
export type FixupConfigArray = Array<FixupConfig>;
/**
* @fileoverview Ignore file utilities for the compat package.
* @author Nicholas C. Zakas
*/
/** @typedef {import("eslint").Linter.FlatConfig} FlatConfig */
/** @typedef {import("eslint").Linter.Config} FlatConfig */
/**
* Converts an ESLint ignore pattern to a minimatch pattern.
* @param {string} pattern The .eslintignore or .gitignore pattern to convert.
@@ -39,7 +39,8 @@ export function fixupRule(ruleDefinition: FixupRuleDefinition | FixupLegacyRuleD
/**
* Reads an ignore file and returns an object with the ignore patterns.
* @param {string} ignoreFilePath The absolute path to the ignore file.
* @param {string} [name] The name of the ignore file config.
* @returns {FlatConfig} An object with an `ignores` property that is an array of ignore patterns.
* @throws {Error} If the ignore file path is not an absolute path.
*/
export function includeIgnoreFile(ignoreFilePath: string): FlatConfig;
export function includeIgnoreFile(ignoreFilePath: string, name?: string): FlatConfig;

View File

@@ -1,14 +1,14 @@
export type FlatConfig = import("eslint").Linter.FlatConfig;
export type FlatConfig = import("eslint").Linter.Config;
export type FixupPluginDefinition = import("eslint").ESLint.Plugin;
export type FixupRuleDefinition = import("eslint").Rule.RuleModule;
export type FixupLegacyRuleDefinition = import("eslint").Rule.OldStyleRule;
export type FixupConfig = import("eslint").Linter.FlatConfig;
export type FixupLegacyRuleDefinition = FixupRuleDefinition["create"];
export type FixupConfig = import("eslint").Linter.Config;
export type FixupConfigArray = Array<FixupConfig>;
/**
* @fileoverview Ignore file utilities for the compat package.
* @author Nicholas C. Zakas
*/
/** @typedef {import("eslint").Linter.FlatConfig} FlatConfig */
/** @typedef {import("eslint").Linter.Config} FlatConfig */
/**
* Converts an ESLint ignore pattern to a minimatch pattern.
* @param {string} pattern The .eslintignore or .gitignore pattern to convert.
@@ -39,7 +39,8 @@ export function fixupRule(ruleDefinition: FixupRuleDefinition | FixupLegacyRuleD
/**
* Reads an ignore file and returns an object with the ignore patterns.
* @param {string} ignoreFilePath The absolute path to the ignore file.
* @param {string} [name] The name of the ignore file config.
* @returns {FlatConfig} An object with an `ignores` property that is an array of ignore patterns.
* @throws {Error} If the ignore file path is not an absolute path.
*/
export function includeIgnoreFile(ignoreFilePath: string): FlatConfig;
export function includeIgnoreFile(ignoreFilePath: string, name?: string): FlatConfig;

View File

@@ -13,8 +13,8 @@ import path from 'node:path';
/** @typedef {import("eslint").ESLint.Plugin} FixupPluginDefinition */
/** @typedef {import("eslint").Rule.RuleModule} FixupRuleDefinition */
/** @typedef {import("eslint").Rule.OldStyleRule} FixupLegacyRuleDefinition */
/** @typedef {import("eslint").Linter.FlatConfig} FixupConfig */
/** @typedef {FixupRuleDefinition["create"]} FixupLegacyRuleDefinition */
/** @typedef {import("eslint").Linter.Config} FixupConfig */
/** @typedef {Array<FixupConfig>} FixupConfigArray */
//-----------------------------------------------------------------------------
@@ -187,6 +187,21 @@ function fixupRule(ruleDefinition) {
create: ruleCreate,
};
// copy `schema` property of function-style rule or top-level `schema` property of object-style rule into `meta` object
// @ts-ignore -- top-level `schema` property was not offically supported for object-style rules so it doesn't exist in types
const { schema } = ruleDefinition;
if (schema) {
if (!newRuleDefinition.meta) {
newRuleDefinition.meta = { schema };
} else {
newRuleDefinition.meta = {
...newRuleDefinition.meta,
// top-level `schema` had precedence over `meta.schema` so it's okay to overwrite `meta.schema` if it exists
schema,
};
}
}
// cache the fixed up rule
fixedUpRuleReplacements.set(ruleDefinition, newRuleDefinition);
fixedUpRules.add(newRuleDefinition);
@@ -269,7 +284,7 @@ function fixupConfigRules(config) {
// Types
//-----------------------------------------------------------------------------
/** @typedef {import("eslint").Linter.FlatConfig} FlatConfig */
/** @typedef {import("eslint").Linter.Config} FlatConfig */
//-----------------------------------------------------------------------------
// Exports
@@ -323,10 +338,11 @@ function convertIgnorePatternToMinimatch(pattern) {
/**
* Reads an ignore file and returns an object with the ignore patterns.
* @param {string} ignoreFilePath The absolute path to the ignore file.
* @param {string} [name] The name of the ignore file config.
* @returns {FlatConfig} An object with an `ignores` property that is an array of ignore patterns.
* @throws {Error} If the ignore file path is not an absolute path.
*/
function includeIgnoreFile(ignoreFilePath) {
function includeIgnoreFile(ignoreFilePath, name) {
if (!path.isAbsolute(ignoreFilePath)) {
throw new Error("The ignore file location must be an absolute path.");
}
@@ -335,7 +351,7 @@ function includeIgnoreFile(ignoreFilePath) {
const lines = ignoreFile.split(/\r?\n/u);
return {
name: "Imported .gitignore patterns",
name: name || "Imported .gitignore patterns",
ignores: lines
.map(line => line.trim())
.filter(line => line && !line.startsWith("#"))

View File

@@ -1,6 +1,6 @@
{
"name": "@eslint/compat",
"version": "1.1.1",
"version": "1.3.1",
"description": "Compatibility utilities for ESLint",
"type": "module",
"main": "dist/esm/index.js",
@@ -25,7 +25,7 @@
"test": "tests"
},
"scripts": {
"build:cts": "node -e \"fs.copyFileSync('dist/esm/index.d.ts', 'dist/cjs/index.d.cts')\"",
"build:cts": "node ../../tools/build-cts.js dist/esm/index.d.ts dist/cjs/index.d.cts",
"build": "rollup -c && tsc -p tsconfig.esm.json && npm run build:cts",
"test:jsr": "npx jsr@latest publish --dry-run",
"test": "mocha tests/*.js",
@@ -33,7 +33,8 @@
},
"repository": {
"type": "git",
"url": "git+https://github.com/eslint/rewrite.git"
"url": "git+https://github.com/eslint/rewrite.git",
"directory": "packages/compat"
},
"keywords": [
"eslint",
@@ -46,14 +47,18 @@
"bugs": {
"url": "https://github.com/eslint/rewrite/issues"
},
"homepage": "https://github.com/eslint/rewrite#readme",
"homepage": "https://github.com/eslint/rewrite/tree/main/packages/compat#readme",
"devDependencies": {
"@types/eslint": "^8.56.10",
"c8": "^9.1.0",
"eslint": "^9.0.0",
"mocha": "^10.4.0",
"rollup": "^4.16.2",
"typescript": "^5.4.5"
"@eslint/core": "^0.15.1",
"eslint": "^9.27.0"
},
"peerDependencies": {
"eslint": "^8.40 || 9"
},
"peerDependenciesMeta": {
"eslint": {
"optional": true
}
},
"engines": {
"node": "^18.18.0 || ^20.9.0 || >=21.1.0"

View File

@@ -1,6 +1,6 @@
{
"name": "@eslint/js",
"version": "9.28.0",
"version": "9.31.0",
"description": "ESLint JavaScript language implementation",
"funding": "https://eslint.org/donate",
"main": "./src/index.js",

View File

@@ -1,6 +1,6 @@
The ISC License
Copyright (c) 2019 Elan Shanker, Paul Miller (https://paulmillr.com)
Copyright (c) Isaac Z. Schlueter and Contributors
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above

71
node_modules/@isaacs/fs-minipass/README.md generated vendored Normal file
View File

@@ -0,0 +1,71 @@
# fs-minipass
Filesystem streams based on [minipass](http://npm.im/minipass).
4 classes are exported:
- ReadStream
- ReadStreamSync
- WriteStream
- WriteStreamSync
When using `ReadStreamSync`, all of the data is made available
immediately upon consuming the stream. Nothing is buffered in memory
when the stream is constructed. If the stream is piped to a writer,
then it will synchronously `read()` and emit data into the writer as
fast as the writer can consume it. (That is, it will respect
backpressure.) If you call `stream.read()` then it will read the
entire file and return the contents.
When using `WriteStreamSync`, every write is flushed to the file
synchronously. If your writes all come in a single tick, then it'll
write it all out in a single tick. It's as synchronous as you are.
The async versions work much like their node builtin counterparts,
with the exception of introducing significantly less Stream machinery
overhead.
## USAGE
It's just streams, you pipe them or read() them or write() to them.
```js
import { ReadStream, WriteStream } from 'fs-minipass'
// or: const { ReadStream, WriteStream } = require('fs-minipass')
const readStream = new ReadStream('file.txt')
const writeStream = new WriteStream('output.txt')
writeStream.write('some file header or whatever\n')
readStream.pipe(writeStream)
```
## ReadStream(path, options)
Path string is required, but somewhat irrelevant if an open file
descriptor is passed in as an option.
Options:
- `fd` Pass in a numeric file descriptor, if the file is already open.
- `readSize` The size of reads to do, defaults to 16MB
- `size` The size of the file, if known. Prevents zero-byte read()
call at the end.
- `autoClose` Set to `false` to prevent the file descriptor from being
closed when the file is done being read.
## WriteStream(path, options)
Path string is required, but somewhat irrelevant if an open file
descriptor is passed in as an option.
Options:
- `fd` Pass in a numeric file descriptor, if the file is already open.
- `mode` The mode to create the file with. Defaults to `0o666`.
- `start` The position in the file to start reading. If not
specified, then the file will start writing at position zero, and be
truncated by default.
- `autoClose` Set to `false` to prevent the file descriptor from being
closed when the stream is ended.
- `flags` Flags to use when opening the file. Irrelevant if `fd` is
passed in, since file won't be opened in that case. Defaults to
`'a'` if a `pos` is specified, or `'w'` otherwise.

View File

@@ -0,0 +1,118 @@
/// <reference types="node" />
/// <reference types="node" />
/// <reference types="node" />
import EE from 'events';
import { Minipass } from 'minipass';
declare const _autoClose: unique symbol;
declare const _close: unique symbol;
declare const _ended: unique symbol;
declare const _fd: unique symbol;
declare const _finished: unique symbol;
declare const _flags: unique symbol;
declare const _flush: unique symbol;
declare const _handleChunk: unique symbol;
declare const _makeBuf: unique symbol;
declare const _mode: unique symbol;
declare const _needDrain: unique symbol;
declare const _onerror: unique symbol;
declare const _onopen: unique symbol;
declare const _onread: unique symbol;
declare const _onwrite: unique symbol;
declare const _open: unique symbol;
declare const _path: unique symbol;
declare const _pos: unique symbol;
declare const _queue: unique symbol;
declare const _read: unique symbol;
declare const _readSize: unique symbol;
declare const _reading: unique symbol;
declare const _remain: unique symbol;
declare const _size: unique symbol;
declare const _write: unique symbol;
declare const _writing: unique symbol;
declare const _defaultFlag: unique symbol;
declare const _errored: unique symbol;
export type ReadStreamOptions = Minipass.Options<Minipass.ContiguousData> & {
fd?: number;
readSize?: number;
size?: number;
autoClose?: boolean;
};
export type ReadStreamEvents = Minipass.Events<Minipass.ContiguousData> & {
open: [fd: number];
};
export declare class ReadStream extends Minipass<Minipass.ContiguousData, Buffer, ReadStreamEvents> {
[_errored]: boolean;
[_fd]?: number;
[_path]: string;
[_readSize]: number;
[_reading]: boolean;
[_size]: number;
[_remain]: number;
[_autoClose]: boolean;
constructor(path: string, opt: ReadStreamOptions);
get fd(): number | undefined;
get path(): string;
write(): void;
end(): void;
[_open](): void;
[_onopen](er?: NodeJS.ErrnoException | null, fd?: number): void;
[_makeBuf](): Buffer;
[_read](): void;
[_onread](er?: NodeJS.ErrnoException | null, br?: number, buf?: Buffer): void;
[_close](): void;
[_onerror](er: NodeJS.ErrnoException): void;
[_handleChunk](br: number, buf: Buffer): boolean;
emit<Event extends keyof ReadStreamEvents>(ev: Event, ...args: ReadStreamEvents[Event]): boolean;
}
export declare class ReadStreamSync extends ReadStream {
[_open](): void;
[_read](): void;
[_close](): void;
}
export type WriteStreamOptions = {
fd?: number;
autoClose?: boolean;
mode?: number;
captureRejections?: boolean;
start?: number;
flags?: string;
};
export declare class WriteStream extends EE {
readable: false;
writable: boolean;
[_errored]: boolean;
[_writing]: boolean;
[_ended]: boolean;
[_queue]: Buffer[];
[_needDrain]: boolean;
[_path]: string;
[_mode]: number;
[_autoClose]: boolean;
[_fd]?: number;
[_defaultFlag]: boolean;
[_flags]: string;
[_finished]: boolean;
[_pos]?: number;
constructor(path: string, opt: WriteStreamOptions);
emit(ev: string, ...args: any[]): boolean;
get fd(): number | undefined;
get path(): string;
[_onerror](er: NodeJS.ErrnoException): void;
[_open](): void;
[_onopen](er?: null | NodeJS.ErrnoException, fd?: number): void;
end(buf: string, enc?: BufferEncoding): this;
end(buf?: Buffer, enc?: undefined): this;
write(buf: string, enc?: BufferEncoding): boolean;
write(buf: Buffer, enc?: undefined): boolean;
[_write](buf: Buffer): void;
[_onwrite](er?: null | NodeJS.ErrnoException, bw?: number): void;
[_flush](): void;
[_close](): void;
}
export declare class WriteStreamSync extends WriteStream {
[_open](): void;
[_close](): void;
[_write](buf: Buffer): void;
}
export {};
//# sourceMappingURL=index.d.ts.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":";;;AAAA,OAAO,EAAE,MAAM,QAAQ,CAAA;AAEvB,OAAO,EAAE,QAAQ,EAAE,MAAM,UAAU,CAAA;AAInC,QAAA,MAAM,UAAU,eAAuB,CAAA;AACvC,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,GAAG,eAAgB,CAAA;AACzB,QAAA,MAAM,SAAS,eAAsB,CAAA;AACrC,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,YAAY,eAAyB,CAAA;AAC3C,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,UAAU,eAAuB,CAAA;AACvC,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,OAAO,eAAoB,CAAA;AACjC,QAAA,MAAM,OAAO,eAAoB,CAAA;AACjC,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,IAAI,eAAiB,CAAA;AAC3B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,SAAS,eAAsB,CAAA;AACrC,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,OAAO,eAAoB,CAAA;AACjC,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,YAAY,eAAyB,CAAA;AAC3C,QAAA,MAAM,QAAQ,eAAqB,CAAA;AAEnC,MAAM,MAAM,iBAAiB,GAC3B,QAAQ,CAAC,OAAO,CAAC,QAAQ,CAAC,cAAc,CAAC,GAAG;IAC1C,EAAE,CAAC,EAAE,MAAM,CAAA;IACX,QAAQ,CAAC,EAAE,MAAM,CAAA;IACjB,IAAI,CAAC,EAAE,MAAM,CAAA;IACb,SAAS,CAAC,EAAE,OAAO,CAAA;CACpB,CAAA;AAEH,MAAM,MAAM,gBAAgB,GAAG,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,cAAc,CAAC,GAAG;IACxE,IAAI,EAAE,CAAC,EAAE,EAAE,MAAM,CAAC,CAAA;CACnB,CAAA;AAED,qBAAa,UAAW,SAAQ,QAAQ,CACtC,QAAQ,CAAC,cAAc,EACvB,MAAM,EACN,gBAAgB,CACjB;IACC,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,GAAG,CAAC,CAAC,EAAE,MAAM,CAAC;IACf,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,SAAS,CAAC,EAAE,MAAM,CAAC;IACpB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,OAAO,CAAC,EAAE,MAAM,CAAC;IAClB,CAAC,UAAU,CAAC,EAAE,OAAO,CAAA;gBAET,IAAI,EAAE,MAAM,EAAE,GAAG,EAAE,iBAAiB;IA4BhD,IAAI,EAAE,uBAEL;IAED,IAAI,IAAI,WAEP;IAGD,KAAK;IAKL,GAAG;IAIH,CAAC,KAAK,CAAC;IAIP,CAAC,OAAO,CAAC,CAAC,EAAE,CAAC,EAAE,MAAM,CAAC,cAAc,GAAG,IAAI,EAAE,EAAE,CAAC,EAAE,MAAM;IAUxD,CAAC,QAAQ,CAAC;IAIV,CAAC,KAAK,CAAC;IAeP,CAAC,OAAO,CAAC,CAAC,EAAE,CAAC,EAAE,MAAM,CAAC,cAAc,GAAG,IAAI,EAAE,EAAE,CAAC,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,MAAM;IAStE,CAAC,MAAM,CAAC;IAUR,CAAC,QAAQ,CAAC,CAAC,EAAE,EAAE,MAAM,CAAC,cAAc;IAMpC,CAAC,YAAY,CAAC,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,EAAE,MAAM;IAiBtC,IAAI,CAAC,KAAK,SAAS,MAAM,gBAAgB,EACvC,EAAE,EAAE,KAAK,EACT,GAAG,IAAI,EAAE,gBAAgB,CAAC,KAAK,CAAC,GAC/B,OAAO;CAuBX;AAED,qBAAa,cAAe,SAAQ,UAAU;IAC5C,CAAC,KAAK,CAAC;IAYP,CAAC,KAAK,CAAC;IA2BP,CAAC,MAAM,CAAC;CAQT;AAED,MAAM,MAAM,kBAAkB,GAAG;IAC/B,EAAE,CAAC,EAAE,MAAM,CAAA;IACX,SAAS,CAAC,EAAE,OAAO,CAAA;IACnB,IAAI,CAAC,EAAE,MAAM,CAAA;IACb,iBAAiB,CAAC,EAAE,OAAO,CAAA;IAC3B,KAAK,CAAC,EAAE,MAAM,CAAA;IACd,KAAK,CAAC,EAAE,MAAM,CAAA;CACf,CAAA;AAED,qBAAa,WAAY,SAAQ,EAAE;IACjC,QAAQ,EAAE,KAAK,CAAQ;IACvB,QAAQ,EAAE,OAAO,CAAQ;IACzB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,MAAM,CAAC,EAAE,OAAO,CAAS;IAC1B,CAAC,MAAM,CAAC,EAAE,MAAM,EAAE,CAAM;IACxB,CAAC,UAAU,CAAC,EAAE,OAAO,CAAS;IAC9B,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,UAAU,CAAC,EAAE,OAAO,CAAC;IACtB,CAAC,GAAG,CAAC,CAAC,EAAE,MAAM,CAAC;IACf,CAAC,YAAY,CAAC,EAAE,OAAO,CAAC;IACxB,CAAC,MAAM,CAAC,EAAE,MAAM,CAAC;IACjB,CAAC,SAAS,CAAC,EAAE,OAAO,CAAS;IAC7B,CAAC,IAAI,CAAC,CAAC,EAAE,MAAM,CAAA;gBAEH,IAAI,EAAE,MAAM,EAAE,GAAG,EAAE,kBAAkB;IAoBjD,IAAI,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,IAAI,EAAE,GAAG,EAAE;IAU/B,IAAI,EAAE,uBAEL;IAED,IAAI,IAAI,WAEP;IAED,CAAC,QAAQ,CAAC,CAAC,EAAE,EAAE,MAAM,CAAC,cAAc;IAMpC,CAAC,KAAK,CAAC;IAMP,CAAC,OAAO,CAAC,CAAC,EAAE,CAAC,EAAE,IAAI,GAAG,MAAM,CAAC,cAAc,EAAE,EAAE,CAAC,EAAE,MAAM;IAoBxD,GAAG,CAAC,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,cAAc,GAAG,IAAI;IAC5C,GAAG,CAAC,GAAG,CAAC,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,SAAS,GAAG,IAAI;IAoBxC,KAAK,CAAC,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,cAAc,GAAG,OAAO;IACjD,KAAK,CAAC,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,SAAS,GAAG,OAAO;IAsB5C,CAAC,MAAM,CAAC,CAAC,GAAG,EAAE,MAAM;IAWpB,CAAC,QAAQ,CAAC,CAAC,EAAE,CAAC,EAAE,IAAI,GAAG,MAAM,CAAC,cAAc,EAAE,EAAE,CAAC,EAAE,MAAM;IAwBzD,CAAC,MAAM,CAAC;IAgBR,CAAC,MAAM,CAAC;CAST;AAED,qBAAa,eAAgB,SAAQ,WAAW;IAC9C,CAAC,KAAK,CAAC,IAAI,IAAI;IAsBf,CAAC,MAAM,CAAC;IASR,CAAC,MAAM,CAAC,CAAC,GAAG,EAAE,MAAM;CAmBrB"}

430
node_modules/@isaacs/fs-minipass/dist/commonjs/index.js generated vendored Normal file
View File

@@ -0,0 +1,430 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.WriteStreamSync = exports.WriteStream = exports.ReadStreamSync = exports.ReadStream = void 0;
const events_1 = __importDefault(require("events"));
const fs_1 = __importDefault(require("fs"));
const minipass_1 = require("minipass");
const writev = fs_1.default.writev;
const _autoClose = Symbol('_autoClose');
const _close = Symbol('_close');
const _ended = Symbol('_ended');
const _fd = Symbol('_fd');
const _finished = Symbol('_finished');
const _flags = Symbol('_flags');
const _flush = Symbol('_flush');
const _handleChunk = Symbol('_handleChunk');
const _makeBuf = Symbol('_makeBuf');
const _mode = Symbol('_mode');
const _needDrain = Symbol('_needDrain');
const _onerror = Symbol('_onerror');
const _onopen = Symbol('_onopen');
const _onread = Symbol('_onread');
const _onwrite = Symbol('_onwrite');
const _open = Symbol('_open');
const _path = Symbol('_path');
const _pos = Symbol('_pos');
const _queue = Symbol('_queue');
const _read = Symbol('_read');
const _readSize = Symbol('_readSize');
const _reading = Symbol('_reading');
const _remain = Symbol('_remain');
const _size = Symbol('_size');
const _write = Symbol('_write');
const _writing = Symbol('_writing');
const _defaultFlag = Symbol('_defaultFlag');
const _errored = Symbol('_errored');
class ReadStream extends minipass_1.Minipass {
[_errored] = false;
[_fd];
[_path];
[_readSize];
[_reading] = false;
[_size];
[_remain];
[_autoClose];
constructor(path, opt) {
opt = opt || {};
super(opt);
this.readable = true;
this.writable = false;
if (typeof path !== 'string') {
throw new TypeError('path must be a string');
}
this[_errored] = false;
this[_fd] = typeof opt.fd === 'number' ? opt.fd : undefined;
this[_path] = path;
this[_readSize] = opt.readSize || 16 * 1024 * 1024;
this[_reading] = false;
this[_size] = typeof opt.size === 'number' ? opt.size : Infinity;
this[_remain] = this[_size];
this[_autoClose] =
typeof opt.autoClose === 'boolean' ? opt.autoClose : true;
if (typeof this[_fd] === 'number') {
this[_read]();
}
else {
this[_open]();
}
}
get fd() {
return this[_fd];
}
get path() {
return this[_path];
}
//@ts-ignore
write() {
throw new TypeError('this is a readable stream');
}
//@ts-ignore
end() {
throw new TypeError('this is a readable stream');
}
[_open]() {
fs_1.default.open(this[_path], 'r', (er, fd) => this[_onopen](er, fd));
}
[_onopen](er, fd) {
if (er) {
this[_onerror](er);
}
else {
this[_fd] = fd;
this.emit('open', fd);
this[_read]();
}
}
[_makeBuf]() {
return Buffer.allocUnsafe(Math.min(this[_readSize], this[_remain]));
}
[_read]() {
if (!this[_reading]) {
this[_reading] = true;
const buf = this[_makeBuf]();
/* c8 ignore start */
if (buf.length === 0) {
return process.nextTick(() => this[_onread](null, 0, buf));
}
/* c8 ignore stop */
fs_1.default.read(this[_fd], buf, 0, buf.length, null, (er, br, b) => this[_onread](er, br, b));
}
}
[_onread](er, br, buf) {
this[_reading] = false;
if (er) {
this[_onerror](er);
}
else if (this[_handleChunk](br, buf)) {
this[_read]();
}
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs_1.default.close(fd, er => er ? this.emit('error', er) : this.emit('close'));
}
}
[_onerror](er) {
this[_reading] = true;
this[_close]();
this.emit('error', er);
}
[_handleChunk](br, buf) {
let ret = false;
// no effect if infinite
this[_remain] -= br;
if (br > 0) {
ret = super.write(br < buf.length ? buf.subarray(0, br) : buf);
}
if (br === 0 || this[_remain] <= 0) {
ret = false;
this[_close]();
super.end();
}
return ret;
}
emit(ev, ...args) {
switch (ev) {
case 'prefinish':
case 'finish':
return false;
case 'drain':
if (typeof this[_fd] === 'number') {
this[_read]();
}
return false;
case 'error':
if (this[_errored]) {
return false;
}
this[_errored] = true;
return super.emit(ev, ...args);
default:
return super.emit(ev, ...args);
}
}
}
exports.ReadStream = ReadStream;
class ReadStreamSync extends ReadStream {
[_open]() {
let threw = true;
try {
this[_onopen](null, fs_1.default.openSync(this[_path], 'r'));
threw = false;
}
finally {
if (threw) {
this[_close]();
}
}
}
[_read]() {
let threw = true;
try {
if (!this[_reading]) {
this[_reading] = true;
do {
const buf = this[_makeBuf]();
/* c8 ignore start */
const br = buf.length === 0
? 0
: fs_1.default.readSync(this[_fd], buf, 0, buf.length, null);
/* c8 ignore stop */
if (!this[_handleChunk](br, buf)) {
break;
}
} while (true);
this[_reading] = false;
}
threw = false;
}
finally {
if (threw) {
this[_close]();
}
}
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs_1.default.closeSync(fd);
this.emit('close');
}
}
}
exports.ReadStreamSync = ReadStreamSync;
class WriteStream extends events_1.default {
readable = false;
writable = true;
[_errored] = false;
[_writing] = false;
[_ended] = false;
[_queue] = [];
[_needDrain] = false;
[_path];
[_mode];
[_autoClose];
[_fd];
[_defaultFlag];
[_flags];
[_finished] = false;
[_pos];
constructor(path, opt) {
opt = opt || {};
super(opt);
this[_path] = path;
this[_fd] = typeof opt.fd === 'number' ? opt.fd : undefined;
this[_mode] = opt.mode === undefined ? 0o666 : opt.mode;
this[_pos] = typeof opt.start === 'number' ? opt.start : undefined;
this[_autoClose] =
typeof opt.autoClose === 'boolean' ? opt.autoClose : true;
// truncating makes no sense when writing into the middle
const defaultFlag = this[_pos] !== undefined ? 'r+' : 'w';
this[_defaultFlag] = opt.flags === undefined;
this[_flags] = opt.flags === undefined ? defaultFlag : opt.flags;
if (this[_fd] === undefined) {
this[_open]();
}
}
emit(ev, ...args) {
if (ev === 'error') {
if (this[_errored]) {
return false;
}
this[_errored] = true;
}
return super.emit(ev, ...args);
}
get fd() {
return this[_fd];
}
get path() {
return this[_path];
}
[_onerror](er) {
this[_close]();
this[_writing] = true;
this.emit('error', er);
}
[_open]() {
fs_1.default.open(this[_path], this[_flags], this[_mode], (er, fd) => this[_onopen](er, fd));
}
[_onopen](er, fd) {
if (this[_defaultFlag] &&
this[_flags] === 'r+' &&
er &&
er.code === 'ENOENT') {
this[_flags] = 'w';
this[_open]();
}
else if (er) {
this[_onerror](er);
}
else {
this[_fd] = fd;
this.emit('open', fd);
if (!this[_writing]) {
this[_flush]();
}
}
}
end(buf, enc) {
if (buf) {
//@ts-ignore
this.write(buf, enc);
}
this[_ended] = true;
// synthetic after-write logic, where drain/finish live
if (!this[_writing] &&
!this[_queue].length &&
typeof this[_fd] === 'number') {
this[_onwrite](null, 0);
}
return this;
}
write(buf, enc) {
if (typeof buf === 'string') {
buf = Buffer.from(buf, enc);
}
if (this[_ended]) {
this.emit('error', new Error('write() after end()'));
return false;
}
if (this[_fd] === undefined || this[_writing] || this[_queue].length) {
this[_queue].push(buf);
this[_needDrain] = true;
return false;
}
this[_writing] = true;
this[_write](buf);
return true;
}
[_write](buf) {
fs_1.default.write(this[_fd], buf, 0, buf.length, this[_pos], (er, bw) => this[_onwrite](er, bw));
}
[_onwrite](er, bw) {
if (er) {
this[_onerror](er);
}
else {
if (this[_pos] !== undefined && typeof bw === 'number') {
this[_pos] += bw;
}
if (this[_queue].length) {
this[_flush]();
}
else {
this[_writing] = false;
if (this[_ended] && !this[_finished]) {
this[_finished] = true;
this[_close]();
this.emit('finish');
}
else if (this[_needDrain]) {
this[_needDrain] = false;
this.emit('drain');
}
}
}
}
[_flush]() {
if (this[_queue].length === 0) {
if (this[_ended]) {
this[_onwrite](null, 0);
}
}
else if (this[_queue].length === 1) {
this[_write](this[_queue].pop());
}
else {
const iovec = this[_queue];
this[_queue] = [];
writev(this[_fd], iovec, this[_pos], (er, bw) => this[_onwrite](er, bw));
}
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs_1.default.close(fd, er => er ? this.emit('error', er) : this.emit('close'));
}
}
}
exports.WriteStream = WriteStream;
class WriteStreamSync extends WriteStream {
[_open]() {
let fd;
// only wrap in a try{} block if we know we'll retry, to avoid
// the rethrow obscuring the error's source frame in most cases.
if (this[_defaultFlag] && this[_flags] === 'r+') {
try {
fd = fs_1.default.openSync(this[_path], this[_flags], this[_mode]);
}
catch (er) {
if (er?.code === 'ENOENT') {
this[_flags] = 'w';
return this[_open]();
}
else {
throw er;
}
}
}
else {
fd = fs_1.default.openSync(this[_path], this[_flags], this[_mode]);
}
this[_onopen](null, fd);
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs_1.default.closeSync(fd);
this.emit('close');
}
}
[_write](buf) {
// throw the original, but try to close if it fails
let threw = true;
try {
this[_onwrite](null, fs_1.default.writeSync(this[_fd], buf, 0, buf.length, this[_pos]));
threw = false;
}
finally {
if (threw) {
try {
this[_close]();
}
catch {
// ok error
}
}
}
}
}
exports.WriteStreamSync = WriteStreamSync;
//# sourceMappingURL=index.js.map

File diff suppressed because one or more lines are too long

118
node_modules/@isaacs/fs-minipass/dist/esm/index.d.ts generated vendored Normal file
View File

@@ -0,0 +1,118 @@
/// <reference types="node" resolution-mode="require"/>
/// <reference types="node" resolution-mode="require"/>
/// <reference types="node" resolution-mode="require"/>
import EE from 'events';
import { Minipass } from 'minipass';
declare const _autoClose: unique symbol;
declare const _close: unique symbol;
declare const _ended: unique symbol;
declare const _fd: unique symbol;
declare const _finished: unique symbol;
declare const _flags: unique symbol;
declare const _flush: unique symbol;
declare const _handleChunk: unique symbol;
declare const _makeBuf: unique symbol;
declare const _mode: unique symbol;
declare const _needDrain: unique symbol;
declare const _onerror: unique symbol;
declare const _onopen: unique symbol;
declare const _onread: unique symbol;
declare const _onwrite: unique symbol;
declare const _open: unique symbol;
declare const _path: unique symbol;
declare const _pos: unique symbol;
declare const _queue: unique symbol;
declare const _read: unique symbol;
declare const _readSize: unique symbol;
declare const _reading: unique symbol;
declare const _remain: unique symbol;
declare const _size: unique symbol;
declare const _write: unique symbol;
declare const _writing: unique symbol;
declare const _defaultFlag: unique symbol;
declare const _errored: unique symbol;
export type ReadStreamOptions = Minipass.Options<Minipass.ContiguousData> & {
fd?: number;
readSize?: number;
size?: number;
autoClose?: boolean;
};
export type ReadStreamEvents = Minipass.Events<Minipass.ContiguousData> & {
open: [fd: number];
};
export declare class ReadStream extends Minipass<Minipass.ContiguousData, Buffer, ReadStreamEvents> {
[_errored]: boolean;
[_fd]?: number;
[_path]: string;
[_readSize]: number;
[_reading]: boolean;
[_size]: number;
[_remain]: number;
[_autoClose]: boolean;
constructor(path: string, opt: ReadStreamOptions);
get fd(): number | undefined;
get path(): string;
write(): void;
end(): void;
[_open](): void;
[_onopen](er?: NodeJS.ErrnoException | null, fd?: number): void;
[_makeBuf](): Buffer;
[_read](): void;
[_onread](er?: NodeJS.ErrnoException | null, br?: number, buf?: Buffer): void;
[_close](): void;
[_onerror](er: NodeJS.ErrnoException): void;
[_handleChunk](br: number, buf: Buffer): boolean;
emit<Event extends keyof ReadStreamEvents>(ev: Event, ...args: ReadStreamEvents[Event]): boolean;
}
export declare class ReadStreamSync extends ReadStream {
[_open](): void;
[_read](): void;
[_close](): void;
}
export type WriteStreamOptions = {
fd?: number;
autoClose?: boolean;
mode?: number;
captureRejections?: boolean;
start?: number;
flags?: string;
};
export declare class WriteStream extends EE {
readable: false;
writable: boolean;
[_errored]: boolean;
[_writing]: boolean;
[_ended]: boolean;
[_queue]: Buffer[];
[_needDrain]: boolean;
[_path]: string;
[_mode]: number;
[_autoClose]: boolean;
[_fd]?: number;
[_defaultFlag]: boolean;
[_flags]: string;
[_finished]: boolean;
[_pos]?: number;
constructor(path: string, opt: WriteStreamOptions);
emit(ev: string, ...args: any[]): boolean;
get fd(): number | undefined;
get path(): string;
[_onerror](er: NodeJS.ErrnoException): void;
[_open](): void;
[_onopen](er?: null | NodeJS.ErrnoException, fd?: number): void;
end(buf: string, enc?: BufferEncoding): this;
end(buf?: Buffer, enc?: undefined): this;
write(buf: string, enc?: BufferEncoding): boolean;
write(buf: Buffer, enc?: undefined): boolean;
[_write](buf: Buffer): void;
[_onwrite](er?: null | NodeJS.ErrnoException, bw?: number): void;
[_flush](): void;
[_close](): void;
}
export declare class WriteStreamSync extends WriteStream {
[_open](): void;
[_close](): void;
[_write](buf: Buffer): void;
}
export {};
//# sourceMappingURL=index.d.ts.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":";;;AAAA,OAAO,EAAE,MAAM,QAAQ,CAAA;AAEvB,OAAO,EAAE,QAAQ,EAAE,MAAM,UAAU,CAAA;AAInC,QAAA,MAAM,UAAU,eAAuB,CAAA;AACvC,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,GAAG,eAAgB,CAAA;AACzB,QAAA,MAAM,SAAS,eAAsB,CAAA;AACrC,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,YAAY,eAAyB,CAAA;AAC3C,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,UAAU,eAAuB,CAAA;AACvC,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,OAAO,eAAoB,CAAA;AACjC,QAAA,MAAM,OAAO,eAAoB,CAAA;AACjC,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,IAAI,eAAiB,CAAA;AAC3B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,SAAS,eAAsB,CAAA;AACrC,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,OAAO,eAAoB,CAAA;AACjC,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,YAAY,eAAyB,CAAA;AAC3C,QAAA,MAAM,QAAQ,eAAqB,CAAA;AAEnC,MAAM,MAAM,iBAAiB,GAC3B,QAAQ,CAAC,OAAO,CAAC,QAAQ,CAAC,cAAc,CAAC,GAAG;IAC1C,EAAE,CAAC,EAAE,MAAM,CAAA;IACX,QAAQ,CAAC,EAAE,MAAM,CAAA;IACjB,IAAI,CAAC,EAAE,MAAM,CAAA;IACb,SAAS,CAAC,EAAE,OAAO,CAAA;CACpB,CAAA;AAEH,MAAM,MAAM,gBAAgB,GAAG,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,cAAc,CAAC,GAAG;IACxE,IAAI,EAAE,CAAC,EAAE,EAAE,MAAM,CAAC,CAAA;CACnB,CAAA;AAED,qBAAa,UAAW,SAAQ,QAAQ,CACtC,QAAQ,CAAC,cAAc,EACvB,MAAM,EACN,gBAAgB,CACjB;IACC,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,GAAG,CAAC,CAAC,EAAE,MAAM,CAAC;IACf,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,SAAS,CAAC,EAAE,MAAM,CAAC;IACpB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,OAAO,CAAC,EAAE,MAAM,CAAC;IAClB,CAAC,UAAU,CAAC,EAAE,OAAO,CAAA;gBAET,IAAI,EAAE,MAAM,EAAE,GAAG,EAAE,iBAAiB;IA4BhD,IAAI,EAAE,uBAEL;IAED,IAAI,IAAI,WAEP;IAGD,KAAK;IAKL,GAAG;IAIH,CAAC,KAAK,CAAC;IAIP,CAAC,OAAO,CAAC,CAAC,EAAE,CAAC,EAAE,MAAM,CAAC,cAAc,GAAG,IAAI,EAAE,EAAE,CAAC,EAAE,MAAM;IAUxD,CAAC,QAAQ,CAAC;IAIV,CAAC,KAAK,CAAC;IAeP,CAAC,OAAO,CAAC,CAAC,EAAE,CAAC,EAAE,MAAM,CAAC,cAAc,GAAG,IAAI,EAAE,EAAE,CAAC,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,MAAM;IAStE,CAAC,MAAM,CAAC;IAUR,CAAC,QAAQ,CAAC,CAAC,EAAE,EAAE,MAAM,CAAC,cAAc;IAMpC,CAAC,YAAY,CAAC,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,EAAE,MAAM;IAiBtC,IAAI,CAAC,KAAK,SAAS,MAAM,gBAAgB,EACvC,EAAE,EAAE,KAAK,EACT,GAAG,IAAI,EAAE,gBAAgB,CAAC,KAAK,CAAC,GAC/B,OAAO;CAuBX;AAED,qBAAa,cAAe,SAAQ,UAAU;IAC5C,CAAC,KAAK,CAAC;IAYP,CAAC,KAAK,CAAC;IA2BP,CAAC,MAAM,CAAC;CAQT;AAED,MAAM,MAAM,kBAAkB,GAAG;IAC/B,EAAE,CAAC,EAAE,MAAM,CAAA;IACX,SAAS,CAAC,EAAE,OAAO,CAAA;IACnB,IAAI,CAAC,EAAE,MAAM,CAAA;IACb,iBAAiB,CAAC,EAAE,OAAO,CAAA;IAC3B,KAAK,CAAC,EAAE,MAAM,CAAA;IACd,KAAK,CAAC,EAAE,MAAM,CAAA;CACf,CAAA;AAED,qBAAa,WAAY,SAAQ,EAAE;IACjC,QAAQ,EAAE,KAAK,CAAQ;IACvB,QAAQ,EAAE,OAAO,CAAQ;IACzB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,MAAM,CAAC,EAAE,OAAO,CAAS;IAC1B,CAAC,MAAM,CAAC,EAAE,MAAM,EAAE,CAAM;IACxB,CAAC,UAAU,CAAC,EAAE,OAAO,CAAS;IAC9B,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,UAAU,CAAC,EAAE,OAAO,CAAC;IACtB,CAAC,GAAG,CAAC,CAAC,EAAE,MAAM,CAAC;IACf,CAAC,YAAY,CAAC,EAAE,OAAO,CAAC;IACxB,CAAC,MAAM,CAAC,EAAE,MAAM,CAAC;IACjB,CAAC,SAAS,CAAC,EAAE,OAAO,CAAS;IAC7B,CAAC,IAAI,CAAC,CAAC,EAAE,MAAM,CAAA;gBAEH,IAAI,EAAE,MAAM,EAAE,GAAG,EAAE,kBAAkB;IAoBjD,IAAI,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,IAAI,EAAE,GAAG,EAAE;IAU/B,IAAI,EAAE,uBAEL;IAED,IAAI,IAAI,WAEP;IAED,CAAC,QAAQ,CAAC,CAAC,EAAE,EAAE,MAAM,CAAC,cAAc;IAMpC,CAAC,KAAK,CAAC;IAMP,CAAC,OAAO,CAAC,CAAC,EAAE,CAAC,EAAE,IAAI,GAAG,MAAM,CAAC,cAAc,EAAE,EAAE,CAAC,EAAE,MAAM;IAoBxD,GAAG,CAAC,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,cAAc,GAAG,IAAI;IAC5C,GAAG,CAAC,GAAG,CAAC,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,SAAS,GAAG,IAAI;IAoBxC,KAAK,CAAC,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,cAAc,GAAG,OAAO;IACjD,KAAK,CAAC,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,SAAS,GAAG,OAAO;IAsB5C,CAAC,MAAM,CAAC,CAAC,GAAG,EAAE,MAAM;IAWpB,CAAC,QAAQ,CAAC,CAAC,EAAE,CAAC,EAAE,IAAI,GAAG,MAAM,CAAC,cAAc,EAAE,EAAE,CAAC,EAAE,MAAM;IAwBzD,CAAC,MAAM,CAAC;IAgBR,CAAC,MAAM,CAAC;CAST;AAED,qBAAa,eAAgB,SAAQ,WAAW;IAC9C,CAAC,KAAK,CAAC,IAAI,IAAI;IAsBf,CAAC,MAAM,CAAC;IASR,CAAC,MAAM,CAAC,CAAC,GAAG,EAAE,MAAM;CAmBrB"}

420
node_modules/@isaacs/fs-minipass/dist/esm/index.js generated vendored Normal file
View File

@@ -0,0 +1,420 @@
import EE from 'events';
import fs from 'fs';
import { Minipass } from 'minipass';
const writev = fs.writev;
const _autoClose = Symbol('_autoClose');
const _close = Symbol('_close');
const _ended = Symbol('_ended');
const _fd = Symbol('_fd');
const _finished = Symbol('_finished');
const _flags = Symbol('_flags');
const _flush = Symbol('_flush');
const _handleChunk = Symbol('_handleChunk');
const _makeBuf = Symbol('_makeBuf');
const _mode = Symbol('_mode');
const _needDrain = Symbol('_needDrain');
const _onerror = Symbol('_onerror');
const _onopen = Symbol('_onopen');
const _onread = Symbol('_onread');
const _onwrite = Symbol('_onwrite');
const _open = Symbol('_open');
const _path = Symbol('_path');
const _pos = Symbol('_pos');
const _queue = Symbol('_queue');
const _read = Symbol('_read');
const _readSize = Symbol('_readSize');
const _reading = Symbol('_reading');
const _remain = Symbol('_remain');
const _size = Symbol('_size');
const _write = Symbol('_write');
const _writing = Symbol('_writing');
const _defaultFlag = Symbol('_defaultFlag');
const _errored = Symbol('_errored');
export class ReadStream extends Minipass {
[_errored] = false;
[_fd];
[_path];
[_readSize];
[_reading] = false;
[_size];
[_remain];
[_autoClose];
constructor(path, opt) {
opt = opt || {};
super(opt);
this.readable = true;
this.writable = false;
if (typeof path !== 'string') {
throw new TypeError('path must be a string');
}
this[_errored] = false;
this[_fd] = typeof opt.fd === 'number' ? opt.fd : undefined;
this[_path] = path;
this[_readSize] = opt.readSize || 16 * 1024 * 1024;
this[_reading] = false;
this[_size] = typeof opt.size === 'number' ? opt.size : Infinity;
this[_remain] = this[_size];
this[_autoClose] =
typeof opt.autoClose === 'boolean' ? opt.autoClose : true;
if (typeof this[_fd] === 'number') {
this[_read]();
}
else {
this[_open]();
}
}
get fd() {
return this[_fd];
}
get path() {
return this[_path];
}
//@ts-ignore
write() {
throw new TypeError('this is a readable stream');
}
//@ts-ignore
end() {
throw new TypeError('this is a readable stream');
}
[_open]() {
fs.open(this[_path], 'r', (er, fd) => this[_onopen](er, fd));
}
[_onopen](er, fd) {
if (er) {
this[_onerror](er);
}
else {
this[_fd] = fd;
this.emit('open', fd);
this[_read]();
}
}
[_makeBuf]() {
return Buffer.allocUnsafe(Math.min(this[_readSize], this[_remain]));
}
[_read]() {
if (!this[_reading]) {
this[_reading] = true;
const buf = this[_makeBuf]();
/* c8 ignore start */
if (buf.length === 0) {
return process.nextTick(() => this[_onread](null, 0, buf));
}
/* c8 ignore stop */
fs.read(this[_fd], buf, 0, buf.length, null, (er, br, b) => this[_onread](er, br, b));
}
}
[_onread](er, br, buf) {
this[_reading] = false;
if (er) {
this[_onerror](er);
}
else if (this[_handleChunk](br, buf)) {
this[_read]();
}
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs.close(fd, er => er ? this.emit('error', er) : this.emit('close'));
}
}
[_onerror](er) {
this[_reading] = true;
this[_close]();
this.emit('error', er);
}
[_handleChunk](br, buf) {
let ret = false;
// no effect if infinite
this[_remain] -= br;
if (br > 0) {
ret = super.write(br < buf.length ? buf.subarray(0, br) : buf);
}
if (br === 0 || this[_remain] <= 0) {
ret = false;
this[_close]();
super.end();
}
return ret;
}
emit(ev, ...args) {
switch (ev) {
case 'prefinish':
case 'finish':
return false;
case 'drain':
if (typeof this[_fd] === 'number') {
this[_read]();
}
return false;
case 'error':
if (this[_errored]) {
return false;
}
this[_errored] = true;
return super.emit(ev, ...args);
default:
return super.emit(ev, ...args);
}
}
}
export class ReadStreamSync extends ReadStream {
[_open]() {
let threw = true;
try {
this[_onopen](null, fs.openSync(this[_path], 'r'));
threw = false;
}
finally {
if (threw) {
this[_close]();
}
}
}
[_read]() {
let threw = true;
try {
if (!this[_reading]) {
this[_reading] = true;
do {
const buf = this[_makeBuf]();
/* c8 ignore start */
const br = buf.length === 0
? 0
: fs.readSync(this[_fd], buf, 0, buf.length, null);
/* c8 ignore stop */
if (!this[_handleChunk](br, buf)) {
break;
}
} while (true);
this[_reading] = false;
}
threw = false;
}
finally {
if (threw) {
this[_close]();
}
}
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs.closeSync(fd);
this.emit('close');
}
}
}
export class WriteStream extends EE {
readable = false;
writable = true;
[_errored] = false;
[_writing] = false;
[_ended] = false;
[_queue] = [];
[_needDrain] = false;
[_path];
[_mode];
[_autoClose];
[_fd];
[_defaultFlag];
[_flags];
[_finished] = false;
[_pos];
constructor(path, opt) {
opt = opt || {};
super(opt);
this[_path] = path;
this[_fd] = typeof opt.fd === 'number' ? opt.fd : undefined;
this[_mode] = opt.mode === undefined ? 0o666 : opt.mode;
this[_pos] = typeof opt.start === 'number' ? opt.start : undefined;
this[_autoClose] =
typeof opt.autoClose === 'boolean' ? opt.autoClose : true;
// truncating makes no sense when writing into the middle
const defaultFlag = this[_pos] !== undefined ? 'r+' : 'w';
this[_defaultFlag] = opt.flags === undefined;
this[_flags] = opt.flags === undefined ? defaultFlag : opt.flags;
if (this[_fd] === undefined) {
this[_open]();
}
}
emit(ev, ...args) {
if (ev === 'error') {
if (this[_errored]) {
return false;
}
this[_errored] = true;
}
return super.emit(ev, ...args);
}
get fd() {
return this[_fd];
}
get path() {
return this[_path];
}
[_onerror](er) {
this[_close]();
this[_writing] = true;
this.emit('error', er);
}
[_open]() {
fs.open(this[_path], this[_flags], this[_mode], (er, fd) => this[_onopen](er, fd));
}
[_onopen](er, fd) {
if (this[_defaultFlag] &&
this[_flags] === 'r+' &&
er &&
er.code === 'ENOENT') {
this[_flags] = 'w';
this[_open]();
}
else if (er) {
this[_onerror](er);
}
else {
this[_fd] = fd;
this.emit('open', fd);
if (!this[_writing]) {
this[_flush]();
}
}
}
end(buf, enc) {
if (buf) {
//@ts-ignore
this.write(buf, enc);
}
this[_ended] = true;
// synthetic after-write logic, where drain/finish live
if (!this[_writing] &&
!this[_queue].length &&
typeof this[_fd] === 'number') {
this[_onwrite](null, 0);
}
return this;
}
write(buf, enc) {
if (typeof buf === 'string') {
buf = Buffer.from(buf, enc);
}
if (this[_ended]) {
this.emit('error', new Error('write() after end()'));
return false;
}
if (this[_fd] === undefined || this[_writing] || this[_queue].length) {
this[_queue].push(buf);
this[_needDrain] = true;
return false;
}
this[_writing] = true;
this[_write](buf);
return true;
}
[_write](buf) {
fs.write(this[_fd], buf, 0, buf.length, this[_pos], (er, bw) => this[_onwrite](er, bw));
}
[_onwrite](er, bw) {
if (er) {
this[_onerror](er);
}
else {
if (this[_pos] !== undefined && typeof bw === 'number') {
this[_pos] += bw;
}
if (this[_queue].length) {
this[_flush]();
}
else {
this[_writing] = false;
if (this[_ended] && !this[_finished]) {
this[_finished] = true;
this[_close]();
this.emit('finish');
}
else if (this[_needDrain]) {
this[_needDrain] = false;
this.emit('drain');
}
}
}
}
[_flush]() {
if (this[_queue].length === 0) {
if (this[_ended]) {
this[_onwrite](null, 0);
}
}
else if (this[_queue].length === 1) {
this[_write](this[_queue].pop());
}
else {
const iovec = this[_queue];
this[_queue] = [];
writev(this[_fd], iovec, this[_pos], (er, bw) => this[_onwrite](er, bw));
}
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs.close(fd, er => er ? this.emit('error', er) : this.emit('close'));
}
}
}
export class WriteStreamSync extends WriteStream {
[_open]() {
let fd;
// only wrap in a try{} block if we know we'll retry, to avoid
// the rethrow obscuring the error's source frame in most cases.
if (this[_defaultFlag] && this[_flags] === 'r+') {
try {
fd = fs.openSync(this[_path], this[_flags], this[_mode]);
}
catch (er) {
if (er?.code === 'ENOENT') {
this[_flags] = 'w';
return this[_open]();
}
else {
throw er;
}
}
}
else {
fd = fs.openSync(this[_path], this[_flags], this[_mode]);
}
this[_onopen](null, fd);
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs.closeSync(fd);
this.emit('close');
}
}
[_write](buf) {
// throw the original, but try to close if it fails
let threw = true;
try {
this[_onwrite](null, fs.writeSync(this[_fd], buf, 0, buf.length, this[_pos]));
threw = false;
}
finally {
if (threw) {
try {
this[_close]();
}
catch {
// ok error
}
}
}
}
}
//# sourceMappingURL=index.js.map

File diff suppressed because one or more lines are too long

72
node_modules/@isaacs/fs-minipass/package.json generated vendored Normal file
View File

@@ -0,0 +1,72 @@
{
"name": "@isaacs/fs-minipass",
"version": "4.0.1",
"main": "./dist/commonjs/index.js",
"scripts": {
"prepare": "tshy",
"pretest": "npm run prepare",
"test": "tap",
"preversion": "npm test",
"postversion": "npm publish",
"prepublishOnly": "git push origin --follow-tags",
"format": "prettier --write . --loglevel warn",
"typedoc": "typedoc --tsconfig .tshy/esm.json ./src/*.ts"
},
"keywords": [],
"author": "Isaac Z. Schlueter",
"license": "ISC",
"repository": {
"type": "git",
"url": "https://github.com/npm/fs-minipass.git"
},
"description": "fs read and write streams based on minipass",
"dependencies": {
"minipass": "^7.0.4"
},
"devDependencies": {
"@types/node": "^20.11.30",
"mutate-fs": "^2.1.1",
"prettier": "^3.2.5",
"tap": "^18.7.1",
"tshy": "^1.12.0",
"typedoc": "^0.25.12"
},
"files": [
"dist"
],
"engines": {
"node": ">=18.0.0"
},
"tshy": {
"exports": {
"./package.json": "./package.json",
".": "./src/index.ts"
}
},
"exports": {
"./package.json": "./package.json",
".": {
"import": {
"types": "./dist/esm/index.d.ts",
"default": "./dist/esm/index.js"
},
"require": {
"types": "./dist/commonjs/index.d.ts",
"default": "./dist/commonjs/index.js"
}
}
},
"types": "./dist/commonjs/index.d.ts",
"type": "module",
"prettier": {
"semi": false,
"printWidth": 75,
"tabWidth": 2,
"useTabs": false,
"singleQuote": true,
"jsxSingleQuote": false,
"bracketSameLine": true,
"arrowParens": "avoid",
"endOfLine": "lf"
}
}

View File

@@ -0,0 +1,20 @@
# For all possible configuration options see:
# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates
# Note that limitations on version updates applied in this file don't apply to security updates,
# which are separately managed by dependabot
version: 2
updates:
- package-ecosystem: "npm"
directory: "/"
schedule:
interval: "daily"
open-pull-requests-limit: 20
ignore:
- dependency-name: "*"
update-types: ["version-update:semver-minor", "version-update:semver-patch"]
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "daily"

View File

@@ -0,0 +1,29 @@
# https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-nodejs
name: ci
on:
pull_request:
push:
branches:
- master
workflow_dispatch:
jobs:
ci:
strategy:
fail-fast: false
max-parallel: 9
matrix:
node-version: ['18.x', '20.x', '22.x']
os: [macos-latest, ubuntu-latest, windows-latest]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: ${{ matrix.node-version }}
- run: npm ci
- run: npm audit
- run: npm run lint
# - run: npm run update-crosswalk # To support newer versions of Node.js
- run: npm run build --if-present
- run: npm test

View File

@@ -0,0 +1,74 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: "CodeQL"
on:
push:
branches: [ "master" ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ "master" ]
schedule:
- cron: '24 5 * * 4'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: [ 'javascript' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# Details on CodeQL's query packs refer to : https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v3
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
# If the Autobuild fails above, remove it and uncomment the following three lines.
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
# - run: |
# echo "Run, Build Application using script"
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3
with:
category: "/language:${{matrix.language}}"

View File

@@ -0,0 +1,105 @@
name: release
on:
push:
branches: [master]
workflow_dispatch:
jobs:
release-check:
name: Check if version is published
runs-on: ubuntu-latest
defaults:
run:
shell: bash
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 22
- name: Check if version is published
id: check
run: |
currentVersion="$( node -e "console.log(require('./package.json').version)" )"
isPublished="$( npm view @mapbox/node-pre-gyp versions --json | jq -c --arg cv "$currentVersion" 'any(. == $cv)' )"
echo "version=$currentVersion" >> "$GITHUB_OUTPUT"
echo "published=$isPublished" >> "$GITHUB_OUTPUT"
echo "currentVersion: $currentVersion"
echo "isPublished: $isPublished"
outputs:
published: ${{ steps.check.outputs.published }}
version: ${{ steps.check.outputs.version }}
publish:
needs: release-check
if: ${{ needs.release-check.outputs.published == 'false' }}
runs-on: ubuntu-latest
permissions:
contents: write
defaults:
run:
shell: bash
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 22
- run: npm ci
- run: npm audit
- run: npm run lint
- run: npm run update-crosswalk # To support newer versions of Node.js
- run: npm run build --if-present
- run: npm test
- name: Prepare release changelog
id: prepare_release
run: |
RELEASE_TYPE="$(node -e "console.log(require('semver').prerelease('${{ needs.release-check.outputs.version }}') ? 'prerelease' : 'regular')")"
if [[ $RELEASE_TYPE == 'regular' ]]; then
echo "prerelease=false" >> "$GITHUB_OUTPUT"
else
echo "prerelease=true" >> "$GITHUB_OUTPUT"
fi
- name: Extract changelog for version
run: |
awk '/^##/ { p = 0 }; p == 1 { print }; $0 == "## ${{ needs.release-check.outputs.version }}" { p = 1 };' CHANGELOG.md > changelog_for_version.md
cat changelog_for_version.md
- name: Publish to Github
uses: ncipollo/release-action@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag: v${{ needs.release-check.outputs.version }}
name: v${{ needs.release-check.outputs.version }}
bodyFile: changelog_for_version.md
allowUpdates: true
draft: false
prerelease: ${{ steps.prepare_release.outputs.prerelease }}
- name: Publish to NPM (release)
if: ${{ steps.prepare_release.outputs.prerelease == 'false' }}
run: |
npm config set //registry.npmjs.org/:_authToken "${NPM_TOKEN}"
npm publish --access public
env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish to NPM (prerelease)
if: ${{ steps.prepare_release.outputs.prerelease == 'true' }}
run: |
npm config set //registry.npmjs.org/:_authToken "${NPM_TOKEN}"
npm publish --tag next --access public
env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}

517
node_modules/@mapbox/node-pre-gyp/CHANGELOG.md generated vendored Normal file
View File

@@ -0,0 +1,517 @@
# node-pre-gyp changelog
## 2.0.0
- Supported Node versions are now stable versions of Node 18+. We will attempt to track the [Node.js release schedule](https://github.com/nodejs/release#release-schedule) and will regularly retire support for versions that have reached EOL.
- Fixed use of `s3ForcePathStyle` for installation [#650](https://github.com/mapbox/node-pre-gyp/pull/650)
- Upgraded to https-proxy-agent 7.0.5, nopt 8.0.0, semver 7.5.3, and tar 7.4.0
- Replaced npmlog with consola
- Removed rimraf and make-dir as dependencies
## 1.0.11
- Fixes dependabot alert [CVE-2021-44906](https://nvd.nist.gov/vuln/detail/CVE-2021-44906)
## 1.0.10
- Upgraded minimist to 1.2.6 to address dependabot alert [CVE-2021-44906](https://nvd.nist.gov/vuln/detail/CVE-2021-44906)
## 1.0.9
- Upgraded node-fetch to 2.6.7 to address [CVE-2022-0235](https://www.cve.org/CVERecord?id=CVE-2022-0235)
- Upgraded detect-libc to 2.0.0 to use non-blocking NodeJS(>=12) Report API
## 1.0.8
- Downgraded npmlog to maintain node v10 and v8 support (https://github.com/mapbox/node-pre-gyp/pull/624)
## 1.0.7
- Upgraded nyc and npmlog to address https://github.com/advisories/GHSA-93q8-gq69-wqmw
## 1.0.6
- Added node v17 to the internal node releases listing
- Upgraded various dependencies declared in package.json to latest major versions (node-fetch from 2.6.1 to 2.6.5, npmlog from 4.1.2 to 5.01, semver from 7.3.4 to 7.3.5, and tar from 6.1.0 to 6.1.11)
- Fixed bug in `staging_host` parameter (https://github.com/mapbox/node-pre-gyp/pull/590)
## 1.0.5
- Fix circular reference warning with node >= v14
## 1.0.4
- Added node v16 to the internal node releases listing
## 1.0.3
- Improved support configuring s3 uploads (solves https://github.com/mapbox/node-pre-gyp/issues/571)
- New options added in https://github.com/mapbox/node-pre-gyp/pull/576: 'bucket', 'region', and `s3ForcePathStyle`
## 1.0.2
- Fixed regression in proxy support (https://github.com/mapbox/node-pre-gyp/issues/572)
## 1.0.1
- Switched from mkdirp@1.0.4 to make-dir@3.1.0 to avoid this bug: https://github.com/isaacs/node-mkdirp/issues/31
## 1.0.0
- Module is now name-spaced at `@mapbox/node-pre-gyp` and the original `node-pre-gyp` is deprecated.
- New: support for staging and production s3 targets (see README.md)
- BREAKING: no longer supporting `node_pre_gyp_accessKeyId` & `node_pre_gyp_secretAccessKey`, use `AWS_ACCESS_KEY_ID` & `AWS_SECRET_ACCESS_KEY` instead to authenticate against s3 for `info`, `publish`, and `unpublish` commands.
- Dropped node v6 support, added node v14 support
- Switched tests to use mapbox-owned bucket for testing
- Added coverage tracking and linting with eslint
- Added back support for symlinks inside the tarball
- Upgraded all test apps to N-API/node-addon-api
- New: support for staging and production s3 targets (see README.md)
- Added `node_pre_gyp_s3_host` env var which has priority over the `--s3_host` option or default.
- Replaced needle with node-fetch
- Added proxy support for node-fetch
- Upgraded to mkdirp@1.x
## 0.17.0
- Got travis + appveyor green again
- Added support for more node versions
## 0.16.0
- Added Node 15 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/520)
## 0.15.0
- Bump dependency on `mkdirp` from `^0.5.1` to `^0.5.3` (https://github.com/mapbox/node-pre-gyp/pull/492)
- Bump dependency on `needle` from `^2.2.1` to `^2.5.0` (https://github.com/mapbox/node-pre-gyp/pull/502)
- Added Node 14 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/501)
## 0.14.0
- Defer modules requires in napi.js (https://github.com/mapbox/node-pre-gyp/pull/434)
- Bump dependency on `tar` from `^4` to `^4.4.2` (https://github.com/mapbox/node-pre-gyp/pull/454)
- Support extracting compiled binary from local offline mirror (https://github.com/mapbox/node-pre-gyp/pull/459)
- Added Node 13 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/483)
## 0.13.0
- Added Node 12 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/449)
## 0.12.0
- Fixed double-build problem with node v10 (https://github.com/mapbox/node-pre-gyp/pull/428)
- Added node 11 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/422)
## 0.11.0
- Fixed double-install problem with node v10
- Significant N-API improvements (https://github.com/mapbox/node-pre-gyp/pull/405)
## 0.10.3
- Now will use `request` over `needle` if request is installed. By default `needle` is used for `https`. This should unbreak proxy support that regressed in v0.9.0
## 0.10.2
- Fixed rc/deep-extent security vulnerability
- Fixed broken reinstall script do to incorrectly named get_best_napi_version
## 0.10.1
- Fix needle error event (@medns)
## 0.10.0
- Allow for a single-level module path when packing @allenluce (https://github.com/mapbox/node-pre-gyp/pull/371)
- Log warnings instead of errors when falling back @xzyfer (https://github.com/mapbox/node-pre-gyp/pull/366)
- Add Node.js v10 support to tests (https://github.com/mapbox/node-pre-gyp/pull/372)
- Remove retire.js from CI (https://github.com/mapbox/node-pre-gyp/pull/372)
- Remove support for Node.js v4 due to [EOL on April 30th, 2018](https://github.com/nodejs/Release/blob/7dd52354049cae99eed0e9fe01345b0722a86fde/schedule.json#L14)
- Update appveyor tests to install default NPM version instead of NPM v2.x for all Windows builds (https://github.com/mapbox/node-pre-gyp/pull/375)
## 0.9.1
- Fixed regression (in v0.9.0) with support for http redirects @allenluce (https://github.com/mapbox/node-pre-gyp/pull/361)
## 0.9.0
- Switched from using `request` to `needle` to reduce size of module deps (https://github.com/mapbox/node-pre-gyp/pull/350)
## 0.8.0
- N-API support (@inspiredware)
## 0.7.1
- Upgraded to tar v4.x
## 0.7.0
- Updated request and hawk (#347)
- Dropped node v0.10.x support
## 0.6.40
- Improved error reporting if an install fails
## 0.6.39
- Support for node v9
- Support for versioning on `{libc}` to allow binaries to work on non-glic linux systems like alpine linux
## 0.6.38
- Maintaining compatibility (for v0.6.x series) with node v0.10.x
## 0.6.37
- Solved one part of #276: now now deduce the node ABI from the major version for node >= 2 even when not stored in the abi_crosswalk.json
- Fixed docs to avoid mentioning the deprecated and dangerous `prepublish` in package.json (#291)
- Add new node versions to crosswalk
- Ported tests to use tape instead of mocha
- Got appveyor tests passing by downgrading npm and node-gyp
## 0.6.36
- Removed the running of `testbinary` during install. Because this was regressed for so long, it is too dangerous to re-enable by default. Developers needing validation can call `node-pre-gyp testbinary` directory.
- Fixed regression in v0.6.35 for electron installs (now skipping binary validation which is not yet supported for electron)
## 0.6.35
- No longer recommending `npm ls` in `prepublish` (#291)
- Fixed testbinary command (#283) @szdavid92
## 0.6.34
- Added new node versions to crosswalk, including v8
- Upgraded deps to latest versions, started using `^` instead of `~` for all deps.
## 0.6.33
- Improved support for yarn
## 0.6.32
- Honor npm configuration for CA bundles (@heikkipora)
- Add node-pre-gyp and npm versions to user agent (@addaleax)
- Updated various deps
- Add known node version for v7.x
## 0.6.31
- Updated various deps
## 0.6.30
- Update to npmlog@4.x and semver@5.3.x
- Add known node version for v6.5.0
## 0.6.29
- Add known node versions for v0.10.45, v0.12.14, v4.4.4, v5.11.1, and v6.1.0
## 0.6.28
- Now more verbose when remote binaries are not available. This is needed since npm is increasingly more quiet by default
and users need to know why builds are falling back to source compiles that might then error out.
## 0.6.27
- Add known node version for node v6
- Stopped bundling dependencies
- Documented method for module authors to avoid bundling node-pre-gyp
- See https://github.com/mapbox/node-pre-gyp/tree/master#configuring for details
## 0.6.26
- Skip validation for nw runtime (https://github.com/mapbox/node-pre-gyp/pull/181) via @fleg
## 0.6.25
- Improved support for auto-detection of electron runtime in `node-pre-gyp.find()`
- Pull request from @enlight - https://github.com/mapbox/node-pre-gyp/pull/187
- Add known node version for 4.4.1 and 5.9.1
## 0.6.24
- Add known node version for 5.8.0, 5.9.0, and 4.4.0.
## 0.6.23
- Add known node version for 0.10.43, 0.12.11, 4.3.2, and 5.7.1.
## 0.6.22
- Add known node version for 4.3.1, and 5.7.0.
## 0.6.21
- Add known node version for 0.10.42, 0.12.10, 4.3.0, and 5.6.0.
## 0.6.20
- Add known node version for 4.2.5, 4.2.6, 5.4.0, 5.4.1,and 5.5.0.
## 0.6.19
- Add known node version for 4.2.4
## 0.6.18
- Add new known node versions for 0.10.x, 0.12.x, 4.x, and 5.x
## 0.6.17
- Re-tagged to fix packaging problem of `Error: Cannot find module 'isarray'`
## 0.6.16
- Added known version in crosswalk for 5.1.0.
## 0.6.15
- Upgraded tar-pack (https://github.com/mapbox/node-pre-gyp/issues/182)
- Support custom binary hosting mirror (https://github.com/mapbox/node-pre-gyp/pull/170)
- Added known version in crosswalk for 4.2.2.
## 0.6.14
- Added node 5.x version
## 0.6.13
- Added more known node 4.x versions
## 0.6.12
- Added support for [Electron](http://electron.atom.io/). Just pass the `--runtime=electron` flag when building/installing. Thanks @zcbenz
## 0.6.11
- Added known node and io.js versions including more 3.x and 4.x versions
## 0.6.10
- Added known node and io.js versions including 3.x and 4.x versions
- Upgraded `tar` dep
## 0.6.9
- Upgraded `rc` dep
- Updated known io.js version: v2.4.0
## 0.6.8
- Upgraded `semver` and `rimraf` deps
- Updated known node and io.js versions
## 0.6.7
- Fixed `node_abi` versions for io.js 1.1.x -> 1.8.x (should be 43, but was stored as 42) (refs https://github.com/iojs/build/issues/94)
## 0.6.6
- Updated with known io.js 2.0.0 version
## 0.6.5
- Now respecting `npm_config_node_gyp` (https://github.com/npm/npm/pull/4887)
- Updated to semver@4.3.2
- Updated known node v0.12.x versions and io.js 1.x versions.
## 0.6.4
- Improved support for `io.js` (@fengmk2)
- Test coverage improvements (@mikemorris)
- Fixed support for `--dist-url` that regressed in 0.6.3
## 0.6.3
- Added support for passing raw options to node-gyp using `--` separator. Flags passed after
the `--` to `node-pre-gyp configure` will be passed directly to gyp while flags passed
after the `--` will be passed directly to make/visual studio.
- Added `node-pre-gyp configure` command to be able to call `node-gyp configure` directly
- Fix issue with require validation not working on windows 7 (@edgarsilva)
## 0.6.2
- Support for io.js >= v1.0.2
- Deferred require of `request` and `tar` to help speed up command line usage of `node-pre-gyp`.
## 0.6.1
- Fixed bundled `tar` version
## 0.6.0
- BREAKING: node odd releases like v0.11.x now use `major.minor.patch` for `{node_abi}` instead of `NODE_MODULE_VERSION` (#124)
- Added support for `toolset` option in versioning. By default is an empty string but `--toolset` can be passed to publish or install to select alternative binaries that target a custom toolset like C++11. For example to target Visual Studio 2014 modules like node-sqlite3 use `--toolset=v140`.
- Added support for `--no-rollback` option to request that a failed binary test does not remove the binary module leaves it in place.
- Added support for `--update-binary` option to request an existing binary be re-installed and the check for a valid local module be skipped.
- Added support for passing build options from `npm` through `node-pre-gyp` to `node-gyp`: `--nodedir`, `--disturl`, `--python`, and `--msvs_version`
## 0.5.31
- Added support for deducing node_abi for node.js runtime from previous release if the series is even
- Added support for --target=0.10.33
## 0.5.30
- Repackaged with latest bundled deps
## 0.5.29
- Added support for semver `build`.
- Fixed support for downloading from urls that include `+`.
## 0.5.28
- Now reporting unix style paths only in reveal command
## 0.5.27
- Fixed support for auto-detecting s3 bucket name when it contains `.` - @taavo
- Fixed support for installing when path contains a `'` - @halfdan
- Ported tests to mocha
## 0.5.26
- Fix node-webkit support when `--target` option is not provided
## 0.5.25
- Fix bundling of deps
## 0.5.24
- Updated ABI crosswalk to include node v0.10.30 and v0.10.31
## 0.5.23
- Added `reveal` command. Pass no options to get all versioning data as json. Pass a second arg to grab a single versioned property value
- Added support for `--silent` (shortcut for `--loglevel=silent`)
## 0.5.22
- Fixed node-webkit versioning name (NOTE: node-webkit support still experimental)
## 0.5.21
- New package to fix `shasum check failed` error with v0.5.20
## 0.5.20
- Now versioning node-webkit binaries based on major.minor.patch - assuming no compatible ABI across versions (#90)
## 0.5.19
- Updated to know about more node-webkit releases
## 0.5.18
- Updated to know about more node-webkit releases
## 0.5.17
- Updated to know about node v0.10.29 release
## 0.5.16
- Now supporting all aws-sdk configuration parameters (http://docs.aws.amazon.com/AWSJavaScriptSDK/guide/node-configuring.html) (#86)
## 0.5.15
- Fixed installation of windows packages sub directories on unix systems (#84)
## 0.5.14
- Finished support for cross building using `--target_platform` option (#82)
- Now skipping binary validation on install if target arch/platform do not match the host.
- Removed multi-arch validating for macOS since it required a FAT node.js binary
## 0.5.13
- Fix problem in 0.5.12 whereby the wrong versions of mkdirp and semver where bundled.
## 0.5.12
- Improved support for node-webkit (@Mithgol)
## 0.5.11
- Updated target versions listing
## 0.5.10
- Fixed handling of `-debug` flag passed directory to node-pre-gyp (#72)
- Added optional second arg to `node_pre_gyp.find` to customize the default versioning options used to locate the runtime binary
- Failed install due to `testbinary` check failure no longer leaves behind binary (#70)
## 0.5.9
- Fixed regression in `testbinary` command causing installs to fail on windows with 0.5.7 (#60)
## 0.5.8
- Started bundling deps
## 0.5.7
- Fixed the `testbinary` check, which is used to determine whether to re-download or source compile, to work even in complex dependency situations (#63)
- Exposed the internal `testbinary` command in node-pre-gyp command line tool
- Fixed minor bug so that `fallback_to_build` option is always respected
## 0.5.6
- Added support for versioning on the `name` value in `package.json` (#57).
- Moved to using streams for reading tarball when publishing (#52)
## 0.5.5
- Improved binary validation that also now works with node-webkit (@Mithgol)
- Upgraded test apps to work with node v0.11.x
- Improved test coverage
## 0.5.4
- No longer depends on external install of node-gyp for compiling builds.
## 0.5.3
- Reverted fix for debian/nodejs since it broke windows (#45)
## 0.5.2
- Support for debian systems where the node binary is named `nodejs` (#45)
- Added `bin/node-pre-gyp.cmd` to be able to run command on windows locally (npm creates an .npm automatically when globally installed)
- Updated abi-crosswalk with node v0.10.26 entry.
## 0.5.1
- Various minor bug fixes, several improving windows support for publishing.
## 0.5.0
- Changed property names in `binary` object: now required are `module_name`, `module_path`, and `host`.
- Now `module_path` supports versioning, which allows developers to opt-in to using a versioned install path (#18).
- Added `remote_path` which also supports versioning.
- Changed `remote_uri` to `host`.
## 0.4.2
- Added support for `--target` flag to request cross-compile against a specific node/node-webkit version.
- Added preliminary support for node-webkit
- Fixed support for `--target_arch` option being respected in all cases.
## 0.4.1
- Fixed exception when only stderr is available in binary test (@bendi / #31)
## 0.4.0
- Enforce only `https:` based remote publishing access.
- Added `node-pre-gyp info` command to display listing of published binaries
- Added support for changing the directory node-pre-gyp should build in with the `-C/--directory` option.
- Added support for S3 prefixes.
## 0.3.1
- Added `unpublish` command.
- Fixed module path construction in tests.
- Added ability to disable falling back to build behavior via `npm install --fallback-to-build=false` which overrides setting in a dependencies package.json `install` target.
## 0.3.0
- Support for packaging all files in `module_path` directory - see `app4` for example
- Added `testpackage` command.
- Changed `clean` command to only delete `.node` not entire `build` directory since node-gyp will handle that.
- `.node` modules must be in a folder of there own since tar-pack will remove everything when it unpacks.

1
node_modules/@mapbox/node-pre-gyp/CODEOWNERS generated vendored Normal file
View File

@@ -0,0 +1 @@
* @mapbox/maps-api

27
node_modules/@mapbox/node-pre-gyp/LICENSE generated vendored Normal file
View File

@@ -0,0 +1,27 @@
Copyright (c), Mapbox
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice,
this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
* Neither the name of node-pre-gyp nor the names of its contributors
may be used to endorse or promote products derived from this software
without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

Some files were not shown because too many files have changed in this diff Show More