Compare commits

..

62 Commits

Author SHA1 Message Date
Michael B. Gale
4e828ff8d4 Merge pull request #2989 from github/update-v3.29.4-37264dc0b
Merge main into releases/v3
2025-07-23 14:15:56 +01:00
github-actions[bot]
b3114b8965 Update changelog for v3.29.4 2025-07-23 13:00:50 +00:00
Koen Vlaswinkel
37264dc0b3 Merge pull request #2988 from github/koesie10/disable-combine-single-file
Disable combining runs within a single file
2025-07-23 14:17:59 +02:00
Koen Vlaswinkel
5a29823d01 Merge remote-tracking branch 'origin/main' into koesie10/disable-combine-single-file 2025-07-23 14:03:16 +02:00
Michael B. Gale
5a2327a6fd Merge pull request #2987 from github/mbg/combine-sarif-error
Treat processing error for multiple runs with the same category as configuration error
2025-07-23 13:02:32 +01:00
Koen Vlaswinkel
287d421cf3 Disable combining runs within a single file 2025-07-23 13:51:13 +02:00
Michael B. Gale
43afe6ec0b Treat processing error for multiple runs with the same category as configuration error
This will result in it being reported as a user error rather than a failure
2025-07-23 12:48:44 +01:00
Michael B. Gale
8f2e63676d Merge pull request #2981 from github/dependabot/npm_and_yarn/npm-fe13dfda46
Bump the npm group with 5 updates
2025-07-23 09:29:24 +01:00
Michael B. Gale
76bf77db0b Merge pull request #2980 from github/dependabot/github_actions/actions-504b6cee34
Bump ruby/setup-ruby from 1.245.0 to 1.247.0 in the actions group
2025-07-22 18:24:17 +01:00
Michael B. Gale
9e7d13dd99 Merge pull request #2983 from github/koesie10/update-changelog-link
Update combining SARIF runs changelog post URL
2025-07-22 18:09:52 +01:00
Michael B. Gale
2b952be91d Update workflow template 2025-07-22 13:31:35 +01:00
Koen Vlaswinkel
48ce740f61 Update combining SARIF runs changelog post URL 2025-07-22 11:51:12 +02:00
github-actions[bot]
4749491b98 Update checked-in dependencies 2025-07-21 19:50:38 +00:00
dependabot[bot]
b7a5452764 Bump the npm group with 5 updates
Bumps the npm group with 5 updates:

| Package | From | To |
| --- | --- | --- |
| [@types/node-forge](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node-forge) | `1.3.12` | `1.3.13` |
| [@eslint/js](https://github.com/eslint/eslint/tree/HEAD/packages/js) | `9.30.1` | `9.31.0` |
| [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin) | `8.35.1` | `8.38.0` |
| [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser) | `8.35.1` | `8.38.0` |
| [nock](https://github.com/nock/nock) | `14.0.5` | `14.0.6` |


Updates `@types/node-forge` from 1.3.12 to 1.3.13
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node-forge)

Updates `@eslint/js` from 9.30.1 to 9.31.0
- [Release notes](https://github.com/eslint/eslint/releases)
- [Changelog](https://github.com/eslint/eslint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslint/commits/v9.31.0/packages/js)

Updates `@typescript-eslint/eslint-plugin` from 8.35.1 to 8.38.0
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.38.0/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.35.1 to 8.38.0
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.38.0/packages/parser)

Updates `nock` from 14.0.5 to 14.0.6
- [Release notes](https://github.com/nock/nock/releases)
- [Changelog](https://github.com/nock/nock/blob/main/CHANGELOG.md)
- [Commits](https://github.com/nock/nock/compare/v14.0.5...v14.0.6)

---
updated-dependencies:
- dependency-name: "@types/node-forge"
  dependency-version: 1.3.13
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@eslint/js"
  dependency-version: 9.31.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-version: 8.38.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-version: 8.38.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: nock
  dependency-version: 14.0.6
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-21 19:49:59 +00:00
dependabot[bot]
20477a3fe1 Bump ruby/setup-ruby from 1.245.0 to 1.247.0 in the actions group
Bumps the actions group with 1 update: [ruby/setup-ruby](https://github.com/ruby/setup-ruby).


Updates `ruby/setup-ruby` from 1.245.0 to 1.247.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](a4effe49ee...4727905401)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-version: 1.247.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-21 18:34:27 +00:00
Chuan-kai Lin
eefe1b5db9 Merge pull request #2975 from github/cklin/overlay-telemetry
Overlay: report telemetry
2025-07-21 06:23:15 -07:00
Koen Vlaswinkel
b6332872af Merge pull request #2979 from github/koesie10/v3.28.20-changelog
Add changelog entry for v3.28.20 backport
2025-07-21 14:56:14 +02:00
Koen Vlaswinkel
8e442bc480 Merge pull request #2978 from github/mergeback/v3.29.3-to-main-d6bbdef4
Mergeback v3.29.3 refs/heads/releases/v3 into main
2025-07-21 13:49:06 +02:00
Koen Vlaswinkel
a7cb1b8b39 Add changelog entry for v3.28.20 backport 2025-07-21 13:38:40 +02:00
github-actions[bot]
b195e1bfc6 Update checked-in dependencies 2025-07-21 11:35:49 +00:00
github-actions[bot]
df82387698 Update changelog and version after v3.29.3 2025-07-21 11:33:16 +00:00
Koen Vlaswinkel
d6bbdef45e Merge pull request #2977 from github/update-v3.29.3-7710ed11e
Merge main into releases/v3
2025-07-21 13:32:49 +02:00
github-actions[bot]
210cc9bfa2 Update changelog for v3.29.3 2025-07-21 09:29:13 +00:00
Chuan-kai Lin
39b0524b50 build: refresh js files 2025-07-18 07:45:45 -07:00
Chuan-kai Lin
c3bbcab41b Add downloadOverlayBaseDatabaseFromCache tests 2025-07-18 07:44:43 -07:00
Chuan-kai Lin
e37b293334 Overlay: report overlay-base database stats 2025-07-18 07:44:22 -07:00
Chuan-kai Lin
19075c4376 Overlay: report overlay analysis mode 2025-07-18 07:18:38 -07:00
Chuan-kai Lin
7710ed11e3 Merge pull request #2970 from github/cklin/diff-informed-feature-enable
Enable Feature.DiffInformedQueries
2025-07-17 08:21:08 -07:00
Chuan-kai Lin
6a49a8cbce build: refresh js files 2025-07-17 06:17:30 -07:00
Chuan-kai Lin
3aef4108d1 Add diff-informed-analysis-utils.test.ts 2025-07-17 06:14:37 -07:00
Chuan-kai Lin
614b64c6ec Diff-informed analysis: disable for GHES below 3.19 2025-07-17 06:10:14 -07:00
Chuan-kai Lin
aefb854fe5 Feature.DiffInformedQueries: default to true 2025-07-17 06:03:52 -07:00
Chuan-kai Lin
03a2a17e75 Merge pull request #2967 from github/cklin/overlay-feature-flags
Overlay: additional feature flags
2025-07-17 05:54:21 -07:00
Koen Vlaswinkel
07455ed3c3 Merge pull request #2972 from github/koesie10/ghes-satisfies
Ignore pre-release parts when comparing GHES versions
2025-07-17 10:35:33 +02:00
Chuan-kai Lin
3fb562ddcc build: refresh js files 2025-07-16 07:10:40 -07:00
Chuan-kai Lin
709cf22a66 Limit Code Scanning API to 25 features per request 2025-07-16 07:07:44 -07:00
Chuan-kai Lin
3eaefb4deb Replicate "too many feature flags" error in test 2025-07-16 07:06:52 -07:00
Koen Vlaswinkel
e30db30685 Ignore pre-release parts when comparing GHES versions 2025-07-16 11:51:53 +02:00
Arthur Baars
0d17ea4843 Merge pull request #2963 from github/dependabot/npm_and_yarn/npm-d16eacb461
Bump the npm group across 1 directory with 7 updates
2025-07-15 14:45:25 +02:00
Arthur Baars
38fdaed818 npm run build 2025-07-15 07:33:26 +00:00
github-actions[bot]
37e3c3113a Update checked-in dependencies 2025-07-15 07:33:26 +00:00
Arthur Baars
15605b194f Make eslint happy 2025-07-15 07:31:22 +00:00
Arthur Baars
0b8d278f47 Run: npx update-browserslist-db@latest 2025-07-15 07:30:36 +00:00
Arthur Baars
ca53360d04 Fix tests 2025-07-15 07:25:49 +00:00
Arthur Baars
bbf184bd4c Update ava 2025-07-15 07:25:49 +00:00
dependabot[bot]
0c2ac60444 Bump the npm group across 1 directory with 7 updates
Bumps the npm group with 6 updates in the / directory:

| Package | From | To |
| --- | --- | --- |
| [@types/node-forge](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node-forge) | `1.3.11` | `1.3.12` |
| [@ava/typescript](https://github.com/avajs/typescript) | `4.1.0` | `6.0.0` |
| [@eslint/compat](https://github.com/eslint/rewrite/tree/HEAD/packages/compat) | `1.1.1` | `1.3.1` |
| [@eslint/js](https://github.com/eslint/eslint/tree/HEAD/packages/js) | `9.28.0` | `9.30.1` |
| [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin) | `8.33.1` | `8.35.1` |
| [sinon](https://github.com/sinonjs/sinon) | `20.0.0` | `21.0.0` |



Updates `@types/node-forge` from 1.3.11 to 1.3.12
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node-forge)

Updates `@ava/typescript` from 4.1.0 to 6.0.0
- [Release notes](https://github.com/avajs/typescript/releases)
- [Commits](https://github.com/avajs/typescript/compare/v4.1.0...v6.0.0)

Updates `@eslint/compat` from 1.1.1 to 1.3.1
- [Release notes](https://github.com/eslint/rewrite/releases)
- [Changelog](https://github.com/eslint/rewrite/blob/main/packages/compat/CHANGELOG.md)
- [Commits](https://github.com/eslint/rewrite/commits/compat-v1.3.1/packages/compat)

Updates `@eslint/js` from 9.28.0 to 9.30.1
- [Release notes](https://github.com/eslint/eslint/releases)
- [Changelog](https://github.com/eslint/eslint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslint/commits/v9.30.1/packages/js)

Updates `@typescript-eslint/eslint-plugin` from 8.33.1 to 8.35.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.35.1/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.33.1 to 8.35.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.35.1/packages/parser)

Updates `sinon` from 20.0.0 to 21.0.0
- [Release notes](https://github.com/sinonjs/sinon/releases)
- [Changelog](https://github.com/sinonjs/sinon/blob/main/docs/changelog.md)
- [Commits](https://github.com/sinonjs/sinon/commits)

---
updated-dependencies:
- dependency-name: "@types/node-forge"
  dependency-version: 1.3.12
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@ava/typescript"
  dependency-version: 6.0.0
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: npm
- dependency-name: "@eslint/compat"
  dependency-version: 1.3.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@eslint/js"
  dependency-version: 9.30.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-version: 8.35.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-version: 8.35.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: sinon
  dependency-version: 21.0.0
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-14 20:52:48 +00:00
Koen Vlaswinkel
6f936b5c2d Merge pull request #2969 from github/koesie10/fix-ghes-version-parsing
Fix parsing of GHES pre-release versions
2025-07-14 13:42:48 +02:00
Koen Vlaswinkel
c6a6c1490f Move comment to JSDoc 2025-07-14 13:18:38 +02:00
Michael B. Gale
4e20239e7b Merge pull request #2951 from github/update-supported-enterprise-server-versions
Update supported GitHub Enterprise Server versions
2025-07-14 10:39:53 +01:00
Koen Vlaswinkel
59d67fc4bf Fix parsing of GHES pre-release versions 2025-07-14 11:25:20 +02:00
Chuan-kai Lin
b37e7e2c5d Move initializeFeatures() to testing-utils
This change eliminates the need for setup-codeql.test to import from
feature-flags.test, which makes the former run all tests defined in the
latter.
2025-07-11 09:54:40 -07:00
Chuan-kai Lin
90d7727554 Overlay: check code-scanning features 2025-07-10 14:16:19 -07:00
Chuan-kai Lin
fb771764cb Extract generateCodeScanningConfig() 2025-07-10 14:14:46 -07:00
Chuan-kai Lin
d799ff5e6a Overlay: check per-language features 2025-07-10 14:14:14 -07:00
Chuan-kai Lin
9f70a5fc86 Overlay: define language-specific features 2025-07-10 11:09:28 -07:00
Chuan-kai Lin
55cb6b8b94 Extract isOverlayAnalysisFeatureEnabled() 2025-07-10 10:48:43 -07:00
Chuan-kai Lin
4bdb7fe04f Overlay database mode tests: list features
Before we introduce additional features for controlling overlay analysis
enablement, change the unit tests to specify features directly instead
of through a isFeatureEnabled boolean field.
2025-07-10 10:46:32 -07:00
Chuan-kai Lin
64fce5856f Use exclude-from-incremental also for overlay analysis 2025-07-09 14:32:05 -07:00
Chuan-kai Lin
fe7205c739 Move getOverlayDatabaseMode() call into initConfig()
In an upcoming change, getOverlayDatabaseMode() will depend on the
contents of Config. As a result, getOverlayDatabaseMode() needs to be
called after the rest of Config has already been populated.

This commit performs the refactoring to move the
getOverlayDatabaseMode() into initConfig(), after the rest of Config has
already been populated.
2025-07-09 14:32:05 -07:00
Chuan-kai Lin
4cd7a721f7 Remove loadConfig()
The loadConfig() function is mostly the same as getDefaultConfig(),
except that it calls loadUserConfig() and stores the results in
originalUserInput.

This refactoring commit replaces the loadConfig() call with
getDefaultConfig() and loadUserConfig(), which allows deleting a large
amount of duplicated code.
2025-07-09 14:32:05 -07:00
Chuan-kai Lin
f4358b38d1 Extract loadUserConfig() 2025-07-09 14:32:05 -07:00
github-actions[bot]
83de9b082b Update supported GitHub Enterprise Server versions 2025-06-25 00:17:41 +00:00
2326 changed files with 82817 additions and 22711 deletions

View File

@@ -46,7 +46,7 @@ jobs:
use-all-platform-bundle: 'false'
setup-kotlin: 'true'
- name: Set up Ruby
uses: ruby/setup-ruby@a4effe49ee8ee5b8b5091268c473a4628afb5651 # v1.245.0
uses: ruby/setup-ruby@472790540115ce5bd69d399a020189a8c87d641f # v1.247.0
with:
ruby-version: 2.6
- name: Install Code Scanning integration

View File

@@ -2,7 +2,11 @@
See the [releases page](https://github.com/github/codeql-action/releases) for the relevant changes to the CodeQL CLI and language packs.
## [UNRELEASED]
## 3.29.4 - 23 Jul 2025
No user facing changes.
## 3.29.3 - 21 Jul 2025
No user facing changes.
@@ -20,6 +24,10 @@ No user facing changes.
- Update default CodeQL bundle version to 2.22.0. [#2925](https://github.com/github/codeql-action/pull/2925)
- Bump minimum CodeQL bundle version to 2.16.6. [#2912](https://github.com/github/codeql-action/pull/2912)
## 3.28.20 - 21 July 2025
- Remove support for combining SARIF files from a single upload for GHES 3.18, see [the changelog post](https://github.blog/changelog/2024-05-06-code-scanning-will-stop-combining-runs-from-a-single-upload/). [#2959](https://github.com/github/codeql-action/pull/2959)
## 3.28.19 - 03 Jun 2025
- The CodeQL Action no longer includes its own copy of the extractor for the `actions` language, which is currently in public preview.

View File

@@ -76,6 +76,7 @@ const util = __importStar(require("./util"));
const requiredInputStub = sinon.stub(actionsUtil, "getRequiredInput");
requiredInputStub.withArgs("token").returns("fake-token");
requiredInputStub.withArgs("upload-database").returns("false");
requiredInputStub.withArgs("output").returns("out");
const optionalInputStub = sinon.stub(actionsUtil, "getOptionalInput");
optionalInputStub.withArgs("cleanup-level").returns("none");
optionalInputStub.withArgs("expect-error").returns("false");

View File

@@ -1 +1 @@
{"version":3,"file":"analyze-action-env.test.js","sourceRoot":"","sources":["../src/analyze-action-env.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,8CAAuB;AACvB,6CAA+B;AAE/B,4DAA8C;AAC9C,mDAAqC;AACrC,kDAAoC;AACpC,4DAA8C;AAC9C,sDAAwC;AACxC,8DAAgD;AAChD,mDAIyB;AACzB,6CAA+B;AAE/B,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,4EAA4E;AAC5E,4EAA4E;AAC5E,+EAA+E;AAC/E,+EAA+E;AAC/E,gFAAgF;AAChF,iCAAiC;AAEjC,IAAA,aAAI,EAAC,8DAA8D,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IAC/E,MAAM,IAAI,CAAC,UAAU,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE;QACrC,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,IAAI,CAAC,iBAAiB,CAAC;QAC1D,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,sCAAsC,CAAC;QAC1E,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,wBAAwB,CAAC;QACzD,KAAK;aACF,IAAI,CAAC,YAAY,EAAE,wBAAwB,CAAC;aAC5C,QAAQ,CAAC,EAAmC,CAAC,CAAC;QACjD,KAAK,CAAC,IAAI,CAAC,YAAY,EAAE,kBAAkB,CAAC,CAAC,QAAQ,EAAE,CAAC;QACxD,KAAK,CAAC,IAAI,CAAC,QAAQ,EAAE,0BAA0B,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC;QAEhE,MAAM,aAAa,GAAuB;YACxC,IAAI,EAAE,IAAI,CAAC,aAAa,CAAC,MAAM;SAChC,CAAC;QACF,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,WAAW,CAAC,CAAC,QAAQ,CAAC;YAC5C,aAAa;YACb,sBAAsB,EAAE,EAAE;YAC1B,SAAS,EAAE,EAAE;YACb,KAAK,EAAE,EAAE;YACT,UAAU,EAAE,EAAE;SACkB,CAAC,CAAC;QACpC,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC;QAC1D,iBAAiB,CAAC,QAAQ,CAAC,iBAAiB,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC/D,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,eAAe,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC;QAC5D,iBAAiB,CAAC,QAAQ,CAAC,cAAc,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC5D,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,kBAAkB,CAAC,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;QAC5D,IAAA,gCAAgB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACjC,IAAA,0CAA0B,EAAC,GAAG,EAAE,EAAE,CAAC,CAAC;QAEpC,uEAAuE;QACvE,0EAA0E;QAC1E,iBAAiB;QACjB,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,IAAI,CAAC;QACrC,OAAO,CAAC,GAAG,CAAC,YAAY,CAAC,GAAG,MAAM,CAAC;QAEnC,MAAM,eAAe,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,aAAa,CAAC,CAAC;QAC3D,MAAM,cAAc,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,YAAY,CAAC,CAAC;QACzD,iEAAiE;QACjE,MAAM,aAAa,GAAG,OAAO,CAAC,kBAAkB,CAAC,CAAC;QAElD,uEAAuE;QACvE,oEAAoE;QACpE,4EAA4E;QAC5E,wEAAwE;QACxE,MAAM,aAAa,CAAC,UAAU,CAAC;QAE/B,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC/D,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;QAC7D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC9D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;IAC9D,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC"}
{"version":3,"file":"analyze-action-env.test.js","sourceRoot":"","sources":["../src/analyze-action-env.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,8CAAuB;AACvB,6CAA+B;AAE/B,4DAA8C;AAC9C,mDAAqC;AACrC,kDAAoC;AACpC,4DAA8C;AAC9C,sDAAwC;AACxC,8DAAgD;AAChD,mDAIyB;AACzB,6CAA+B;AAE/B,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,4EAA4E;AAC5E,4EAA4E;AAC5E,+EAA+E;AAC/E,+EAA+E;AAC/E,gFAAgF;AAChF,iCAAiC;AAEjC,IAAA,aAAI,EAAC,8DAA8D,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IAC/E,MAAM,IAAI,CAAC,UAAU,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE;QACrC,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,IAAI,CAAC,iBAAiB,CAAC;QAC1D,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,sCAAsC,CAAC;QAC1E,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,wBAAwB,CAAC;QACzD,KAAK;aACF,IAAI,CAAC,YAAY,EAAE,wBAAwB,CAAC;aAC5C,QAAQ,CAAC,EAAmC,CAAC,CAAC;QACjD,KAAK,CAAC,IAAI,CAAC,YAAY,EAAE,kBAAkB,CAAC,CAAC,QAAQ,EAAE,CAAC;QACxD,KAAK,CAAC,IAAI,CAAC,QAAQ,EAAE,0BAA0B,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC;QAEhE,MAAM,aAAa,GAAuB;YACxC,IAAI,EAAE,IAAI,CAAC,aAAa,CAAC,MAAM;SAChC,CAAC;QACF,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,WAAW,CAAC,CAAC,QAAQ,CAAC;YAC5C,aAAa;YACb,sBAAsB,EAAE,EAAE;YAC1B,SAAS,EAAE,EAAE;YACb,KAAK,EAAE,EAAE;YACT,UAAU,EAAE,EAAE;SACkB,CAAC,CAAC;QACpC,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC;QAC1D,iBAAiB,CAAC,QAAQ,CAAC,iBAAiB,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC/D,iBAAiB,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC;QACpD,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,eAAe,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC;QAC5D,iBAAiB,CAAC,QAAQ,CAAC,cAAc,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC5D,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,kBAAkB,CAAC,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;QAC5D,IAAA,gCAAgB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACjC,IAAA,0CAA0B,EAAC,GAAG,EAAE,EAAE,CAAC,CAAC;QAEpC,uEAAuE;QACvE,0EAA0E;QAC1E,iBAAiB;QACjB,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,IAAI,CAAC;QACrC,OAAO,CAAC,GAAG,CAAC,YAAY,CAAC,GAAG,MAAM,CAAC;QAEnC,MAAM,eAAe,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,aAAa,CAAC,CAAC;QAC3D,MAAM,cAAc,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,YAAY,CAAC,CAAC;QACzD,iEAAiE;QACjE,MAAM,aAAa,GAAG,OAAO,CAAC,kBAAkB,CAAC,CAAC;QAElD,uEAAuE;QACvE,oEAAoE;QACpE,4EAA4E;QAC5E,wEAAwE;QACxE,MAAM,aAAa,CAAC,UAAU,CAAC;QAE/B,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC/D,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;QAC7D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC9D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;IAC9D,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC"}

View File

@@ -75,6 +75,7 @@ const util = __importStar(require("./util"));
const requiredInputStub = sinon.stub(actionsUtil, "getRequiredInput");
requiredInputStub.withArgs("token").returns("fake-token");
requiredInputStub.withArgs("upload-database").returns("false");
requiredInputStub.withArgs("output").returns("out");
const optionalInputStub = sinon.stub(actionsUtil, "getOptionalInput");
optionalInputStub.withArgs("cleanup-level").returns("none");
optionalInputStub.withArgs("expect-error").returns("false");

View File

@@ -1 +1 @@
{"version":3,"file":"analyze-action-input.test.js","sourceRoot":"","sources":["../src/analyze-action-input.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,8CAAuB;AACvB,6CAA+B;AAE/B,4DAA8C;AAC9C,mDAAqC;AACrC,kDAAoC;AACpC,4DAA8C;AAC9C,sDAAwC;AACxC,8DAAgD;AAChD,mDAIyB;AACzB,6CAA+B;AAE/B,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,4EAA4E;AAC5E,4EAA4E;AAC5E,+EAA+E;AAC/E,+EAA+E;AAC/E,gFAAgF;AAChF,iCAAiC;AAEjC,IAAA,aAAI,EAAC,sDAAsD,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACvE,MAAM,IAAI,CAAC,UAAU,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE;QACrC,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,IAAI,CAAC,iBAAiB,CAAC;QAC1D,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,sCAAsC,CAAC;QAC1E,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,wBAAwB,CAAC;QACzD,KAAK;aACF,IAAI,CAAC,YAAY,EAAE,wBAAwB,CAAC;aAC5C,QAAQ,CAAC,EAAmC,CAAC,CAAC;QACjD,KAAK,CAAC,IAAI,CAAC,YAAY,EAAE,kBAAkB,CAAC,CAAC,QAAQ,EAAE,CAAC;QACxD,MAAM,aAAa,GAAuB;YACxC,IAAI,EAAE,IAAI,CAAC,aAAa,CAAC,MAAM;SAChC,CAAC;QACF,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,WAAW,CAAC,CAAC,QAAQ,CAAC;YAC5C,aAAa;YACb,sBAAsB,EAAE,EAAE;YAC1B,SAAS,EAAE,EAAE;YACb,KAAK,EAAE,EAAE;YACT,UAAU,EAAE,EAAE;SACkB,CAAC,CAAC;QACpC,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC;QAC1D,iBAAiB,CAAC,QAAQ,CAAC,iBAAiB,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC/D,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,eAAe,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC;QAC5D,iBAAiB,CAAC,QAAQ,CAAC,cAAc,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC5D,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,kBAAkB,CAAC,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;QAC5D,KAAK,CAAC,IAAI,CAAC,QAAQ,EAAE,0BAA0B,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC;QAChE,IAAA,gCAAgB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACjC,IAAA,0CAA0B,EAAC,GAAG,EAAE,EAAE,CAAC,CAAC;QAEpC,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,GAAG,CAAC;QACpC,OAAO,CAAC,GAAG,CAAC,YAAY,CAAC,GAAG,MAAM,CAAC;QAEnC,4DAA4D;QAC5D,iBAAiB,CAAC,QAAQ,CAAC,SAAS,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;QACpD,iBAAiB,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC;QAElD,MAAM,eAAe,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,aAAa,CAAC,CAAC;QAC3D,MAAM,cAAc,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,YAAY,CAAC,CAAC;QACzD,iEAAiE;QACjE,MAAM,aAAa,GAAG,OAAO,CAAC,kBAAkB,CAAC,CAAC;QAElD,uEAAuE;QACvE,oEAAoE;QACpE,4EAA4E;QAC5E,wEAAwE;QACxE,MAAM,aAAa,CAAC,UAAU,CAAC;QAE/B,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC/D,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;QAC7D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC9D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;IAC9D,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC"}
{"version":3,"file":"analyze-action-input.test.js","sourceRoot":"","sources":["../src/analyze-action-input.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,8CAAuB;AACvB,6CAA+B;AAE/B,4DAA8C;AAC9C,mDAAqC;AACrC,kDAAoC;AACpC,4DAA8C;AAC9C,sDAAwC;AACxC,8DAAgD;AAChD,mDAIyB;AACzB,6CAA+B;AAE/B,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,4EAA4E;AAC5E,4EAA4E;AAC5E,+EAA+E;AAC/E,+EAA+E;AAC/E,gFAAgF;AAChF,iCAAiC;AAEjC,IAAA,aAAI,EAAC,sDAAsD,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IACvE,MAAM,IAAI,CAAC,UAAU,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE;QACrC,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,IAAI,CAAC,iBAAiB,CAAC;QAC1D,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,GAAG,sCAAsC,CAAC;QAC1E,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,wBAAwB,CAAC;QACzD,KAAK;aACF,IAAI,CAAC,YAAY,EAAE,wBAAwB,CAAC;aAC5C,QAAQ,CAAC,EAAmC,CAAC,CAAC;QACjD,KAAK,CAAC,IAAI,CAAC,YAAY,EAAE,kBAAkB,CAAC,CAAC,QAAQ,EAAE,CAAC;QACxD,MAAM,aAAa,GAAuB;YACxC,IAAI,EAAE,IAAI,CAAC,aAAa,CAAC,MAAM;SAChC,CAAC;QACF,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,WAAW,CAAC,CAAC,QAAQ,CAAC;YAC5C,aAAa;YACb,sBAAsB,EAAE,EAAE;YAC1B,SAAS,EAAE,EAAE;YACb,KAAK,EAAE,EAAE;YACT,UAAU,EAAE,EAAE;SACkB,CAAC,CAAC;QACpC,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC;QAC1D,iBAAiB,CAAC,QAAQ,CAAC,iBAAiB,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC/D,iBAAiB,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC;QACpD,MAAM,iBAAiB,GAAG,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,kBAAkB,CAAC,CAAC;QACtE,iBAAiB,CAAC,QAAQ,CAAC,eAAe,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC;QAC5D,iBAAiB,CAAC,QAAQ,CAAC,cAAc,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC5D,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,kBAAkB,CAAC,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;QAC5D,KAAK,CAAC,IAAI,CAAC,QAAQ,EAAE,0BAA0B,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC;QAChE,IAAA,gCAAgB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACjC,IAAA,0CAA0B,EAAC,GAAG,EAAE,EAAE,CAAC,CAAC;QAEpC,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,GAAG,GAAG,CAAC;QACpC,OAAO,CAAC,GAAG,CAAC,YAAY,CAAC,GAAG,MAAM,CAAC;QAEnC,4DAA4D;QAC5D,iBAAiB,CAAC,QAAQ,CAAC,SAAS,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;QACpD,iBAAiB,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC;QAElD,MAAM,eAAe,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,aAAa,CAAC,CAAC;QAC3D,MAAM,cAAc,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,YAAY,CAAC,CAAC;QACzD,iEAAiE;QACjE,MAAM,aAAa,GAAG,OAAO,CAAC,kBAAkB,CAAC,CAAC;QAElD,uEAAuE;QACvE,oEAAoE;QACpE,4EAA4E;QAC5E,wEAAwE;QACxE,MAAM,aAAa,CAAC,UAAU,CAAC;QAE/B,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC/D,CAAC,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;QAC7D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;QAC9D,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,YAAY,CAAC,CAAC;IAC9D,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC"}

6
lib/analyze.js generated
View File

@@ -423,6 +423,12 @@ async function runQueries(sarifFolder, memoryFlag, addSnippetsFlag, threadsFlag,
queryFlags.push("--extension-packs=codeql-action/pr-diff-range");
incrementalMode.push("diff-informed");
}
statusReport.analysis_is_overlay =
config.augmentationProperties.overlayDatabaseMode ===
overlay_database_utils_1.OverlayDatabaseMode.Overlay;
statusReport.analysis_builds_overlay_base_database =
config.augmentationProperties.overlayDatabaseMode ===
overlay_database_utils_1.OverlayDatabaseMode.OverlayBase;
if (config.augmentationProperties.overlayDatabaseMode ===
overlay_database_utils_1.OverlayDatabaseMode.Overlay) {
incrementalMode.push("overlay");

File diff suppressed because one or more lines are too long

2
lib/analyze.test.js generated
View File

@@ -116,7 +116,9 @@ const util = __importStar(require("./util"));
});
const statusReport = await (0, analyze_1.runQueries)(tmpDir, memoryFlag, addSnippetsFlag, threadsFlag, "brutal", undefined, undefined, config, (0, logging_1.getRunnerLogger)(true), (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.QaTelemetryEnabled]));
t.deepEqual(Object.keys(statusReport).sort(), [
"analysis_builds_overlay_base_database",
"analysis_is_diff_informed",
"analysis_is_overlay",
`analyze_builtin_queries_${language}_duration_ms`,
"event_reports",
`interpret_results_${language}_duration_ms`,

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
{ "maximumVersion": "3.18", "minimumVersion": "3.14" }
{ "maximumVersion": "3.18", "minimumVersion": "3.13" }

55
lib/codeql.js generated
View File

@@ -50,6 +50,7 @@ const toolrunner = __importStar(require("@actions/exec/lib/toolrunner"));
const yaml = __importStar(require("js-yaml"));
const actions_util_1 = require("./actions-util");
const cli_errors_1 = require("./cli-errors");
const config_utils_1 = require("./config-utils");
const doc_url_1 = require("./doc-url");
const environment_1 = require("./environment");
const feature_flags_1 = require("./feature-flags");
@@ -261,7 +262,7 @@ async function getCodeQLForCmd(cmd, checkVersion) {
extraArgs.push(...(await getTrapCachingExtractorConfigArgs(config)));
extraArgs.push(`--trace-process-name=${processName}`);
}
const codeScanningConfigFile = await generateCodeScanningConfig(config, logger);
const codeScanningConfigFile = await writeCodeScanningConfigFile(config, logger);
const externalRepositoryToken = (0, actions_util_1.getOptionalInput)("external-repository-token");
extraArgs.push(`--codescanning-config=${codeScanningConfigFile}`);
if (externalRepositoryToken) {
@@ -756,57 +757,9 @@ async function runCli(cmd, args = [], opts = {}) {
* @param config The configuration to use.
* @returns the path to the generated user configuration file.
*/
async function generateCodeScanningConfig(config, logger) {
async function writeCodeScanningConfigFile(config, logger) {
const codeScanningConfigFile = getGeneratedCodeScanningConfigPath(config);
// make a copy so we can modify it
const augmentedConfig = (0, util_1.cloneObject)(config.originalUserInput);
// Inject the queries from the input
if (config.augmentationProperties.queriesInput) {
if (config.augmentationProperties.queriesInputCombines) {
augmentedConfig.queries = (augmentedConfig.queries || []).concat(config.augmentationProperties.queriesInput);
}
else {
augmentedConfig.queries = config.augmentationProperties.queriesInput;
}
}
if (augmentedConfig.queries?.length === 0) {
delete augmentedConfig.queries;
}
// Inject the packs from the input
if (config.augmentationProperties.packsInput) {
if (config.augmentationProperties.packsInputCombines) {
// At this point, we already know that this is a single-language analysis
if (Array.isArray(augmentedConfig.packs)) {
augmentedConfig.packs = (augmentedConfig.packs || []).concat(config.augmentationProperties.packsInput);
}
else if (!augmentedConfig.packs) {
augmentedConfig.packs = config.augmentationProperties.packsInput;
}
else {
// At this point, we know there is only one language.
// If there were more than one language, an error would already have been thrown.
const language = Object.keys(augmentedConfig.packs)[0];
augmentedConfig.packs[language] = augmentedConfig.packs[language].concat(config.augmentationProperties.packsInput);
}
}
else {
augmentedConfig.packs = config.augmentationProperties.packsInput;
}
}
if (Array.isArray(augmentedConfig.packs) && !augmentedConfig.packs.length) {
delete augmentedConfig.packs;
}
augmentedConfig["query-filters"] = [
// Ordering matters. If the first filter is an inclusion, it implicitly
// excludes all queries that are not included. If it is an exclusion,
// it implicitly includes all queries that are not excluded. So user
// filters (if any) should always be first to preserve intent.
...(augmentedConfig["query-filters"] || []),
...(config.augmentationProperties.extraQueryExclusions || []),
];
if (augmentedConfig["query-filters"]?.length === 0) {
delete augmentedConfig["query-filters"];
}
const augmentedConfig = (0, config_utils_1.generateCodeScanningConfig)(config.originalUserInput, config.augmentationProperties);
logger.info(`Writing augmented user configuration file to ${codeScanningConfigFile}`);
logger.startGroup("Augmented user configuration file contents");
logger.info(yaml.dump(augmentedConfig));

File diff suppressed because one or more lines are too long

204
lib/config-utils.js generated
View File

@@ -58,6 +58,7 @@ exports.getConfig = getConfig;
exports.generateRegistries = generateRegistries;
exports.wrapEnvironment = wrapEnvironment;
exports.parseBuildModeInput = parseBuildModeInput;
exports.generateCodeScanningConfig = generateCodeScanningConfig;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const perf_hooks_1 = require("perf_hooks");
@@ -231,12 +232,12 @@ async function getRawLanguages(languagesInput, repository, logger) {
return { rawLanguages, autodetected };
}
/**
* Get the default config for when the user has not supplied one.
* Get the default config, populated without user configuration file.
*/
async function getDefaultConfig({ languagesInput, queriesInput, qualityQueriesInput, packsInput, buildModeInput, dbLocation, trapCachingEnabled, dependencyCachingEnabled, debugMode, debugArtifactName, debugDatabaseName, repository, tempDir, codeql, sourceRoot, githubVersion, features, logger, }) {
async function getDefaultConfig({ languagesInput, queriesInput, qualityQueriesInput, packsInput, buildModeInput, dbLocation, trapCachingEnabled, dependencyCachingEnabled, debugMode, debugArtifactName, debugDatabaseName, repository, tempDir, codeql, githubVersion, features, logger, }) {
const languages = await getLanguages(codeql, languagesInput, repository, logger);
const buildMode = await parseBuildModeInput(buildModeInput, languages, features, logger);
const augmentationProperties = await calculateAugmentation(codeql, repository, features, packsInput, queriesInput, qualityQueriesInput, languages, sourceRoot, buildMode, logger);
const augmentationProperties = await calculateAugmentation(packsInput, queriesInput, qualityQueriesInput, languages);
const { trapCaches, trapCacheDownloadTime } = await downloadCacheWithTime(trapCachingEnabled, codeql, languages, logger);
return {
languages,
@@ -265,11 +266,7 @@ async function downloadCacheWithTime(trapCachingEnabled, codeQL, languages, logg
}
return { trapCaches, trapCacheDownloadTime };
}
/**
* Load the config from the given file.
*/
async function loadConfig({ languagesInput, queriesInput, qualityQueriesInput, packsInput, buildModeInput, configFile, dbLocation, trapCachingEnabled, dependencyCachingEnabled, debugMode, debugArtifactName, debugDatabaseName, repository, tempDir, codeql, workspacePath, sourceRoot, githubVersion, apiDetails, features, logger, }) {
let parsedYAML;
async function loadUserConfig(configFile, workspacePath, apiDetails, tempDir) {
if (isLocal(configFile)) {
if (configFile !== userConfigFromActionPath(tempDir)) {
// If the config file is not generated by the Action, it should be relative to the workspace.
@@ -279,31 +276,11 @@ async function loadConfig({ languagesInput, queriesInput, qualityQueriesInput, p
throw new util_1.ConfigurationError(getConfigFileOutsideWorkspaceErrorMessage(configFile));
}
}
parsedYAML = getLocalConfig(configFile);
return getLocalConfig(configFile);
}
else {
parsedYAML = await getRemoteConfig(configFile, apiDetails);
return await getRemoteConfig(configFile, apiDetails);
}
const languages = await getLanguages(codeql, languagesInput, repository, logger);
const buildMode = await parseBuildModeInput(buildModeInput, languages, features, logger);
const augmentationProperties = await calculateAugmentation(codeql, repository, features, packsInput, queriesInput, qualityQueriesInput, languages, sourceRoot, buildMode, logger);
const { trapCaches, trapCacheDownloadTime } = await downloadCacheWithTime(trapCachingEnabled, codeql, languages, logger);
return {
languages,
buildMode,
originalUserInput: parsedYAML,
tempDir,
codeQLCmd: codeql.getPath(),
gitHubVersion: githubVersion,
dbLocation: dbLocationOrDefault(dbLocation, tempDir),
debugMode,
debugArtifactName,
debugDatabaseName,
augmentationProperties,
trapCaches,
trapCacheDownloadTime,
dependencyCachingEnabled: (0, caching_utils_1.getCachingKind)(dependencyCachingEnabled),
};
}
/**
* Calculates how the codeql config file needs to be augmented before passing
@@ -312,17 +289,11 @@ async function loadConfig({ languagesInput, queriesInput, qualityQueriesInput, p
* and the CLI does not know about these inputs so we need to inject them into
* the config file sent to the CLI.
*
* @param codeql The CodeQL object.
* @param repository The repository to analyze.
* @param features The feature enablement object.
* @param rawPacksInput The packs input from the action configuration.
* @param rawQueriesInput The queries input from the action configuration.
* @param languages The languages that the config file is for. If the packs input
* is non-empty, then there must be exactly one language. Otherwise, an
* error is thrown.
* @param sourceRoot The source root of the repository.
* @param buildMode The build mode to use.
* @param logger The logger to use for logging.
*
* @returns The properties that need to be augmented in the config file.
*
@@ -330,30 +301,21 @@ async function loadConfig({ languagesInput, queriesInput, qualityQueriesInput, p
* not have exactly one language.
*/
// exported for testing.
async function calculateAugmentation(codeql, repository, features, rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages, sourceRoot, buildMode, logger) {
async function calculateAugmentation(rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages) {
const packsInputCombines = shouldCombine(rawPacksInput);
const packsInput = parsePacksFromInput(rawPacksInput, languages, packsInputCombines);
const queriesInputCombines = shouldCombine(rawQueriesInput);
const queriesInput = parseQueriesFromInput(rawQueriesInput, queriesInputCombines);
const { overlayDatabaseMode, useOverlayDatabaseCaching } = await getOverlayDatabaseMode(codeql, repository, features, languages, sourceRoot, buildMode, logger);
logger.info(`Using overlay database mode: ${overlayDatabaseMode} ` +
`${useOverlayDatabaseCaching ? "with" : "without"} caching.`);
const qualityQueriesInput = parseQueriesFromInput(rawQualityQueriesInput, false);
const extraQueryExclusions = [];
if (await (0, diff_informed_analysis_utils_1.shouldPerformDiffInformedAnalysis)(codeql, features, logger)) {
extraQueryExclusions.push({
exclude: { tags: "exclude-from-incremental" },
});
}
return {
packsInputCombines,
packsInput: packsInput?.[languages[0]],
queriesInput,
queriesInputCombines,
qualityQueriesInput,
extraQueryExclusions,
overlayDatabaseMode,
useOverlayDatabaseCaching,
extraQueryExclusions: [],
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
};
}
function parseQueriesFromInput(rawQueriesInput, queriesInputCombines) {
@@ -368,6 +330,64 @@ function parseQueriesFromInput(rawQueriesInput, queriesInputCombines) {
}
return trimmedInput.split(",").map((query) => ({ uses: query.trim() }));
}
const OVERLAY_ANALYSIS_FEATURES = {
actions: feature_flags_1.Feature.OverlayAnalysisActions,
cpp: feature_flags_1.Feature.OverlayAnalysisCpp,
csharp: feature_flags_1.Feature.OverlayAnalysisCsharp,
go: feature_flags_1.Feature.OverlayAnalysisGo,
java: feature_flags_1.Feature.OverlayAnalysisJava,
javascript: feature_flags_1.Feature.OverlayAnalysisJavascript,
python: feature_flags_1.Feature.OverlayAnalysisPython,
ruby: feature_flags_1.Feature.OverlayAnalysisRuby,
rust: feature_flags_1.Feature.OverlayAnalysisRust,
swift: feature_flags_1.Feature.OverlayAnalysisSwift,
};
const OVERLAY_ANALYSIS_CODE_SCANNING_FEATURES = {
actions: feature_flags_1.Feature.OverlayAnalysisCodeScanningActions,
cpp: feature_flags_1.Feature.OverlayAnalysisCodeScanningCpp,
csharp: feature_flags_1.Feature.OverlayAnalysisCodeScanningCsharp,
go: feature_flags_1.Feature.OverlayAnalysisCodeScanningGo,
java: feature_flags_1.Feature.OverlayAnalysisCodeScanningJava,
javascript: feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
python: feature_flags_1.Feature.OverlayAnalysisCodeScanningPython,
ruby: feature_flags_1.Feature.OverlayAnalysisCodeScanningRuby,
rust: feature_flags_1.Feature.OverlayAnalysisCodeScanningRust,
swift: feature_flags_1.Feature.OverlayAnalysisCodeScanningSwift,
};
async function isOverlayAnalysisFeatureEnabled(repository, features, codeql, languages, codeScanningConfig) {
// TODO: Remove the repository owner check once support for overlay analysis
// stabilizes, and no more backward-incompatible changes are expected.
if (!["github", "dsp-testing"].includes(repository.owner)) {
return false;
}
if (!(await features.getValue(feature_flags_1.Feature.OverlayAnalysis, codeql))) {
return false;
}
let enableForCodeScanningOnly = false;
for (const language of languages) {
const feature = OVERLAY_ANALYSIS_FEATURES[language];
if (feature && (await features.getValue(feature, codeql))) {
continue;
}
const codeScanningFeature = OVERLAY_ANALYSIS_CODE_SCANNING_FEATURES[language];
if (codeScanningFeature &&
(await features.getValue(codeScanningFeature, codeql))) {
enableForCodeScanningOnly = true;
continue;
}
return false;
}
if (enableForCodeScanningOnly) {
// A code-scanning configuration runs only the (default) code-scanning suite
// if the default queries are not disabled, and no packs, queries, or
// query-filters are specified.
return (codeScanningConfig["disable-default-queries"] !== true &&
codeScanningConfig.packs === undefined &&
codeScanningConfig.queries === undefined &&
codeScanningConfig["query-filters"] === undefined);
}
return true;
}
/**
* Calculate and validate the overlay database mode and caching to use.
*
@@ -389,7 +409,7 @@ function parseQueriesFromInput(rawQueriesInput, queriesInputCombines) {
* @returns An object containing the overlay database mode and whether the
* action should perform overlay-base database caching.
*/
async function getOverlayDatabaseMode(codeql, repository, features, languages, sourceRoot, buildMode, logger) {
async function getOverlayDatabaseMode(codeql, repository, features, languages, sourceRoot, buildMode, codeScanningConfig, logger) {
let overlayDatabaseMode = overlay_database_utils_1.OverlayDatabaseMode.None;
let useOverlayDatabaseCaching = false;
const modeEnv = process.env.CODEQL_OVERLAY_DATABASE_MODE;
@@ -402,11 +422,7 @@ async function getOverlayDatabaseMode(codeql, repository, features, languages, s
logger.info(`Setting overlay database mode to ${overlayDatabaseMode} ` +
"from the CODEQL_OVERLAY_DATABASE_MODE environment variable.");
}
else if (
// TODO: Remove the repository owner check once support for overlay analysis
// stabilizes, and no more backward-incompatible changes are expected.
["github", "dsp-testing"].includes(repository.owner) &&
(await features.getValue(feature_flags_1.Feature.OverlayAnalysis, codeql))) {
else if (await isOverlayAnalysisFeatureEnabled(repository, features, codeql, languages, codeScanningConfig)) {
if ((0, actions_util_1.isAnalyzingPullRequest)()) {
overlayDatabaseMode = overlay_database_utils_1.OverlayDatabaseMode.Overlay;
useOverlayDatabaseCaching = true;
@@ -586,7 +602,6 @@ function userConfigFromActionPath(tempDir) {
* a default config. The parsed config is then stored to a known location.
*/
async function initConfig(inputs) {
let config;
const { logger, tempDir } = inputs;
// if configInput is set, it takes precedence over configFile
if (inputs.configInput) {
@@ -597,14 +612,31 @@ async function initConfig(inputs) {
fs.writeFileSync(inputs.configFile, inputs.configInput);
logger.debug(`Using config from action input: ${inputs.configFile}`);
}
// If no config file was provided create an empty one
let userConfig = {};
if (!inputs.configFile) {
logger.debug("No configuration file was provided");
config = await getDefaultConfig(inputs);
}
else {
// Convince the type checker that inputs.configFile is defined.
config = await loadConfig({ ...inputs, configFile: inputs.configFile });
logger.debug(`Using configuration file: ${inputs.configFile}`);
userConfig = await loadUserConfig(inputs.configFile, inputs.workspacePath, inputs.apiDetails, tempDir);
}
const config = await getDefaultConfig(inputs);
const augmentationProperties = config.augmentationProperties;
config.originalUserInput = userConfig;
// The choice of overlay database mode depends on the selection of languages
// and queries, which in turn depends on the user config and the augmentation
// properties. So we need to calculate the overlay database mode after the
// rest of the config has been populated.
const { overlayDatabaseMode, useOverlayDatabaseCaching } = await getOverlayDatabaseMode(inputs.codeql, inputs.repository, inputs.features, config.languages, inputs.sourceRoot, config.buildMode, generateCodeScanningConfig(userConfig, augmentationProperties), logger);
logger.info(`Using overlay database mode: ${overlayDatabaseMode} ` +
`${useOverlayDatabaseCaching ? "with" : "without"} caching.`);
augmentationProperties.overlayDatabaseMode = overlayDatabaseMode;
augmentationProperties.useOverlayDatabaseCaching = useOverlayDatabaseCaching;
if (overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.Overlay ||
(await (0, diff_informed_analysis_utils_1.shouldPerformDiffInformedAnalysis)(inputs.codeql, inputs.features, logger))) {
augmentationProperties.extraQueryExclusions.push({
exclude: { tags: "exclude-from-incremental" },
});
}
// Save the config so we can easily access it again in the future
await saveConfig(config, logger);
@@ -807,4 +839,56 @@ async function parseBuildModeInput(input, languages, features, logger) {
}
return input;
}
function generateCodeScanningConfig(originalUserInput, augmentationProperties) {
// make a copy so we can modify it
const augmentedConfig = (0, util_1.cloneObject)(originalUserInput);
// Inject the queries from the input
if (augmentationProperties.queriesInput) {
if (augmentationProperties.queriesInputCombines) {
augmentedConfig.queries = (augmentedConfig.queries || []).concat(augmentationProperties.queriesInput);
}
else {
augmentedConfig.queries = augmentationProperties.queriesInput;
}
}
if (augmentedConfig.queries?.length === 0) {
delete augmentedConfig.queries;
}
// Inject the packs from the input
if (augmentationProperties.packsInput) {
if (augmentationProperties.packsInputCombines) {
// At this point, we already know that this is a single-language analysis
if (Array.isArray(augmentedConfig.packs)) {
augmentedConfig.packs = (augmentedConfig.packs || []).concat(augmentationProperties.packsInput);
}
else if (!augmentedConfig.packs) {
augmentedConfig.packs = augmentationProperties.packsInput;
}
else {
// At this point, we know there is only one language.
// If there were more than one language, an error would already have been thrown.
const language = Object.keys(augmentedConfig.packs)[0];
augmentedConfig.packs[language] = augmentedConfig.packs[language].concat(augmentationProperties.packsInput);
}
}
else {
augmentedConfig.packs = augmentationProperties.packsInput;
}
}
if (Array.isArray(augmentedConfig.packs) && !augmentedConfig.packs.length) {
delete augmentedConfig.packs;
}
augmentedConfig["query-filters"] = [
// Ordering matters. If the first filter is an inclusion, it implicitly
// excludes all queries that are not included. If it is an exclusion,
// it implicitly includes all queries that are not excluded. So user
// filters (if any) should always be first to preserve intent.
...(augmentedConfig["query-filters"] || []),
...augmentationProperties.extraQueryExclusions,
];
if (augmentedConfig["query-filters"]?.length === 0) {
delete augmentedConfig["query-filters"];
}
return augmentedConfig;
}
//# sourceMappingURL=config-utils.js.map

File diff suppressed because one or more lines are too long

237
lib/config-utils.test.js generated
View File

@@ -629,9 +629,7 @@ const packSpecPrettyPrintingMacro = ava_1.default.macro({
const mockLogger = (0, logging_1.getRunnerLogger)(true);
const calculateAugmentationMacro = ava_1.default.macro({
exec: async (t, _title, rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages, expectedAugmentationProperties) => {
const actualAugmentationProperties = await configUtils.calculateAugmentation((0, codeql_1.getCachedCodeQL)(), { owner: "github", repo: "repo" }, (0, testing_utils_1.createFeatures)([]), rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages, "", // sourceRoot
undefined, // buildMode
mockLogger);
const actualAugmentationProperties = await configUtils.calculateAugmentation(rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages);
t.deepEqual(actualAugmentationProperties, expectedAugmentationProperties);
},
title: (_, title) => `Calculate Augmentation: ${title}`,
@@ -678,9 +676,7 @@ const calculateAugmentationMacro = ava_1.default.macro({
});
const calculateAugmentationErrorMacro = ava_1.default.macro({
exec: async (t, _title, rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages, expectedError) => {
await t.throwsAsync(() => configUtils.calculateAugmentation((0, codeql_1.getCachedCodeQL)(), { owner: "github", repo: "repo" }, (0, testing_utils_1.createFeatures)([]), rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages, "", // sourceRoot
undefined, // buildMode
mockLogger), { message: expectedError });
await t.throwsAsync(() => configUtils.calculateAugmentation(rawPacksInput, rawQueriesInput, rawQualityQueriesInput, languages), { message: expectedError });
},
title: (_, title) => `Calculate Augmentation Error: ${title}`,
});
@@ -834,7 +830,7 @@ for (const { displayName, language, feature } of [
}
const defaultOverlayDatabaseModeTestSetup = {
overlayDatabaseEnvVar: undefined,
isFeatureEnabled: false,
features: [],
isPullRequest: false,
isDefaultBranch: false,
repositoryOwner: "github",
@@ -842,6 +838,7 @@ const defaultOverlayDatabaseModeTestSetup = {
languages: [languages_1.Language.javascript],
codeqlVersion: "2.21.0",
gitRoot: "/some/git/root",
codeScanningConfig: {},
};
const getOverlayDatabaseModeMacro = ava_1.default.macro({
exec: async (t, _title, setupOverrides, expected) => {
@@ -862,7 +859,7 @@ const getOverlayDatabaseModeMacro = ava_1.default.macro({
setup.overlayDatabaseEnvVar;
}
// Mock feature flags
const features = (0, testing_utils_1.createFeatures)(setup.isFeatureEnabled ? [feature_flags_1.Feature.OverlayAnalysis] : []);
const features = (0, testing_utils_1.createFeatures)(setup.features);
// Mock isAnalyzingPullRequest function
sinon
.stub(actionsUtil, "isAnalyzingPullRequest")
@@ -883,7 +880,7 @@ const getOverlayDatabaseModeMacro = ava_1.default.macro({
.stub(gitUtils, "isAnalyzingDefaultBranch")
.resolves(setup.isDefaultBranch);
const result = await configUtils.getOverlayDatabaseMode(codeql, repository, features, setup.languages, tempDir, // sourceRoot
setup.buildMode, logger);
setup.buildMode, setup.codeScanningConfig, logger);
t.deepEqual(result, expected);
}
finally {
@@ -919,32 +916,227 @@ const getOverlayDatabaseModeMacro = ava_1.default.macro({
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Ignore feature flag when analyzing non-default branch", {
isFeatureEnabled: true,
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay-base database on default branch when feature enabled", {
isFeatureEnabled: true,
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.OverlayBase,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when feature disabled", {
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay-base database on default branch when feature enabled with custom analysis", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
codeScanningConfig: {
packs: ["some-custom-pack@1.0.0"],
},
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.OverlayBase,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay-base database on default branch when code-scanning feature enabled", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.OverlayBase,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when code-scanning feature enabled with disable-default-queries", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
"disable-default-queries": true,
},
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when code-scanning feature enabled with packs", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
packs: ["some-custom-pack@1.0.0"],
},
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when code-scanning feature enabled with queries", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
queries: [{ uses: "some-query.ql" }],
},
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when code-scanning feature enabled with query-filters", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
"query-filters": [{ include: { "security-severity": "high" } }],
},
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when only language-specific feature enabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysisJavascript],
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when only code-scanning feature enabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript],
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay-base database on default branch when language-specific feature disabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis],
isDefaultBranch: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay analysis on PR when feature enabled", {
isFeatureEnabled: true,
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when feature disabled", {
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay analysis on PR when feature enabled with custom analysis", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
codeScanningConfig: {
packs: ["some-custom-pack@1.0.0"],
},
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay analysis on PR when code-scanning feature enabled", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when code-scanning feature enabled with disable-default-queries", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
"disable-default-queries": true,
},
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when code-scanning feature enabled with packs", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
packs: ["some-custom-pack@1.0.0"],
},
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when code-scanning feature enabled with queries", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
queries: [{ uses: "some-query.ql" }],
},
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when code-scanning feature enabled with query-filters", {
languages: [languages_1.Language.javascript],
features: [
feature_flags_1.Feature.OverlayAnalysis,
feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript,
],
codeScanningConfig: {
"query-filters": [{ include: { "security-severity": "high" } }],
},
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when only language-specific feature enabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysisJavascript],
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when only code-scanning feature enabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysisCodeScanningJavascript],
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay analysis on PR when language-specific feature disabled", {
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis],
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
@@ -965,7 +1157,8 @@ const getOverlayDatabaseModeMacro = ava_1.default.macro({
useOverlayDatabaseCaching: false,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "Overlay PR analysis by feature flag for dsp-testing", {
isFeatureEnabled: true,
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
isPullRequest: true,
repositoryOwner: "dsp-testing",
}, {
@@ -973,7 +1166,8 @@ const getOverlayDatabaseModeMacro = ava_1.default.macro({
useOverlayDatabaseCaching: true,
});
(0, ava_1.default)(getOverlayDatabaseModeMacro, "No overlay PR analysis by feature flag for other-org", {
isFeatureEnabled: true,
languages: [languages_1.Language.javascript],
features: [feature_flags_1.Feature.OverlayAnalysis, feature_flags_1.Feature.OverlayAnalysisJavascript],
isPullRequest: true,
repositoryOwner: "other-org",
}, {
@@ -1010,4 +1204,15 @@ const getOverlayDatabaseModeMacro = ava_1.default.macro({
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
// Exercise language-specific overlay analysis features code paths
for (const language in languages_1.Language) {
(0, ava_1.default)(getOverlayDatabaseModeMacro, `Check default overlay analysis feature for ${language}`, {
languages: [language],
features: [feature_flags_1.Feature.OverlayAnalysis],
isPullRequest: true,
}, {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
useOverlayDatabaseCaching: false,
});
}
//# sourceMappingURL=config-utils.test.js.map

File diff suppressed because one or more lines are too long

View File

@@ -40,7 +40,9 @@ exports.readDiffRangesJsonFile = readDiffRangesJsonFile;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const actionsUtil = __importStar(require("./actions-util"));
const api_client_1 = require("./api-client");
const feature_flags_1 = require("./feature-flags");
const util_1 = require("./util");
/**
* Check if the action should perform diff-informed analysis.
*/
@@ -59,6 +61,11 @@ async function getDiffInformedAnalysisBranches(codeql, features, logger) {
if (!(await features.getValue(feature_flags_1.Feature.DiffInformedQueries, codeql))) {
return undefined;
}
const gitHubVersion = await (0, api_client_1.getGitHubVersion)();
if (gitHubVersion.type === util_1.GitHubVariant.GHES &&
(0, util_1.satisfiesGHESVersion)(gitHubVersion.version, "<3.19", true)) {
return undefined;
}
const branches = actionsUtil.getPullRequestBranches();
if (!branches) {
logger.info("Not performing diff-informed analysis " +

View File

@@ -1 +1 @@
{"version":3,"file":"diff-informed-analysis-utils.js","sourceRoot":"","sources":["../src/diff-informed-analysis-utils.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAYA,8EASC;AASD,0EAiBC;AAYD,0DAUC;AAED,wDAaC;AApFD,uCAAyB;AACzB,2CAA6B;AAE7B,4DAA8C;AAG9C,mDAA6D;AAG7D;;GAEG;AACI,KAAK,UAAU,iCAAiC,CACrD,MAAc,EACd,QAA2B,EAC3B,MAAc;IAEd,OAAO,CACL,CAAC,MAAM,+BAA+B,CAAC,MAAM,EAAE,QAAQ,EAAE,MAAM,CAAC,CAAC;QACjE,SAAS,CACV,CAAC;AACJ,CAAC;AAED;;;;;;GAMG;AACI,KAAK,UAAU,+BAA+B,CACnD,MAAc,EACd,QAA2B,EAC3B,MAAc;IAEd,IAAI,CAAC,CAAC,MAAM,QAAQ,CAAC,QAAQ,CAAC,uBAAO,CAAC,mBAAmB,EAAE,MAAM,CAAC,CAAC,EAAE,CAAC;QACpE,OAAO,SAAS,CAAC;IACnB,CAAC;IAED,MAAM,QAAQ,GAAG,WAAW,CAAC,sBAAsB,EAAE,CAAC;IACtD,IAAI,CAAC,QAAQ,EAAE,CAAC;QACd,MAAM,CAAC,IAAI,CACT,wCAAwC;YACtC,8CAA8C,CACjD,CAAC;IACJ,CAAC;IACD,OAAO,QAAQ,CAAC;AAClB,CAAC;AAQD,SAAS,yBAAyB;IAChC,OAAO,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,qBAAqB,EAAE,EAAE,oBAAoB,CAAC,CAAC;AAC9E,CAAC;AAED,SAAgB,uBAAuB,CACrC,MAAc,EACd,MAAwB;IAExB,MAAM,YAAY,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,CAAC;IACrD,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,EAAE,CAAC,aAAa,CAAC,YAAY,EAAE,YAAY,CAAC,CAAC;IAC7C,MAAM,CAAC,KAAK,CACV,oCAAoC,YAAY,MAAM,YAAY,EAAE,CACrE,CAAC;AACJ,CAAC;AAED,SAAgB,sBAAsB,CACpC,MAAc;IAEd,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,IAAI,CAAC,EAAE,CAAC,UAAU,CAAC,YAAY,CAAC,EAAE,CAAC;QACjC,MAAM,CAAC,KAAK,CAAC,2CAA2C,YAAY,EAAE,CAAC,CAAC;QACxE,OAAO,SAAS,CAAC;IACnB,CAAC;IACD,MAAM,YAAY,GAAG,EAAE,CAAC,YAAY,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC;IAC3D,MAAM,CAAC,KAAK,CACV,qCAAqC,YAAY,MAAM,YAAY,EAAE,CACtE,CAAC;IACF,OAAO,IAAI,CAAC,KAAK,CAAC,YAAY,CAAqB,CAAC;AACtD,CAAC"}
{"version":3,"file":"diff-informed-analysis-utils.js","sourceRoot":"","sources":["../src/diff-informed-analysis-utils.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAcA,8EASC;AASD,0EAyBC;AAYD,0DAUC;AAED,wDAaC;AA9FD,uCAAyB;AACzB,2CAA6B;AAE7B,4DAA8C;AAE9C,6CAAgD;AAEhD,mDAA6D;AAE7D,iCAA6D;AAE7D;;GAEG;AACI,KAAK,UAAU,iCAAiC,CACrD,MAAc,EACd,QAA2B,EAC3B,MAAc;IAEd,OAAO,CACL,CAAC,MAAM,+BAA+B,CAAC,MAAM,EAAE,QAAQ,EAAE,MAAM,CAAC,CAAC;QACjE,SAAS,CACV,CAAC;AACJ,CAAC;AAED;;;;;;GAMG;AACI,KAAK,UAAU,+BAA+B,CACnD,MAAc,EACd,QAA2B,EAC3B,MAAc;IAEd,IAAI,CAAC,CAAC,MAAM,QAAQ,CAAC,QAAQ,CAAC,uBAAO,CAAC,mBAAmB,EAAE,MAAM,CAAC,CAAC,EAAE,CAAC;QACpE,OAAO,SAAS,CAAC;IACnB,CAAC;IAED,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;IAC/C,IACE,aAAa,CAAC,IAAI,KAAK,oBAAa,CAAC,IAAI;QACzC,IAAA,2BAAoB,EAAC,aAAa,CAAC,OAAO,EAAE,OAAO,EAAE,IAAI,CAAC,EAC1D,CAAC;QACD,OAAO,SAAS,CAAC;IACnB,CAAC;IAED,MAAM,QAAQ,GAAG,WAAW,CAAC,sBAAsB,EAAE,CAAC;IACtD,IAAI,CAAC,QAAQ,EAAE,CAAC;QACd,MAAM,CAAC,IAAI,CACT,wCAAwC;YACtC,8CAA8C,CACjD,CAAC;IACJ,CAAC;IACD,OAAO,QAAQ,CAAC;AAClB,CAAC;AAQD,SAAS,yBAAyB;IAChC,OAAO,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,qBAAqB,EAAE,EAAE,oBAAoB,CAAC,CAAC;AAC9E,CAAC;AAED,SAAgB,uBAAuB,CACrC,MAAc,EACd,MAAwB;IAExB,MAAM,YAAY,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,CAAC;IACrD,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,EAAE,CAAC,aAAa,CAAC,YAAY,EAAE,YAAY,CAAC,CAAC;IAC7C,MAAM,CAAC,KAAK,CACV,oCAAoC,YAAY,MAAM,YAAY,EAAE,CACrE,CAAC;AACJ,CAAC;AAED,SAAgB,sBAAsB,CACpC,MAAc;IAEd,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,IAAI,CAAC,EAAE,CAAC,UAAU,CAAC,YAAY,CAAC,EAAE,CAAC;QACjC,MAAM,CAAC,KAAK,CAAC,2CAA2C,YAAY,EAAE,CAAC,CAAC;QACxE,OAAO,SAAS,CAAC;IACnB,CAAC;IACD,MAAM,YAAY,GAAG,EAAE,CAAC,YAAY,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC;IAC3D,MAAM,CAAC,KAAK,CACV,qCAAqC,YAAY,MAAM,YAAY,EAAE,CACtE,CAAC;IACF,OAAO,IAAI,CAAC,KAAK,CAAC,YAAY,CAAqB,CAAC;AACtD,CAAC"}

130
lib/diff-informed-analysis-utils.test.js generated Normal file
View File

@@ -0,0 +1,130 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || (function () {
var ownKeys = function(o) {
ownKeys = Object.getOwnPropertyNames || function (o) {
var ar = [];
for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
return ar;
};
return ownKeys(o);
};
return function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
__setModuleDefault(result, mod);
return result;
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
const ava_1 = __importDefault(require("ava"));
const sinon = __importStar(require("sinon"));
const actionsUtil = __importStar(require("./actions-util"));
const apiClient = __importStar(require("./api-client"));
const diff_informed_analysis_utils_1 = require("./diff-informed-analysis-utils");
const feature_flags_1 = require("./feature-flags");
const logging_1 = require("./logging");
const repository_1 = require("./repository");
const testing_utils_1 = require("./testing-utils");
const util_1 = require("./util");
(0, testing_utils_1.setupTests)(ava_1.default);
const defaultTestCase = {
featureEnabled: true,
gitHubVersion: {
type: util_1.GitHubVariant.DOTCOM,
},
pullRequestBranches: {
base: "main",
head: "feature-branch",
},
codeQLVersion: "2.21.0",
};
const testShouldPerformDiffInformedAnalysis = ava_1.default.macro({
exec: async (t, _title, partialTestCase, expectedResult) => {
return await (0, util_1.withTmpDir)(async (tmpDir) => {
(0, testing_utils_1.setupActionsVars)(tmpDir, tmpDir);
const testCase = { ...defaultTestCase, ...partialTestCase };
const logger = (0, logging_1.getRunnerLogger)(true);
const codeql = (0, testing_utils_1.mockCodeQLVersion)(testCase.codeQLVersion);
if (testCase.diffInformedQueriesEnvVar !== undefined) {
process.env.CODEQL_ACTION_DIFF_INFORMED_QUERIES =
testCase.diffInformedQueriesEnvVar.toString();
}
else {
delete process.env.CODEQL_ACTION_DIFF_INFORMED_QUERIES;
}
const features = new feature_flags_1.Features(testCase.gitHubVersion, (0, repository_1.parseRepositoryNwo)("github/example"), tmpDir, logger);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, {
[feature_flags_1.Feature.DiffInformedQueries]: testCase.featureEnabled,
});
const getGitHubVersionStub = sinon
.stub(apiClient, "getGitHubVersion")
.resolves(testCase.gitHubVersion);
const getPullRequestBranchesStub = sinon
.stub(actionsUtil, "getPullRequestBranches")
.returns(testCase.pullRequestBranches);
const result = await (0, diff_informed_analysis_utils_1.shouldPerformDiffInformedAnalysis)(codeql, features, logger);
t.is(result, expectedResult);
delete process.env.CODEQL_ACTION_DIFF_INFORMED_QUERIES;
getGitHubVersionStub.restore();
getPullRequestBranchesStub.restore();
});
},
title: (_, title) => `shouldPerformDiffInformedAnalysis: ${title}`,
});
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns true in the default test case", {}, true);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns false when feature flag is disabled from the API", {
featureEnabled: false,
}, false);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns false when CODEQL_ACTION_DIFF_INFORMED_QUERIES is set to false", {
featureEnabled: true,
diffInformedQueriesEnvVar: false,
}, false);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns true when CODEQL_ACTION_DIFF_INFORMED_QUERIES is set to true", {
featureEnabled: false,
diffInformedQueriesEnvVar: true,
}, true);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns false for CodeQL version 2.20.0", {
codeQLVersion: "2.20.0",
}, false);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns false for invalid GHES version", {
gitHubVersion: {
type: util_1.GitHubVariant.GHES,
version: "invalid-version",
},
}, false);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns false for GHES version 3.18.5", {
gitHubVersion: {
type: util_1.GitHubVariant.GHES,
version: "3.18.5",
},
}, false);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns true for GHES version 3.19.0", {
gitHubVersion: {
type: util_1.GitHubVariant.GHES,
version: "3.19.0",
},
}, true);
(0, ava_1.default)(testShouldPerformDiffInformedAnalysis, "returns false when not a pull request", {
pullRequestBranches: undefined,
}, false);
//# sourceMappingURL=diff-informed-analysis-utils.test.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"diff-informed-analysis-utils.test.js","sourceRoot":"","sources":["../src/diff-informed-analysis-utils.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,8CAA6C;AAC7C,6CAA+B;AAE/B,4DAA8C;AAE9C,wDAA0C;AAC1C,iFAAmF;AACnF,mDAAoD;AACpD,uCAA4C;AAC5C,6CAAkD;AAClD,mDAKyB;AACzB,iCAAmD;AAGnD,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAUjB,MAAM,eAAe,GAAiC;IACpD,cAAc,EAAE,IAAI;IACpB,aAAa,EAAE;QACb,IAAI,EAAE,oBAAa,CAAC,MAAM;KAC3B;IACD,mBAAmB,EAAE;QACnB,IAAI,EAAE,MAAM;QACZ,IAAI,EAAE,gBAAgB;KACvB;IACD,aAAa,EAAE,QAAQ;CACxB,CAAC;AAEF,MAAM,qCAAqC,GAAG,aAAI,CAAC,KAAK,CAAC;IACvD,IAAI,EAAE,KAAK,EACT,CAAmB,EACnB,MAAc,EACd,eAAsD,EACtD,cAAuB,EACvB,EAAE;QACF,OAAO,MAAM,IAAA,iBAAU,EAAC,KAAK,EAAE,MAAM,EAAE,EAAE;YACvC,IAAA,gCAAgB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;YAEjC,MAAM,QAAQ,GAAG,EAAE,GAAG,eAAe,EAAE,GAAG,eAAe,EAAE,CAAC;YAC5D,MAAM,MAAM,GAAG,IAAA,yBAAe,EAAC,IAAI,CAAC,CAAC;YACrC,MAAM,MAAM,GAAG,IAAA,iCAAiB,EAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;YAEzD,IAAI,QAAQ,CAAC,yBAAyB,KAAK,SAAS,EAAE,CAAC;gBACrD,OAAO,CAAC,GAAG,CAAC,mCAAmC;oBAC7C,QAAQ,CAAC,yBAAyB,CAAC,QAAQ,EAAE,CAAC;YAClD,CAAC;iBAAM,CAAC;gBACN,OAAO,OAAO,CAAC,GAAG,CAAC,mCAAmC,CAAC;YACzD,CAAC;YAED,MAAM,QAAQ,GAAG,IAAI,wBAAQ,CAC3B,QAAQ,CAAC,aAAa,EACtB,IAAA,+BAAkB,EAAC,gBAAgB,CAAC,EACpC,MAAM,EACN,MAAM,CACP,CAAC;YACF,IAAA,0CAA0B,EAAC,GAAG,EAAE;gBAC9B,CAAC,uBAAO,CAAC,mBAAmB,CAAC,EAAE,QAAQ,CAAC,cAAc;aACvD,CAAC,CAAC;YAEH,MAAM,oBAAoB,GAAG,KAAK;iBAC/B,IAAI,CAAC,SAAS,EAAE,kBAAkB,CAAC;iBACnC,QAAQ,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;YACpC,MAAM,0BAA0B,GAAG,KAAK;iBACrC,IAAI,CAAC,WAAW,EAAE,wBAAwB,CAAC;iBAC3C,OAAO,CAAC,QAAQ,CAAC,mBAAmB,CAAC,CAAC;YAEzC,MAAM,MAAM,GAAG,MAAM,IAAA,gEAAiC,EACpD,MAAM,EACN,QAAQ,EACR,MAAM,CACP,CAAC;YAEF,CAAC,CAAC,EAAE,CAAC,MAAM,EAAE,cAAc,CAAC,CAAC;YAE7B,OAAO,OAAO,CAAC,GAAG,CAAC,mCAAmC,CAAC;YAEvD,oBAAoB,CAAC,OAAO,EAAE,CAAC;YAC/B,0BAA0B,CAAC,OAAO,EAAE,CAAC;QACvC,CAAC,CAAC,CAAC;IACL,CAAC;IACD,KAAK,EAAE,CAAC,CAAC,EAAE,KAAK,EAAE,EAAE,CAAC,sCAAsC,KAAK,EAAE;CACnE,CAAC,CAAC;AAEH,IAAA,aAAI,EACF,qCAAqC,EACrC,uCAAuC,EACvC,EAAE,EACF,IAAI,CACL,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,0DAA0D,EAC1D;IACE,cAAc,EAAE,KAAK;CACtB,EACD,KAAK,CACN,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,wEAAwE,EACxE;IACE,cAAc,EAAE,IAAI;IACpB,yBAAyB,EAAE,KAAK;CACjC,EACD,KAAK,CACN,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,sEAAsE,EACtE;IACE,cAAc,EAAE,KAAK;IACrB,yBAAyB,EAAE,IAAI;CAChC,EACD,IAAI,CACL,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,yCAAyC,EACzC;IACE,aAAa,EAAE,QAAQ;CACxB,EACD,KAAK,CACN,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,wCAAwC,EACxC;IACE,aAAa,EAAE;QACb,IAAI,EAAE,oBAAa,CAAC,IAAI;QACxB,OAAO,EAAE,iBAAiB;KAC3B;CACF,EACD,KAAK,CACN,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,uCAAuC,EACvC;IACE,aAAa,EAAE;QACb,IAAI,EAAE,oBAAa,CAAC,IAAI;QACxB,OAAO,EAAE,QAAQ;KAClB;CACF,EACD,KAAK,CACN,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,sCAAsC,EACtC;IACE,aAAa,EAAE;QACb,IAAI,EAAE,oBAAa,CAAC,IAAI;QACxB,OAAO,EAAE,QAAQ;KAClB;CACF,EACD,IAAI,CACL,CAAC;AAEF,IAAA,aAAI,EACF,qCAAqC,EACrC,uCAAuC,EACvC;IACE,mBAAmB,EAAE,SAAS;CAC/B,EACD,KAAK,CACN,CAAC"}

146
lib/feature-flags.js generated
View File

@@ -69,6 +69,26 @@ var Feature;
Feature["ExportDiagnosticsEnabled"] = "export_diagnostics_enabled";
Feature["ExtractToToolcache"] = "extract_to_toolcache";
Feature["OverlayAnalysis"] = "overlay_analysis";
Feature["OverlayAnalysisActions"] = "overlay_analysis_actions";
Feature["OverlayAnalysisCodeScanningActions"] = "overlay_analysis_code_scanning_actions";
Feature["OverlayAnalysisCodeScanningCpp"] = "overlay_analysis_code_scanning_cpp";
Feature["OverlayAnalysisCodeScanningCsharp"] = "overlay_analysis_code_scanning_csharp";
Feature["OverlayAnalysisCodeScanningGo"] = "overlay_analysis_code_scanning_go";
Feature["OverlayAnalysisCodeScanningJava"] = "overlay_analysis_code_scanning_java";
Feature["OverlayAnalysisCodeScanningJavascript"] = "overlay_analysis_code_scanning_javascript";
Feature["OverlayAnalysisCodeScanningPython"] = "overlay_analysis_code_scanning_python";
Feature["OverlayAnalysisCodeScanningRuby"] = "overlay_analysis_code_scanning_ruby";
Feature["OverlayAnalysisCodeScanningRust"] = "overlay_analysis_code_scanning_rust";
Feature["OverlayAnalysisCodeScanningSwift"] = "overlay_analysis_code_scanning_swift";
Feature["OverlayAnalysisCpp"] = "overlay_analysis_cpp";
Feature["OverlayAnalysisCsharp"] = "overlay_analysis_csharp";
Feature["OverlayAnalysisGo"] = "overlay_analysis_go";
Feature["OverlayAnalysisJava"] = "overlay_analysis_java";
Feature["OverlayAnalysisJavascript"] = "overlay_analysis_javascript";
Feature["OverlayAnalysisPython"] = "overlay_analysis_python";
Feature["OverlayAnalysisRuby"] = "overlay_analysis_ruby";
Feature["OverlayAnalysisRust"] = "overlay_analysis_rust";
Feature["OverlayAnalysisSwift"] = "overlay_analysis_swift";
Feature["PythonDefaultIsToNotExtractStdlib"] = "python_default_is_to_not_extract_stdlib";
Feature["QaTelemetryEnabled"] = "qa_telemetry_enabled";
Feature["RustAnalysis"] = "rust_analysis";
@@ -97,7 +117,7 @@ exports.featureConfig = {
minimumVersion: "2.15.0",
},
[Feature.DiffInformedQueries]: {
defaultValue: false,
defaultValue: true,
envVar: "CODEQL_ACTION_DIFF_INFORMED_QUERIES",
minimumVersion: "2.21.0",
},
@@ -139,6 +159,106 @@ exports.featureConfig = {
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS",
minimumVersion: overlay_database_utils_1.CODEQL_OVERLAY_MINIMUM_VERSION,
},
[Feature.OverlayAnalysisActions]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_ACTIONS",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningActions]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_ACTIONS",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningCpp]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_CPP",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningCsharp]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_CSHARP",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningGo]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_GO",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningJava]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_JAVA",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningJavascript]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_JAVASCRIPT",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningPython]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_PYTHON",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningRuby]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_RUBY",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningRust]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_RUST",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCodeScanningSwift]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CODE_SCANNING_SWIFT",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCpp]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CPP",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisCsharp]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_CSHARP",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisGo]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_GO",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisJava]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_JAVA",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisJavascript]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_JAVASCRIPT",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisPython]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_PYTHON",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisRuby]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_RUBY",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisRust]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_RUST",
minimumVersion: undefined,
},
[Feature.OverlayAnalysisSwift]: {
defaultValue: false,
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_SWIFT",
minimumVersion: undefined,
},
[Feature.PythonDefaultIsToNotExtractStdlib]: {
defaultValue: false,
envVar: "CODEQL_ACTION_DISABLE_PYTHON_STANDARD_LIBRARY_EXTRACTION",
@@ -370,14 +490,22 @@ class GitHubFeatureFlags {
try {
const featuresToRequest = Object.entries(exports.featureConfig)
.filter(([, config]) => !config.legacyApi)
.map(([f]) => f)
.join(",");
const response = await (0, api_client_1.getApiClient)().request("GET /repos/:owner/:repo/code-scanning/codeql-action/features", {
owner: this.repositoryNwo.owner,
repo: this.repositoryNwo.repo,
features: featuresToRequest,
});
const remoteFlags = response.data;
.map(([f]) => f);
const FEATURES_PER_REQUEST = 25;
const featureChunks = [];
while (featuresToRequest.length > 0) {
featureChunks.push(featuresToRequest.splice(0, FEATURES_PER_REQUEST));
}
let remoteFlags = {};
for (const chunk of featureChunks) {
const response = await (0, api_client_1.getApiClient)().request("GET /repos/:owner/:repo/code-scanning/codeql-action/features", {
owner: this.repositoryNwo.owner,
repo: this.repositoryNwo.repo,
features: chunk.join(","),
});
const chunkFlags = response.data;
remoteFlags = { ...remoteFlags, ...chunkFlags };
}
this.logger.debug("Loaded the following default values for the feature flags from the Code Scanning API:");
for (const [feature, value] of Object.entries(remoteFlags).sort(([nameA], [nameB]) => nameA.localeCompare(nameB))) {
this.logger.debug(` ${feature}: ${value}`);

File diff suppressed because one or more lines are too long

View File

@@ -36,7 +36,6 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.initializeFeatures = initializeFeatures;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const ava_1 = __importDefault(require("ava"));
@@ -68,7 +67,7 @@ const testRepositoryNwo = (0, repository_1.parseRepositoryNwo)("github/example")
await (0, util_1.withTmpDir)(async (tmpDir) => {
const loggedMessages = [];
const features = setUpFeatureFlagTests(tmpDir, (0, testing_utils_1.getRecordingLogger)(loggedMessages), { type: util_1.GitHubVariant.GHE_DOTCOM });
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, initializeFeatures(true));
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, (0, testing_utils_1.initializeFeatures)(true));
for (const feature of Object.values(feature_flags_1.Feature)) {
// Ensure we have gotten a response value back from the Mock API
t.assert(await features.getValue(feature, includeCodeQlIfRequired(feature)));
@@ -103,6 +102,24 @@ const testRepositoryNwo = (0, repository_1.parseRepositoryNwo)("github/example")
assertAllFeaturesUndefinedInApi(t, loggedMessages);
});
});
(0, ava_1.default)("Include no more than 25 features in each API request", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
(0, testing_utils_1.stubFeatureFlagApiEndpoint)((request) => {
const requestedFeatures = request.features.split(",");
return {
status: requestedFeatures.length <= 25 ? 200 : 400,
messageIfError: "Can request a maximum of 25 features.",
data: {},
};
});
// We only need to call getValue once, and it does not matter which feature
// we ask for. Under the hood, the features library will request all features
// from the API.
const feature = Object.values(feature_flags_1.Feature)[0];
await t.notThrowsAsync(async () => features.getValue(feature, includeCodeQlIfRequired(feature)));
});
});
(0, ava_1.default)("Feature flags exception is propagated if the API request errors", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
@@ -135,7 +152,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)(`Only feature '${feature}' is enabled if the associated environment variable is true. Others disabled.`, async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(false);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(false);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
// feature should be disabled initially
t.assert(!(await features.getValue(feature, includeCodeQlIfRequired(feature))));
@@ -147,7 +164,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)(`Feature '${feature}' is disabled if the associated environment variable is false, even if enabled in API`, async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
// feature should be enabled initially
t.assert(await features.getValue(feature, includeCodeQlIfRequired(feature)));
@@ -161,7 +178,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)(`Getting feature '${feature} should throw if no codeql is provided`, async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
await t.throwsAsync(async () => features.getValue(feature), {
message: `Internal error: A ${feature_flags_1.featureConfig[feature].minimumVersion !== undefined
@@ -175,7 +192,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)(`Feature '${feature}' is disabled if the minimum CLI version is below ${feature_flags_1.featureConfig[feature].minimumVersion}`, async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
// feature should be disabled when an old CLI version is set
let codeql = (0, testing_utils_1.mockCodeQLVersion)("2.0.0");
@@ -199,7 +216,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)(`Feature '${feature}' is disabled if the required tools feature is not enabled`, async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
// feature should be disabled when the required tools feature is not enabled
let codeql = (0, testing_utils_1.mockCodeQLVersion)("2.0.0");
@@ -225,7 +242,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)("Feature flags are saved to disk", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
const cachedFeatureFlags = path.join(tmpDir, feature_flags_1.FEATURE_FLAGS_FILE_NAME);
t.false(fs.existsSync(cachedFeatureFlags), "Feature flag cached file should not exist before getting feature flags");
@@ -244,7 +261,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)("Environment variable can override feature flag cache", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
const cachedFeatureFlags = path.join(tmpDir, feature_flags_1.FEATURE_FLAGS_FILE_NAME);
t.true(await features.getValue(feature_flags_1.Feature.QaTelemetryEnabled, includeCodeQlIfRequired(feature_flags_1.Feature.QaTelemetryEnabled)), "Feature flag should be enabled initially");
@@ -266,7 +283,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)("selects CLI v2.20.1 on Dotcom when feature flags enable v2.20.0 and v2.20.1", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
expectedFeatureEnablement["default_codeql_version_2_20_0_enabled"] = true;
expectedFeatureEnablement["default_codeql_version_2_20_1_enabled"] = true;
expectedFeatureEnablement["default_codeql_version_2_20_2_enabled"] = false;
@@ -285,7 +302,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)("includes tag name", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
expectedFeatureEnablement["default_codeql_version_2_20_0_enabled"] = true;
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
const defaultCliVersion = await features.getDefaultCliVersion(util_1.GitHubVariant.DOTCOM);
@@ -299,7 +316,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
(0, ava_1.default)(`selects CLI from defaults.json on Dotcom when no default version feature flags are enabled`, async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const features = setUpFeatureFlagTests(tmpDir);
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
(0, testing_utils_1.mockFeatureFlagApiEndpoint)(200, expectedFeatureEnablement);
const defaultCliVersion = await features.getDefaultCliVersion(util_1.GitHubVariant.DOTCOM);
t.deepEqual(defaultCliVersion, {
@@ -313,7 +330,7 @@ for (const feature of Object.keys(feature_flags_1.featureConfig)) {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const loggedMessages = [];
const features = setUpFeatureFlagTests(tmpDir, (0, testing_utils_1.getRecordingLogger)(loggedMessages));
const expectedFeatureEnablement = initializeFeatures(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
expectedFeatureEnablement["default_codeql_version_2_20_0_enabled"] = true;
expectedFeatureEnablement["default_codeql_version_2_20_1_enabled"] = true;
expectedFeatureEnablement["default_codeql_version_2_20_invalid_enabled"] =
@@ -358,12 +375,6 @@ function assertAllFeaturesUndefinedInApi(t, loggedMessages) {
v.message.includes("undefined in API response")) !== undefined);
}
}
function initializeFeatures(initialValue) {
return Object.keys(feature_flags_1.featureConfig).reduce((features, key) => {
features[key] = initialValue;
return features;
}, {});
}
function setUpFeatureFlagTests(tmpDir, logger = (0, logging_1.getRunnerLogger)(true), gitHubVersion = { type: util_1.GitHubVariant.DOTCOM }) {
(0, testing_utils_1.setupActionsVars)(tmpDir, tmpDir);
return new feature_flags_1.Features(gitHubVersion, testRepositoryNwo, tmpDir, logger);

File diff suppressed because one or more lines are too long

13
lib/init-action.js generated
View File

@@ -57,7 +57,7 @@ const status_report_1 = require("./status-report");
const tools_features_1 = require("./tools-features");
const util_1 = require("./util");
const workflow_1 = require("./workflow");
async function sendCompletedStatusReport(startedAt, config, configFile, toolsDownloadStatusReport, toolsFeatureFlagsValid, toolsSource, toolsVersion, logger, error) {
async function sendCompletedStatusReport(startedAt, config, configFile, toolsDownloadStatusReport, toolsFeatureFlagsValid, toolsSource, toolsVersion, overlayBaseDatabaseStats, logger, error) {
const statusReportBase = await (0, status_report_1.createStatusReportBase)(status_report_1.ActionName.Init, (0, status_report_1.getActionsStatus)(error), startedAt, config, await (0, util_1.checkDiskUsage)(logger), logger, error?.message, error?.stack);
if (statusReportBase === undefined) {
return;
@@ -126,6 +126,8 @@ async function sendCompletedStatusReport(startedAt, config, configFile, toolsDow
trap_cache_languages: Object.keys(config.trapCaches).join(","),
trap_cache_download_size_bytes: Math.round(await (0, caching_utils_1.getTotalCacheSize)(Object.values(config.trapCaches), logger)),
trap_cache_download_duration_ms: Math.round(config.trapCacheDownloadTime),
overlay_base_database_download_size_bytes: overlayBaseDatabaseStats?.databaseSizeBytes,
overlay_base_database_download_duration_ms: overlayBaseDatabaseStats?.databaseDownloadDurationMs,
query_filters: JSON.stringify(config.originalUserInput["query-filters"] ?? []),
registries: JSON.stringify(configUtils.parseRegistriesWithoutCredentials((0, actions_util_1.getOptionalInput)("registries")) ?? []),
};
@@ -232,6 +234,7 @@ async function run() {
}
return;
}
let overlayBaseDatabaseStats;
try {
if (config.augmentationProperties.overlayDatabaseMode ===
overlay_database_utils_1.OverlayDatabaseMode.Overlay &&
@@ -247,8 +250,8 @@ async function run() {
// necessary preparations. So, in that mode, we would assume that
// everything is in order and let the analysis fail if that turns out not
// to be the case.
const overlayDatabaseDownloaded = await (0, overlay_database_utils_1.downloadOverlayBaseDatabaseFromCache)(codeql, config, logger);
if (!overlayDatabaseDownloaded) {
overlayBaseDatabaseStats = await (0, overlay_database_utils_1.downloadOverlayBaseDatabaseFromCache)(codeql, config, logger);
if (!overlayBaseDatabaseStats) {
config.augmentationProperties.overlayDatabaseMode =
overlay_database_utils_1.OverlayDatabaseMode.None;
logger.info("No overlay-base database found in cache, " +
@@ -443,13 +446,13 @@ async function run() {
const error = (0, util_1.wrapError)(unwrappedError);
core.setFailed(error.message);
await sendCompletedStatusReport(startedAt, config, undefined, // We only report config info on success.
toolsDownloadStatusReport, toolsFeatureFlagsValid, toolsSource, toolsVersion, logger, error);
toolsDownloadStatusReport, toolsFeatureFlagsValid, toolsSource, toolsVersion, overlayBaseDatabaseStats, logger, error);
return;
}
finally {
(0, diagnostics_1.logUnwrittenDiagnostics)();
}
await sendCompletedStatusReport(startedAt, config, configFile, toolsDownloadStatusReport, toolsFeatureFlagsValid, toolsSource, toolsVersion, logger);
await sendCompletedStatusReport(startedAt, config, configFile, toolsDownloadStatusReport, toolsFeatureFlagsValid, toolsSource, toolsVersion, overlayBaseDatabaseStats, logger);
}
function getTrapCachingEnabled() {
// If the workflow specified something always respect that

File diff suppressed because one or more lines are too long

View File

@@ -213,51 +213,67 @@ async function uploadOverlayBaseDatabaseToCache(codeql, config, logger) {
* @param codeql The CodeQL instance
* @param config The configuration object
* @param logger The logger instance
* @returns A promise that resolves to true if the download was performed and
* successfully completed, or false otherwise
* @returns A promise that resolves to download statistics if an overlay-base
* database was successfully downloaded, or undefined if the download was
* either not performed or failed.
*/
async function downloadOverlayBaseDatabaseFromCache(codeql, config, logger) {
const overlayDatabaseMode = config.augmentationProperties.overlayDatabaseMode;
if (overlayDatabaseMode !== OverlayDatabaseMode.Overlay) {
logger.debug(`Overlay database mode is ${overlayDatabaseMode}. ` +
"Skip downloading overlay-base database from cache.");
return false;
return undefined;
}
if (!config.augmentationProperties.useOverlayDatabaseCaching) {
logger.debug("Overlay database caching is disabled. " +
"Skip downloading overlay-base database from cache.");
return false;
return undefined;
}
if ((0, util_1.isInTestMode)()) {
logger.debug("In test mode. Skip downloading overlay-base database from cache.");
return false;
return undefined;
}
const dbLocation = config.dbLocation;
const codeQlVersion = (await codeql.getVersion()).version;
const restoreKey = getCacheRestoreKey(config, codeQlVersion);
logger.info(`Looking in Actions cache for overlay-base database with restore key ${restoreKey}`);
let databaseDownloadDurationMs = 0;
try {
const databaseDownloadStart = performance.now();
const foundKey = await (0, util_1.withTimeout)(MAX_CACHE_OPERATION_MS, actionsCache.restoreCache([dbLocation], restoreKey), () => {
logger.info("Timed out downloading overlay-base database from cache");
});
databaseDownloadDurationMs = Math.round(performance.now() - databaseDownloadStart);
if (foundKey === undefined) {
logger.info("No overlay-base database found in Actions cache");
return false;
return undefined;
}
logger.info(`Downloaded overlay-base database in cache with key ${foundKey}`);
}
catch (error) {
logger.warning("Failed to download overlay-base database from cache: " +
`${error instanceof Error ? error.message : String(error)}`);
return false;
return undefined;
}
const databaseIsValid = checkOverlayBaseDatabase(config, logger, "Downloaded overlay-base database is invalid");
if (!databaseIsValid) {
logger.warning("Downloaded overlay-base database failed validation");
return false;
return undefined;
}
const databaseSizeBytes = await (0, util_1.tryGetFolderBytes)(dbLocation, logger);
if (databaseSizeBytes === undefined) {
logger.info("Filesystem error while accessing downloaded overlay-base database");
// The problem that warrants reporting download failure is not that we are
// unable to determine the size of the database. Rather, it is that we
// encountered a filesystem error while accessing the database, which
// indicates that an overlay analysis will likely fail.
return undefined;
}
logger.info(`Successfully downloaded overlay-base database to ${dbLocation}`);
return true;
return {
databaseSizeBytes: Math.round(databaseSizeBytes),
databaseDownloadDurationMs,
};
}
async function generateCacheKey(config, codeQlVersion, checkoutPath) {
const sha = await (0, git_utils_1.getCommitOid)(checkoutPath);

File diff suppressed because one or more lines are too long

View File

@@ -38,6 +38,7 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
Object.defineProperty(exports, "__esModule", { value: true });
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const actionsCache = __importStar(require("@actions/cache"));
const ava_1 = __importDefault(require("ava"));
const sinon = __importStar(require("sinon"));
const actionsUtil = __importStar(require("./actions-util"));
@@ -45,6 +46,7 @@ const gitUtils = __importStar(require("./git-utils"));
const logging_1 = require("./logging");
const overlay_database_utils_1 = require("./overlay-database-utils");
const testing_utils_1 = require("./testing-utils");
const utils = __importStar(require("./util"));
const util_1 = require("./util");
(0, testing_utils_1.setupTests)(ava_1.default);
(0, ava_1.default)("writeOverlayChangesFile generates correct changes file", async (t) => {
@@ -91,4 +93,93 @@ const util_1 = require("./util");
t.deepEqual(parsedContent.changes.sort(), ["added.js", "deleted.js", "modified.js"], "Should identify added, deleted, and modified files");
});
});
const defaultDownloadTestCase = {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: true,
isInTestMode: false,
restoreCacheResult: "cache-key",
hasBaseDatabaseOidsFile: true,
tryGetFolderBytesSucceeds: true,
codeQLVersion: "2.20.5",
};
const testDownloadOverlayBaseDatabaseFromCache = ava_1.default.macro({
exec: async (t, _title, partialTestCase, expectDownloadSuccess) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
const dbLocation = path.join(tmpDir, "db");
await fs.promises.mkdir(dbLocation, { recursive: true });
const logger = (0, logging_1.getRunnerLogger)(true);
const config = (0, testing_utils_1.createTestConfig)({ dbLocation });
const testCase = { ...defaultDownloadTestCase, ...partialTestCase };
config.augmentationProperties.overlayDatabaseMode =
testCase.overlayDatabaseMode;
config.augmentationProperties.useOverlayDatabaseCaching =
testCase.useOverlayDatabaseCaching;
if (testCase.hasBaseDatabaseOidsFile) {
const baseDatabaseOidsFile = path.join(dbLocation, "base-database-oids.json");
await fs.promises.writeFile(baseDatabaseOidsFile, JSON.stringify({}));
}
const stubs = [];
const isInTestModeStub = sinon
.stub(utils, "isInTestMode")
.returns(testCase.isInTestMode);
stubs.push(isInTestModeStub);
if (testCase.restoreCacheResult instanceof Error) {
const restoreCacheStub = sinon
.stub(actionsCache, "restoreCache")
.rejects(testCase.restoreCacheResult);
stubs.push(restoreCacheStub);
}
else {
const restoreCacheStub = sinon
.stub(actionsCache, "restoreCache")
.resolves(testCase.restoreCacheResult);
stubs.push(restoreCacheStub);
}
const tryGetFolderBytesStub = sinon
.stub(utils, "tryGetFolderBytes")
.resolves(testCase.tryGetFolderBytesSucceeds ? 1024 * 1024 : undefined);
stubs.push(tryGetFolderBytesStub);
try {
const result = await (0, overlay_database_utils_1.downloadOverlayBaseDatabaseFromCache)((0, testing_utils_1.mockCodeQLVersion)(testCase.codeQLVersion), config, logger);
if (expectDownloadSuccess) {
t.truthy(result);
}
else {
t.is(result, undefined);
}
}
finally {
for (const stub of stubs) {
stub.restore();
}
}
});
},
title: (_, title) => `downloadOverlayBaseDatabaseFromCache: ${title}`,
});
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns stats when successful", {}, true);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when mode is OverlayDatabaseMode.OverlayBase", {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.OverlayBase,
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when mode is OverlayDatabaseMode.None", {
overlayDatabaseMode: overlay_database_utils_1.OverlayDatabaseMode.None,
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when caching is disabled", {
useOverlayDatabaseCaching: false,
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined in test mode", {
isInTestMode: true,
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when cache miss", {
restoreCacheResult: undefined,
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when download fails", {
restoreCacheResult: new Error("Download failed"),
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when downloaded database is invalid", {
hasBaseDatabaseOidsFile: false,
}, false);
(0, ava_1.default)(testDownloadOverlayBaseDatabaseFromCache, "returns undefined when filesystem error occurs", {
tryGetFolderBytesSucceeds: false,
}, false);
//# sourceMappingURL=overlay-database-utils.test.js.map

File diff suppressed because one or more lines are too long

View File

@@ -40,14 +40,13 @@ const path = __importStar(require("path"));
const ava_1 = __importDefault(require("ava"));
const sinon = __importStar(require("sinon"));
const actionsUtil = __importStar(require("./actions-util"));
const feature_flags_test_1 = require("./feature-flags.test");
const logging_1 = require("./logging");
const setupCodeql = __importStar(require("./setup-codeql"));
const testing_utils_1 = require("./testing-utils");
const util_1 = require("./util");
(0, testing_utils_1.setupTests)(ava_1.default);
// TODO: Remove when when we no longer need to pass in features (https://github.com/github/codeql-action/issues/2600)
const expectedFeatureEnablement = (0, feature_flags_test_1.initializeFeatures)(true);
const expectedFeatureEnablement = (0, testing_utils_1.initializeFeatures)(true);
expectedFeatureEnablement.getValue = function (feature) {
// eslint-disable-next-line @typescript-eslint/no-unsafe-return
return expectedFeatureEnablement[feature];

File diff suppressed because one or more lines are too long

43
lib/testing-utils.js generated
View File

@@ -41,9 +41,11 @@ exports.setupTests = setupTests;
exports.setupActionsVars = setupActionsVars;
exports.getRecordingLogger = getRecordingLogger;
exports.mockFeatureFlagApiEndpoint = mockFeatureFlagApiEndpoint;
exports.stubFeatureFlagApiEndpoint = stubFeatureFlagApiEndpoint;
exports.mockLanguagesInRepo = mockLanguagesInRepo;
exports.mockCodeQLVersion = mockCodeQLVersion;
exports.createFeatures = createFeatures;
exports.initializeFeatures = initializeFeatures;
exports.mockBundleDownloadApi = mockBundleDownloadApi;
exports.createTestConfig = createTestConfig;
const node_util_1 = require("node:util");
@@ -54,6 +56,7 @@ const sinon = __importStar(require("sinon"));
const apiClient = __importStar(require("./api-client"));
const codeql = __importStar(require("./codeql"));
const defaults = __importStar(require("./defaults.json"));
const feature_flags_1 = require("./feature-flags");
const util_1 = require("./util");
exports.SAMPLE_DOTCOM_API_DETAILS = {
auth: "token",
@@ -172,21 +175,32 @@ function getRecordingLogger(messages) {
}
/** Mock the HTTP request to the feature flags enablement API endpoint. */
function mockFeatureFlagApiEndpoint(responseStatusCode, response) {
stubFeatureFlagApiEndpoint(() => ({
status: responseStatusCode,
messageIfError: "some error message",
data: response,
}));
}
/** Stub the HTTP request to the feature flags enablement API endpoint. */
function stubFeatureFlagApiEndpoint(responseFunction) {
// Passing an auth token is required, so we just use a dummy value
const client = github.getOctokit("123");
const requestSpy = sinon.stub(client, "request");
const optInSpy = requestSpy.withArgs("GET /repos/:owner/:repo/code-scanning/codeql-action/features");
if (responseStatusCode < 300) {
optInSpy.resolves({
status: responseStatusCode,
data: response,
headers: {},
url: "GET /repos/:owner/:repo/code-scanning/codeql-action/features",
});
}
else {
optInSpy.throws(new util_1.HTTPError("some error message", responseStatusCode));
}
optInSpy.callsFake((_route, params) => {
const response = responseFunction(params);
if (response.status < 300) {
return Promise.resolve({
status: response.status,
data: response.data,
headers: {},
url: "GET /repos/:owner/:repo/code-scanning/codeql-action/features",
});
}
else {
throw new util_1.HTTPError(response.messageIfError || "default stub error message", response.status);
}
});
sinon.stub(apiClient, "getApiClient").value(() => client);
}
function mockLanguagesInRepo(languages) {
@@ -240,6 +254,12 @@ function createFeatures(enabledFeatures) {
},
};
}
function initializeFeatures(initialValue) {
return Object.keys(feature_flags_1.featureConfig).reduce((features, key) => {
features[key] = initialValue;
return features;
}, {});
}
/**
* Mocks the API for downloading the bundle tagged `tagName`.
*
@@ -282,6 +302,7 @@ function createTestConfig(overrides) {
augmentationProperties: {
packsInputCombines: false,
queriesInputCombines: false,
extraQueryExclusions: [],
},
trapCaches: {},
trapCacheDownloadTime: 0,

File diff suppressed because one or more lines are too long

19
lib/upload-lib.js generated
View File

@@ -57,9 +57,7 @@ const zlib_1 = __importDefault(require("zlib"));
const core = __importStar(require("@actions/core"));
const file_url_1 = __importDefault(require("file-url"));
const jsonschema = __importStar(require("jsonschema"));
const semver = __importStar(require("semver"));
const actionsUtil = __importStar(require("./actions-util"));
const actions_util_1 = require("./actions-util");
const api = __importStar(require("./api-client"));
const api_client_1 = require("./api-client");
const codeql_1 = require("./codeql");
@@ -140,7 +138,7 @@ function areAllRunsUnique(sarifObjects) {
async function shouldShowCombineSarifFilesDeprecationWarning(sarifObjects, githubVersion) {
// Do not show this warning on GHES versions before 3.14.0
if (githubVersion.type === util_1.GitHubVariant.GHES &&
semver.lt(githubVersion.version, "3.14.0")) {
(0, util_1.satisfiesGHESVersion)(githubVersion.version, "<3.14", true)) {
return false;
}
// Only give a deprecation warning when not all runs are unique and
@@ -152,15 +150,14 @@ async function throwIfCombineSarifFilesDisabled(sarifObjects, features, githubVe
if (!(await shouldDisableCombineSarifFiles(sarifObjects, features, githubVersion))) {
return;
}
// TODO: Update this changelog URL to the correct one when it's published.
const deprecationMoreInformationMessage = "For more information, see https://github.blog/changelog/2024-05-06-code-scanning-will-stop-combining-runs-from-a-single-upload";
const deprecationMoreInformationMessage = "For more information, see https://github.blog/changelog/2025-07-21-code-scanning-will-stop-combining-multiple-sarif-runs-uploaded-in-the-same-sarif-file/";
throw new util_1.ConfigurationError(`The CodeQL Action does not support uploading multiple SARIF runs with the same category. Please update your workflow to upload a single run per category. ${deprecationMoreInformationMessage}`);
}
// Checks whether combining SARIF files should be disabled.
async function shouldDisableCombineSarifFiles(sarifObjects, features, githubVersion) {
if (githubVersion.type === util_1.GitHubVariant.GHES) {
// Never block on GHES versions before 3.18.
if (semver.lt(githubVersion.version, "3.18.0-0")) {
if ((0, util_1.satisfiesGHESVersion)(githubVersion.version, "<3.18", true)) {
return false;
}
}
@@ -184,9 +181,6 @@ async function shouldDisableCombineSarifFiles(sarifObjects, features, githubVers
// Returns the contents of the combined sarif file.
async function combineSarifFilesUsingCLI(sarifFiles, gitHubVersion, features, logger) {
logger.info("Combining SARIF files using the CodeQL CLI");
if (sarifFiles.length === 1) {
return JSON.parse(fs.readFileSync(sarifFiles[0], "utf8"));
}
const sarifObjects = sarifFiles.map((sarifFile) => {
return JSON.parse(fs.readFileSync(sarifFile, "utf8"));
});
@@ -216,8 +210,8 @@ async function combineSarifFilesUsingCLI(sarifFiles, gitHubVersion, features, lo
else {
logger.info("Initializing CodeQL since the 'init' Action was not called before this step.");
const apiDetails = {
auth: (0, actions_util_1.getRequiredInput)("token"),
externalRepoAuth: (0, actions_util_1.getOptionalInput)("external-repository-token"),
auth: actionsUtil.getRequiredInput("token"),
externalRepoAuth: actionsUtil.getOptionalInput("external-repository-token"),
url: (0, util_1.getRequiredEnvParam)("GITHUB_SERVER_URL"),
apiURL: (0, util_1.getRequiredEnvParam)("GITHUB_API_URL"),
};
@@ -493,6 +487,8 @@ async function uploadSpecifiedFiles(sarifPaths, checkoutPath, category, features
const sarifPath = sarifPaths[0];
sarif = readSarifFile(sarifPath);
validateSarifFileSchema(sarif, sarifPath, logger);
// Validate that there are no runs for the same category
await throwIfCombineSarifFilesDisabled([sarif], features, gitHubVersion);
}
sarif = filterAlertsByDiffRange(logger, sarif);
sarif = await fingerprints.addFingerprints(sarif, checkoutPath, logger);
@@ -608,6 +604,7 @@ function shouldConsiderConfigurationError(processingErrors) {
const expectedConfigErrors = [
"CodeQL analyses from advanced configurations cannot be processed when the default setup is enabled",
"rejecting delivery as the repository has too many logical alerts",
"A delivery cannot contain multiple runs with the same category",
];
return (processingErrors.length === 1 &&
expectedConfigErrors.some((msg) => processingErrors[0].includes(msg)));

File diff suppressed because one or more lines are too long

26
lib/upload-lib.test.js generated
View File

@@ -236,6 +236,12 @@ ava_1.default.beforeEach(() => {
version: "3.14.0",
}));
});
(0, ava_1.default)("shouldShowCombineSarifFilesDeprecationWarning when on GHES 3.16 pre", async (t) => {
t.true(await uploadLib.shouldShowCombineSarifFilesDeprecationWarning([createMockSarif("abc", "def"), createMockSarif("abc", "def")], {
type: util_1.GitHubVariant.GHES,
version: "3.16.0.pre1",
}));
});
(0, ava_1.default)("shouldShowCombineSarifFilesDeprecationWarning with only 1 run", async (t) => {
t.false(await uploadLib.shouldShowCombineSarifFilesDeprecationWarning([createMockSarif("abc", "def")], {
type: util_1.GitHubVariant.DOTCOM,
@@ -260,7 +266,9 @@ ava_1.default.beforeEach(() => {
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on dotcom with feature flag", async (t) => {
await t.throwsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.DisableCombineSarifFiles]), {
type: util_1.GitHubVariant.DOTCOM,
}));
}), {
message: /The CodeQL Action does not support uploading multiple SARIF runs with the same category/,
});
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on dotcom without feature flag", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
@@ -289,18 +297,30 @@ ava_1.default.beforeEach(() => {
await t.throwsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.18.0.pre1",
}));
}), {
message: /The CodeQL Action does not support uploading multiple SARIF runs with the same category/,
});
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.18 alpha", async (t) => {
await t.throwsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.18.0-alpha.1",
}));
}), {
message: /The CodeQL Action does not support uploading multiple SARIF runs with the same category/,
});
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled when on GHES 3.18", async (t) => {
await t.throwsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "3.18.0",
}), {
message: /The CodeQL Action does not support uploading multiple SARIF runs with the same category/,
});
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled with an invalid GHES version", async (t) => {
await t.notThrowsAsync(uploadLib.throwIfCombineSarifFilesDisabled([createMockSarif("abc", "def"), createMockSarif("abc", "def")], (0, testing_utils_1.createFeatures)([]), {
type: util_1.GitHubVariant.GHES,
version: "foobar",
}));
});
(0, ava_1.default)("throwIfCombineSarifFilesDisabled with only 1 run", async (t) => {

File diff suppressed because one or more lines are too long

19
lib/util.js generated
View File

@@ -77,6 +77,7 @@ exports.getErrorMessage = getErrorMessage;
exports.prettyPrintPack = prettyPrintPack;
exports.checkDiskUsage = checkDiskUsage;
exports.checkActionVersion = checkActionVersion;
exports.satisfiesGHESVersion = satisfiesGHESVersion;
exports.cloneObject = cloneObject;
exports.checkSipEnablement = checkSipEnablement;
exports.cleanUpGlob = cleanUpGlob;
@@ -887,6 +888,24 @@ function checkActionVersion(version, githubVersion) {
}
}
}
/**
* This will check whether the given GitHub version satisfies the given range,
* taking into account that a range like >=3.18 will also match the GHES 3.18
* pre-release/RC versions.
*
* When the given `githubVersion` is not a GHES version, or if the version
* is invalid, this will return `defaultIfInvalid`.
*/
function satisfiesGHESVersion(ghesVersion, range, defaultIfInvalid) {
const semverVersion = semver.coerce(ghesVersion);
if (semverVersion === null) {
return defaultIfInvalid;
}
// We always drop the pre-release part of the version, since anything that
// applies to GHES 3.18.0 should also apply to GHES 3.18.0.pre1.
semverVersion.prerelease = [];
return semver.satisfies(semverVersion, range);
}
/**
* Supported build modes.
*

File diff suppressed because one or more lines are too long

3
lib/util.test.js generated
View File

@@ -240,9 +240,10 @@ const shortTime = 10;
(0, ava_1.default)("withTimeout on long task", async (t) => {
let longTaskTimedOut = false;
const longTask = new Promise((resolve) => {
setTimeout(() => {
const timer = setTimeout(() => {
resolve(42);
}, longTime);
t.teardown(() => clearTimeout(timer));
});
const result = await util.withTimeout(shortTime, longTask, () => {
longTaskTimedOut = true;

File diff suppressed because one or more lines are too long

1
node_modules/.bin/nft generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../@vercel/nft/out/cli.js

1
node_modules/.bin/node-gyp-build generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../node-gyp-build/bin.js

1
node_modules/.bin/node-gyp-build-optional generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../node-gyp-build/optional.js

1
node_modules/.bin/node-gyp-build-test generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../node-gyp-build/build-test.js

1
node_modules/.bin/node-pre-gyp generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../@mapbox/node-pre-gyp/bin/node-pre-gyp

1
node_modules/.bin/nopt generated vendored Symbolic link
View File

@@ -0,0 +1 @@
../nopt/bin/nopt.js

1570
node_modules/.package-lock.json generated vendored

File diff suppressed because it is too large Load Diff

253
node_modules/@ava/typescript/index.js generated vendored
View File

@@ -4,8 +4,8 @@ import {pathToFileURL} from 'node:url';
import escapeStringRegexp from 'escape-string-regexp';
import {execa} from 'execa';
const pkg = JSON.parse(fs.readFileSync(new URL('package.json', import.meta.url)));
const help = `See https://github.com/avajs/typescript/blob/v${pkg.version}/README.md`;
const package_ = JSON.parse(fs.readFileSync(new URL('package.json', import.meta.url)));
const help = `See https://github.com/avajs/typescript/blob/v${package_.version}/README.md`;
function isPlainObject(x) {
return x !== null && typeof x === 'object' && Reflect.getPrototypeOf(x) === Object.prototype;
@@ -36,8 +36,8 @@ function validate(target, properties) {
}
}
async function compileTypeScript(projectDir) {
return execa('tsc', ['--incremental'], {preferLocal: true, cwd: projectDir});
async function compileTypeScript(projectDirectory) {
return execa({preferLocal: true, cwd: projectDirectory})`tsc --incremental`;
}
const configProperties = {
@@ -62,7 +62,7 @@ const configProperties = {
isValid(extensions) {
return Array.isArray(extensions)
&& extensions.length > 0
&& extensions.every(ext => typeof ext === 'string' && ext !== '')
&& extensions.every(extension => typeof extension === 'string' && extension !== '')
&& new Set(extensions).size === extensions.length;
},
},
@@ -75,7 +75,7 @@ const changeInterpretations = Object.freeze(Object.assign(Object.create(null), {
}));
export default function typescriptProvider({negotiateProtocol}) {
const protocol = negotiateProtocol(['ava-6', 'ava-3.2'], {version: pkg.version});
const protocol = negotiateProtocol(['ava-6'], {version: package_.version});
if (protocol === null) {
return;
}
@@ -98,143 +98,112 @@ export default function typescriptProvider({negotiateProtocol}) {
path.join(protocol.projectDir, from),
path.join(protocol.projectDir, to),
]);
const testFileExtension = new RegExp(`\\.(${extensions.map(ext => escapeStringRegexp(ext)).join('|')})$`);
const testFileExtension = new RegExp(`\\.(${extensions.map(extension => escapeStringRegexp(extension)).join('|')})$`);
const watchMode = protocol.identifier === 'ava-3.2'
? {
ignoreChange(filePath) {
if (!testFileExtension.test(filePath)) {
return false;
}
return rewritePaths.some(([from]) => filePath.startsWith(from));
},
resolveTestFile(testfile) { // Used under AVA 3.2 protocol by legacy watcher implementation.
if (!testFileExtension.test(testfile)) {
return testfile;
}
const rewrite = rewritePaths.find(([from]) => testfile.startsWith(from));
if (rewrite === undefined) {
return testfile;
}
const [from, to] = rewrite;
let newExtension = '.js';
if (testfile.endsWith('.cts')) {
newExtension = '.cjs';
} else if (testfile.endsWith('.mts')) {
newExtension = '.mjs';
}
return `${to}${testfile.slice(from.length)}`.replace(testFileExtension, newExtension);
},
}
: {
changeInterpretations,
interpretChange(filePath) {
if (config.compile === false) {
for (const [from] of rewritePaths) {
if (testFileExtension.test(filePath) && filePath.startsWith(from)) {
return changeInterpretations.waitForOutOfBandCompilation;
}
const watchMode = {
changeInterpretations,
interpretChange(filePath) {
if (config.compile === false) {
for (const [from] of rewritePaths) {
if (testFileExtension.test(filePath) && filePath.startsWith(from)) {
return changeInterpretations.waitForOutOfBandCompilation;
}
}
}
if (config.compile === 'tsc') {
for (const [, to] of rewritePaths) {
if (filePath.startsWith(to)) {
return changeInterpretations.ignoreCompiled;
}
if (config.compile === 'tsc') {
for (const [, to] of rewritePaths) {
if (filePath.startsWith(to)) {
return changeInterpretations.ignoreCompiled;
}
}
}
return changeInterpretations.unspecified;
},
resolvePossibleOutOfBandCompilationSources(filePath) {
if (config.compile !== false) {
return null;
}
// Only recognize .cjs, .mjs and .js files.
if (!/\.(c|m)?js$/.test(filePath)) {
return null;
}
for (const [from, to] of rewritePaths) {
if (!filePath.startsWith(to)) {
continue;
}
const rewritten = `${from}${filePath.slice(to.length)}`;
const possibleExtensions = [];
if (filePath.endsWith('.cjs')) {
if (extensions.includes('cjs')) {
possibleExtensions.push({replace: /\.cjs$/, extension: 'cjs'});
}
if (extensions.includes('cts')) {
possibleExtensions.push({replace: /\.cjs$/, extension: 'cts'});
}
if (possibleExtensions.length === 0) {
return null;
}
}
if (filePath.endsWith('.mjs')) {
if (extensions.includes('mjs')) {
possibleExtensions.push({replace: /\.mjs$/, extension: 'mjs'});
}
if (extensions.includes('mts')) {
possibleExtensions.push({replace: /\.mjs$/, extension: 'mts'});
}
if (possibleExtensions.length === 0) {
return null;
}
}
if (filePath.endsWith('.js')) {
if (extensions.includes('js')) {
possibleExtensions.push({replace: /\.js$/, extension: 'js'});
}
if (extensions.includes('ts')) {
possibleExtensions.push({replace: /\.js$/, extension: 'ts'});
}
if (extensions.includes('tsx')) {
possibleExtensions.push({replace: /\.js$/, extension: 'tsx'});
}
if (possibleExtensions.length === 0) {
return null;
}
}
const possibleDeletedFiles = [];
for (const {replace, extension} of possibleExtensions) {
const possibleFilePath = rewritten.replace(replace, `.${extension}`);
// Pick the first file path that exists.
if (fs.existsSync(possibleFilePath)) {
return [possibleFilePath];
}
possibleDeletedFiles.push(possibleFilePath);
}
return possibleDeletedFiles;
}
return changeInterpretations.unspecified;
},
resolvePossibleOutOfBandCompilationSources(filePath) {
if (config.compile !== false) {
return null;
},
};
}
// Only recognize .cjs, .mjs and .js files.
if (!/\.(c|m)?js$/.test(filePath)) {
return null;
}
for (const [from, to] of rewritePaths) {
if (!filePath.startsWith(to)) {
continue;
}
const rewritten = `${from}${filePath.slice(to.length)}`;
const possibleExtensions = [];
if (filePath.endsWith('.cjs')) {
if (extensions.includes('cjs')) {
possibleExtensions.push({replace: /\.cjs$/, extension: 'cjs'});
}
if (extensions.includes('cts')) {
possibleExtensions.push({replace: /\.cjs$/, extension: 'cts'});
}
if (possibleExtensions.length === 0) {
return null;
}
}
if (filePath.endsWith('.mjs')) {
if (extensions.includes('mjs')) {
possibleExtensions.push({replace: /\.mjs$/, extension: 'mjs'});
}
if (extensions.includes('mts')) {
possibleExtensions.push({replace: /\.mjs$/, extension: 'mts'});
}
if (possibleExtensions.length === 0) {
return null;
}
}
if (filePath.endsWith('.js')) {
if (extensions.includes('js')) {
possibleExtensions.push({replace: /\.js$/, extension: 'js'});
}
if (extensions.includes('ts')) {
possibleExtensions.push({replace: /\.js$/, extension: 'ts'});
}
if (extensions.includes('tsx')) {
possibleExtensions.push({replace: /\.js$/, extension: 'tsx'});
}
if (possibleExtensions.length === 0) {
return null;
}
}
const possibleDeletedFiles = [];
for (const {replace, extension} of possibleExtensions) {
const possibleFilePath = rewritten.replace(replace, `.${extension}`);
// Pick the first file path that exists.
if (fs.existsSync(possibleFilePath)) {
return [possibleFilePath];
}
possibleDeletedFiles.push(possibleFilePath);
}
return possibleDeletedFiles;
}
return null;
},
};
return {
...watchMode,
@@ -276,21 +245,21 @@ export default function typescriptProvider({negotiateProtocol}) {
worker({extensionsToLoadAsModules, state: {extensions, rewritePaths}}) {
const importJs = extensionsToLoadAsModules.includes('js');
const testFileExtension = new RegExp(`\\.(${extensions.map(ext => escapeStringRegexp(ext)).join('|')})$`);
const testFileExtension = new RegExp(`\\.(${extensions.map(extension => escapeStringRegexp(extension)).join('|')})$`);
return {
canLoad(ref) {
return testFileExtension.test(ref) && rewritePaths.some(([from]) => ref.startsWith(from));
canLoad(reference) {
return testFileExtension.test(reference) && rewritePaths.some(([from]) => reference.startsWith(from));
},
async load(ref, {requireFn}) {
const [from, to] = rewritePaths.find(([from]) => ref.startsWith(from));
let rewritten = `${to}${ref.slice(from.length)}`;
async load(reference, {requireFn}) {
const [from, to] = rewritePaths.find(([from]) => reference.startsWith(from));
let rewritten = `${to}${reference.slice(from.length)}`;
let useImport = true;
if (ref.endsWith('.cts')) {
if (reference.endsWith('.cts')) {
rewritten = rewritten.replace(/\.cts$/, '.cjs');
useImport = false;
} else if (ref.endsWith('.mts')) {
} else if (reference.endsWith('.mts')) {
rewritten = rewritten.replace(/\.mts$/, '.mjs');
} else {
rewritten = rewritten.replace(testFileExtension, '.js');

View File

@@ -1,9 +1,9 @@
{
"name": "@ava/typescript",
"version": "4.1.0",
"version": "6.0.0",
"description": "TypeScript provider for AVA",
"engines": {
"node": "^14.19 || ^16.15 || ^18 || ^20"
"node": "^20.8 || ^22 || >=24"
},
"files": [
"index.js"
@@ -24,36 +24,17 @@
},
"dependencies": {
"escape-string-regexp": "^5.0.0",
"execa": "^7.1.1"
"execa": "^9.6.0"
},
"devDependencies": {
"ava": "^5.3.1",
"c8": "^8.0.0",
"del": "^7.0.0",
"typescript": "^5.1.3",
"xo": "^0.54.2"
"ava": "^6.4.0",
"c8": "^10.1.3",
"del": "^8.0.0",
"typescript": "^5.8.3",
"xo": "^1.1.0"
},
"c8": {
"reporter": [
"html",
"lcov",
"text"
]
},
"ava": {
"files": [
"!test/broken-fixtures/**"
],
"ignoredByWatcher": [
"test/fixtures/**",
"test/broken-fixtures/**"
],
"timeout": "60s"
},
"xo": {
"ignores": [
"test/broken-fixtures",
"test/fixtures/**/compiled/**"
]
"volta": {
"node": "22.16.0",
"npm": "11.4.2"
}
}

View File

@@ -17,7 +17,7 @@ yarn add @eslint/compat -D
# or
pnpm install @eslint/compat -D
# or
bun install @eslint/compat -D
bun add @eslint/compat -D
```
For Deno:
@@ -30,10 +30,10 @@ deno add @eslint/compat
This package exports the following functions in both ESM and CommonJS format:
- `fixupRule(rule)` - wraps the given rule in a compatibility layer and returns the result
- `fixupPluginRules(plugin)` - wraps each rule in the given plugin using `fixupRule()` and returns a new object that represents the plugin with the fixed-up rules
- `fixupConfigRules(configs)` - wraps all plugins found in an array of config objects using `fixupPluginRules()`
- `includeIgnoreFile(path)` - reads an ignore file (like `.gitignore`) and converts the patterns into the correct format for the config file
- `fixupRule(rule)` - wraps the given rule in a compatibility layer and returns the result
- `fixupPluginRules(plugin)` - wraps each rule in the given plugin using `fixupRule()` and returns a new object that represents the plugin with the fixed-up rules
- `fixupConfigRules(configs)` - wraps all plugins found in an array of config objects using `fixupPluginRules()`
- `includeIgnoreFile(path)` - reads an ignore file (like `.gitignore`) and converts the patterns into the correct format for the config file
### Fixing Rules
@@ -145,7 +145,9 @@ module.exports = [
### Including Ignore Files
If you were using an alternate ignore file in ESLint v8.x, such as using `--ignore-path .gitignore` on the command line, you can include those patterns programmatically in your config file using the `includeIgnoreFile()` function. For example:
If you were using an alternate ignore file in ESLint v8.x, such as using `--ignore-path .gitignore` on the command line, you can include those patterns programmatically in your config file using the `includeIgnoreFile()` function.
The `includeIgnoreFile()` function also accepts a second optional `name` parameter that allows you to set a custom name for this configuration object. If not specified, it defaults to `"Imported .gitignore patterns"`. For example:
```js
// eslint.config.js - ESM example
@@ -158,7 +160,7 @@ const __dirname = path.dirname(__filename);
const gitignorePath = path.resolve(__dirname, ".gitignore");
export default [
includeIgnoreFile(gitignorePath),
includeIgnoreFile(gitignorePath, "Imported .gitignore patterns"), // second argument is optional.
{
// your overrides
},
@@ -174,7 +176,7 @@ const path = require("node:path");
const gitignorePath = path.resolve(__dirname, ".gitignore");
module.exports = [
includeIgnoreFile(gitignorePath),
includeIgnoreFile(gitignorePath, "Imported .gitignore patterns"), // second argument is optional.
{
// your overrides
},
@@ -187,21 +189,21 @@ module.exports = [
Apache 2.0
## Sponsors
The following companies, organizations, and individuals support ESLint's ongoing maintenance and development. [Become a Sponsor](https://eslint.org/donate) to get your logo on our README and website.
<!-- NOTE: This section is autogenerated. Do not manually edit.-->
<!--sponsorsstart-->
<h3>Platinum Sponsors</h3>
<p><a href="https://automattic.com"><img src="https://images.opencollective.com/automattic/d0ef3e1/logo.png" alt="Automattic" height="128"></a> <a href="https://www.airbnb.com/"><img src="https://images.opencollective.com/airbnb/d327d66/logo.png" alt="Airbnb" height="128"></a></p><h3>Gold Sponsors</h3>
<p><a href="#"><img src="https://images.opencollective.com/guest-bf377e88/avatar.png" alt="Eli Schleifer" height="96"></a> <a href="https://engineering.salesforce.com"><img src="https://images.opencollective.com/salesforce/ca8f997/logo.png" alt="Salesforce" height="96"></a></p><h3>Silver Sponsors</h3>
<p><a href="https://www.jetbrains.com/"><img src="https://images.opencollective.com/jetbrains/fe76f99/logo.png" alt="JetBrains" height="64"></a> <a href="https://liftoff.io/"><img src="https://images.opencollective.com/liftoff/5c4fa84/logo.png" alt="Liftoff" height="64"></a> <a href="https://americanexpress.io"><img src="https://avatars.githubusercontent.com/u/3853301?v=4" alt="American Express" height="64"></a> <a href="https://www.workleap.com"><img src="https://avatars.githubusercontent.com/u/53535748?u=d1e55d7661d724bf2281c1bfd33cb8f99fe2465f&v=4" alt="Workleap" height="64"></a></p><h3>Bronze Sponsors</h3>
<p><a href="https://www.notion.so"><img src="https://images.opencollective.com/notion/bf3b117/logo.png" alt="notion" height="32"></a> <a href="https://www.crosswordsolver.org/anagram-solver/"><img src="https://images.opencollective.com/anagram-solver/2666271/logo.png" alt="Anagram Solver" height="32"></a> <a href="https://icons8.com/"><img src="https://images.opencollective.com/icons8/7fa1641/logo.png" alt="Icons8" height="32"></a> <a href="https://discord.com"><img src="https://images.opencollective.com/discordapp/f9645d9/logo.png" alt="Discord" height="32"></a> <a href="https://www.ignitionapp.com"><img src="https://avatars.githubusercontent.com/u/5753491?v=4" alt="Ignition" height="32"></a> <a href="https://nx.dev"><img src="https://avatars.githubusercontent.com/u/23692104?v=4" alt="Nx" height="32"></a> <a href="https://herocoders.com"><img src="https://avatars.githubusercontent.com/u/37549774?v=4" alt="HeroCoders" height="32"></a> <a href="https://usenextbase.com"><img src="https://avatars.githubusercontent.com/u/145838380?v=4" alt="Nextbase Starter Kit" height="32"></a></p>
<!--sponsorsend-->
<!--techsponsorsstart-->
<h2>Technology Sponsors</h2>
<p><a href="https://netlify.com"><img src="https://raw.githubusercontent.com/eslint/eslint.org/main/src/assets/images/techsponsors/netlify-icon.svg" alt="Netlify" height="32"></a> <a href="https://algolia.com"><img src="https://raw.githubusercontent.com/eslint/eslint.org/main/src/assets/images/techsponsors/algolia-icon.svg" alt="Algolia" height="32"></a> <a href="https://1password.com"><img src="https://raw.githubusercontent.com/eslint/eslint.org/main/src/assets/images/techsponsors/1password-icon.svg" alt="1Password" height="32"></a>
</p>
<!--techsponsorsend-->
## Sponsors
The following companies, organizations, and individuals support ESLint's ongoing maintenance and development. [Become a Sponsor](https://eslint.org/donate)
to get your logo on our READMEs and [website](https://eslint.org/sponsors).
<h3>Diamond Sponsors</h3>
<p><a href="https://www.ag-grid.com/"><img src="https://images.opencollective.com/ag-grid/bec0580/logo.png" alt="AG Grid" height="128"></a></p><h3>Platinum Sponsors</h3>
<p><a href="https://automattic.com"><img src="https://images.opencollective.com/automattic/d0ef3e1/logo.png" alt="Automattic" height="128"></a> <a href="https://www.airbnb.com/"><img src="https://images.opencollective.com/airbnb/d327d66/logo.png" alt="Airbnb" height="128"></a></p><h3>Gold Sponsors</h3>
<p><a href="https://qlty.sh/"><img src="https://images.opencollective.com/qltysh/33d157d/logo.png" alt="Qlty Software" height="96"></a> <a href="https://trunk.io/"><img src="https://images.opencollective.com/trunkio/fb92d60/avatar.png" alt="trunk.io" height="96"></a> <a href="https://shopify.engineering/"><img src="https://avatars.githubusercontent.com/u/8085" alt="Shopify" height="96"></a></p><h3>Silver Sponsors</h3>
<p><a href="https://vite.dev/"><img src="https://images.opencollective.com/vite/e6d15e1/logo.png" alt="Vite" height="64"></a> <a href="https://liftoff.io/"><img src="https://images.opencollective.com/liftoff/5c4fa84/logo.png" alt="Liftoff" height="64"></a> <a href="https://americanexpress.io"><img src="https://avatars.githubusercontent.com/u/3853301" alt="American Express" height="64"></a> <a href="https://stackblitz.com"><img src="https://avatars.githubusercontent.com/u/28635252" alt="StackBlitz" height="64"></a></p><h3>Bronze Sponsors</h3>
<p><a href="https://sentry.io"><img src="https://github.com/getsentry.png" alt="Sentry" height="32"></a> <a href="https://syntax.fm"><img src="https://github.com/syntaxfm.png" alt="Syntax" height="32"></a> <a href="https://cybozu.co.jp/"><img src="https://images.opencollective.com/cybozu/933e46d/logo.png" alt="Cybozu" height="32"></a> <a href="https://www.crosswordsolver.org/anagram-solver/"><img src="https://images.opencollective.com/anagram-solver/2666271/logo.png" alt="Anagram Solver" height="32"></a> <a href="https://icons8.com/"><img src="https://images.opencollective.com/icons8/7fa1641/logo.png" alt="Icons8" height="32"></a> <a href="https://discord.com"><img src="https://images.opencollective.com/discordapp/f9645d9/logo.png" alt="Discord" height="32"></a> <a href="https://www.gitbook.com"><img src="https://avatars.githubusercontent.com/u/7111340" alt="GitBook" height="32"></a> <a href="https://nolebase.ayaka.io"><img src="https://avatars.githubusercontent.com/u/11081491" alt="Neko" height="32"></a> <a href="https://nx.dev"><img src="https://avatars.githubusercontent.com/u/23692104" alt="Nx" height="32"></a> <a href="https://opensource.mercedes-benz.com/"><img src="https://avatars.githubusercontent.com/u/34240465" alt="Mercedes-Benz Group" height="32"></a> <a href="https://herocoders.com"><img src="https://avatars.githubusercontent.com/u/37549774" alt="HeroCoders" height="32"></a> <a href="https://www.lambdatest.com"><img src="https://avatars.githubusercontent.com/u/171592363" alt="LambdaTest" height="32"></a></p>
<h3>Technology Sponsors</h3>
Technology sponsors allow us to use their products and services for free as part of a contribution to the open source ecosystem and our work.
<p><a href="https://netlify.com"><img src="https://raw.githubusercontent.com/eslint/eslint.org/main/src/assets/images/techsponsors/netlify-icon.svg" alt="Netlify" height="32"></a> <a href="https://algolia.com"><img src="https://raw.githubusercontent.com/eslint/eslint.org/main/src/assets/images/techsponsors/algolia-icon.svg" alt="Algolia" height="32"></a> <a href="https://1password.com"><img src="https://raw.githubusercontent.com/eslint/eslint.org/main/src/assets/images/techsponsors/1password-icon.svg" alt="1Password" height="32"></a></p>
<!--sponsorsend-->

View File

@@ -14,8 +14,8 @@ var path = require('node:path');
/** @typedef {import("eslint").ESLint.Plugin} FixupPluginDefinition */
/** @typedef {import("eslint").Rule.RuleModule} FixupRuleDefinition */
/** @typedef {import("eslint").Rule.OldStyleRule} FixupLegacyRuleDefinition */
/** @typedef {import("eslint").Linter.FlatConfig} FixupConfig */
/** @typedef {FixupRuleDefinition["create"]} FixupLegacyRuleDefinition */
/** @typedef {import("eslint").Linter.Config} FixupConfig */
/** @typedef {Array<FixupConfig>} FixupConfigArray */
//-----------------------------------------------------------------------------
@@ -188,6 +188,21 @@ function fixupRule(ruleDefinition) {
create: ruleCreate,
};
// copy `schema` property of function-style rule or top-level `schema` property of object-style rule into `meta` object
// @ts-ignore -- top-level `schema` property was not offically supported for object-style rules so it doesn't exist in types
const { schema } = ruleDefinition;
if (schema) {
if (!newRuleDefinition.meta) {
newRuleDefinition.meta = { schema };
} else {
newRuleDefinition.meta = {
...newRuleDefinition.meta,
// top-level `schema` had precedence over `meta.schema` so it's okay to overwrite `meta.schema` if it exists
schema,
};
}
}
// cache the fixed up rule
fixedUpRuleReplacements.set(ruleDefinition, newRuleDefinition);
fixedUpRules.add(newRuleDefinition);
@@ -270,7 +285,7 @@ function fixupConfigRules(config) {
// Types
//-----------------------------------------------------------------------------
/** @typedef {import("eslint").Linter.FlatConfig} FlatConfig */
/** @typedef {import("eslint").Linter.Config} FlatConfig */
//-----------------------------------------------------------------------------
// Exports
@@ -324,10 +339,11 @@ function convertIgnorePatternToMinimatch(pattern) {
/**
* Reads an ignore file and returns an object with the ignore patterns.
* @param {string} ignoreFilePath The absolute path to the ignore file.
* @param {string} [name] The name of the ignore file config.
* @returns {FlatConfig} An object with an `ignores` property that is an array of ignore patterns.
* @throws {Error} If the ignore file path is not an absolute path.
*/
function includeIgnoreFile(ignoreFilePath) {
function includeIgnoreFile(ignoreFilePath, name) {
if (!path.isAbsolute(ignoreFilePath)) {
throw new Error("The ignore file location must be an absolute path.");
}
@@ -336,7 +352,7 @@ function includeIgnoreFile(ignoreFilePath) {
const lines = ignoreFile.split(/\r?\n/u);
return {
name: "Imported .gitignore patterns",
name: name || "Imported .gitignore patterns",
ignores: lines
.map(line => line.trim())
.filter(line => line && !line.startsWith("#"))

View File

@@ -1,14 +1,14 @@
export type FlatConfig = import("eslint").Linter.FlatConfig;
export type FlatConfig = import("eslint").Linter.Config;
export type FixupPluginDefinition = import("eslint").ESLint.Plugin;
export type FixupRuleDefinition = import("eslint").Rule.RuleModule;
export type FixupLegacyRuleDefinition = import("eslint").Rule.OldStyleRule;
export type FixupConfig = import("eslint").Linter.FlatConfig;
export type FixupLegacyRuleDefinition = FixupRuleDefinition["create"];
export type FixupConfig = import("eslint").Linter.Config;
export type FixupConfigArray = Array<FixupConfig>;
/**
* @fileoverview Ignore file utilities for the compat package.
* @author Nicholas C. Zakas
*/
/** @typedef {import("eslint").Linter.FlatConfig} FlatConfig */
/** @typedef {import("eslint").Linter.Config} FlatConfig */
/**
* Converts an ESLint ignore pattern to a minimatch pattern.
* @param {string} pattern The .eslintignore or .gitignore pattern to convert.
@@ -39,7 +39,8 @@ export function fixupRule(ruleDefinition: FixupRuleDefinition | FixupLegacyRuleD
/**
* Reads an ignore file and returns an object with the ignore patterns.
* @param {string} ignoreFilePath The absolute path to the ignore file.
* @param {string} [name] The name of the ignore file config.
* @returns {FlatConfig} An object with an `ignores` property that is an array of ignore patterns.
* @throws {Error} If the ignore file path is not an absolute path.
*/
export function includeIgnoreFile(ignoreFilePath: string): FlatConfig;
export function includeIgnoreFile(ignoreFilePath: string, name?: string): FlatConfig;

View File

@@ -1,14 +1,14 @@
export type FlatConfig = import("eslint").Linter.FlatConfig;
export type FlatConfig = import("eslint").Linter.Config;
export type FixupPluginDefinition = import("eslint").ESLint.Plugin;
export type FixupRuleDefinition = import("eslint").Rule.RuleModule;
export type FixupLegacyRuleDefinition = import("eslint").Rule.OldStyleRule;
export type FixupConfig = import("eslint").Linter.FlatConfig;
export type FixupLegacyRuleDefinition = FixupRuleDefinition["create"];
export type FixupConfig = import("eslint").Linter.Config;
export type FixupConfigArray = Array<FixupConfig>;
/**
* @fileoverview Ignore file utilities for the compat package.
* @author Nicholas C. Zakas
*/
/** @typedef {import("eslint").Linter.FlatConfig} FlatConfig */
/** @typedef {import("eslint").Linter.Config} FlatConfig */
/**
* Converts an ESLint ignore pattern to a minimatch pattern.
* @param {string} pattern The .eslintignore or .gitignore pattern to convert.
@@ -39,7 +39,8 @@ export function fixupRule(ruleDefinition: FixupRuleDefinition | FixupLegacyRuleD
/**
* Reads an ignore file and returns an object with the ignore patterns.
* @param {string} ignoreFilePath The absolute path to the ignore file.
* @param {string} [name] The name of the ignore file config.
* @returns {FlatConfig} An object with an `ignores` property that is an array of ignore patterns.
* @throws {Error} If the ignore file path is not an absolute path.
*/
export function includeIgnoreFile(ignoreFilePath: string): FlatConfig;
export function includeIgnoreFile(ignoreFilePath: string, name?: string): FlatConfig;

View File

@@ -13,8 +13,8 @@ import path from 'node:path';
/** @typedef {import("eslint").ESLint.Plugin} FixupPluginDefinition */
/** @typedef {import("eslint").Rule.RuleModule} FixupRuleDefinition */
/** @typedef {import("eslint").Rule.OldStyleRule} FixupLegacyRuleDefinition */
/** @typedef {import("eslint").Linter.FlatConfig} FixupConfig */
/** @typedef {FixupRuleDefinition["create"]} FixupLegacyRuleDefinition */
/** @typedef {import("eslint").Linter.Config} FixupConfig */
/** @typedef {Array<FixupConfig>} FixupConfigArray */
//-----------------------------------------------------------------------------
@@ -187,6 +187,21 @@ function fixupRule(ruleDefinition) {
create: ruleCreate,
};
// copy `schema` property of function-style rule or top-level `schema` property of object-style rule into `meta` object
// @ts-ignore -- top-level `schema` property was not offically supported for object-style rules so it doesn't exist in types
const { schema } = ruleDefinition;
if (schema) {
if (!newRuleDefinition.meta) {
newRuleDefinition.meta = { schema };
} else {
newRuleDefinition.meta = {
...newRuleDefinition.meta,
// top-level `schema` had precedence over `meta.schema` so it's okay to overwrite `meta.schema` if it exists
schema,
};
}
}
// cache the fixed up rule
fixedUpRuleReplacements.set(ruleDefinition, newRuleDefinition);
fixedUpRules.add(newRuleDefinition);
@@ -269,7 +284,7 @@ function fixupConfigRules(config) {
// Types
//-----------------------------------------------------------------------------
/** @typedef {import("eslint").Linter.FlatConfig} FlatConfig */
/** @typedef {import("eslint").Linter.Config} FlatConfig */
//-----------------------------------------------------------------------------
// Exports
@@ -323,10 +338,11 @@ function convertIgnorePatternToMinimatch(pattern) {
/**
* Reads an ignore file and returns an object with the ignore patterns.
* @param {string} ignoreFilePath The absolute path to the ignore file.
* @param {string} [name] The name of the ignore file config.
* @returns {FlatConfig} An object with an `ignores` property that is an array of ignore patterns.
* @throws {Error} If the ignore file path is not an absolute path.
*/
function includeIgnoreFile(ignoreFilePath) {
function includeIgnoreFile(ignoreFilePath, name) {
if (!path.isAbsolute(ignoreFilePath)) {
throw new Error("The ignore file location must be an absolute path.");
}
@@ -335,7 +351,7 @@ function includeIgnoreFile(ignoreFilePath) {
const lines = ignoreFile.split(/\r?\n/u);
return {
name: "Imported .gitignore patterns",
name: name || "Imported .gitignore patterns",
ignores: lines
.map(line => line.trim())
.filter(line => line && !line.startsWith("#"))

View File

@@ -1,6 +1,6 @@
{
"name": "@eslint/compat",
"version": "1.1.1",
"version": "1.3.1",
"description": "Compatibility utilities for ESLint",
"type": "module",
"main": "dist/esm/index.js",
@@ -25,7 +25,7 @@
"test": "tests"
},
"scripts": {
"build:cts": "node -e \"fs.copyFileSync('dist/esm/index.d.ts', 'dist/cjs/index.d.cts')\"",
"build:cts": "node ../../tools/build-cts.js dist/esm/index.d.ts dist/cjs/index.d.cts",
"build": "rollup -c && tsc -p tsconfig.esm.json && npm run build:cts",
"test:jsr": "npx jsr@latest publish --dry-run",
"test": "mocha tests/*.js",
@@ -33,7 +33,8 @@
},
"repository": {
"type": "git",
"url": "git+https://github.com/eslint/rewrite.git"
"url": "git+https://github.com/eslint/rewrite.git",
"directory": "packages/compat"
},
"keywords": [
"eslint",
@@ -46,14 +47,18 @@
"bugs": {
"url": "https://github.com/eslint/rewrite/issues"
},
"homepage": "https://github.com/eslint/rewrite#readme",
"homepage": "https://github.com/eslint/rewrite/tree/main/packages/compat#readme",
"devDependencies": {
"@types/eslint": "^8.56.10",
"c8": "^9.1.0",
"eslint": "^9.0.0",
"mocha": "^10.4.0",
"rollup": "^4.16.2",
"typescript": "^5.4.5"
"@eslint/core": "^0.15.1",
"eslint": "^9.27.0"
},
"peerDependencies": {
"eslint": "^8.40 || 9"
},
"peerDependenciesMeta": {
"eslint": {
"optional": true
}
},
"engines": {
"node": "^18.18.0 || ^20.9.0 || >=21.1.0"

View File

@@ -1,6 +1,6 @@
{
"name": "@eslint/js",
"version": "9.28.0",
"version": "9.31.0",
"description": "ESLint JavaScript language implementation",
"funding": "https://eslint.org/donate",
"main": "./src/index.js",

View File

@@ -1,6 +1,6 @@
The ISC License
Copyright (c) 2019 Elan Shanker, Paul Miller (https://paulmillr.com)
Copyright (c) Isaac Z. Schlueter and Contributors
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above

71
node_modules/@isaacs/fs-minipass/README.md generated vendored Normal file
View File

@@ -0,0 +1,71 @@
# fs-minipass
Filesystem streams based on [minipass](http://npm.im/minipass).
4 classes are exported:
- ReadStream
- ReadStreamSync
- WriteStream
- WriteStreamSync
When using `ReadStreamSync`, all of the data is made available
immediately upon consuming the stream. Nothing is buffered in memory
when the stream is constructed. If the stream is piped to a writer,
then it will synchronously `read()` and emit data into the writer as
fast as the writer can consume it. (That is, it will respect
backpressure.) If you call `stream.read()` then it will read the
entire file and return the contents.
When using `WriteStreamSync`, every write is flushed to the file
synchronously. If your writes all come in a single tick, then it'll
write it all out in a single tick. It's as synchronous as you are.
The async versions work much like their node builtin counterparts,
with the exception of introducing significantly less Stream machinery
overhead.
## USAGE
It's just streams, you pipe them or read() them or write() to them.
```js
import { ReadStream, WriteStream } from 'fs-minipass'
// or: const { ReadStream, WriteStream } = require('fs-minipass')
const readStream = new ReadStream('file.txt')
const writeStream = new WriteStream('output.txt')
writeStream.write('some file header or whatever\n')
readStream.pipe(writeStream)
```
## ReadStream(path, options)
Path string is required, but somewhat irrelevant if an open file
descriptor is passed in as an option.
Options:
- `fd` Pass in a numeric file descriptor, if the file is already open.
- `readSize` The size of reads to do, defaults to 16MB
- `size` The size of the file, if known. Prevents zero-byte read()
call at the end.
- `autoClose` Set to `false` to prevent the file descriptor from being
closed when the file is done being read.
## WriteStream(path, options)
Path string is required, but somewhat irrelevant if an open file
descriptor is passed in as an option.
Options:
- `fd` Pass in a numeric file descriptor, if the file is already open.
- `mode` The mode to create the file with. Defaults to `0o666`.
- `start` The position in the file to start reading. If not
specified, then the file will start writing at position zero, and be
truncated by default.
- `autoClose` Set to `false` to prevent the file descriptor from being
closed when the stream is ended.
- `flags` Flags to use when opening the file. Irrelevant if `fd` is
passed in, since file won't be opened in that case. Defaults to
`'a'` if a `pos` is specified, or `'w'` otherwise.

View File

@@ -0,0 +1,118 @@
/// <reference types="node" />
/// <reference types="node" />
/// <reference types="node" />
import EE from 'events';
import { Minipass } from 'minipass';
declare const _autoClose: unique symbol;
declare const _close: unique symbol;
declare const _ended: unique symbol;
declare const _fd: unique symbol;
declare const _finished: unique symbol;
declare const _flags: unique symbol;
declare const _flush: unique symbol;
declare const _handleChunk: unique symbol;
declare const _makeBuf: unique symbol;
declare const _mode: unique symbol;
declare const _needDrain: unique symbol;
declare const _onerror: unique symbol;
declare const _onopen: unique symbol;
declare const _onread: unique symbol;
declare const _onwrite: unique symbol;
declare const _open: unique symbol;
declare const _path: unique symbol;
declare const _pos: unique symbol;
declare const _queue: unique symbol;
declare const _read: unique symbol;
declare const _readSize: unique symbol;
declare const _reading: unique symbol;
declare const _remain: unique symbol;
declare const _size: unique symbol;
declare const _write: unique symbol;
declare const _writing: unique symbol;
declare const _defaultFlag: unique symbol;
declare const _errored: unique symbol;
export type ReadStreamOptions = Minipass.Options<Minipass.ContiguousData> & {
fd?: number;
readSize?: number;
size?: number;
autoClose?: boolean;
};
export type ReadStreamEvents = Minipass.Events<Minipass.ContiguousData> & {
open: [fd: number];
};
export declare class ReadStream extends Minipass<Minipass.ContiguousData, Buffer, ReadStreamEvents> {
[_errored]: boolean;
[_fd]?: number;
[_path]: string;
[_readSize]: number;
[_reading]: boolean;
[_size]: number;
[_remain]: number;
[_autoClose]: boolean;
constructor(path: string, opt: ReadStreamOptions);
get fd(): number | undefined;
get path(): string;
write(): void;
end(): void;
[_open](): void;
[_onopen](er?: NodeJS.ErrnoException | null, fd?: number): void;
[_makeBuf](): Buffer;
[_read](): void;
[_onread](er?: NodeJS.ErrnoException | null, br?: number, buf?: Buffer): void;
[_close](): void;
[_onerror](er: NodeJS.ErrnoException): void;
[_handleChunk](br: number, buf: Buffer): boolean;
emit<Event extends keyof ReadStreamEvents>(ev: Event, ...args: ReadStreamEvents[Event]): boolean;
}
export declare class ReadStreamSync extends ReadStream {
[_open](): void;
[_read](): void;
[_close](): void;
}
export type WriteStreamOptions = {
fd?: number;
autoClose?: boolean;
mode?: number;
captureRejections?: boolean;
start?: number;
flags?: string;
};
export declare class WriteStream extends EE {
readable: false;
writable: boolean;
[_errored]: boolean;
[_writing]: boolean;
[_ended]: boolean;
[_queue]: Buffer[];
[_needDrain]: boolean;
[_path]: string;
[_mode]: number;
[_autoClose]: boolean;
[_fd]?: number;
[_defaultFlag]: boolean;
[_flags]: string;
[_finished]: boolean;
[_pos]?: number;
constructor(path: string, opt: WriteStreamOptions);
emit(ev: string, ...args: any[]): boolean;
get fd(): number | undefined;
get path(): string;
[_onerror](er: NodeJS.ErrnoException): void;
[_open](): void;
[_onopen](er?: null | NodeJS.ErrnoException, fd?: number): void;
end(buf: string, enc?: BufferEncoding): this;
end(buf?: Buffer, enc?: undefined): this;
write(buf: string, enc?: BufferEncoding): boolean;
write(buf: Buffer, enc?: undefined): boolean;
[_write](buf: Buffer): void;
[_onwrite](er?: null | NodeJS.ErrnoException, bw?: number): void;
[_flush](): void;
[_close](): void;
}
export declare class WriteStreamSync extends WriteStream {
[_open](): void;
[_close](): void;
[_write](buf: Buffer): void;
}
export {};
//# sourceMappingURL=index.d.ts.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":";;;AAAA,OAAO,EAAE,MAAM,QAAQ,CAAA;AAEvB,OAAO,EAAE,QAAQ,EAAE,MAAM,UAAU,CAAA;AAInC,QAAA,MAAM,UAAU,eAAuB,CAAA;AACvC,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,GAAG,eAAgB,CAAA;AACzB,QAAA,MAAM,SAAS,eAAsB,CAAA;AACrC,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,YAAY,eAAyB,CAAA;AAC3C,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,UAAU,eAAuB,CAAA;AACvC,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,OAAO,eAAoB,CAAA;AACjC,QAAA,MAAM,OAAO,eAAoB,CAAA;AACjC,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,IAAI,eAAiB,CAAA;AAC3B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,SAAS,eAAsB,CAAA;AACrC,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,OAAO,eAAoB,CAAA;AACjC,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,YAAY,eAAyB,CAAA;AAC3C,QAAA,MAAM,QAAQ,eAAqB,CAAA;AAEnC,MAAM,MAAM,iBAAiB,GAC3B,QAAQ,CAAC,OAAO,CAAC,QAAQ,CAAC,cAAc,CAAC,GAAG;IAC1C,EAAE,CAAC,EAAE,MAAM,CAAA;IACX,QAAQ,CAAC,EAAE,MAAM,CAAA;IACjB,IAAI,CAAC,EAAE,MAAM,CAAA;IACb,SAAS,CAAC,EAAE,OAAO,CAAA;CACpB,CAAA;AAEH,MAAM,MAAM,gBAAgB,GAAG,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,cAAc,CAAC,GAAG;IACxE,IAAI,EAAE,CAAC,EAAE,EAAE,MAAM,CAAC,CAAA;CACnB,CAAA;AAED,qBAAa,UAAW,SAAQ,QAAQ,CACtC,QAAQ,CAAC,cAAc,EACvB,MAAM,EACN,gBAAgB,CACjB;IACC,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,GAAG,CAAC,CAAC,EAAE,MAAM,CAAC;IACf,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,SAAS,CAAC,EAAE,MAAM,CAAC;IACpB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,OAAO,CAAC,EAAE,MAAM,CAAC;IAClB,CAAC,UAAU,CAAC,EAAE,OAAO,CAAA;gBAET,IAAI,EAAE,MAAM,EAAE,GAAG,EAAE,iBAAiB;IA4BhD,IAAI,EAAE,uBAEL;IAED,IAAI,IAAI,WAEP;IAGD,KAAK;IAKL,GAAG;IAIH,CAAC,KAAK,CAAC;IAIP,CAAC,OAAO,CAAC,CAAC,EAAE,CAAC,EAAE,MAAM,CAAC,cAAc,GAAG,IAAI,EAAE,EAAE,CAAC,EAAE,MAAM;IAUxD,CAAC,QAAQ,CAAC;IAIV,CAAC,KAAK,CAAC;IAeP,CAAC,OAAO,CAAC,CAAC,EAAE,CAAC,EAAE,MAAM,CAAC,cAAc,GAAG,IAAI,EAAE,EAAE,CAAC,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,MAAM;IAStE,CAAC,MAAM,CAAC;IAUR,CAAC,QAAQ,CAAC,CAAC,EAAE,EAAE,MAAM,CAAC,cAAc;IAMpC,CAAC,YAAY,CAAC,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,EAAE,MAAM;IAiBtC,IAAI,CAAC,KAAK,SAAS,MAAM,gBAAgB,EACvC,EAAE,EAAE,KAAK,EACT,GAAG,IAAI,EAAE,gBAAgB,CAAC,KAAK,CAAC,GAC/B,OAAO;CAuBX;AAED,qBAAa,cAAe,SAAQ,UAAU;IAC5C,CAAC,KAAK,CAAC;IAYP,CAAC,KAAK,CAAC;IA2BP,CAAC,MAAM,CAAC;CAQT;AAED,MAAM,MAAM,kBAAkB,GAAG;IAC/B,EAAE,CAAC,EAAE,MAAM,CAAA;IACX,SAAS,CAAC,EAAE,OAAO,CAAA;IACnB,IAAI,CAAC,EAAE,MAAM,CAAA;IACb,iBAAiB,CAAC,EAAE,OAAO,CAAA;IAC3B,KAAK,CAAC,EAAE,MAAM,CAAA;IACd,KAAK,CAAC,EAAE,MAAM,CAAA;CACf,CAAA;AAED,qBAAa,WAAY,SAAQ,EAAE;IACjC,QAAQ,EAAE,KAAK,CAAQ;IACvB,QAAQ,EAAE,OAAO,CAAQ;IACzB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,MAAM,CAAC,EAAE,OAAO,CAAS;IAC1B,CAAC,MAAM,CAAC,EAAE,MAAM,EAAE,CAAM;IACxB,CAAC,UAAU,CAAC,EAAE,OAAO,CAAS;IAC9B,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,UAAU,CAAC,EAAE,OAAO,CAAC;IACtB,CAAC,GAAG,CAAC,CAAC,EAAE,MAAM,CAAC;IACf,CAAC,YAAY,CAAC,EAAE,OAAO,CAAC;IACxB,CAAC,MAAM,CAAC,EAAE,MAAM,CAAC;IACjB,CAAC,SAAS,CAAC,EAAE,OAAO,CAAS;IAC7B,CAAC,IAAI,CAAC,CAAC,EAAE,MAAM,CAAA;gBAEH,IAAI,EAAE,MAAM,EAAE,GAAG,EAAE,kBAAkB;IAoBjD,IAAI,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,IAAI,EAAE,GAAG,EAAE;IAU/B,IAAI,EAAE,uBAEL;IAED,IAAI,IAAI,WAEP;IAED,CAAC,QAAQ,CAAC,CAAC,EAAE,EAAE,MAAM,CAAC,cAAc;IAMpC,CAAC,KAAK,CAAC;IAMP,CAAC,OAAO,CAAC,CAAC,EAAE,CAAC,EAAE,IAAI,GAAG,MAAM,CAAC,cAAc,EAAE,EAAE,CAAC,EAAE,MAAM;IAoBxD,GAAG,CAAC,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,cAAc,GAAG,IAAI;IAC5C,GAAG,CAAC,GAAG,CAAC,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,SAAS,GAAG,IAAI;IAoBxC,KAAK,CAAC,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,cAAc,GAAG,OAAO;IACjD,KAAK,CAAC,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,SAAS,GAAG,OAAO;IAsB5C,CAAC,MAAM,CAAC,CAAC,GAAG,EAAE,MAAM;IAWpB,CAAC,QAAQ,CAAC,CAAC,EAAE,CAAC,EAAE,IAAI,GAAG,MAAM,CAAC,cAAc,EAAE,EAAE,CAAC,EAAE,MAAM;IAwBzD,CAAC,MAAM,CAAC;IAgBR,CAAC,MAAM,CAAC;CAST;AAED,qBAAa,eAAgB,SAAQ,WAAW;IAC9C,CAAC,KAAK,CAAC,IAAI,IAAI;IAsBf,CAAC,MAAM,CAAC;IASR,CAAC,MAAM,CAAC,CAAC,GAAG,EAAE,MAAM;CAmBrB"}

430
node_modules/@isaacs/fs-minipass/dist/commonjs/index.js generated vendored Normal file
View File

@@ -0,0 +1,430 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.WriteStreamSync = exports.WriteStream = exports.ReadStreamSync = exports.ReadStream = void 0;
const events_1 = __importDefault(require("events"));
const fs_1 = __importDefault(require("fs"));
const minipass_1 = require("minipass");
const writev = fs_1.default.writev;
const _autoClose = Symbol('_autoClose');
const _close = Symbol('_close');
const _ended = Symbol('_ended');
const _fd = Symbol('_fd');
const _finished = Symbol('_finished');
const _flags = Symbol('_flags');
const _flush = Symbol('_flush');
const _handleChunk = Symbol('_handleChunk');
const _makeBuf = Symbol('_makeBuf');
const _mode = Symbol('_mode');
const _needDrain = Symbol('_needDrain');
const _onerror = Symbol('_onerror');
const _onopen = Symbol('_onopen');
const _onread = Symbol('_onread');
const _onwrite = Symbol('_onwrite');
const _open = Symbol('_open');
const _path = Symbol('_path');
const _pos = Symbol('_pos');
const _queue = Symbol('_queue');
const _read = Symbol('_read');
const _readSize = Symbol('_readSize');
const _reading = Symbol('_reading');
const _remain = Symbol('_remain');
const _size = Symbol('_size');
const _write = Symbol('_write');
const _writing = Symbol('_writing');
const _defaultFlag = Symbol('_defaultFlag');
const _errored = Symbol('_errored');
class ReadStream extends minipass_1.Minipass {
[_errored] = false;
[_fd];
[_path];
[_readSize];
[_reading] = false;
[_size];
[_remain];
[_autoClose];
constructor(path, opt) {
opt = opt || {};
super(opt);
this.readable = true;
this.writable = false;
if (typeof path !== 'string') {
throw new TypeError('path must be a string');
}
this[_errored] = false;
this[_fd] = typeof opt.fd === 'number' ? opt.fd : undefined;
this[_path] = path;
this[_readSize] = opt.readSize || 16 * 1024 * 1024;
this[_reading] = false;
this[_size] = typeof opt.size === 'number' ? opt.size : Infinity;
this[_remain] = this[_size];
this[_autoClose] =
typeof opt.autoClose === 'boolean' ? opt.autoClose : true;
if (typeof this[_fd] === 'number') {
this[_read]();
}
else {
this[_open]();
}
}
get fd() {
return this[_fd];
}
get path() {
return this[_path];
}
//@ts-ignore
write() {
throw new TypeError('this is a readable stream');
}
//@ts-ignore
end() {
throw new TypeError('this is a readable stream');
}
[_open]() {
fs_1.default.open(this[_path], 'r', (er, fd) => this[_onopen](er, fd));
}
[_onopen](er, fd) {
if (er) {
this[_onerror](er);
}
else {
this[_fd] = fd;
this.emit('open', fd);
this[_read]();
}
}
[_makeBuf]() {
return Buffer.allocUnsafe(Math.min(this[_readSize], this[_remain]));
}
[_read]() {
if (!this[_reading]) {
this[_reading] = true;
const buf = this[_makeBuf]();
/* c8 ignore start */
if (buf.length === 0) {
return process.nextTick(() => this[_onread](null, 0, buf));
}
/* c8 ignore stop */
fs_1.default.read(this[_fd], buf, 0, buf.length, null, (er, br, b) => this[_onread](er, br, b));
}
}
[_onread](er, br, buf) {
this[_reading] = false;
if (er) {
this[_onerror](er);
}
else if (this[_handleChunk](br, buf)) {
this[_read]();
}
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs_1.default.close(fd, er => er ? this.emit('error', er) : this.emit('close'));
}
}
[_onerror](er) {
this[_reading] = true;
this[_close]();
this.emit('error', er);
}
[_handleChunk](br, buf) {
let ret = false;
// no effect if infinite
this[_remain] -= br;
if (br > 0) {
ret = super.write(br < buf.length ? buf.subarray(0, br) : buf);
}
if (br === 0 || this[_remain] <= 0) {
ret = false;
this[_close]();
super.end();
}
return ret;
}
emit(ev, ...args) {
switch (ev) {
case 'prefinish':
case 'finish':
return false;
case 'drain':
if (typeof this[_fd] === 'number') {
this[_read]();
}
return false;
case 'error':
if (this[_errored]) {
return false;
}
this[_errored] = true;
return super.emit(ev, ...args);
default:
return super.emit(ev, ...args);
}
}
}
exports.ReadStream = ReadStream;
class ReadStreamSync extends ReadStream {
[_open]() {
let threw = true;
try {
this[_onopen](null, fs_1.default.openSync(this[_path], 'r'));
threw = false;
}
finally {
if (threw) {
this[_close]();
}
}
}
[_read]() {
let threw = true;
try {
if (!this[_reading]) {
this[_reading] = true;
do {
const buf = this[_makeBuf]();
/* c8 ignore start */
const br = buf.length === 0
? 0
: fs_1.default.readSync(this[_fd], buf, 0, buf.length, null);
/* c8 ignore stop */
if (!this[_handleChunk](br, buf)) {
break;
}
} while (true);
this[_reading] = false;
}
threw = false;
}
finally {
if (threw) {
this[_close]();
}
}
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs_1.default.closeSync(fd);
this.emit('close');
}
}
}
exports.ReadStreamSync = ReadStreamSync;
class WriteStream extends events_1.default {
readable = false;
writable = true;
[_errored] = false;
[_writing] = false;
[_ended] = false;
[_queue] = [];
[_needDrain] = false;
[_path];
[_mode];
[_autoClose];
[_fd];
[_defaultFlag];
[_flags];
[_finished] = false;
[_pos];
constructor(path, opt) {
opt = opt || {};
super(opt);
this[_path] = path;
this[_fd] = typeof opt.fd === 'number' ? opt.fd : undefined;
this[_mode] = opt.mode === undefined ? 0o666 : opt.mode;
this[_pos] = typeof opt.start === 'number' ? opt.start : undefined;
this[_autoClose] =
typeof opt.autoClose === 'boolean' ? opt.autoClose : true;
// truncating makes no sense when writing into the middle
const defaultFlag = this[_pos] !== undefined ? 'r+' : 'w';
this[_defaultFlag] = opt.flags === undefined;
this[_flags] = opt.flags === undefined ? defaultFlag : opt.flags;
if (this[_fd] === undefined) {
this[_open]();
}
}
emit(ev, ...args) {
if (ev === 'error') {
if (this[_errored]) {
return false;
}
this[_errored] = true;
}
return super.emit(ev, ...args);
}
get fd() {
return this[_fd];
}
get path() {
return this[_path];
}
[_onerror](er) {
this[_close]();
this[_writing] = true;
this.emit('error', er);
}
[_open]() {
fs_1.default.open(this[_path], this[_flags], this[_mode], (er, fd) => this[_onopen](er, fd));
}
[_onopen](er, fd) {
if (this[_defaultFlag] &&
this[_flags] === 'r+' &&
er &&
er.code === 'ENOENT') {
this[_flags] = 'w';
this[_open]();
}
else if (er) {
this[_onerror](er);
}
else {
this[_fd] = fd;
this.emit('open', fd);
if (!this[_writing]) {
this[_flush]();
}
}
}
end(buf, enc) {
if (buf) {
//@ts-ignore
this.write(buf, enc);
}
this[_ended] = true;
// synthetic after-write logic, where drain/finish live
if (!this[_writing] &&
!this[_queue].length &&
typeof this[_fd] === 'number') {
this[_onwrite](null, 0);
}
return this;
}
write(buf, enc) {
if (typeof buf === 'string') {
buf = Buffer.from(buf, enc);
}
if (this[_ended]) {
this.emit('error', new Error('write() after end()'));
return false;
}
if (this[_fd] === undefined || this[_writing] || this[_queue].length) {
this[_queue].push(buf);
this[_needDrain] = true;
return false;
}
this[_writing] = true;
this[_write](buf);
return true;
}
[_write](buf) {
fs_1.default.write(this[_fd], buf, 0, buf.length, this[_pos], (er, bw) => this[_onwrite](er, bw));
}
[_onwrite](er, bw) {
if (er) {
this[_onerror](er);
}
else {
if (this[_pos] !== undefined && typeof bw === 'number') {
this[_pos] += bw;
}
if (this[_queue].length) {
this[_flush]();
}
else {
this[_writing] = false;
if (this[_ended] && !this[_finished]) {
this[_finished] = true;
this[_close]();
this.emit('finish');
}
else if (this[_needDrain]) {
this[_needDrain] = false;
this.emit('drain');
}
}
}
}
[_flush]() {
if (this[_queue].length === 0) {
if (this[_ended]) {
this[_onwrite](null, 0);
}
}
else if (this[_queue].length === 1) {
this[_write](this[_queue].pop());
}
else {
const iovec = this[_queue];
this[_queue] = [];
writev(this[_fd], iovec, this[_pos], (er, bw) => this[_onwrite](er, bw));
}
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs_1.default.close(fd, er => er ? this.emit('error', er) : this.emit('close'));
}
}
}
exports.WriteStream = WriteStream;
class WriteStreamSync extends WriteStream {
[_open]() {
let fd;
// only wrap in a try{} block if we know we'll retry, to avoid
// the rethrow obscuring the error's source frame in most cases.
if (this[_defaultFlag] && this[_flags] === 'r+') {
try {
fd = fs_1.default.openSync(this[_path], this[_flags], this[_mode]);
}
catch (er) {
if (er?.code === 'ENOENT') {
this[_flags] = 'w';
return this[_open]();
}
else {
throw er;
}
}
}
else {
fd = fs_1.default.openSync(this[_path], this[_flags], this[_mode]);
}
this[_onopen](null, fd);
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs_1.default.closeSync(fd);
this.emit('close');
}
}
[_write](buf) {
// throw the original, but try to close if it fails
let threw = true;
try {
this[_onwrite](null, fs_1.default.writeSync(this[_fd], buf, 0, buf.length, this[_pos]));
threw = false;
}
finally {
if (threw) {
try {
this[_close]();
}
catch {
// ok error
}
}
}
}
}
exports.WriteStreamSync = WriteStreamSync;
//# sourceMappingURL=index.js.map

File diff suppressed because one or more lines are too long

118
node_modules/@isaacs/fs-minipass/dist/esm/index.d.ts generated vendored Normal file
View File

@@ -0,0 +1,118 @@
/// <reference types="node" resolution-mode="require"/>
/// <reference types="node" resolution-mode="require"/>
/// <reference types="node" resolution-mode="require"/>
import EE from 'events';
import { Minipass } from 'minipass';
declare const _autoClose: unique symbol;
declare const _close: unique symbol;
declare const _ended: unique symbol;
declare const _fd: unique symbol;
declare const _finished: unique symbol;
declare const _flags: unique symbol;
declare const _flush: unique symbol;
declare const _handleChunk: unique symbol;
declare const _makeBuf: unique symbol;
declare const _mode: unique symbol;
declare const _needDrain: unique symbol;
declare const _onerror: unique symbol;
declare const _onopen: unique symbol;
declare const _onread: unique symbol;
declare const _onwrite: unique symbol;
declare const _open: unique symbol;
declare const _path: unique symbol;
declare const _pos: unique symbol;
declare const _queue: unique symbol;
declare const _read: unique symbol;
declare const _readSize: unique symbol;
declare const _reading: unique symbol;
declare const _remain: unique symbol;
declare const _size: unique symbol;
declare const _write: unique symbol;
declare const _writing: unique symbol;
declare const _defaultFlag: unique symbol;
declare const _errored: unique symbol;
export type ReadStreamOptions = Minipass.Options<Minipass.ContiguousData> & {
fd?: number;
readSize?: number;
size?: number;
autoClose?: boolean;
};
export type ReadStreamEvents = Minipass.Events<Minipass.ContiguousData> & {
open: [fd: number];
};
export declare class ReadStream extends Minipass<Minipass.ContiguousData, Buffer, ReadStreamEvents> {
[_errored]: boolean;
[_fd]?: number;
[_path]: string;
[_readSize]: number;
[_reading]: boolean;
[_size]: number;
[_remain]: number;
[_autoClose]: boolean;
constructor(path: string, opt: ReadStreamOptions);
get fd(): number | undefined;
get path(): string;
write(): void;
end(): void;
[_open](): void;
[_onopen](er?: NodeJS.ErrnoException | null, fd?: number): void;
[_makeBuf](): Buffer;
[_read](): void;
[_onread](er?: NodeJS.ErrnoException | null, br?: number, buf?: Buffer): void;
[_close](): void;
[_onerror](er: NodeJS.ErrnoException): void;
[_handleChunk](br: number, buf: Buffer): boolean;
emit<Event extends keyof ReadStreamEvents>(ev: Event, ...args: ReadStreamEvents[Event]): boolean;
}
export declare class ReadStreamSync extends ReadStream {
[_open](): void;
[_read](): void;
[_close](): void;
}
export type WriteStreamOptions = {
fd?: number;
autoClose?: boolean;
mode?: number;
captureRejections?: boolean;
start?: number;
flags?: string;
};
export declare class WriteStream extends EE {
readable: false;
writable: boolean;
[_errored]: boolean;
[_writing]: boolean;
[_ended]: boolean;
[_queue]: Buffer[];
[_needDrain]: boolean;
[_path]: string;
[_mode]: number;
[_autoClose]: boolean;
[_fd]?: number;
[_defaultFlag]: boolean;
[_flags]: string;
[_finished]: boolean;
[_pos]?: number;
constructor(path: string, opt: WriteStreamOptions);
emit(ev: string, ...args: any[]): boolean;
get fd(): number | undefined;
get path(): string;
[_onerror](er: NodeJS.ErrnoException): void;
[_open](): void;
[_onopen](er?: null | NodeJS.ErrnoException, fd?: number): void;
end(buf: string, enc?: BufferEncoding): this;
end(buf?: Buffer, enc?: undefined): this;
write(buf: string, enc?: BufferEncoding): boolean;
write(buf: Buffer, enc?: undefined): boolean;
[_write](buf: Buffer): void;
[_onwrite](er?: null | NodeJS.ErrnoException, bw?: number): void;
[_flush](): void;
[_close](): void;
}
export declare class WriteStreamSync extends WriteStream {
[_open](): void;
[_close](): void;
[_write](buf: Buffer): void;
}
export {};
//# sourceMappingURL=index.d.ts.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":";;;AAAA,OAAO,EAAE,MAAM,QAAQ,CAAA;AAEvB,OAAO,EAAE,QAAQ,EAAE,MAAM,UAAU,CAAA;AAInC,QAAA,MAAM,UAAU,eAAuB,CAAA;AACvC,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,GAAG,eAAgB,CAAA;AACzB,QAAA,MAAM,SAAS,eAAsB,CAAA;AACrC,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,YAAY,eAAyB,CAAA;AAC3C,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,UAAU,eAAuB,CAAA;AACvC,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,OAAO,eAAoB,CAAA;AACjC,QAAA,MAAM,OAAO,eAAoB,CAAA;AACjC,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,IAAI,eAAiB,CAAA;AAC3B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,SAAS,eAAsB,CAAA;AACrC,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,OAAO,eAAoB,CAAA;AACjC,QAAA,MAAM,KAAK,eAAkB,CAAA;AAC7B,QAAA,MAAM,MAAM,eAAmB,CAAA;AAC/B,QAAA,MAAM,QAAQ,eAAqB,CAAA;AACnC,QAAA,MAAM,YAAY,eAAyB,CAAA;AAC3C,QAAA,MAAM,QAAQ,eAAqB,CAAA;AAEnC,MAAM,MAAM,iBAAiB,GAC3B,QAAQ,CAAC,OAAO,CAAC,QAAQ,CAAC,cAAc,CAAC,GAAG;IAC1C,EAAE,CAAC,EAAE,MAAM,CAAA;IACX,QAAQ,CAAC,EAAE,MAAM,CAAA;IACjB,IAAI,CAAC,EAAE,MAAM,CAAA;IACb,SAAS,CAAC,EAAE,OAAO,CAAA;CACpB,CAAA;AAEH,MAAM,MAAM,gBAAgB,GAAG,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,cAAc,CAAC,GAAG;IACxE,IAAI,EAAE,CAAC,EAAE,EAAE,MAAM,CAAC,CAAA;CACnB,CAAA;AAED,qBAAa,UAAW,SAAQ,QAAQ,CACtC,QAAQ,CAAC,cAAc,EACvB,MAAM,EACN,gBAAgB,CACjB;IACC,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,GAAG,CAAC,CAAC,EAAE,MAAM,CAAC;IACf,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,SAAS,CAAC,EAAE,MAAM,CAAC;IACpB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,OAAO,CAAC,EAAE,MAAM,CAAC;IAClB,CAAC,UAAU,CAAC,EAAE,OAAO,CAAA;gBAET,IAAI,EAAE,MAAM,EAAE,GAAG,EAAE,iBAAiB;IA4BhD,IAAI,EAAE,uBAEL;IAED,IAAI,IAAI,WAEP;IAGD,KAAK;IAKL,GAAG;IAIH,CAAC,KAAK,CAAC;IAIP,CAAC,OAAO,CAAC,CAAC,EAAE,CAAC,EAAE,MAAM,CAAC,cAAc,GAAG,IAAI,EAAE,EAAE,CAAC,EAAE,MAAM;IAUxD,CAAC,QAAQ,CAAC;IAIV,CAAC,KAAK,CAAC;IAeP,CAAC,OAAO,CAAC,CAAC,EAAE,CAAC,EAAE,MAAM,CAAC,cAAc,GAAG,IAAI,EAAE,EAAE,CAAC,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,MAAM;IAStE,CAAC,MAAM,CAAC;IAUR,CAAC,QAAQ,CAAC,CAAC,EAAE,EAAE,MAAM,CAAC,cAAc;IAMpC,CAAC,YAAY,CAAC,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,EAAE,MAAM;IAiBtC,IAAI,CAAC,KAAK,SAAS,MAAM,gBAAgB,EACvC,EAAE,EAAE,KAAK,EACT,GAAG,IAAI,EAAE,gBAAgB,CAAC,KAAK,CAAC,GAC/B,OAAO;CAuBX;AAED,qBAAa,cAAe,SAAQ,UAAU;IAC5C,CAAC,KAAK,CAAC;IAYP,CAAC,KAAK,CAAC;IA2BP,CAAC,MAAM,CAAC;CAQT;AAED,MAAM,MAAM,kBAAkB,GAAG;IAC/B,EAAE,CAAC,EAAE,MAAM,CAAA;IACX,SAAS,CAAC,EAAE,OAAO,CAAA;IACnB,IAAI,CAAC,EAAE,MAAM,CAAA;IACb,iBAAiB,CAAC,EAAE,OAAO,CAAA;IAC3B,KAAK,CAAC,EAAE,MAAM,CAAA;IACd,KAAK,CAAC,EAAE,MAAM,CAAA;CACf,CAAA;AAED,qBAAa,WAAY,SAAQ,EAAE;IACjC,QAAQ,EAAE,KAAK,CAAQ;IACvB,QAAQ,EAAE,OAAO,CAAQ;IACzB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAS;IAC5B,CAAC,MAAM,CAAC,EAAE,OAAO,CAAS;IAC1B,CAAC,MAAM,CAAC,EAAE,MAAM,EAAE,CAAM;IACxB,CAAC,UAAU,CAAC,EAAE,OAAO,CAAS;IAC9B,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,KAAK,CAAC,EAAE,MAAM,CAAC;IAChB,CAAC,UAAU,CAAC,EAAE,OAAO,CAAC;IACtB,CAAC,GAAG,CAAC,CAAC,EAAE,MAAM,CAAC;IACf,CAAC,YAAY,CAAC,EAAE,OAAO,CAAC;IACxB,CAAC,MAAM,CAAC,EAAE,MAAM,CAAC;IACjB,CAAC,SAAS,CAAC,EAAE,OAAO,CAAS;IAC7B,CAAC,IAAI,CAAC,CAAC,EAAE,MAAM,CAAA;gBAEH,IAAI,EAAE,MAAM,EAAE,GAAG,EAAE,kBAAkB;IAoBjD,IAAI,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,IAAI,EAAE,GAAG,EAAE;IAU/B,IAAI,EAAE,uBAEL;IAED,IAAI,IAAI,WAEP;IAED,CAAC,QAAQ,CAAC,CAAC,EAAE,EAAE,MAAM,CAAC,cAAc;IAMpC,CAAC,KAAK,CAAC;IAMP,CAAC,OAAO,CAAC,CAAC,EAAE,CAAC,EAAE,IAAI,GAAG,MAAM,CAAC,cAAc,EAAE,EAAE,CAAC,EAAE,MAAM;IAoBxD,GAAG,CAAC,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,cAAc,GAAG,IAAI;IAC5C,GAAG,CAAC,GAAG,CAAC,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,SAAS,GAAG,IAAI;IAoBxC,KAAK,CAAC,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,cAAc,GAAG,OAAO;IACjD,KAAK,CAAC,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,SAAS,GAAG,OAAO;IAsB5C,CAAC,MAAM,CAAC,CAAC,GAAG,EAAE,MAAM;IAWpB,CAAC,QAAQ,CAAC,CAAC,EAAE,CAAC,EAAE,IAAI,GAAG,MAAM,CAAC,cAAc,EAAE,EAAE,CAAC,EAAE,MAAM;IAwBzD,CAAC,MAAM,CAAC;IAgBR,CAAC,MAAM,CAAC;CAST;AAED,qBAAa,eAAgB,SAAQ,WAAW;IAC9C,CAAC,KAAK,CAAC,IAAI,IAAI;IAsBf,CAAC,MAAM,CAAC;IASR,CAAC,MAAM,CAAC,CAAC,GAAG,EAAE,MAAM;CAmBrB"}

420
node_modules/@isaacs/fs-minipass/dist/esm/index.js generated vendored Normal file
View File

@@ -0,0 +1,420 @@
import EE from 'events';
import fs from 'fs';
import { Minipass } from 'minipass';
const writev = fs.writev;
const _autoClose = Symbol('_autoClose');
const _close = Symbol('_close');
const _ended = Symbol('_ended');
const _fd = Symbol('_fd');
const _finished = Symbol('_finished');
const _flags = Symbol('_flags');
const _flush = Symbol('_flush');
const _handleChunk = Symbol('_handleChunk');
const _makeBuf = Symbol('_makeBuf');
const _mode = Symbol('_mode');
const _needDrain = Symbol('_needDrain');
const _onerror = Symbol('_onerror');
const _onopen = Symbol('_onopen');
const _onread = Symbol('_onread');
const _onwrite = Symbol('_onwrite');
const _open = Symbol('_open');
const _path = Symbol('_path');
const _pos = Symbol('_pos');
const _queue = Symbol('_queue');
const _read = Symbol('_read');
const _readSize = Symbol('_readSize');
const _reading = Symbol('_reading');
const _remain = Symbol('_remain');
const _size = Symbol('_size');
const _write = Symbol('_write');
const _writing = Symbol('_writing');
const _defaultFlag = Symbol('_defaultFlag');
const _errored = Symbol('_errored');
export class ReadStream extends Minipass {
[_errored] = false;
[_fd];
[_path];
[_readSize];
[_reading] = false;
[_size];
[_remain];
[_autoClose];
constructor(path, opt) {
opt = opt || {};
super(opt);
this.readable = true;
this.writable = false;
if (typeof path !== 'string') {
throw new TypeError('path must be a string');
}
this[_errored] = false;
this[_fd] = typeof opt.fd === 'number' ? opt.fd : undefined;
this[_path] = path;
this[_readSize] = opt.readSize || 16 * 1024 * 1024;
this[_reading] = false;
this[_size] = typeof opt.size === 'number' ? opt.size : Infinity;
this[_remain] = this[_size];
this[_autoClose] =
typeof opt.autoClose === 'boolean' ? opt.autoClose : true;
if (typeof this[_fd] === 'number') {
this[_read]();
}
else {
this[_open]();
}
}
get fd() {
return this[_fd];
}
get path() {
return this[_path];
}
//@ts-ignore
write() {
throw new TypeError('this is a readable stream');
}
//@ts-ignore
end() {
throw new TypeError('this is a readable stream');
}
[_open]() {
fs.open(this[_path], 'r', (er, fd) => this[_onopen](er, fd));
}
[_onopen](er, fd) {
if (er) {
this[_onerror](er);
}
else {
this[_fd] = fd;
this.emit('open', fd);
this[_read]();
}
}
[_makeBuf]() {
return Buffer.allocUnsafe(Math.min(this[_readSize], this[_remain]));
}
[_read]() {
if (!this[_reading]) {
this[_reading] = true;
const buf = this[_makeBuf]();
/* c8 ignore start */
if (buf.length === 0) {
return process.nextTick(() => this[_onread](null, 0, buf));
}
/* c8 ignore stop */
fs.read(this[_fd], buf, 0, buf.length, null, (er, br, b) => this[_onread](er, br, b));
}
}
[_onread](er, br, buf) {
this[_reading] = false;
if (er) {
this[_onerror](er);
}
else if (this[_handleChunk](br, buf)) {
this[_read]();
}
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs.close(fd, er => er ? this.emit('error', er) : this.emit('close'));
}
}
[_onerror](er) {
this[_reading] = true;
this[_close]();
this.emit('error', er);
}
[_handleChunk](br, buf) {
let ret = false;
// no effect if infinite
this[_remain] -= br;
if (br > 0) {
ret = super.write(br < buf.length ? buf.subarray(0, br) : buf);
}
if (br === 0 || this[_remain] <= 0) {
ret = false;
this[_close]();
super.end();
}
return ret;
}
emit(ev, ...args) {
switch (ev) {
case 'prefinish':
case 'finish':
return false;
case 'drain':
if (typeof this[_fd] === 'number') {
this[_read]();
}
return false;
case 'error':
if (this[_errored]) {
return false;
}
this[_errored] = true;
return super.emit(ev, ...args);
default:
return super.emit(ev, ...args);
}
}
}
export class ReadStreamSync extends ReadStream {
[_open]() {
let threw = true;
try {
this[_onopen](null, fs.openSync(this[_path], 'r'));
threw = false;
}
finally {
if (threw) {
this[_close]();
}
}
}
[_read]() {
let threw = true;
try {
if (!this[_reading]) {
this[_reading] = true;
do {
const buf = this[_makeBuf]();
/* c8 ignore start */
const br = buf.length === 0
? 0
: fs.readSync(this[_fd], buf, 0, buf.length, null);
/* c8 ignore stop */
if (!this[_handleChunk](br, buf)) {
break;
}
} while (true);
this[_reading] = false;
}
threw = false;
}
finally {
if (threw) {
this[_close]();
}
}
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs.closeSync(fd);
this.emit('close');
}
}
}
export class WriteStream extends EE {
readable = false;
writable = true;
[_errored] = false;
[_writing] = false;
[_ended] = false;
[_queue] = [];
[_needDrain] = false;
[_path];
[_mode];
[_autoClose];
[_fd];
[_defaultFlag];
[_flags];
[_finished] = false;
[_pos];
constructor(path, opt) {
opt = opt || {};
super(opt);
this[_path] = path;
this[_fd] = typeof opt.fd === 'number' ? opt.fd : undefined;
this[_mode] = opt.mode === undefined ? 0o666 : opt.mode;
this[_pos] = typeof opt.start === 'number' ? opt.start : undefined;
this[_autoClose] =
typeof opt.autoClose === 'boolean' ? opt.autoClose : true;
// truncating makes no sense when writing into the middle
const defaultFlag = this[_pos] !== undefined ? 'r+' : 'w';
this[_defaultFlag] = opt.flags === undefined;
this[_flags] = opt.flags === undefined ? defaultFlag : opt.flags;
if (this[_fd] === undefined) {
this[_open]();
}
}
emit(ev, ...args) {
if (ev === 'error') {
if (this[_errored]) {
return false;
}
this[_errored] = true;
}
return super.emit(ev, ...args);
}
get fd() {
return this[_fd];
}
get path() {
return this[_path];
}
[_onerror](er) {
this[_close]();
this[_writing] = true;
this.emit('error', er);
}
[_open]() {
fs.open(this[_path], this[_flags], this[_mode], (er, fd) => this[_onopen](er, fd));
}
[_onopen](er, fd) {
if (this[_defaultFlag] &&
this[_flags] === 'r+' &&
er &&
er.code === 'ENOENT') {
this[_flags] = 'w';
this[_open]();
}
else if (er) {
this[_onerror](er);
}
else {
this[_fd] = fd;
this.emit('open', fd);
if (!this[_writing]) {
this[_flush]();
}
}
}
end(buf, enc) {
if (buf) {
//@ts-ignore
this.write(buf, enc);
}
this[_ended] = true;
// synthetic after-write logic, where drain/finish live
if (!this[_writing] &&
!this[_queue].length &&
typeof this[_fd] === 'number') {
this[_onwrite](null, 0);
}
return this;
}
write(buf, enc) {
if (typeof buf === 'string') {
buf = Buffer.from(buf, enc);
}
if (this[_ended]) {
this.emit('error', new Error('write() after end()'));
return false;
}
if (this[_fd] === undefined || this[_writing] || this[_queue].length) {
this[_queue].push(buf);
this[_needDrain] = true;
return false;
}
this[_writing] = true;
this[_write](buf);
return true;
}
[_write](buf) {
fs.write(this[_fd], buf, 0, buf.length, this[_pos], (er, bw) => this[_onwrite](er, bw));
}
[_onwrite](er, bw) {
if (er) {
this[_onerror](er);
}
else {
if (this[_pos] !== undefined && typeof bw === 'number') {
this[_pos] += bw;
}
if (this[_queue].length) {
this[_flush]();
}
else {
this[_writing] = false;
if (this[_ended] && !this[_finished]) {
this[_finished] = true;
this[_close]();
this.emit('finish');
}
else if (this[_needDrain]) {
this[_needDrain] = false;
this.emit('drain');
}
}
}
}
[_flush]() {
if (this[_queue].length === 0) {
if (this[_ended]) {
this[_onwrite](null, 0);
}
}
else if (this[_queue].length === 1) {
this[_write](this[_queue].pop());
}
else {
const iovec = this[_queue];
this[_queue] = [];
writev(this[_fd], iovec, this[_pos], (er, bw) => this[_onwrite](er, bw));
}
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs.close(fd, er => er ? this.emit('error', er) : this.emit('close'));
}
}
}
export class WriteStreamSync extends WriteStream {
[_open]() {
let fd;
// only wrap in a try{} block if we know we'll retry, to avoid
// the rethrow obscuring the error's source frame in most cases.
if (this[_defaultFlag] && this[_flags] === 'r+') {
try {
fd = fs.openSync(this[_path], this[_flags], this[_mode]);
}
catch (er) {
if (er?.code === 'ENOENT') {
this[_flags] = 'w';
return this[_open]();
}
else {
throw er;
}
}
}
else {
fd = fs.openSync(this[_path], this[_flags], this[_mode]);
}
this[_onopen](null, fd);
}
[_close]() {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd];
this[_fd] = undefined;
fs.closeSync(fd);
this.emit('close');
}
}
[_write](buf) {
// throw the original, but try to close if it fails
let threw = true;
try {
this[_onwrite](null, fs.writeSync(this[_fd], buf, 0, buf.length, this[_pos]));
threw = false;
}
finally {
if (threw) {
try {
this[_close]();
}
catch {
// ok error
}
}
}
}
}
//# sourceMappingURL=index.js.map

File diff suppressed because one or more lines are too long

72
node_modules/@isaacs/fs-minipass/package.json generated vendored Normal file
View File

@@ -0,0 +1,72 @@
{
"name": "@isaacs/fs-minipass",
"version": "4.0.1",
"main": "./dist/commonjs/index.js",
"scripts": {
"prepare": "tshy",
"pretest": "npm run prepare",
"test": "tap",
"preversion": "npm test",
"postversion": "npm publish",
"prepublishOnly": "git push origin --follow-tags",
"format": "prettier --write . --loglevel warn",
"typedoc": "typedoc --tsconfig .tshy/esm.json ./src/*.ts"
},
"keywords": [],
"author": "Isaac Z. Schlueter",
"license": "ISC",
"repository": {
"type": "git",
"url": "https://github.com/npm/fs-minipass.git"
},
"description": "fs read and write streams based on minipass",
"dependencies": {
"minipass": "^7.0.4"
},
"devDependencies": {
"@types/node": "^20.11.30",
"mutate-fs": "^2.1.1",
"prettier": "^3.2.5",
"tap": "^18.7.1",
"tshy": "^1.12.0",
"typedoc": "^0.25.12"
},
"files": [
"dist"
],
"engines": {
"node": ">=18.0.0"
},
"tshy": {
"exports": {
"./package.json": "./package.json",
".": "./src/index.ts"
}
},
"exports": {
"./package.json": "./package.json",
".": {
"import": {
"types": "./dist/esm/index.d.ts",
"default": "./dist/esm/index.js"
},
"require": {
"types": "./dist/commonjs/index.d.ts",
"default": "./dist/commonjs/index.js"
}
}
},
"types": "./dist/commonjs/index.d.ts",
"type": "module",
"prettier": {
"semi": false,
"printWidth": 75,
"tabWidth": 2,
"useTabs": false,
"singleQuote": true,
"jsxSingleQuote": false,
"bracketSameLine": true,
"arrowParens": "avoid",
"endOfLine": "lf"
}
}

View File

@@ -0,0 +1,20 @@
# For all possible configuration options see:
# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates
# Note that limitations on version updates applied in this file don't apply to security updates,
# which are separately managed by dependabot
version: 2
updates:
- package-ecosystem: "npm"
directory: "/"
schedule:
interval: "daily"
open-pull-requests-limit: 20
ignore:
- dependency-name: "*"
update-types: ["version-update:semver-minor", "version-update:semver-patch"]
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "daily"

View File

@@ -0,0 +1,29 @@
# https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-nodejs
name: ci
on:
pull_request:
push:
branches:
- master
workflow_dispatch:
jobs:
ci:
strategy:
fail-fast: false
max-parallel: 9
matrix:
node-version: ['18.x', '20.x', '22.x']
os: [macos-latest, ubuntu-latest, windows-latest]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: ${{ matrix.node-version }}
- run: npm ci
- run: npm audit
- run: npm run lint
# - run: npm run update-crosswalk # To support newer versions of Node.js
- run: npm run build --if-present
- run: npm test

View File

@@ -0,0 +1,74 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: "CodeQL"
on:
push:
branches: [ "master" ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ "master" ]
schedule:
- cron: '24 5 * * 4'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: [ 'javascript' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# Details on CodeQL's query packs refer to : https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v3
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
# If the Autobuild fails above, remove it and uncomment the following three lines.
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
# - run: |
# echo "Run, Build Application using script"
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3
with:
category: "/language:${{matrix.language}}"

View File

@@ -0,0 +1,105 @@
name: release
on:
push:
branches: [master]
workflow_dispatch:
jobs:
release-check:
name: Check if version is published
runs-on: ubuntu-latest
defaults:
run:
shell: bash
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 22
- name: Check if version is published
id: check
run: |
currentVersion="$( node -e "console.log(require('./package.json').version)" )"
isPublished="$( npm view @mapbox/node-pre-gyp versions --json | jq -c --arg cv "$currentVersion" 'any(. == $cv)' )"
echo "version=$currentVersion" >> "$GITHUB_OUTPUT"
echo "published=$isPublished" >> "$GITHUB_OUTPUT"
echo "currentVersion: $currentVersion"
echo "isPublished: $isPublished"
outputs:
published: ${{ steps.check.outputs.published }}
version: ${{ steps.check.outputs.version }}
publish:
needs: release-check
if: ${{ needs.release-check.outputs.published == 'false' }}
runs-on: ubuntu-latest
permissions:
contents: write
defaults:
run:
shell: bash
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 22
- run: npm ci
- run: npm audit
- run: npm run lint
- run: npm run update-crosswalk # To support newer versions of Node.js
- run: npm run build --if-present
- run: npm test
- name: Prepare release changelog
id: prepare_release
run: |
RELEASE_TYPE="$(node -e "console.log(require('semver').prerelease('${{ needs.release-check.outputs.version }}') ? 'prerelease' : 'regular')")"
if [[ $RELEASE_TYPE == 'regular' ]]; then
echo "prerelease=false" >> "$GITHUB_OUTPUT"
else
echo "prerelease=true" >> "$GITHUB_OUTPUT"
fi
- name: Extract changelog for version
run: |
awk '/^##/ { p = 0 }; p == 1 { print }; $0 == "## ${{ needs.release-check.outputs.version }}" { p = 1 };' CHANGELOG.md > changelog_for_version.md
cat changelog_for_version.md
- name: Publish to Github
uses: ncipollo/release-action@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag: v${{ needs.release-check.outputs.version }}
name: v${{ needs.release-check.outputs.version }}
bodyFile: changelog_for_version.md
allowUpdates: true
draft: false
prerelease: ${{ steps.prepare_release.outputs.prerelease }}
- name: Publish to NPM (release)
if: ${{ steps.prepare_release.outputs.prerelease == 'false' }}
run: |
npm config set //registry.npmjs.org/:_authToken "${NPM_TOKEN}"
npm publish --access public
env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish to NPM (prerelease)
if: ${{ steps.prepare_release.outputs.prerelease == 'true' }}
run: |
npm config set //registry.npmjs.org/:_authToken "${NPM_TOKEN}"
npm publish --tag next --access public
env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}

517
node_modules/@mapbox/node-pre-gyp/CHANGELOG.md generated vendored Normal file
View File

@@ -0,0 +1,517 @@
# node-pre-gyp changelog
## 2.0.0
- Supported Node versions are now stable versions of Node 18+. We will attempt to track the [Node.js release schedule](https://github.com/nodejs/release#release-schedule) and will regularly retire support for versions that have reached EOL.
- Fixed use of `s3ForcePathStyle` for installation [#650](https://github.com/mapbox/node-pre-gyp/pull/650)
- Upgraded to https-proxy-agent 7.0.5, nopt 8.0.0, semver 7.5.3, and tar 7.4.0
- Replaced npmlog with consola
- Removed rimraf and make-dir as dependencies
## 1.0.11
- Fixes dependabot alert [CVE-2021-44906](https://nvd.nist.gov/vuln/detail/CVE-2021-44906)
## 1.0.10
- Upgraded minimist to 1.2.6 to address dependabot alert [CVE-2021-44906](https://nvd.nist.gov/vuln/detail/CVE-2021-44906)
## 1.0.9
- Upgraded node-fetch to 2.6.7 to address [CVE-2022-0235](https://www.cve.org/CVERecord?id=CVE-2022-0235)
- Upgraded detect-libc to 2.0.0 to use non-blocking NodeJS(>=12) Report API
## 1.0.8
- Downgraded npmlog to maintain node v10 and v8 support (https://github.com/mapbox/node-pre-gyp/pull/624)
## 1.0.7
- Upgraded nyc and npmlog to address https://github.com/advisories/GHSA-93q8-gq69-wqmw
## 1.0.6
- Added node v17 to the internal node releases listing
- Upgraded various dependencies declared in package.json to latest major versions (node-fetch from 2.6.1 to 2.6.5, npmlog from 4.1.2 to 5.01, semver from 7.3.4 to 7.3.5, and tar from 6.1.0 to 6.1.11)
- Fixed bug in `staging_host` parameter (https://github.com/mapbox/node-pre-gyp/pull/590)
## 1.0.5
- Fix circular reference warning with node >= v14
## 1.0.4
- Added node v16 to the internal node releases listing
## 1.0.3
- Improved support configuring s3 uploads (solves https://github.com/mapbox/node-pre-gyp/issues/571)
- New options added in https://github.com/mapbox/node-pre-gyp/pull/576: 'bucket', 'region', and `s3ForcePathStyle`
## 1.0.2
- Fixed regression in proxy support (https://github.com/mapbox/node-pre-gyp/issues/572)
## 1.0.1
- Switched from mkdirp@1.0.4 to make-dir@3.1.0 to avoid this bug: https://github.com/isaacs/node-mkdirp/issues/31
## 1.0.0
- Module is now name-spaced at `@mapbox/node-pre-gyp` and the original `node-pre-gyp` is deprecated.
- New: support for staging and production s3 targets (see README.md)
- BREAKING: no longer supporting `node_pre_gyp_accessKeyId` & `node_pre_gyp_secretAccessKey`, use `AWS_ACCESS_KEY_ID` & `AWS_SECRET_ACCESS_KEY` instead to authenticate against s3 for `info`, `publish`, and `unpublish` commands.
- Dropped node v6 support, added node v14 support
- Switched tests to use mapbox-owned bucket for testing
- Added coverage tracking and linting with eslint
- Added back support for symlinks inside the tarball
- Upgraded all test apps to N-API/node-addon-api
- New: support for staging and production s3 targets (see README.md)
- Added `node_pre_gyp_s3_host` env var which has priority over the `--s3_host` option or default.
- Replaced needle with node-fetch
- Added proxy support for node-fetch
- Upgraded to mkdirp@1.x
## 0.17.0
- Got travis + appveyor green again
- Added support for more node versions
## 0.16.0
- Added Node 15 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/520)
## 0.15.0
- Bump dependency on `mkdirp` from `^0.5.1` to `^0.5.3` (https://github.com/mapbox/node-pre-gyp/pull/492)
- Bump dependency on `needle` from `^2.2.1` to `^2.5.0` (https://github.com/mapbox/node-pre-gyp/pull/502)
- Added Node 14 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/501)
## 0.14.0
- Defer modules requires in napi.js (https://github.com/mapbox/node-pre-gyp/pull/434)
- Bump dependency on `tar` from `^4` to `^4.4.2` (https://github.com/mapbox/node-pre-gyp/pull/454)
- Support extracting compiled binary from local offline mirror (https://github.com/mapbox/node-pre-gyp/pull/459)
- Added Node 13 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/483)
## 0.13.0
- Added Node 12 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/449)
## 0.12.0
- Fixed double-build problem with node v10 (https://github.com/mapbox/node-pre-gyp/pull/428)
- Added node 11 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/422)
## 0.11.0
- Fixed double-install problem with node v10
- Significant N-API improvements (https://github.com/mapbox/node-pre-gyp/pull/405)
## 0.10.3
- Now will use `request` over `needle` if request is installed. By default `needle` is used for `https`. This should unbreak proxy support that regressed in v0.9.0
## 0.10.2
- Fixed rc/deep-extent security vulnerability
- Fixed broken reinstall script do to incorrectly named get_best_napi_version
## 0.10.1
- Fix needle error event (@medns)
## 0.10.0
- Allow for a single-level module path when packing @allenluce (https://github.com/mapbox/node-pre-gyp/pull/371)
- Log warnings instead of errors when falling back @xzyfer (https://github.com/mapbox/node-pre-gyp/pull/366)
- Add Node.js v10 support to tests (https://github.com/mapbox/node-pre-gyp/pull/372)
- Remove retire.js from CI (https://github.com/mapbox/node-pre-gyp/pull/372)
- Remove support for Node.js v4 due to [EOL on April 30th, 2018](https://github.com/nodejs/Release/blob/7dd52354049cae99eed0e9fe01345b0722a86fde/schedule.json#L14)
- Update appveyor tests to install default NPM version instead of NPM v2.x for all Windows builds (https://github.com/mapbox/node-pre-gyp/pull/375)
## 0.9.1
- Fixed regression (in v0.9.0) with support for http redirects @allenluce (https://github.com/mapbox/node-pre-gyp/pull/361)
## 0.9.0
- Switched from using `request` to `needle` to reduce size of module deps (https://github.com/mapbox/node-pre-gyp/pull/350)
## 0.8.0
- N-API support (@inspiredware)
## 0.7.1
- Upgraded to tar v4.x
## 0.7.0
- Updated request and hawk (#347)
- Dropped node v0.10.x support
## 0.6.40
- Improved error reporting if an install fails
## 0.6.39
- Support for node v9
- Support for versioning on `{libc}` to allow binaries to work on non-glic linux systems like alpine linux
## 0.6.38
- Maintaining compatibility (for v0.6.x series) with node v0.10.x
## 0.6.37
- Solved one part of #276: now now deduce the node ABI from the major version for node >= 2 even when not stored in the abi_crosswalk.json
- Fixed docs to avoid mentioning the deprecated and dangerous `prepublish` in package.json (#291)
- Add new node versions to crosswalk
- Ported tests to use tape instead of mocha
- Got appveyor tests passing by downgrading npm and node-gyp
## 0.6.36
- Removed the running of `testbinary` during install. Because this was regressed for so long, it is too dangerous to re-enable by default. Developers needing validation can call `node-pre-gyp testbinary` directory.
- Fixed regression in v0.6.35 for electron installs (now skipping binary validation which is not yet supported for electron)
## 0.6.35
- No longer recommending `npm ls` in `prepublish` (#291)
- Fixed testbinary command (#283) @szdavid92
## 0.6.34
- Added new node versions to crosswalk, including v8
- Upgraded deps to latest versions, started using `^` instead of `~` for all deps.
## 0.6.33
- Improved support for yarn
## 0.6.32
- Honor npm configuration for CA bundles (@heikkipora)
- Add node-pre-gyp and npm versions to user agent (@addaleax)
- Updated various deps
- Add known node version for v7.x
## 0.6.31
- Updated various deps
## 0.6.30
- Update to npmlog@4.x and semver@5.3.x
- Add known node version for v6.5.0
## 0.6.29
- Add known node versions for v0.10.45, v0.12.14, v4.4.4, v5.11.1, and v6.1.0
## 0.6.28
- Now more verbose when remote binaries are not available. This is needed since npm is increasingly more quiet by default
and users need to know why builds are falling back to source compiles that might then error out.
## 0.6.27
- Add known node version for node v6
- Stopped bundling dependencies
- Documented method for module authors to avoid bundling node-pre-gyp
- See https://github.com/mapbox/node-pre-gyp/tree/master#configuring for details
## 0.6.26
- Skip validation for nw runtime (https://github.com/mapbox/node-pre-gyp/pull/181) via @fleg
## 0.6.25
- Improved support for auto-detection of electron runtime in `node-pre-gyp.find()`
- Pull request from @enlight - https://github.com/mapbox/node-pre-gyp/pull/187
- Add known node version for 4.4.1 and 5.9.1
## 0.6.24
- Add known node version for 5.8.0, 5.9.0, and 4.4.0.
## 0.6.23
- Add known node version for 0.10.43, 0.12.11, 4.3.2, and 5.7.1.
## 0.6.22
- Add known node version for 4.3.1, and 5.7.0.
## 0.6.21
- Add known node version for 0.10.42, 0.12.10, 4.3.0, and 5.6.0.
## 0.6.20
- Add known node version for 4.2.5, 4.2.6, 5.4.0, 5.4.1,and 5.5.0.
## 0.6.19
- Add known node version for 4.2.4
## 0.6.18
- Add new known node versions for 0.10.x, 0.12.x, 4.x, and 5.x
## 0.6.17
- Re-tagged to fix packaging problem of `Error: Cannot find module 'isarray'`
## 0.6.16
- Added known version in crosswalk for 5.1.0.
## 0.6.15
- Upgraded tar-pack (https://github.com/mapbox/node-pre-gyp/issues/182)
- Support custom binary hosting mirror (https://github.com/mapbox/node-pre-gyp/pull/170)
- Added known version in crosswalk for 4.2.2.
## 0.6.14
- Added node 5.x version
## 0.6.13
- Added more known node 4.x versions
## 0.6.12
- Added support for [Electron](http://electron.atom.io/). Just pass the `--runtime=electron` flag when building/installing. Thanks @zcbenz
## 0.6.11
- Added known node and io.js versions including more 3.x and 4.x versions
## 0.6.10
- Added known node and io.js versions including 3.x and 4.x versions
- Upgraded `tar` dep
## 0.6.9
- Upgraded `rc` dep
- Updated known io.js version: v2.4.0
## 0.6.8
- Upgraded `semver` and `rimraf` deps
- Updated known node and io.js versions
## 0.6.7
- Fixed `node_abi` versions for io.js 1.1.x -> 1.8.x (should be 43, but was stored as 42) (refs https://github.com/iojs/build/issues/94)
## 0.6.6
- Updated with known io.js 2.0.0 version
## 0.6.5
- Now respecting `npm_config_node_gyp` (https://github.com/npm/npm/pull/4887)
- Updated to semver@4.3.2
- Updated known node v0.12.x versions and io.js 1.x versions.
## 0.6.4
- Improved support for `io.js` (@fengmk2)
- Test coverage improvements (@mikemorris)
- Fixed support for `--dist-url` that regressed in 0.6.3
## 0.6.3
- Added support for passing raw options to node-gyp using `--` separator. Flags passed after
the `--` to `node-pre-gyp configure` will be passed directly to gyp while flags passed
after the `--` will be passed directly to make/visual studio.
- Added `node-pre-gyp configure` command to be able to call `node-gyp configure` directly
- Fix issue with require validation not working on windows 7 (@edgarsilva)
## 0.6.2
- Support for io.js >= v1.0.2
- Deferred require of `request` and `tar` to help speed up command line usage of `node-pre-gyp`.
## 0.6.1
- Fixed bundled `tar` version
## 0.6.0
- BREAKING: node odd releases like v0.11.x now use `major.minor.patch` for `{node_abi}` instead of `NODE_MODULE_VERSION` (#124)
- Added support for `toolset` option in versioning. By default is an empty string but `--toolset` can be passed to publish or install to select alternative binaries that target a custom toolset like C++11. For example to target Visual Studio 2014 modules like node-sqlite3 use `--toolset=v140`.
- Added support for `--no-rollback` option to request that a failed binary test does not remove the binary module leaves it in place.
- Added support for `--update-binary` option to request an existing binary be re-installed and the check for a valid local module be skipped.
- Added support for passing build options from `npm` through `node-pre-gyp` to `node-gyp`: `--nodedir`, `--disturl`, `--python`, and `--msvs_version`
## 0.5.31
- Added support for deducing node_abi for node.js runtime from previous release if the series is even
- Added support for --target=0.10.33
## 0.5.30
- Repackaged with latest bundled deps
## 0.5.29
- Added support for semver `build`.
- Fixed support for downloading from urls that include `+`.
## 0.5.28
- Now reporting unix style paths only in reveal command
## 0.5.27
- Fixed support for auto-detecting s3 bucket name when it contains `.` - @taavo
- Fixed support for installing when path contains a `'` - @halfdan
- Ported tests to mocha
## 0.5.26
- Fix node-webkit support when `--target` option is not provided
## 0.5.25
- Fix bundling of deps
## 0.5.24
- Updated ABI crosswalk to include node v0.10.30 and v0.10.31
## 0.5.23
- Added `reveal` command. Pass no options to get all versioning data as json. Pass a second arg to grab a single versioned property value
- Added support for `--silent` (shortcut for `--loglevel=silent`)
## 0.5.22
- Fixed node-webkit versioning name (NOTE: node-webkit support still experimental)
## 0.5.21
- New package to fix `shasum check failed` error with v0.5.20
## 0.5.20
- Now versioning node-webkit binaries based on major.minor.patch - assuming no compatible ABI across versions (#90)
## 0.5.19
- Updated to know about more node-webkit releases
## 0.5.18
- Updated to know about more node-webkit releases
## 0.5.17
- Updated to know about node v0.10.29 release
## 0.5.16
- Now supporting all aws-sdk configuration parameters (http://docs.aws.amazon.com/AWSJavaScriptSDK/guide/node-configuring.html) (#86)
## 0.5.15
- Fixed installation of windows packages sub directories on unix systems (#84)
## 0.5.14
- Finished support for cross building using `--target_platform` option (#82)
- Now skipping binary validation on install if target arch/platform do not match the host.
- Removed multi-arch validating for macOS since it required a FAT node.js binary
## 0.5.13
- Fix problem in 0.5.12 whereby the wrong versions of mkdirp and semver where bundled.
## 0.5.12
- Improved support for node-webkit (@Mithgol)
## 0.5.11
- Updated target versions listing
## 0.5.10
- Fixed handling of `-debug` flag passed directory to node-pre-gyp (#72)
- Added optional second arg to `node_pre_gyp.find` to customize the default versioning options used to locate the runtime binary
- Failed install due to `testbinary` check failure no longer leaves behind binary (#70)
## 0.5.9
- Fixed regression in `testbinary` command causing installs to fail on windows with 0.5.7 (#60)
## 0.5.8
- Started bundling deps
## 0.5.7
- Fixed the `testbinary` check, which is used to determine whether to re-download or source compile, to work even in complex dependency situations (#63)
- Exposed the internal `testbinary` command in node-pre-gyp command line tool
- Fixed minor bug so that `fallback_to_build` option is always respected
## 0.5.6
- Added support for versioning on the `name` value in `package.json` (#57).
- Moved to using streams for reading tarball when publishing (#52)
## 0.5.5
- Improved binary validation that also now works with node-webkit (@Mithgol)
- Upgraded test apps to work with node v0.11.x
- Improved test coverage
## 0.5.4
- No longer depends on external install of node-gyp for compiling builds.
## 0.5.3
- Reverted fix for debian/nodejs since it broke windows (#45)
## 0.5.2
- Support for debian systems where the node binary is named `nodejs` (#45)
- Added `bin/node-pre-gyp.cmd` to be able to run command on windows locally (npm creates an .npm automatically when globally installed)
- Updated abi-crosswalk with node v0.10.26 entry.
## 0.5.1
- Various minor bug fixes, several improving windows support for publishing.
## 0.5.0
- Changed property names in `binary` object: now required are `module_name`, `module_path`, and `host`.
- Now `module_path` supports versioning, which allows developers to opt-in to using a versioned install path (#18).
- Added `remote_path` which also supports versioning.
- Changed `remote_uri` to `host`.
## 0.4.2
- Added support for `--target` flag to request cross-compile against a specific node/node-webkit version.
- Added preliminary support for node-webkit
- Fixed support for `--target_arch` option being respected in all cases.
## 0.4.1
- Fixed exception when only stderr is available in binary test (@bendi / #31)
## 0.4.0
- Enforce only `https:` based remote publishing access.
- Added `node-pre-gyp info` command to display listing of published binaries
- Added support for changing the directory node-pre-gyp should build in with the `-C/--directory` option.
- Added support for S3 prefixes.
## 0.3.1
- Added `unpublish` command.
- Fixed module path construction in tests.
- Added ability to disable falling back to build behavior via `npm install --fallback-to-build=false` which overrides setting in a dependencies package.json `install` target.
## 0.3.0
- Support for packaging all files in `module_path` directory - see `app4` for example
- Added `testpackage` command.
- Changed `clean` command to only delete `.node` not entire `build` directory since node-gyp will handle that.
- `.node` modules must be in a folder of there own since tar-pack will remove everything when it unpacks.

1
node_modules/@mapbox/node-pre-gyp/CODEOWNERS generated vendored Normal file
View File

@@ -0,0 +1 @@
* @mapbox/maps-api

27
node_modules/@mapbox/node-pre-gyp/LICENSE generated vendored Normal file
View File

@@ -0,0 +1,27 @@
Copyright (c), Mapbox
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice,
this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
* Neither the name of node-pre-gyp nor the names of its contributors
may be used to endorse or promote products derived from this software
without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

744
node_modules/@mapbox/node-pre-gyp/README.md generated vendored Normal file
View File

@@ -0,0 +1,744 @@
# @mapbox/node-pre-gyp
#### @mapbox/node-pre-gyp makes it easy to publish and install Node.js C++ addons from binaries
[![Build status](https://ci.appveyor.com/api/projects/status/3nxewb425y83c0gv)](https://ci.appveyor.com/project/Mapbox/node-pre-gyp)
`@mapbox/node-pre-gyp` stands between [npm](https://github.com/npm/npm) and [node-gyp](https://github.com/Tootallnate/node-gyp) and offers a cross-platform method of binary deployment.
### Special note on previous package
On Feb 9th, 2021 `@mapbox/node-pre-gyp@1.0.0` was [released](./CHANGELOG.md). Older, unscoped versions that are not part of the `@mapbox` org are deprecated and only `@mapbox/node-pre-gyp` will see updates going forward. To upgrade to the new package do:
```
npm uninstall node-pre-gyp --save
npm install @mapbox/node-pre-gyp --save
```
### Features
- A command line tool called `node-pre-gyp` that can install your package's C++ module from a binary.
- A variety of developer targeted commands for packaging, testing, and publishing binaries.
- A JavaScript module that can dynamically require your installed binary: `require('@mapbox/node-pre-gyp').find`
For a hello world example of a module packaged with `node-pre-gyp` see <https://github.com/springmeyer/node-addon-example> and [the wiki ](https://github.com/mapbox/node-pre-gyp/wiki/Modules-using-node-pre-gyp) for real world examples.
## Credits
- The module is modeled after [node-gyp](https://github.com/Tootallnate/node-gyp) by [@Tootallnate](https://github.com/Tootallnate)
- Motivation for initial development came from [@ErisDS](https://github.com/ErisDS) and the [Ghost Project](https://github.com/TryGhost/Ghost).
- Development is sponsored by [Mapbox](https://www.mapbox.com/)
## FAQ
See the [Frequently Ask Questions](https://github.com/mapbox/node-pre-gyp/wiki/FAQ).
## Depends
We will attempt to track the [Node.js release schedule](https://github.com/nodejs/release#release-schedule) and will regularly retire support for versions that have reached EOL.
- v2: Node.js >= 18.x (unreleased)
- v1: Node.js >= 8.x
## Install
`node-pre-gyp` is designed to be installed as a local dependency of your Node.js C++ addon and accessed like:
./node_modules/.bin/node-pre-gyp --help
But you can also install it globally:
npm install @mapbox/node-pre-gyp -g
## Usage
### Commands
View all possible commands:
node-pre-gyp --help
- clean - Remove the entire folder containing the compiled .node module
- install - Install pre-built binary for module
- reinstall - Run "clean" and "install" at once
- build - Compile the module by dispatching to node-gyp or nw-gyp
- rebuild - Run "clean" and "build" at once
- package - Pack binary into tarball
- testpackage - Test that the staged package is valid
- publish - Publish pre-built binary
- unpublish - Unpublish pre-built binary
- info - Fetch info on published binaries
You can also chain commands:
node-pre-gyp clean build unpublish publish info
### Options
Options include:
- `-C/--directory`: run the command in this directory
- `--build-from-source`: build from source instead of using pre-built binary
- `--update-binary`: reinstall by replacing previously installed local binary with remote binary
- `--runtime=node-webkit`: customize the runtime: `node`, `electron` and `node-webkit` are the valid options
- `--fallback-to-build`: fallback to building from source if pre-built binary is not available
- `--target=0.4.0`: Pass the target node or node-webkit version to compile against
- `--target_arch=ia32`: Pass the target arch and override the host `arch`. Any value that is [supported by Node.js](https://nodejs.org/api/os.html#osarch) is valid.
- `--target_platform=win32`: Pass the target platform and override the host `platform`. Valid values are `linux`, `darwin`, `win32`, `sunos`, `freebsd`, `openbsd`, and `aix`.
Both `--build-from-source` and `--fallback-to-build` can be passed alone or they can provide values. You can pass `--fallback-to-build=false` to override the option as declared in package.json. In addition to being able to pass `--build-from-source` you can also pass `--build-from-source=myapp` where `myapp` is the name of your module.
For example: `npm install --build-from-source=myapp`. This is useful if:
- `myapp` is referenced in the package.json of a larger app and therefore `myapp` is being installed as a dependency with `npm install`.
- The larger app also depends on other modules installed with `node-pre-gyp`
- You only want to trigger a source compile for `myapp` and the other modules.
### Configuring
This is a guide to configuring your module to use node-pre-gyp.
#### 1) Add new entries to your `package.json`
- Add `@mapbox/node-pre-gyp` to `dependencies`
- Add `aws-sdk` as a `devDependency`
- Add a custom `install` script
- Declare a `binary` object
This looks like:
```js
"dependencies" : {
"@mapbox/node-pre-gyp": "1.x"
},
"devDependencies": {
"aws-sdk": "2.x"
}
"scripts": {
"install": "node-pre-gyp install --fallback-to-build"
},
"binary": {
"module_name": "your_module",
"module_path": "./lib/binding/",
"host": "https://your_module.s3-us-west-1.amazonaws.com"
}
```
For a full example see [node-addon-examples's package.json](https://github.com/springmeyer/node-addon-example/blob/master/package.json).
Let's break this down:
- Dependencies need to list `node-pre-gyp`
- Your devDependencies should list `aws-sdk` so that you can run `node-pre-gyp publish` locally or a CI system. We recommend using `devDependencies` only since `aws-sdk` is large and not needed for `node-pre-gyp install` since it only uses http to fetch binaries
- Your `scripts` section should override the `install` target with `"install": "node-pre-gyp install --fallback-to-build"`. This allows node-pre-gyp to be used instead of the default npm behavior of always source compiling with `node-gyp` directly.
- Your package.json should contain a `binary` section describing key properties you provide to allow node-pre-gyp to package optimally. They are detailed below.
Note: in the past we recommended putting `@mapbox/node-pre-gyp` in the `bundledDependencies`, but we no longer recommend this. In the past there were npm bugs (with node versions 0.10.x) that could lead to node-pre-gyp not being available at the right time during install (unless we bundled). This should no longer be the case. Also, for a time we recommended using `"preinstall": "npm install @mapbox/node-pre-gyp"` as an alternative method to avoid needing to bundle. But this did not behave predictably across all npm versions - see https://github.com/mapbox/node-pre-gyp/issues/260 for the details. So we do not recommend using `preinstall` to install `@mapbox/node-pre-gyp`. More history on this at https://github.com/strongloop/fsevents/issues/157#issuecomment-265545908.
##### The `binary` object has three required properties
###### module_name
The name of your native node module. This value must:
- Match the name passed to [the NODE_MODULE macro](http://nodejs.org/api/addons.html#addons_hello_world)
- Must be a valid C variable name (e.g. it cannot contain `-`)
- Should not include the `.node` extension.
###### module_path
The location your native module is placed after a build. This should be an empty directory without other Javascript files. This entire directory will be packaged in the binary tarball. When installing from a remote package this directory will be overwritten with the contents of the tarball.
Note: This property supports variables based on [Versioning](#versioning).
###### host
A url to the remote location where you've published tarball binaries (must be `https` not `http`).
It is highly recommended that you use Amazon S3. The reasons are:
- Various node-pre-gyp commands like `publish` and `info` only work with an S3 host.
- S3 is a very solid hosting platform for distributing large files.
- We provide detail documentation for using [S3 hosting](#s3-hosting) with node-pre-gyp.
Why then not require S3? Because while some applications using node-pre-gyp need to distribute binaries as large as 20-30 MB, others might have very small binaries and might wish to store them in a GitHub repo. This is not recommended, but if an author really wants to host in a non-S3 location then it should be possible.
It should also be mentioned that there is an optional and entirely separate npm module called [node-pre-gyp-github](https://github.com/bchr02/node-pre-gyp-github) which is intended to complement node-pre-gyp and be installed along with it. It provides the ability to store and publish your binaries within your repositories GitHub Releases if you would rather not use S3 directly. Installation and usage instructions can be found [here](https://github.com/bchr02/node-pre-gyp-github), but the basic premise is that instead of using the ```node-pre-gyp publish``` command you would use ```node-pre-gyp-github publish```.
##### The `binary` object other optional S3 properties
If you are not using a standard s3 path like `bucket_name.s3(.-)region.amazonaws.com`, you might get an error on `publish` because node-pre-gyp extracts the region and bucket from the `host` url. For example, you may have an on-premises s3-compatible storage server, or may have configured a specific dns redirecting to an s3 endpoint. In these cases, you can explicitly set the `region` and `bucket` properties to tell node-pre-gyp to use these values instead of guessing from the `host` property. The following values can be used in the `binary` section:
###### host
The url to the remote server root location (must be `https` not `http`).
###### bucket
The bucket name where your tarball binaries should be located.
###### region
Your S3 server region.
###### s3ForcePathStyle
Set `s3ForcePathStyle` to true if the endpoint url should not be prefixed with the bucket name. If false (default), the server endpoint would be constructed as `bucket_name.your_server.com`.
##### The `binary` object has optional properties
###### remote_path
It **is recommended** that you customize this property. This is an extra path to use for publishing and finding remote tarballs. The default value for `remote_path` is `""` meaning that if you do not provide it then all packages will be published at the base of the `host`. It is recommended to provide a value like `./{name}/v{version}` to help organize remote packages in the case that you choose to publish multiple node addons to the same `host`.
Note: This property supports variables based on [Versioning](#versioning).
###### package_name
It is **not recommended** to override this property unless you are also overriding the `remote_path`. This is the versioned name of the remote tarball containing the binary `.node` module and any supporting files you've placed inside the `module_path` directory. Unless you specify `package_name` in your `package.json` then it defaults to `{module_name}-v{version}-{node_abi}-{platform}-{arch}.tar.gz` which allows your binary to work across node versions, platforms, and architectures. If you are using `remote_path` that is also versioned by `./{module_name}/v{version}` then you could remove these variables from the `package_name` and just use: `{node_abi}-{platform}-{arch}.tar.gz`. Then your remote tarball will be looked up at, for example, `https://example.com/your-module/v0.1.0/node-v11-linux-x64.tar.gz`.
Avoiding the version of your module in the `package_name` and instead only embedding in a directory name can be useful when you want to make a quick tag of your module that does not change any C++ code. In this case you can just copy binaries to the new version behind the scenes like:
```sh
aws s3 sync --acl public-read s3://mapbox-node-binary/sqlite3/v3.0.3/ s3://mapbox-node-binary/sqlite3/v3.0.4/
```
Note: This property supports variables based on [Versioning](#versioning).
#### 2) Add a new target to binding.gyp
`node-pre-gyp` calls out to `node-gyp` to compile the module and passes variables along like [module_name](#module_name) and [module_path](#module_path).
A new target must be added to `binding.gyp` that moves the compiled `.node` module from `./build/Release/module_name.node` into the directory specified by `module_path`.
Add a target like this at the end of your `targets` list:
```js
{
"target_name": "action_after_build",
"type": "none",
"dependencies": [ "<(module_name)" ],
"copies": [
{
"files": [ "<(PRODUCT_DIR)/<(module_name).node" ],
"destination": "<(module_path)"
}
]
}
```
For a full example see [node-addon-example's binding.gyp](https://github.com/springmeyer/node-addon-example/blob/2ff60a8ded7f042864ad21db00c3a5a06cf47075/binding.gyp).
#### 3) Dynamically require your `.node`
Inside the main js file that requires your addon module you are likely currently doing:
```js
var binding = require('../build/Release/binding.node');
```
or:
```js
var bindings = require('./bindings')
```
Change those lines to:
```js
var binary = require('@mapbox/node-pre-gyp');
var path = require('path');
var binding_path = binary.find(path.resolve(path.join(__dirname,'./package.json')));
var binding = require(binding_path);
```
For a full example see [node-addon-example's index.js](https://github.com/springmeyer/node-addon-example/blob/2ff60a8ded7f042864ad21db00c3a5a06cf47075/index.js#L1-L4)
#### 4) Build and package your app
Now build your module from source:
npm install --build-from-source
The `--build-from-source` tells `node-pre-gyp` to not look for a remote package and instead dispatch to node-gyp to build.
Now `node-pre-gyp` should now also be installed as a local dependency so the command line tool it offers can be found at `./node_modules/.bin/node-pre-gyp`.
#### 5) Test
Now `npm test` should work just as it did before.
#### 6) Publish the tarball
Then package your app:
./node_modules/.bin/node-pre-gyp package
Once packaged, now you can publish:
./node_modules/.bin/node-pre-gyp publish
Currently the `publish` command pushes your binary to S3. This requires:
- You have installed `aws-sdk` with `npm install aws-sdk`
- You have created a bucket already.
- The `host` points to an S3 http or https endpoint.
- You have configured node-pre-gyp to read your S3 credentials (see [S3 hosting](#s3-hosting) for details).
You can also host your binaries elsewhere. To do this requires:
- You manually publish the binary created by the `package` command to an `https` endpoint
- Ensure that the `host` value points to your custom `https` endpoint.
#### 7) Automate builds
Now you need to publish builds for all the platforms and node versions you wish to support. This is best automated.
- See [Appveyor Automation](#appveyor-automation) for how to auto-publish builds on Windows.
- See [Travis Automation](#travis-automation) for how to auto-publish builds on OS X and Linux.
#### 8) You're done!
Now publish your module to the npm registry. Users will now be able to install your module from a binary.
What will happen is this:
1. `npm install <your package>` will pull from the npm registry
2. npm will run the `install` script which will call out to `node-pre-gyp`
3. `node-pre-gyp` will fetch the binary `.node` module and unpack in the right place
4. Assuming that all worked, you are done
If a a binary was not available for a given platform and `--fallback-to-build` was used then `node-gyp rebuild` will be called to try to source compile the module.
#### 9) One more option
It may be that you want to work with two s3 buckets, one for staging and one for production; this
arrangement makes it less likely to accidentally overwrite a production binary. It also allows the production
environment to have more restrictive permissions than staging while still enabling publishing when
developing and testing.
The binary.host property can be set at execution time. In order to do so all of the following conditions
must be true.
- binary.host is falsey or not present
- binary.staging_host is not empty
- binary.production_host is not empty
If any of these checks fail then the operation will not perform execution time determination of the s3 target.
If the command being executed is either "publish" or "unpublish" then the default is set to `binary.staging_host`. In all other cases
the default is `binary.production_host`.
The command-line options `--s3_host=staging` or `--s3_host=production` override the default. If `s3_host`
is present and not `staging` or `production` an exception is thrown.
This allows installing from staging by specifying `--s3_host=staging`. And it requires specifying
`--s3_option=production` in order to publish to, or unpublish from, production, making accidental errors less likely.
## Node-API Considerations
[Node-API](https://nodejs.org/api/n-api.html#n_api_node_api), which was previously known as N-API, is an ABI-stable alternative to previous technologies such as [nan](https://github.com/nodejs/nan) which are tied to a specific Node runtime engine. Node-API is Node runtime engine agnostic and guarantees modules created today will continue to run, without changes, into the future.
Using `node-pre-gyp` with Node-API projects requires a handful of additional configuration values and imposes some additional requirements.
The most significant difference is that an Node-API module can be coded to target multiple Node-API versions. Therefore, an Node-API module must declare in its `package.json` file which Node-API versions the module is designed to run against. In addition, since multiple builds may be required for a single module, path and file names must be specified in way that avoids naming conflicts.
### The `napi_versions` array property
A Node-API module must declare in its `package.json` file, the Node-API versions the module is intended to support. This is accomplished by including an `napi-versions` array property in the `binary` object. For example:
```js
"binary": {
"module_name": "your_module",
"module_path": "your_module_path",
"host": "https://your_bucket.s3-us-west-1.amazonaws.com",
"napi_versions": [1,3]
}
```
If the `napi_versions` array property is *not* present, `node-pre-gyp` operates as it always has. Including the `napi_versions` array property instructs `node-pre-gyp` that this is a Node-API module build.
When the `napi_versions` array property is present, `node-pre-gyp` fires off multiple operations, one for each of the Node-API versions in the array. In the example above, two operations are initiated, one for Node-API version 1 and second for Node-API version 3. How this version number is communicated is described next.
### The `napi_build_version` value
For each of the Node-API module operations `node-pre-gyp` initiates, it ensures that the `napi_build_version` is set appropriately.
This value is of importance in two areas:
1. The C/C++ code which needs to know against which Node-API version it should compile.
2. `node-pre-gyp` itself which must assign appropriate path and file names to avoid collisions.
### Defining `NAPI_VERSION` for the C/C++ code
The `napi_build_version` value is communicated to the C/C++ code by adding this code to the `binding.gyp` file:
```
"defines": [
"NAPI_VERSION=<(napi_build_version)",
]
```
This ensures that `NAPI_VERSION`, an integer value, is declared appropriately to the C/C++ code for each build.
> Note that earlier versions of this document recommended defining the symbol `NAPI_BUILD_VERSION`. `NAPI_VERSION` is preferred because it used by the Node-API C/C++ headers to configure the specific Node-API versions being requested.
### Path and file naming requirements in `package.json`
Since `node-pre-gyp` fires off multiple operations for each request, it is essential that path and file names be created in such a way as to avoid collisions. This is accomplished by imposing additional path and file naming requirements.
Specifically, when performing Node-API builds, the `{napi_build_version}` text configuration value *must* be present in the `module_path` property. In addition, the `{napi_build_version}` text configuration value *must* be present in either the `remote_path` or `package_name` property. (No problem if it's in both.)
Here's an example:
```js
"binary": {
"module_name": "your_module",
"module_path": "./lib/binding/napi-v{napi_build_version}",
"remote_path": "./{module_name}/v{version}/{configuration}/",
"package_name": "{platform}-{arch}-napi-v{napi_build_version}.tar.gz",
"host": "https://your_bucket.s3-us-west-1.amazonaws.com",
"napi_versions": [1,3]
}
```
## Supporting both Node-API and NAN builds
You may have a legacy native add-on that you wish to continue supporting for those versions of Node that do not support Node-API, as you add Node-API support for later Node versions. This can be accomplished by specifying the `node_napi_label` configuration value in the package.json `binary.package_name` property.
Placing the configuration value `node_napi_label` in the package.json `binary.package_name` property instructs `node-pre-gyp` to build all viable Node-API binaries supported by the current Node instance. If the current Node instance does not support Node-API, `node-pre-gyp` will request a traditional, non-Node-API build.
The configuration value `node_napi_label` is set by `node-pre-gyp` to the type of build created, `napi` or `node`, and the version number. For Node-API builds, the string contains the Node-API version nad has values like `napi-v3`. For traditional, non-Node-API builds, the string contains the ABI version with values like `node-v46`.
Here's how the `binary` configuration above might be changed to support both Node-API and NAN builds:
```js
"binary": {
"module_name": "your_module",
"module_path": "./lib/binding/{node_napi_label}",
"remote_path": "./{module_name}/v{version}/{configuration}/",
"package_name": "{platform}-{arch}-{node_napi_label}.tar.gz",
"host": "https://your_bucket.s3-us-west-1.amazonaws.com",
"napi_versions": [1,3]
}
```
The C/C++ symbol `NAPI_VERSION` can be used to distinguish Node-API and non-Node-API builds. The value of `NAPI_VERSION` is set to the integer Node-API version for Node-API builds and is set to `0` for non-Node-API builds.
For example:
```C
#if NAPI_VERSION
// Node-API code goes here
#else
// NAN code goes here
#endif
```
### Two additional configuration values
The following two configuration values, which were implemented in previous versions of `node-pre-gyp`, continue to exist, but have been replaced by the `node_napi_label` configuration value described above.
1. `napi_version` If Node-API is supported by the currently executing Node instance, this value is the Node-API version number supported by Node. If Node-API is not supported, this value is an empty string.
2. `node_abi_napi` If the value returned for `napi_version` is non empty, this value is `'napi'`. If the value returned for `napi_version` is empty, this value is the value returned for `node_abi`.
These values are present for use in the `binding.gyp` file and may be used as `{napi_version}` and `{node_abi_napi}` for text substitution in the `binary` properties of the `package.json` file.
## S3 Hosting
You can host wherever you choose but S3 is cheap, `node-pre-gyp publish` expects it, and S3 can be integrated well with [Travis.ci](http://travis-ci.org) to automate builds for OS X and Ubuntu, and with [Appveyor](http://appveyor.com) to automate builds for Windows. Here is an approach to do this:
First, get setup locally and test the workflow:
#### 1) Create an S3 bucket
And have your **key** and **secret key** ready for writing to the bucket.
It is recommended to create a IAM user with a policy that only gives permissions to the specific bucket you plan to publish to. This can be done in the [IAM console](https://console.aws.amazon.com/iam/) by: 1) adding a new user, 2) choosing `Attach User Policy`, 3) Using the `Policy Generator`, 4) selecting `Amazon S3` for the service, 5) adding the actions: `DeleteObject`, `GetObject`, `GetObjectAcl`, `ListBucket`, `HeadBucket`, `PutObject`, `PutObjectAcl`, 6) adding an ARN of `arn:aws:s3:::bucket/*` (replacing `bucket` with your bucket name), and finally 7) clicking `Add Statement` and saving the policy. It should generate a policy like:
```js
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "objects",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObjectAcl",
"s3:GetObject",
"s3:DeleteObject",
"s3:PutObjectAcl"
],
"Resource": "arn:aws:s3:::your-bucket-name/*"
},
{
"Sid": "bucket",
"Effect": "Allow",
"Action": "s3:ListBucket",
"Resource": "arn:aws:s3:::your-bucket-name"
},
{
"Sid": "buckets",
"Effect": "Allow",
"Action": "s3:HeadBucket",
"Resource": "*"
}
]
}
```
#### 2) Install node-pre-gyp
Either install it globally:
npm install node-pre-gyp -g
Or put the local version on your PATH
export PATH=`pwd`/node_modules/.bin/:$PATH
#### 3) Configure AWS credentials
It is recommended to configure the AWS JS SDK v2 used internally by `node-pre-gyp` by setting these environment variables:
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCESS_KEY
But also you can also use the `Shared Config File` mentioned [in the AWS JS SDK v2 docs](https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/configuring-the-jssdk.html)
#### 4) Package and publish your build
Install the `aws-sdk`:
npm install aws-sdk
Then publish:
node-pre-gyp package publish
Note: if you hit an error like `Hostname/IP doesn't match certificate's altnames` it may mean that you need to provide the `region` option in your config.
## Appveyor Automation
[Appveyor](http://www.appveyor.com/) can build binaries and publish the results per commit and supports:
- Windows Visual Studio 2013 and related compilers
- Both 64 bit (x64) and 32 bit (x86) build configurations
- Multiple Node.js versions
For an example of doing this see [node-sqlite3's appveyor.yml](https://github.com/mapbox/node-sqlite3/blob/master/appveyor.yml).
Below is a guide to getting set up:
#### 1) Create a free Appveyor account
Go to https://ci.appveyor.com/signup/free and sign in with your GitHub account.
#### 2) Create a new project
Go to https://ci.appveyor.com/projects/new and select the GitHub repo for your module
#### 3) Add appveyor.yml and push it
Once you have committed an `appveyor.yml` ([appveyor.yml reference](http://www.appveyor.com/docs/appveyor-yml)) to your GitHub repo and pushed it AppVeyor should automatically start building your project.
#### 4) Create secure variables
Encrypt your S3 AWS keys by going to <https://ci.appveyor.com/tools/encrypt> and hitting the `encrypt` button.
Then paste the result into your `appveyor.yml`
```yml
environment:
AWS_ACCESS_KEY_ID:
secure: Dn9HKdLNYvDgPdQOzRq/DqZ/MPhjknRHB1o+/lVU8MA=
AWS_SECRET_ACCESS_KEY:
secure: W1rwNoSnOku1r+28gnoufO8UA8iWADmL1LiiwH9IOkIVhDTNGdGPJqAlLjNqwLnL
```
NOTE: keys are per account but not per repo (this is difference than Travis where keys are per repo but not related to the account used to encrypt them).
#### 5) Hook up publishing
Just put `node-pre-gyp package publish` in your `appveyor.yml` after `npm install`.
#### 6) Publish when you want
You might wish to publish binaries only on a specific commit. To do this you could borrow from the [Travis CI idea of commit keywords](http://about.travis-ci.org/docs/user/how-to-skip-a-build/) and add special handling for commit messages with `[publish binary]`:
SET CM=%APPVEYOR_REPO_COMMIT_MESSAGE%
if not "%CM%" == "%CM:[publish binary]=%" node-pre-gyp --msvs_version=2013 publish
If your commit message contains special characters (e.g. `&`) this method might fail. An alternative is to use PowerShell, which gives you additional possibilities, like ignoring case by using `ToLower()`:
ps: if($env:APPVEYOR_REPO_COMMIT_MESSAGE.ToLower().Contains('[publish binary]')) { node-pre-gyp --msvs_version=2013 publish }
Remember this publishing is not the same as `npm publish`. We're just talking about the binary module here and not your entire npm package.
## Travis Automation
[Travis](https://travis-ci.org/) can push to S3 after a successful build and supports both:
- Ubuntu Precise and OS X (64 bit)
- Multiple Node.js versions
For an example of doing this see [node-add-example's .travis.yml](https://github.com/springmeyer/node-addon-example/blob/2ff60a8ded7f042864ad21db00c3a5a06cf47075/.travis.yml).
Note: if you need 32 bit binaries, this can be done from a 64 bit Travis machine. See [the node-sqlite3 scripts for an example of doing this](https://github.com/mapbox/node-sqlite3/blob/bae122aa6a2b8a45f6b717fab24e207740e32b5d/scripts/build_against_node.sh#L54-L74).
Below is a guide to getting set up:
#### 1) Install the Travis gem
gem install travis
#### 2) Create secure variables
Make sure you run this command from within the directory of your module.
Use `travis-encrypt` like:
travis encrypt AWS_ACCESS_KEY_ID=${node_pre_gyp_accessKeyId}
travis encrypt AWS_SECRET_ACCESS_KEY=${node_pre_gyp_secretAccessKey}
Then put those values in your `.travis.yml` like:
```yaml
env:
global:
- secure: F+sEL/v56CzHqmCSSES4pEyC9NeQlkoR0Gs/ZuZxX1ytrj8SKtp3MKqBj7zhIclSdXBz4Ev966Da5ctmcTd410p0b240MV6BVOkLUtkjZJyErMBOkeb8n8yVfSoeMx8RiIhBmIvEn+rlQq+bSFis61/JkE9rxsjkGRZi14hHr4M=
- secure: o2nkUQIiABD139XS6L8pxq3XO5gch27hvm/gOdV+dzNKc/s2KomVPWcOyXNxtJGhtecAkABzaW8KHDDi5QL1kNEFx6BxFVMLO8rjFPsMVaBG9Ks6JiDQkkmrGNcnVdxI/6EKTLHTH5WLsz8+J7caDBzvKbEfTux5EamEhxIWgrI=
```
More details on Travis encryption at http://about.travis-ci.org/docs/user/encryption-keys/.
#### 3) Hook up publishing
Just put `node-pre-gyp package publish` in your `.travis.yml` after `npm install`.
##### OS X publishing
If you want binaries for OS X in addition to linux you can enable [multi-os for Travis](http://docs.travis-ci.com/user/multi-os/#Setting-.travis.yml)
Use a configuration like:
```yml
language: cpp
os:
- linux
- osx
env:
matrix:
- NODE_VERSION="4"
- NODE_VERSION="6"
before_install:
- rm -rf ~/.nvm/ && git clone --depth 1 https://github.com/creationix/nvm.git ~/.nvm
- source ~/.nvm/nvm.sh
- nvm install $NODE_VERSION
- nvm use $NODE_VERSION
```
See [Travis OS X Gotchas](#travis-os-x-gotchas) for why we replace `language: node_js` and `node_js:` sections with `language: cpp` and a custom matrix.
Also create platform specific sections for any deps that need install. For example if you need libpng:
```yml
- if [ $(uname -s) == 'Linux' ]; then apt-get install libpng-dev; fi;
- if [ $(uname -s) == 'Darwin' ]; then brew install libpng; fi;
```
For detailed multi-OS examples see [node-mapnik](https://github.com/mapnik/node-mapnik/blob/master/.travis.yml) and [node-sqlite3](https://github.com/mapbox/node-sqlite3/blob/master/.travis.yml).
##### Travis OS X Gotchas
First, unlike the Travis Linux machines, the OS X machines do not put `node-pre-gyp` on PATH by default. To do so you will need to:
```sh
export PATH=$(pwd)/node_modules/.bin:${PATH}
```
Second, the OS X machines do not support using a matrix for installing different Node.js versions. So you need to bootstrap the installation of Node.js in a cross platform way.
By doing:
```yml
env:
matrix:
- NODE_VERSION="4"
- NODE_VERSION="6"
before_install:
- rm -rf ~/.nvm/ && git clone --depth 1 https://github.com/creationix/nvm.git ~/.nvm
- source ~/.nvm/nvm.sh
- nvm install $NODE_VERSION
- nvm use $NODE_VERSION
```
You can easily recreate the previous behavior of this matrix:
```yml
node_js:
- "4"
- "6"
```
#### 4) Publish when you want
You might wish to publish binaries only on a specific commit. To do this you could borrow from the [Travis CI idea of commit keywords](http://about.travis-ci.org/docs/user/how-to-skip-a-build/) and add special handling for commit messages with `[publish binary]`:
COMMIT_MESSAGE=$(git log --format=%B --no-merges -n 1 | tr -d '\n')
if [[ ${COMMIT_MESSAGE} =~ "[publish binary]" ]]; then node-pre-gyp publish; fi;
Then you can trigger new binaries to be built like:
git commit -a -m "[publish binary]"
Or, if you don't have any changes to make simply run:
git commit --allow-empty -m "[publish binary]"
WARNING: if you are working in a pull request and publishing binaries from there then you will want to avoid double publishing when Travis CI builds both the `push` and `pr`. You only want to run the publish on the `push` commit. See https://github.com/Project-OSRM/node-osrm/blob/8eb837abe2e2e30e595093d16e5354bc5c573575/scripts/is_pr_merge.sh which is called from https://github.com/Project-OSRM/node-osrm/blob/8eb837abe2e2e30e595093d16e5354bc5c573575/scripts/publish.sh for an example of how to do this.
Remember this publishing is not the same as `npm publish`. We're just talking about the binary module here and not your entire npm package. To automate the publishing of your entire package to npm on Travis see http://about.travis-ci.org/docs/user/deployment/npm/
# Versioning
The `binary` properties of `module_path`, `remote_path`, and `package_name` support variable substitution. The strings are evaluated by `node-pre-gyp` depending on your system and any custom build flags you passed.
- `node_abi`: The node C++ `ABI` number. This value is available in Javascript as `process.versions.modules` as of [`>= v0.10.4 >= v0.11.7`](https://github.com/joyent/node/commit/ccabd4a6fa8a6eb79d29bc3bbe9fe2b6531c2d8e) and in C++ as the `NODE_MODULE_VERSION` define much earlier. For versions of Node before this was available we fallback to the V8 major and minor version.
- `platform` matches node's `process.platform` like `linux`, `darwin`, and `win32` unless the user passed the `--target_platform` option to override.
- `arch` matches node's `process.arch` like `x64` or `ia32` unless the user passes the `--target_arch` option to override.
- `libc` matches `require('detect-libc').family` like `glibc` or `musl` unless the user passes the `--target_libc` option to override.
- `configuration` - Either 'Release' or 'Debug' depending on if `--debug` is passed during the build.
- `module_name` - the `binary.module_name` attribute from `package.json`.
- `version` - the semver `version` value for your module from `package.json` (NOTE: ignores the `semver.build` property).
- `major`, `minor`, `patch`, and `prelease` match the individual semver values for your module's `version`
- `build` - the sevmer `build` value. For example it would be `this.that` if your package.json `version` was `v1.0.0+this.that`
- `prerelease` - the semver `prerelease` value. For example it would be `alpha.beta` if your package.json `version` was `v1.0.0-alpha.beta`
The options are visible in the code at <https://github.com/mapbox/node-pre-gyp/blob/612b7bca2604508d881e1187614870ba19a7f0c5/lib/util/versioning.js#L114-L127>
# Download binary files from a mirror
S3 is broken in China for the well known reason.
Using the `npm` config argument: `--{module_name}_binary_host_mirror` can download binary files through a mirror, `-` in `module_name` will be replaced with `_`.
e.g.: Install [v8-profiler](https://www.npmjs.com/package/v8-profiler) from `npm`.
```bash
$ npm install v8-profiler --profiler_binary_host_mirror=https://npm.taobao.org/mirrors/node-inspector/
```
e.g.: Install [canvas-prebuilt](https://www.npmjs.com/package/canvas-prebuilt) from `npm`.
```bash
$ npm install canvas-prebuilt --canvas_prebuilt_binary_host_mirror=https://npm.taobao.org/mirrors/canvas-prebuilt/
```

23
node_modules/@mapbox/node-pre-gyp/RELEASE.md generated vendored Normal file
View File

@@ -0,0 +1,23 @@
# Instructions for making a release
1. Change the version number in `package.json`. Run the following command in the package root directory, replacing <update_type> with one of the semantic versioning release types (prerelease, prepatch, preminor, premajor, patch, minor, major):
```
npm version <update_type> --preid pre --no-git-tag-version
```
`--preid` specifies which suffix to use in the release such as `pre`, `next`, `beta`, `rc`, etc.
`prepatch`, `preminor`, and `premajor` start a new series of pre-releases while bumping the patch, minor, or major version. E.g. `premajor` with `--preid pre` would do a prerelease for a new major using the `-pre` suffix (i.e. it would be a new major with `-pre.0`)
You can use `prerelease` to bump the version for a new pre-release version. E.g. you could run `npm version prerelease --preid pre --no-git-tag-version` to go from `-pre.0` to `-pre.1`.
For regular versions, you can use `patch`, `minor`, or `major`. E.g. `npm version major --no-git-tag-version`.
2. Update the changelog, which can be found in `CHANGELOG.md`. The heading must match `## <VERSION>` exactly, or it will not be picked up. For example, for version 1.0.11:
```
## 1.0.11
```
3. Commit and push the changes. On push the release workflow will automaticlly check if the release has been published on npm. If the release has not yet been published, the workflow will update the abi crosswalk file and publish a new npm release.

4
node_modules/@mapbox/node-pre-gyp/bin/node-pre-gyp generated vendored Executable file
View File

@@ -0,0 +1,4 @@
#!/usr/bin/env node
'use strict';
require('../lib/main');

View File

@@ -0,0 +1,2 @@
@echo off
node "%~dp0\node-pre-gyp" %*

View File

@@ -0,0 +1,6 @@
- Supported Node versions are now stable versions of Node 18+. We will attempt to track the [Node.js release schedule](https://github.com/nodejs/release#release-schedule) and will regularly retire support for versions that have reached EOL.
- Fixed use of `s3ForcePathStyle` for installation [#650](https://github.com/mapbox/node-pre-gyp/pull/650)
- Upgraded to https-proxy-agent 7.0.5, nopt 8.0.0, semver 7.5.3, and tar 7.4.0
- Replaced npmlog with consola
- Removed rimraf and make-dir as dependencies

10
node_modules/@mapbox/node-pre-gyp/contributing.md generated vendored Normal file
View File

@@ -0,0 +1,10 @@
# Contributing
### Releasing a new version:
- Ensure tests are passing on travis and appveyor
- Run `node scripts/abi_crosswalk.js` and commit any changes
- Update the changelog
- Tag a new release like: `git tag -a v0.6.34 -m "tagging v0.6.34" && git push --tags`
- Run `npm publish`

51
node_modules/@mapbox/node-pre-gyp/lib/build.js generated vendored Normal file
View File

@@ -0,0 +1,51 @@
'use strict';
module.exports = exports = build;
exports.usage = 'Attempts to compile the module by dispatching to node-gyp or nw-gyp';
const napi = require('./util/napi.js');
const compile = require('./util/compile.js');
const handle_gyp_opts = require('./util/handle_gyp_opts.js');
const configure = require('./configure.js');
function do_build(gyp, argv, callback) {
handle_gyp_opts(gyp, argv, (err, result) => {
let final_args = ['build'].concat(result.gyp).concat(result.pre);
if (result.unparsed.length > 0) {
final_args = final_args.
concat(['--']).
concat(result.unparsed);
}
if (!err && result.opts.napi_build_version) {
napi.swap_build_dir_in(result.opts.napi_build_version);
}
compile.run_gyp(final_args, result.opts, (err2) => {
if (result.opts.napi_build_version) {
napi.swap_build_dir_out(result.opts.napi_build_version);
}
return callback(err2);
});
});
}
function build(gyp, argv, callback) {
// Form up commands to pass to node-gyp:
// We map `node-pre-gyp build` to `node-gyp configure build` so that we do not
// trigger a clean and therefore do not pay the penalty of a full recompile
if (argv.length && (argv.indexOf('rebuild') > -1)) {
argv.shift(); // remove `rebuild`
// here we map `node-pre-gyp rebuild` to `node-gyp rebuild` which internally means
// "clean + configure + build" and triggers a full recompile
compile.run_gyp(['clean'], {}, (err3) => {
if (err3) return callback(err3);
configure(gyp, argv, (err4) => {
if (err4) return callback(err4);
return do_build(gyp, argv, callback);
});
});
} else {
return do_build(gyp, argv, callback);
}
}

31
node_modules/@mapbox/node-pre-gyp/lib/clean.js generated vendored Normal file
View File

@@ -0,0 +1,31 @@
'use strict';
module.exports = exports = clean;
exports.usage = 'Removes the entire folder containing the compiled .node module';
const fs = require('fs');
const exists = fs.exists || require('path').exists;
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
const path = require('path');
function clean(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
const to_delete = opts.module_path;
if (!to_delete) {
return callback(new Error('module_path is empty, refusing to delete'));
} else if (path.normalize(to_delete) === path.normalize(process.cwd())) {
return callback(new Error('module_path is not set, refusing to delete'));
} else {
exists(to_delete, (found) => {
if (found) {
if (!gyp.opts.silent_clean) console.log('[' + package_json.name + '] Removing "%s"', to_delete);
return fs.rm(to_delete, { recursive: true, force: true }, callback);
}
return callback();
});
}
}

52
node_modules/@mapbox/node-pre-gyp/lib/configure.js generated vendored Normal file
View File

@@ -0,0 +1,52 @@
'use strict';
module.exports = exports = configure;
exports.usage = 'Attempts to configure node-gyp or nw-gyp build';
const napi = require('./util/napi.js');
const compile = require('./util/compile.js');
const handle_gyp_opts = require('./util/handle_gyp_opts.js');
function configure(gyp, argv, callback) {
handle_gyp_opts(gyp, argv, (err, result) => {
let final_args = result.gyp.concat(result.pre);
// pull select node-gyp configure options out of the npm environ
const known_gyp_args = ['dist-url', 'python', 'nodedir', 'msvs_version'];
known_gyp_args.forEach((key) => {
const val = gyp.opts[key] || gyp.opts[key.replace('-', '_')];
if (val) {
final_args.push('--' + key + '=' + val);
}
});
// --ensure=false tell node-gyp to re-install node development headers
// but it is only respected by node-gyp install, so we have to call install
// as a separate step if the user passes it
if (gyp.opts.ensure === false) {
const install_args = final_args.concat(['install', '--ensure=false']);
compile.run_gyp(install_args, result.opts, (err2) => {
if (err2) return callback(err2);
if (result.unparsed.length > 0) {
final_args = final_args.
concat(['--']).
concat(result.unparsed);
}
compile.run_gyp(['configure'].concat(final_args), result.opts, (err3) => {
return callback(err3);
});
});
} else {
if (result.unparsed.length > 0) {
final_args = final_args.
concat(['--']).
concat(result.unparsed);
}
compile.run_gyp(['configure'].concat(final_args), result.opts, (err4) => {
if (!err4 && result.opts.napi_build_version) {
napi.swap_build_dir_out(result.opts.napi_build_version);
}
return callback(err4);
});
}
});
}

37
node_modules/@mapbox/node-pre-gyp/lib/info.js generated vendored Normal file
View File

@@ -0,0 +1,37 @@
'use strict';
module.exports = exports = info;
exports.usage = 'Lists all published binaries (requires aws-sdk)';
const log = require('./util/log.js');
const versioning = require('./util/versioning.js');
const s3_setup = require('./util/s3_setup.js');
function info(gyp, argv, callback) {
const package_json = gyp.package_json;
const opts = versioning.evaluate(package_json, gyp.opts);
const config = s3_setup.detect(opts);
const s3 = s3_setup.get_s3(config);
const s3_opts = {
Bucket: config.bucket,
Prefix: config.prefix
};
s3.listObjects(s3_opts, (err, meta) => {
if (err && err.code === 'NotFound') {
return callback(new Error('[' + package_json.name + '] Not found: https://' + opts.hosted_path));
} else if (err) {
return callback(err);
} else {
log.verbose(JSON.stringify(meta, null, 1));
if (meta && meta.Contents.length) {
meta.Contents.forEach((obj) => {
console.log(obj.Key);
});
} else {
console.error('[' + package_json.name + '] Not found: No objects at https://' + opts.hosted_path);
}
return callback();
}
});
}

234
node_modules/@mapbox/node-pre-gyp/lib/install.js generated vendored Normal file
View File

@@ -0,0 +1,234 @@
'use strict';
module.exports = exports = install;
exports.usage = 'Attempts to install pre-built binary for module';
const fs = require('fs');
const path = require('path');
const log = require('./util/log.js');
const existsAsync = fs.exists || path.exists;
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
// for fetching binaries
const fetch = require('node-fetch');
const tar = require('tar');
let npgVersion = 'unknown';
try {
// Read own package.json to get the current node-pre-pyp version.
const ownPackageJSON = fs.readFileSync(path.join(__dirname, '..', 'package.json'), 'utf8');
npgVersion = JSON.parse(ownPackageJSON).version;
} catch (e) {
// do nothing
}
function place_binary(uri, targetDir, opts, callback) {
log.log('GET', uri);
// Try getting version info from the currently running npm.
const envVersionInfo = process.env.npm_config_user_agent ||
'node ' + process.version;
const sanitized = uri.replace('+', '%2B');
const requestOpts = {
uri: sanitized,
headers: {
'User-Agent': 'node-pre-gyp (v' + npgVersion + ', ' + envVersionInfo + ')'
},
follow_max: 10
};
if (opts.cafile) {
try {
requestOpts.ca = fs.readFileSync(opts.cafile);
} catch (e) {
return callback(e);
}
} else if (opts.ca) {
requestOpts.ca = opts.ca;
}
const proxyUrl = opts.proxy ||
process.env.http_proxy ||
process.env.HTTP_PROXY ||
process.env.npm_config_proxy;
let agent;
if (proxyUrl) {
const { HttpsProxyAgent } = require('https-proxy-agent');
agent = new HttpsProxyAgent(proxyUrl);
log.log('download', `proxy agent configured using: "${proxyUrl}"`);
}
fetch(sanitized, { agent })
.then((res) => {
if (!res.ok) {
throw new Error(`response status ${res.status} ${res.statusText} on ${sanitized}`);
}
const dataStream = res.body;
return new Promise((resolve, reject) => {
let extractions = 0;
const countExtractions = (entry) => {
extractions += 1;
log.info('install', `unpacking ${entry.path}`);
};
dataStream.pipe(extract(targetDir, countExtractions))
.on('error', (e) => {
reject(e);
});
dataStream.on('end', () => {
resolve(`extracted file count: ${extractions}`);
});
dataStream.on('error', (e) => {
reject(e);
});
});
})
.then((text) => {
log.info(text);
callback();
})
.catch((e) => {
log.error(`install ${e.message}`);
callback(e);
});
}
function extract(to, onentry) {
return tar.extract({
cwd: to,
strip: 1,
onentry
});
}
function extract_from_local(from, targetDir, callback) {
if (!fs.existsSync(from)) {
return callback(new Error('Cannot find file ' + from));
}
log.info('Found local file to extract from ' + from);
// extract helpers
let extractCount = 0;
function countExtractions(entry) {
extractCount += 1;
log.info('install', 'unpacking ' + entry.path);
}
function afterExtract(err) {
if (err) return callback(err);
if (extractCount === 0) {
return callback(new Error('There was a fatal problem while extracting the tarball'));
}
log.info('tarball', 'done parsing tarball');
callback();
}
fs.createReadStream(from).pipe(extract(targetDir, countExtractions))
.on('close', afterExtract)
.on('error', afterExtract);
}
function do_build(gyp, argv, callback) {
const args = ['rebuild'].concat(argv);
gyp.todo.push({ name: 'build', args: args });
process.nextTick(callback);
}
function print_fallback_error(err, opts, package_json) {
const fallback_message = ' (falling back to source compile with node-gyp)';
let full_message = '';
if (err.statusCode !== undefined) {
// If we got a network response it but failed to download
// it means remote binaries are not available, so let's try to help
// the user/developer with the info to debug why
full_message = 'Pre-built binaries not found for ' + package_json.name + '@' + package_json.version;
full_message += ' and ' + opts.runtime + '@' + (opts.target || process.versions.node) + ' (' + opts.node_abi + ' ABI, ' + opts.libc + ')';
full_message += fallback_message;
log.warn('Tried to download(' + err.statusCode + '): ' + opts.hosted_tarball);
log.warn(full_message);
log.error(err.message);
} else {
// If we do not have a statusCode that means an unexpected error
// happened and prevented an http response, so we output the exact error
full_message = 'Pre-built binaries not installable for ' + package_json.name + '@' + package_json.version;
full_message += ' and ' + opts.runtime + '@' + (opts.target || process.versions.node) + ' (' + opts.node_abi + ' ABI, ' + opts.libc + ')';
full_message += fallback_message;
log.warn(full_message);
log.warn('Hit error ' + err.message);
}
}
//
// install
//
function install(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const source_build = gyp.opts['build-from-source'] || gyp.opts.build_from_source;
const update_binary = gyp.opts['update-binary'] || gyp.opts.update_binary;
const should_do_source_build = source_build === package_json.name || (source_build === true || source_build === 'true');
if (should_do_source_build) {
log.info('build', 'requesting source compile');
return do_build(gyp, argv, callback);
} else {
const fallback_to_build = gyp.opts['fallback-to-build'] || gyp.opts.fallback_to_build;
let should_do_fallback_build = fallback_to_build === package_json.name || (fallback_to_build === true || fallback_to_build === 'true');
// but allow override from npm
if (process.env.npm_config_argv) {
const cooked = JSON.parse(process.env.npm_config_argv).cooked;
const match = cooked.indexOf('--fallback-to-build');
if (match > -1 && cooked.length > match && cooked[match + 1] === 'false') {
should_do_fallback_build = false;
log.info('install', 'Build fallback disabled via npm flag: --fallback-to-build=false');
}
}
let opts;
try {
opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
} catch (err) {
return callback(err);
}
opts.ca = gyp.opts.ca;
opts.cafile = gyp.opts.cafile;
const from = opts.hosted_tarball;
const to = opts.module_path;
const binary_module = path.join(to, opts.module_name + '.node');
existsAsync(binary_module, (found) => {
if (!update_binary) {
if (found) {
console.log('[' + package_json.name + '] Success: "' + binary_module + '" already installed');
console.log('Pass --update-binary to reinstall or --build-from-source to recompile');
return callback();
}
log.info('check', 'checked for "' + binary_module + '" (not found)');
}
fs.promises.mkdir(to, { recursive: true }).then(() => {
const fileName = from.startsWith('file://') && from.slice('file://'.length);
if (fileName) {
extract_from_local(fileName, to, after_place);
} else {
place_binary(from, to, opts, after_place);
}
}).catch((err) => {
after_place(err);
});
function after_place(err) {
if (err && should_do_fallback_build) {
print_fallback_error(err, opts, package_json);
return do_build(gyp, argv, callback);
} else if (err) {
return callback(err);
} else {
console.log('[' + package_json.name + '] Success: "' + binary_module + '" is installed via remote');
return callback();
}
}
});
}
}

125
node_modules/@mapbox/node-pre-gyp/lib/main.js generated vendored Normal file
View File

@@ -0,0 +1,125 @@
'use strict';
/**
* Set the title.
*/
process.title = 'node-pre-gyp';
const node_pre_gyp = require('../');
const log = require('./util/log.js');
/**
* Process and execute the selected commands.
*/
const prog = new node_pre_gyp.Run({ argv: process.argv });
let completed = false;
if (prog.todo.length === 0) {
if (~process.argv.indexOf('-v') || ~process.argv.indexOf('--version')) {
console.log('v%s', prog.version);
process.exit(0);
} else if (~process.argv.indexOf('-h') || ~process.argv.indexOf('--help')) {
console.log('%s', prog.usage());
process.exit(0);
}
console.log('%s', prog.usage());
process.exit(1);
}
// if --no-color is passed
if (prog.opts && Object.hasOwnProperty.call(prog, 'color') && !prog.opts.color) {
log.disableColor();
}
log.info('it worked if it ends with', 'ok');
log.verbose('cli', process.argv);
log.info(`using ${process.title}@${prog.version}`);
log.info(`using node@${process.versions.node} | ${process.platform} | ${process.arch} `);
/**
* Change dir if -C/--directory was passed.
*/
const dir = prog.opts.directory;
if (dir) {
const fs = require('fs');
try {
const stat = fs.statSync(dir);
if (stat.isDirectory()) {
log.info('chdir', dir);
process.chdir(dir);
} else {
log.warn('chdir', dir + ' is not a directory');
}
} catch (e) {
if (e.code === 'ENOENT') {
log.warn('chdir', dir + ' is not a directory');
} else {
log.warn('chdir', `error during chdir() "${e.message}"`);
}
}
}
function run() {
const command = prog.todo.shift();
if (!command) {
// done!
completed = true;
log.info('ok');
return;
}
// set binary.host when appropriate. host determines the s3 target bucket.
const target = prog.setBinaryHostProperty(command.name);
if (target && ['install', 'publish', 'unpublish', 'info'].indexOf(command.name) >= 0) {
log.info('using binary.host: ' + prog.package_json.binary.host);
}
prog.commands[command.name](command.args, function(err) {
if (err) {
log.error(command.name + ' error');
log.error('stack', err.stack);
errorMessage();
log.error('not ok');
console.log(err.message);
return process.exit(1);
}
const args_array = [].slice.call(arguments, 1);
if (args_array.length) {
console.log.apply(console, args_array);
}
// now run the next command in the queue
process.nextTick(run);
});
}
process.on('exit', (code) => {
if (!completed && !code) {
log.error('Completion callback never invoked!');
errorMessage();
process.exit(6);
}
});
process.on('uncaughtException', (err) => {
log.error('UNCAUGHT EXCEPTION');
log.error('stack', err.stack);
errorMessage();
process.exit(7);
});
function errorMessage() {
// copied from npm's lib/util/error-handler.js
const os = require('os');
log.error('System', os.type() + ' ' + os.release());
log.error('command', process.argv.map(JSON.stringify).join(' '));
log.error('cwd', process.cwd());
log.error('node -v', process.version);
log.error(process.title + ' -v', 'v' + prog.package.version);
}
// start running the given commands!
run();

305
node_modules/@mapbox/node-pre-gyp/lib/node-pre-gyp.js generated vendored Normal file
View File

@@ -0,0 +1,305 @@
'use strict';
/**
* Module exports.
*/
module.exports = exports;
/**
* Module dependencies.
*/
// load mocking control function for accessing s3 via https. the function is a noop always returning
// false if not mocking.
exports.mockS3Http = require('./util/s3_setup').get_mockS3Http();
exports.mockS3Http('on');
const mocking = exports.mockS3Http('get');
const fs = require('fs');
const path = require('path');
const nopt = require('nopt');
const log = require('./util/log.js');
const napi = require('./util/napi.js');
const EE = require('events').EventEmitter;
const inherits = require('util').inherits;
const cli_commands = [
'clean',
'install',
'reinstall',
'build',
'rebuild',
'package',
'testpackage',
'publish',
'unpublish',
'info',
'testbinary',
'reveal',
'configure'
];
const aliases = {};
if (mocking) {
log.warn(`mocking s3 to ${process.env.node_pre_gyp_mock_s3}`);
}
// this is a getter to avoid circular reference warnings with node v14.
Object.defineProperty(exports, 'find', {
get: function() {
return require('./pre-binding').find;
},
enumerable: true
});
// in the following, "my_module" is using node-pre-gyp to
// prebuild and install pre-built binaries. "main_module"
// is using "my_module".
//
// "bin/node-pre-gyp" invokes Run() without a path. the
// expectation is that the working directory is the package
// root "my_module". this is true because in all cases npm is
// executing a script in the context of "my_module".
//
// "pre-binding.find()" is executed by "my_module" but in the
// context of "main_module". this is because "main_module" is
// executing and requires "my_module" which is then executing
// "pre-binding.find()" via "node-pre-gyp.find()", so the working
// directory is that of "main_module".
//
// that's why "find()" must pass the path to package.json.
//
function Run({ package_json_path = './package.json', argv }) {
this.package_json_path = package_json_path;
this.commands = {};
const self = this;
cli_commands.forEach((command) => {
self.commands[command] = function(argvx, callback) {
log.verbose('command', command, argvx);
return require('./' + command)(self, argvx, callback);
};
});
this.parseArgv(argv);
// this is set to true after the binary.host property was set to
// either staging_host or production_host.
this.binaryHostSet = false;
}
inherits(Run, EE);
exports.Run = Run;
const proto = Run.prototype;
/**
* Export the contents of the package.json.
*/
proto.package = require('../package.json');
/**
* nopt configuration definitions
*/
proto.configDefs = {
help: Boolean, // everywhere
arch: String, // 'configure'
debug: Boolean, // 'build'
directory: String, // bin
proxy: String, // 'install'
loglevel: String // everywhere
};
/**
* nopt shorthands
*/
proto.shorthands = {
release: '--no-debug',
C: '--directory',
debug: '--debug',
j: '--jobs',
silent: '--loglevel=silent',
silly: '--loglevel=silly',
verbose: '--loglevel=verbose'
};
/**
* expose the command aliases for the bin file to use.
*/
proto.aliases = aliases;
/**
* Parses the given argv array and sets the 'opts', 'argv',
* 'command', and 'package_json' properties.
*/
proto.parseArgv = function parseOpts(argv) {
this.opts = nopt(this.configDefs, this.shorthands, argv);
this.argv = this.opts.argv.remain.slice();
const commands = this.todo = [];
// create a copy of the argv array with aliases mapped
argv = this.argv.map((arg) => {
// is this an alias?
if (arg in this.aliases) {
arg = this.aliases[arg];
}
return arg;
});
// process the mapped args into "command" objects ("name" and "args" props)
argv.slice().forEach((arg) => {
if (arg in this.commands) {
const args = argv.splice(0, argv.indexOf(arg));
argv.shift();
if (commands.length > 0) {
commands[commands.length - 1].args = args;
}
commands.push({ name: arg, args: [] });
}
});
if (commands.length > 0) {
commands[commands.length - 1].args = argv.splice(0);
}
// if a directory was specified package.json is assumed to be relative
// to it.
let package_json_path = this.package_json_path;
if (this.opts.directory) {
package_json_path = path.join(this.opts.directory, package_json_path);
}
this.package_json = JSON.parse(fs.readFileSync(package_json_path));
// expand commands entries for multiple napi builds
this.todo = napi.expand_commands(this.package_json, this.opts, commands);
// support for inheriting config env variables from npm
const npm_config_prefix = 'npm_config_';
Object.keys(process.env).forEach((name) => {
if (name.indexOf(npm_config_prefix) !== 0) return;
const val = process.env[name];
if (name === npm_config_prefix + 'loglevel') {
log.level = val;
} else {
// add the user-defined options to the config
name = name.substring(npm_config_prefix.length);
// avoid npm argv clobber already present args
// which avoids problem of 'npm test' calling
// script that runs unique npm install commands
if (name === 'argv') {
if (this.opts.argv &&
this.opts.argv.remain &&
this.opts.argv.remain.length) {
// do nothing
} else {
this.opts[name] = val;
}
} else {
this.opts[name] = val;
}
}
});
if (this.opts.loglevel) {
log.level = this.opts.loglevel;
}
log.resume();
};
/**
* allow the binary.host property to be set at execution time.
*
* for this to take effect requires all the following to be true.
* - binary is a property in package.json
* - binary.host is falsey
* - binary.staging_host is not empty
* - binary.production_host is not empty
*
* if any of the previous checks fail then the function returns an empty string
* and makes no changes to package.json's binary property.
*
*
* if command is "publish" then the default is set to "binary.staging_host"
* if command is not "publish" the the default is set to "binary.production_host"
*
* if the command-line option '--s3_host' is set to "staging" or "production" then
* "binary.host" is set to the specified "staging_host" or "production_host". if
* '--s3_host' is any other value an exception is thrown.
*
* if '--s3_host' is not present then "binary.host" is set to the default as above.
*
* this strategy was chosen so that any command other than "publish" or "unpublish" uses "production"
* as the default without requiring any command-line options but that "publish" and "unpublish" require
* '--s3_host production_host' to be specified in order to *really* publish (or unpublish). publishing
* to staging can be done freely without worrying about disturbing any production releases.
*/
proto.setBinaryHostProperty = function(command) {
if (this.binaryHostSet) {
return this.package_json.binary.host;
}
const p = this.package_json;
// don't set anything if host is present. it must be left blank to trigger this.
if (!p || !p.binary || p.binary.host) {
return '';
}
// and both staging and production must be present. errors will be reported later.
if (!p.binary.staging_host || !p.binary.production_host) {
return '';
}
let target = 'production_host';
if (command === 'publish' || command === 'unpublish') {
target = 'staging_host';
}
// the environment variable has priority over the default or the command line. if
// either the env var or the command line option are invalid throw an error.
const npg_s3_host = process.env.node_pre_gyp_s3_host;
if (npg_s3_host === 'staging' || npg_s3_host === 'production') {
target = `${npg_s3_host}_host`;
} else if (this.opts['s3_host'] === 'staging' || this.opts['s3_host'] === 'production') {
target = `${this.opts['s3_host']}_host`;
} else if (this.opts['s3_host'] || npg_s3_host) {
throw new Error(`invalid s3_host ${this.opts['s3_host'] || npg_s3_host}`);
}
p.binary.host = p.binary[target];
this.binaryHostSet = true;
return p.binary.host;
};
/**
* Returns the usage instructions for node-pre-gyp.
*/
proto.usage = function usage() {
const str = [
'',
' Usage: node-pre-gyp <command> [options]',
'',
' where <command> is one of:',
cli_commands.map((c) => {
return ' - ' + c + ' - ' + require('./' + c).usage;
}).join('\n'),
'',
'node-pre-gyp@' + this.version + ' ' + path.resolve(__dirname, '..'),
'node@' + process.versions.node
].join('\n');
return str;
};
/**
* Version number getter.
*/
Object.defineProperty(proto, 'version', {
get: function() {
return this.package.version;
},
enumerable: true
});

72
node_modules/@mapbox/node-pre-gyp/lib/package.js generated vendored Normal file
View File

@@ -0,0 +1,72 @@
'use strict';
module.exports = exports = _package;
exports.usage = 'Packs binary (and enclosing directory) into locally staged tarball';
const fs = require('fs');
const path = require('path');
const log = require('./util/log.js');
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
const existsAsync = fs.exists || path.exists;
const tar = require('tar');
function readdirSync(dir) {
let list = [];
const files = fs.readdirSync(dir);
files.forEach((file) => {
const stats = fs.lstatSync(path.join(dir, file));
if (stats.isDirectory()) {
list = list.concat(readdirSync(path.join(dir, file)));
} else {
list.push(path.join(dir, file));
}
});
return list;
}
function _package(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
const from = opts.module_path;
const binary_module = path.join(from, opts.module_name + '.node');
existsAsync(binary_module, (found) => {
if (!found) {
return callback(new Error('Cannot package because ' + binary_module + ' missing: run `node-pre-gyp rebuild` first'));
}
const tarball = opts.staged_tarball;
const filter_func = function(entry) {
const basename = path.basename(entry);
if (basename.length && basename[0] !== '.') {
console.log('packing ' + entry);
return true;
} else {
console.log('skipping ' + entry);
}
return false;
};
fs.promises.mkdir(path.dirname(tarball), { recursive: true }).then(() => {
let files = readdirSync(from);
const base = path.basename(from);
files = files.map((file) => {
return path.join(base, path.relative(from, file));
});
tar.create({
portable: false,
gzip: true,
filter: filter_func,
file: tarball,
cwd: path.dirname(from)
}, files, (err2) => {
if (err2) console.error('[' + package_json.name + '] ' + err2.message);
else log.info('package', 'Binary staged at "' + tarball + '"');
return callback(err2);
});
}).catch((err) => {
return callback(err);
});
});
}

34
node_modules/@mapbox/node-pre-gyp/lib/pre-binding.js generated vendored Normal file
View File

@@ -0,0 +1,34 @@
'use strict';
const npg = require('..');
const versioning = require('../lib/util/versioning.js');
const napi = require('../lib/util/napi.js');
const existsSync = require('fs').existsSync || require('path').existsSync;
const path = require('path');
module.exports = exports;
exports.usage = 'Finds the require path for the node-pre-gyp installed module';
exports.validate = function(package_json, opts) {
versioning.validate_config(package_json, opts);
};
exports.find = function(package_json_path, opts) {
if (!existsSync(package_json_path)) {
throw new Error(package_json_path + 'does not exist');
}
const prog = new npg.Run({ package_json_path, argv: process.argv });
prog.setBinaryHostProperty();
const package_json = prog.package_json;
versioning.validate_config(package_json, opts);
let napi_build_version;
if (napi.get_napi_build_versions(package_json, opts)) {
napi_build_version = napi.get_best_napi_build_version(package_json, opts);
}
opts = opts || {};
if (!opts.module_root) opts.module_root = path.dirname(package_json_path);
const meta = versioning.evaluate(package_json, opts, napi_build_version);
return meta.module;
};

80
node_modules/@mapbox/node-pre-gyp/lib/publish.js generated vendored Normal file
View File

@@ -0,0 +1,80 @@
'use strict';
module.exports = exports = publish;
exports.usage = 'Publishes pre-built binary (requires aws-sdk)';
const fs = require('fs');
const path = require('path');
const log = require('./util/log.js');
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
const s3_setup = require('./util/s3_setup.js');
const existsAsync = fs.exists || path.exists;
const url = require('url');
function publish(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
const tarball = opts.staged_tarball;
existsAsync(tarball, (found) => {
if (!found) {
return callback(new Error('Cannot publish because ' + tarball + ' missing: run `node-pre-gyp package` first'));
}
log.info('publish', 'Detecting s3 credentials');
const config = s3_setup.detect(opts);
const s3 = s3_setup.get_s3(config);
const key_name = url.resolve(config.prefix, opts.package_name);
const s3_opts = {
Bucket: config.bucket,
Key: key_name
};
log.info('publish', 'Authenticating with s3');
log.info('publish', config);
log.info('publish', 'Checking for existing binary at ' + opts.hosted_path);
s3.headObject(s3_opts, (err, meta) => {
if (meta) log.info('publish', JSON.stringify(meta));
if (err && err.code === 'NotFound') {
// we are safe to publish because
// the object does not already exist
log.info('publish', 'Preparing to put object');
const s3_put_opts = {
ACL: 'public-read',
Body: fs.createReadStream(tarball),
Key: key_name,
Bucket: config.bucket
};
log.info('publish', 'Putting object', s3_put_opts.ACL, s3_put_opts.Bucket, s3_put_opts.Key);
try {
s3.putObject(s3_put_opts, (err2, resp) => {
log.info('publish', 'returned from putting object');
if (err2) {
log.info('publish', 's3 putObject error: "' + err2 + '"');
return callback(err2);
}
if (resp) log.info('publish', 's3 putObject response: "' + JSON.stringify(resp) + '"');
log.info('publish', 'successfully put object');
console.log('[' + package_json.name + '] Success: published to ' + opts.hosted_path);
return callback();
});
} catch (err3) {
log.info('publish', 's3 putObject error: "' + err3 + '"');
return callback(err3);
}
} else if (err) {
log.info('publish', 's3 headObject error: "' + err + '"');
return callback(err);
} else {
log.error('publish', 'Cannot publish over existing version');
log.error('publish', "Update the 'version' field in package.json and try again");
log.error('publish', 'If the previous version was published in error see:');
log.error('publish', '\t node-pre-gyp unpublish');
return callback(new Error('Failed publishing to ' + opts.hosted_path));
}
});
});
}

20
node_modules/@mapbox/node-pre-gyp/lib/rebuild.js generated vendored Normal file
View File

@@ -0,0 +1,20 @@
'use strict';
module.exports = exports = rebuild;
exports.usage = 'Runs "clean" and "build" at once';
const napi = require('./util/napi.js');
function rebuild(gyp, argv, callback) {
const package_json = gyp.package_json;
let commands = [
{ name: 'clean', args: [] },
{ name: 'build', args: ['rebuild'] }
];
commands = napi.expand_commands(package_json, gyp.opts, commands);
for (let i = commands.length; i !== 0; i--) {
gyp.todo.unshift(commands[i - 1]);
}
process.nextTick(callback);
}

19
node_modules/@mapbox/node-pre-gyp/lib/reinstall.js generated vendored Normal file
View File

@@ -0,0 +1,19 @@
'use strict';
module.exports = exports = rebuild;
exports.usage = 'Runs "clean" and "install" at once';
const napi = require('./util/napi.js');
function rebuild(gyp, argv, callback) {
const package_json = gyp.package_json;
let installArgs = [];
const napi_build_version = napi.get_best_napi_build_version(package_json, gyp.opts);
if (napi_build_version != null) installArgs = [napi.get_command_arg(napi_build_version)];
gyp.todo.unshift(
{ name: 'clean', args: [] },
{ name: 'install', args: installArgs }
);
process.nextTick(callback);
}

32
node_modules/@mapbox/node-pre-gyp/lib/reveal.js generated vendored Normal file
View File

@@ -0,0 +1,32 @@
'use strict';
module.exports = exports = reveal;
exports.usage = 'Reveals data on the versioned binary';
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
function unix_paths(key, val) {
return val && val.replace ? val.replace(/\\/g, '/') : val;
}
function reveal(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
let hit = false;
// if a second arg is passed look to see
// if it is a known option
// console.log(JSON.stringify(gyp.opts,null,1))
const remain = gyp.opts.argv.remain[gyp.opts.argv.remain.length - 1];
if (remain && Object.hasOwnProperty.call(opts, remain)) {
console.log(opts[remain].replace(/\\/g, '/'));
hit = true;
}
// otherwise return all options as json
if (!hit) {
console.log(JSON.stringify(opts, unix_paths, 2));
}
return callback();
}

79
node_modules/@mapbox/node-pre-gyp/lib/testbinary.js generated vendored Normal file
View File

@@ -0,0 +1,79 @@
'use strict';
module.exports = exports = testbinary;
exports.usage = 'Tests that the binary.node can be required';
const path = require('path');
const log = require('./util/log.js');
const cp = require('child_process');
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
function testbinary(gyp, argv, callback) {
const args = [];
const options = {};
let shell_cmd = process.execPath;
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
// skip validation for runtimes we don't explicitly support (like electron)
if (opts.runtime &&
opts.runtime !== 'node-webkit' &&
opts.runtime !== 'node') {
return callback();
}
const nw = (opts.runtime && opts.runtime === 'node-webkit');
// ensure on windows that / are used for require path
const binary_module = opts.module.replace(/\\/g, '/');
if ((process.arch !== opts.target_arch) ||
(process.platform !== opts.target_platform)) {
let msg = 'skipping validation since host platform/arch (';
msg += process.platform + '/' + process.arch + ')';
msg += ' does not match target (';
msg += opts.target_platform + '/' + opts.target_arch + ')';
log.info('validate', msg);
return callback();
}
if (nw) {
options.timeout = 5000;
if (process.platform === 'darwin') {
shell_cmd = 'node-webkit';
} else if (process.platform === 'win32') {
shell_cmd = 'nw.exe';
} else {
shell_cmd = 'nw';
}
const modulePath = path.resolve(binary_module);
const appDir = path.join(__dirname, 'util', 'nw-pre-gyp');
args.push(appDir);
args.push(modulePath);
log.info('validate', "Running test command: '" + shell_cmd + ' ' + args.join(' ') + "'");
cp.execFile(shell_cmd, args, options, (err, stdout, stderr) => {
// check for normal timeout for node-webkit
if (err) {
if (err.killed === true && err.signal && err.signal.indexOf('SIG') > -1) {
return callback();
}
const stderrLog = stderr.toString();
log.info('stderr', stderrLog);
if (/^\s*Xlib:\s*extension\s*"RANDR"\s*missing\s*on\s*display\s*":\d+\.\d+"\.\s*$/.test(stderrLog)) {
log.info('RANDR', 'stderr contains only RANDR error, ignored');
return callback();
}
return callback(err);
}
return callback();
});
return;
}
args.push('--eval');
args.push("require('" + binary_module.replace(/'/g, '\'') + "')");
log.info('validate', "Running test command: '" + shell_cmd + ' ' + args.join(' ') + "'");
cp.execFile(shell_cmd, args, options, (err, stdout, stderr) => {
if (err) {
return callback(err, { stdout: stdout, stderr: stderr });
}
return callback();
});
}

52
node_modules/@mapbox/node-pre-gyp/lib/testpackage.js generated vendored Normal file
View File

@@ -0,0 +1,52 @@
'use strict';
module.exports = exports = testpackage;
exports.usage = 'Tests that the staged package is valid';
const fs = require('fs');
const path = require('path');
const log = require('./util/log.js');
const existsAsync = fs.exists || path.exists;
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
const testbinary = require('./testbinary.js');
const tar = require('tar');
function testpackage(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
const tarball = opts.staged_tarball;
existsAsync(tarball, (found) => {
if (!found) {
return callback(new Error('Cannot test package because ' + tarball + ' missing: run `node-pre-gyp package` first'));
}
const to = opts.module_path;
function filter_func(entry) {
log.info('install', 'unpacking [' + entry.path + ']');
}
fs.promises.mkdir(to, { recursive: true }).then(() => {
tar.extract({
file: tarball,
cwd: to,
strip: 1,
onentry: filter_func
}).then(after_extract, callback);
}).catch((err) => {
return callback(err);
});
function after_extract() {
testbinary(gyp, argv, (err) => {
if (err) {
return callback(err);
} else {
console.log('[' + package_json.name + '] Package appears valid');
return callback();
}
});
}
});
}

Some files were not shown because too many files have changed in this diff Show More