Compare commits

...

119 Commits

Author SHA1 Message Date
Angela P Wen
cdcdbb5797 PR checks: stop setting experimental Swift var for new CLI versions (#1718)
Now that `latest` and `cached` are both 2.13.3, which is the version in which we GA'ed Swift, we should stop setting this experimental variable when we test these CLI versions so we can test the case where the variable is unset.
2023-06-06 08:49:09 -07:00
Henry Mercer
8b0f2cf9da Merge pull request #1717 from github/henrymercer/fix-changelog
Fix changelog for 2.3.6
2023-06-05 19:44:53 +01:00
Henry Mercer
a35a881b65 Fix changelog for 2.3.6 2023-06-05 19:14:03 +01:00
Henry Mercer
d8667207b6 Merge pull request #1714 from github/mergeback/v2.3.6-to-main-83f0fe6c
Mergeback v2.3.6 refs/heads/releases/v2 into main
2023-06-05 19:12:50 +01:00
Henry Mercer
926a4898bc Merge pull request #1712 from github/henrymercer/remove-unused-env-var
Remove unused `CODESCANNING_EVENT_NAME` environment variable
2023-06-01 18:28:45 +01:00
github-actions[bot]
5c63cc5b1c Update checked-in dependencies 2023-06-01 15:34:00 +00:00
github-actions[bot]
30a3b9a904 Update changelog and version after v2.3.6 2023-06-01 15:27:36 +00:00
Alexander Eyers-Taylor
83f0fe6c49 Merge pull request #1713 from github/update-v2.3.6-96f284028
Merge main into releases/v2
2023-06-01 16:25:43 +01:00
github-actions[bot]
5c8f4be0e9 Update changelog for v2.3.6 2023-06-01 13:04:31 +00:00
Henry Mercer
96f2840282 Merge pull request #1711 from github/henrymercer/improve-supported-versions-update
Improve automation for updating supported versions of GHES
2023-05-31 18:26:51 +01:00
Henry Mercer
dfc31c9995 Convert actions-util docs to JSDoc 2023-05-31 17:49:42 +01:00
Henry Mercer
019a40b91a Inline checks for producing a better error message for Dependabot PRs 2023-05-31 17:42:45 +01:00
Henry Mercer
ae005db7f8 Merge branch 'main' into henrymercer/remove-unused-env-var 2023-05-31 17:41:04 +01:00
Henry Mercer
89c4c9e65c Merge pull request #1678 from github/henrymercer/default-setup-safeguarding
Flag up functionality that may not exist in default setup workflows
2023-05-31 17:33:30 +01:00
Henry Mercer
26f16a5e63 Rephrase the still supported calculation to make it clearer 2023-05-31 17:20:39 +01:00
Henry Mercer
955f8596ae Fix sign error 2023-05-31 16:49:34 +01:00
Henry Mercer
e7cff66ce1 Fix push 2023-05-31 16:35:12 +01:00
Henry Mercer
bf419682de Remove unused CODESCANNING_EVENT_NAME environment variable 2023-05-31 15:37:11 +01:00
Henry Mercer
afdba76326 Wait a week before dropping support for end of life GHES versions 2023-05-31 15:00:19 +01:00
Henry Mercer
07e43a2208 Open PR with gh CLI 2023-05-31 14:39:03 +01:00
Henry Mercer
9632771630 Address review comments 2023-05-31 14:23:43 +01:00
Alexander Eyers-Taylor
9d2dd7cfea Merge pull request #1698 from github/update-bundle/codeql-bundle-20230524
Update default bundle to 2.13.3
2023-05-31 12:29:26 +01:00
Henry Mercer
d427c89ed7 Ignore internal Actions 2023-05-30 20:31:56 +01:00
Henry Mercer
125ff5530c Fix deprecation warnings 2023-05-30 20:31:40 +01:00
Henry Mercer
86ead5e019 Only flag up the deepest properties 2023-05-30 19:50:56 +01:00
Henry Mercer
eb1c7a3887 Use getRefFromEnv() so ref is present on default setup 2023-05-30 19:39:53 +01:00
Henry Mercer
6bd8101752 Merge pull request #1709 from github/henrymercer/print-baseline-once
Only print lines of code information once
2023-05-26 21:03:22 +01:00
Henry Mercer
2408985f4e Only print lines of code information once
CodeQL already prints it, so we don't need to print it again.
2023-05-26 20:34:30 +01:00
Henry Mercer
f8b1cb6997 Merge pull request #1695 from github/henrymercer/update-requests
PR checks: Update requests to 2.31.0
2023-05-26 11:10:44 +01:00
Andrew Eisenberg
2d031a36d6 Merge pull request #1707 from github/mergeback/v2.3.5-to-main-0225834c
Mergeback v2.3.5 refs/heads/releases/v2 into main
2023-05-25 12:50:21 -07:00
github-actions[bot]
1ba7713018 Update checked-in dependencies 2023-05-25 19:23:44 +00:00
github-actions[bot]
339e0d5afb Update changelog and version after v2.3.5 2023-05-25 19:12:36 +00:00
Andrew Eisenberg
0225834cc5 Merge pull request #1706 from github/update-v2.3.5-d3314cca2
Merge main into releases/v2
2023-05-25 12:10:52 -07:00
Andrew Eisenberg
15f9b00614 Apply suggestions from code review 2023-05-25 11:42:54 -07:00
github-actions[bot]
ff82fd0736 Update changelog for v2.3.5 2023-05-25 18:22:27 +00:00
Andrew Eisenberg
d3314cca22 Merge pull request #1705 from github/aeisenberg/location-uri-schema-fix 2023-05-25 10:45:48 -07:00
Andrew Eisenberg
42add7b4d7 Update changelog 2023-05-25 10:21:47 -07:00
Andrew Eisenberg
9c5706e1a2 Avoid throwing validation error on invalid URIs
The recent update of jsonschema inadvertently caused extra validation of
`uri-reference` formatted properties. This change ensures that these
errors are converted to warnings.

Note that we cannot revert the change to jsonschema since the old
version does not handle `uniqueItems` correctly.
2023-05-25 10:18:12 -07:00
Henry Mercer
3912995667 Merge pull request #1704 from github/henrymercer/contributions-updates
Contributing documentation updates
2023-05-25 17:33:19 +01:00
Henry Mercer
8d7f61b8f2 Update npm version 2023-05-25 17:06:08 +01:00
Henry Mercer
50bc388cfc Update Node version 2023-05-25 17:04:40 +01:00
Henry Mercer
4a409ace8f Link to CONTRIBUTING doc from README 2023-05-25 17:03:48 +01:00
Henry Mercer
41499f5466 Merge pull request #1702 from github/henrymercer/update-github-actions-email
Fix GitHub Actions email
2023-05-25 16:19:18 +01:00
Henry Mercer
1023a086ae Merge pull request #1694 from jsoref/fixes
Fix running tests on forks, and handle invalid URIs when fingerprinting
2023-05-25 15:41:27 +01:00
Josh Soref
cc5f2fb439 Gracefully handle decodeURIComponent failure 2023-05-25 09:15:55 -04:00
Josh Soref
789f65c9ee Improving handling of uploadFailedSarifResult -> [Object object] 2023-05-25 09:15:55 -04:00
Josh Soref
a5879b7b6e Tolerate forks of github/codeql-action 2023-05-25 09:14:30 -04:00
Henry Mercer
3da4cbfc79 Fix GitHub Actions email 2023-05-25 11:27:13 +01:00
Henry Mercer
5f061ca665 Merge pull request #1697 from github/fixInvalidNotifications-shortcut
Avoid parsing SARIF file when workaround for duplicate notification locations is disabled
2023-05-25 10:45:50 +01:00
Angela P Wen
11ea309db5 Merge pull request #1701 from github/mergeback/v2.3.4-to-main-f0e3dfb3
Mergeback v2.3.4 refs/heads/releases/v2 into main
2023-05-24 16:21:39 -07:00
github-actions[bot]
1319d54f85 Update checked-in dependencies 2023-05-24 22:19:26 +00:00
github-actions[bot]
59d27da76b Update changelog and version after v2.3.4 2023-05-24 22:16:43 +00:00
Angela P Wen
f0e3dfb303 Merge pull request #1700 from github/update-v2.3.4-570734c55
Merge main into releases/v2
2023-05-24 15:14:53 -07:00
Josh Soref
dba4f66682 Grant security-events: write permissions 2023-05-24 18:14:01 -04:00
Josh Soref
8f9b20ba50 Clarify how to update workflows 2023-05-24 18:14:01 -04:00
Angela P Wen
0d65621757 Update CHANGELOG.md 2023-05-24 14:49:16 -07:00
github-actions[bot]
c3ae9dcd15 Update changelog for v2.3.4 2023-05-24 21:41:27 +00:00
Angela P Wen
570734c55c Remove unnecessary conditional for Ruby autodetect (#1699)
We should check language autodetect for Ruby unconditionally. We can now move it into the step that checks all other languages.
2023-05-24 18:33:06 +00:00
Henry Mercer
65920dd33a Unconditionally set up Swift in debug artifacts PR check 2023-05-24 18:28:18 +01:00
Henry Mercer
60f5c59630 Merge branch 'main' into update-bundle/codeql-bundle-20230524 2023-05-24 18:04:09 +01:00
Henry Mercer
0962265901 Merge branch 'main' into fixInvalidNotifications-shortcut 2023-05-24 18:00:28 +01:00
Henry Mercer
143b5fb429 Merge branch 'main' into henrymercer/update-requests 2023-05-24 18:00:08 +01:00
Angela P Wen
8c923c00a3 Fix Swift PR Checks on nightly-latest CLI (#1696) 2023-05-24 17:59:40 +01:00
github-actions[bot]
34e8e09ae4 Add changelog note 2023-05-24 16:01:57 +00:00
github-actions[bot]
4f41ff7fc8 Update default bundle to codeql-bundle-20230524 2023-05-24 16:01:53 +00:00
Stephan Brandauer
636b9eab1d add rebuilt lib 2023-05-24 12:12:27 +00:00
Stephan Brandauer
153cab09da jsdoc for fixInvalidNotificationsInFile 2023-05-24 12:09:28 +00:00
Stephan Brandauer
dddabd0d26 add rebuilt lib 2023-05-24 11:51:57 +00:00
Stephan Brandauer
3100e1e354 move check to calling function
DISABLE_DUPLICATE_LOCATION_FIX - this is to avoid needless crashes on
large sarif files
2023-05-24 11:46:19 +00:00
Henry Mercer
6e92b190d0 Bump requests to 2.31.0 2023-05-23 17:07:30 +01:00
Henry Mercer
292bb7c0b9 Parameterize check scripts over requests version 2023-05-23 17:07:30 +01:00
Henry Mercer
1245696032 Merge pull request #1687 from github/henrymercer/update-changelog-note
Push back semver CodeQL bundles
2023-05-22 17:11:03 +01:00
Henry Mercer
317cd34a7a Push back semver CodeQL bundles
Push back the first bundle released using a semantic version number to 2.13.4 now that we're skipping 2.13.2.
2023-05-22 11:00:25 +01:00
Henry Mercer
6cfb483131 Merge pull request #1682 from github/henrymercer/semver-bundles
Extract semantic CLI version from URL when requesting specific tools
2023-05-18 11:32:45 +01:00
Henry Mercer
a5f4123fb0 Improve changelog note 2023-05-17 18:23:54 +01:00
Henry Mercer
50931b43dd Add changelog note 2023-05-17 14:57:27 +01:00
Henry Mercer
f54f0731d1 Merge branch 'main' into henrymercer/semver-bundles 2023-05-17 14:45:33 +01:00
Henry Mercer
ca6b925548 Merge pull request #1681 from github/henrymercer/remove-redundant-flag
Remove redundant query help version flag
2023-05-16 19:50:47 +01:00
Henry Mercer
d439786b65 Merge branch 'main' into henrymercer/remove-redundant-flag 2023-05-16 19:21:22 +01:00
Henry Mercer
f5159143cd Merge pull request #1680 from github/henrymercer/handle-swift-promotion
Use `resolve extractor` when finding autobuild scripts
2023-05-16 19:19:11 +01:00
Henry Mercer
a1be09ed8a Remove redundant query help version flag 2023-05-16 18:20:13 +01:00
Henry Mercer
2bf10dc4b9 Extract semantic CLI version from URL when requesting specific tools 2023-05-16 14:47:32 +01:00
Henry Mercer
e422b64793 Use resolve extractor when finding autobuild scripts 2023-05-16 11:18:16 +01:00
Henry Mercer
eac5e24aee Downgrade query severity to warning 2023-05-16 11:06:13 +01:00
Rasmus Wriedt Larsen
5489416722 Merge pull request #1676 from github/rasmuswl/python-disable-dependency-installation
Feature flag to disable python dependency installation
2023-05-16 10:40:47 +02:00
Rasmus Wriedt Larsen
dc0f6da426 Update CHANGELOG.md
Co-authored-by: Henry Mercer <henry.mercer@me.com>
2023-05-15 12:38:10 +02:00
Rasmus Wriedt Larsen
e1cca2565c Generate JS 2023-05-15 12:05:50 +02:00
Rasmus Wriedt Larsen
cf58ef4480 Update wording for CODEQL_PYTHON warning
Co-authored-by: Henry Mercer <henry.mercer@me.com>
2023-05-15 12:05:03 +02:00
Henry Mercer
8065746a2a Add query to find context variables that may not work with default setup 2023-05-12 19:35:08 +01:00
Henry Mercer
abb267d186 Add query to identify env vars that may not work with default setup 2023-05-12 18:46:31 +01:00
Rasmus Wriedt Larsen
fce87bbc67 Generate JS 2023-05-12 10:00:31 +02:00
Rasmus Wriedt Larsen
cc641561b7 Improve python warning message
The last dot in `=3.11.` is just slightly confusing, so added single
quotes around the environment variable assignments to make it 100% clear
2023-05-12 09:59:20 +02:00
Rasmus Wriedt Larsen
c237da1a2f Fix linting errors 2023-05-12 09:58:30 +02:00
Rasmus Wriedt Larsen
cbc79bf64b Merge branch 'main' into rasmuswl/python-disable-dependency-installation 2023-05-12 09:55:57 +02:00
Rasmus Wriedt Larsen
b8f39fe0f5 Use features properly in setupPythonExtractor 2023-05-12 09:55:22 +02:00
Henry Mercer
9953504776 Use new packaging mechanism for internal queries 2023-05-11 18:43:36 +01:00
Henry Mercer
130884e4e1 Merge pull request #1675 from shaikhul/remove-consts
Remove MismatchedBranches check from code scanning workflow validation
2023-05-11 15:45:33 +01:00
Shaikhul Islam
a0755a79b6 Update CHANGELOG.md
Co-authored-by: Henry Mercer <henry.mercer@me.com>
2023-05-11 10:22:57 -04:00
Shaikhul Islam
903cb278c5 recompile src 2023-05-11 14:16:34 +00:00
Shaikhul Islam
e5fdcd4a8f Apply suggestions from code review
Co-authored-by: Henry Mercer <henry.mercer@me.com>
2023-05-11 09:29:25 -04:00
Rasmus Wriedt Larsen
eb8a70647a Update CHANGELOG.md 2023-05-11 12:18:29 +02:00
Rasmus Wriedt Larsen
0ccdbf8cd5 Feature flag to disable python dependency installation 2023-05-11 12:14:04 +02:00
Shaikhul Islam
c26fc558ba revert MissingPushHook checks changes 2023-05-10 20:37:56 +00:00
Shaikhul Islam
f8707c9939 update changelog 2023-05-10 15:01:33 +00:00
Shaikhul Islam
699855c048 fix linter issue 2023-05-09 15:05:36 +00:00
Shaikhul Islam
edb138ff88 remove consts MismatchedBranches and MissingPushHook 2023-05-09 14:39:49 +00:00
Andrew Eisenberg
95cfca769b Merge pull request #1673 from github/dependabot/github_actions/peter-evans/create-pull-request-5.0.1
Bump peter-evans/create-pull-request from 5.0.0 to 5.0.1
2023-05-08 12:25:32 -07:00
dependabot[bot]
9c51a58355 Bump peter-evans/create-pull-request from 5.0.0 to 5.0.1
Bumps [peter-evans/create-pull-request](https://github.com/peter-evans/create-pull-request) from 5.0.0 to 5.0.1.
- [Release notes](https://github.com/peter-evans/create-pull-request/releases)
- [Commits](5b4a9f6a9e...284f54f989)

---
updated-dependencies:
- dependency-name: peter-evans/create-pull-request
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-05-08 18:00:47 +00:00
Aditya Sharad
deb312c60b Merge pull request #1672 from github/aeisenberg/sarif-again
Fix broken regex
2023-05-05 12:53:12 -07:00
Andrew Eisenberg
9824588133 Fix broken regex
`($i)` is not valid for javascript regexes.
2023-05-05 12:02:19 -07:00
Andrew Eisenberg
11fba50273 Merge pull request #1668 from github/aeisenberg/update-sarif-schema 2023-05-05 09:14:24 -07:00
Andrew Eisenberg
684c4b5c77 Update CHANGELOG.md
Co-authored-by: Aditya Sharad <6874315+adityasharad@users.noreply.github.com>
2023-05-05 08:41:11 -07:00
Dave Bartolomeo
1e1aca8165 Merge pull request #1670 from github/mergeback/v2.3.3-to-main-29b1f65c
Mergeback v2.3.3 refs/heads/releases/v2 into main
2023-05-04 15:27:32 -04:00
github-actions[bot]
898fba281b Update checked-in dependencies 2023-05-04 19:02:16 +00:00
github-actions[bot]
913b8b11ad Update changelog and version after v2.3.3 2023-05-04 18:53:44 +00:00
Andrew Eisenberg
3df80238a3 Re-run sync.py with new ruamel.yaml 2023-05-02 15:19:05 -07:00
Andrew Eisenberg
ef88842204 Update jsonschema version
Fixes bug in `uniqueItems` property.
2023-05-02 14:26:17 -07:00
Andrew Eisenberg
ece3cbc8ec Update changelog 2023-05-02 13:52:28 -07:00
Andrew Eisenberg
febbadf751 Update the sarif schema file
The version we were using is quite old. Copied the latest from
123e95847b/Schemata/sarif-schema-2.1.0.json

I do not think the sarif spec will be changing any more without
an explicit version update, so this is fine for now.
2023-05-02 13:46:24 -07:00
133 changed files with 2022 additions and 1229 deletions

View File

@@ -1,5 +1,5 @@
name: "Set up Swift"
description: Sets up an appropriate Swift version if Swift is enabled via CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT.
description: Sets up an appropriate Swift version if supported on this platform.
inputs:
codeql-path:
description: Path to the CodeQL CLI executable.
@@ -9,24 +9,29 @@ runs:
steps:
- name: Get Swift version
id: get_swift_version
if: env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true'
if: runner.os != 'Windows'
shell: bash
env:
CODEQL_PATH: ${{ inputs.codeql-path }}
run: |
if [ $RUNNER_OS = "macOS" ]; then
if [[ $RUNNER_OS = "macOS" ]]; then
PLATFORM="osx64"
else # We do not run this step on Windows.
PLATFORM="linux64"
fi
SWIFT_EXTRACTOR_DIR="$("$CODEQL_PATH" resolve languages --format json | jq -r '.swift[0]')"
VERSION="$("$SWIFT_EXTRACTOR_DIR/tools/$PLATFORM/extractor" --version | awk '/version/ { print $3 }')"
# Specify 5.7.0, otherwise setup Action will default to latest minor version.
if [ $VERSION = "5.7" ]; then
VERSION="5.7.0"
if [ $SWIFT_EXTRACTOR_DIR = "null" ]; then
VERSION="null"
else
VERSION="$("$SWIFT_EXTRACTOR_DIR/tools/$PLATFORM/extractor" --version | awk '/version/ { print $3 }')"
# Specify 5.7.0, otherwise setup Action will default to latest minor version.
if [ $VERSION = "5.7" ]; then
VERSION="5.7.0"
fi
fi
echo "version=$VERSION" | tee -a $GITHUB_OUTPUT
- uses: swift-actions/setup-swift@65540b95f51493d65f5e59e97dcef9629ddf11bf # Please update the corresponding SHA in the CLI's CodeQL Action Integration Test.
if: env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true'
if: runner.os != 'Windows' && steps.get_swift_version.outputs.version != 'null'
with:
swift-version: "${{ steps.get_swift_version.outputs.version }}"

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: "PR Check - Analyze: 'ref' and 'sha' from inputs"
@@ -68,6 +68,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: "Analyze: 'ref' and 'sha' from inputs"
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -82,10 +85,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - autobuild-action
@@ -32,6 +32,9 @@ jobs:
- os: windows-latest
version: latest
name: autobuild-action
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -46,10 +49,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Config export
@@ -38,6 +38,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: Config export
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -52,10 +55,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Diagnostic export
@@ -44,6 +44,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: Diagnostic export
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -58,10 +61,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Export file baseline information
@@ -32,6 +32,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: Export file baseline information
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -46,10 +49,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Extractor ram and threads options test
@@ -28,6 +28,9 @@ jobs:
- os: ubuntu-latest
version: latest
name: Extractor ram and threads options test
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -42,10 +45,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: 'PR Check - Go: Custom queries'
@@ -68,6 +68,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: 'Go: Custom queries'
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -82,10 +85,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: 'PR Check - Go: tracing with autobuilder step'
@@ -54,6 +54,9 @@ jobs:
- os: macos-latest
version: nightly-latest
name: 'Go: tracing with autobuilder step'
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -68,10 +71,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: 'PR Check - Go: tracing with custom build steps'
@@ -54,6 +54,9 @@ jobs:
- os: macos-latest
version: nightly-latest
name: 'Go: tracing with custom build steps'
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -68,10 +71,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: 'PR Check - Go: tracing with legacy workflow'
@@ -54,6 +54,9 @@ jobs:
- os: macos-latest
version: nightly-latest
name: 'Go: tracing with legacy workflow'
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -68,10 +71,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: 'PR Check - Packaging: Download using registries'
@@ -44,6 +44,10 @@ jobs:
- os: windows-latest
version: nightly-latest
name: 'Packaging: Download using registries'
permissions:
contents: read
packages: read
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -58,10 +62,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
@@ -128,9 +129,5 @@ jobs:
cat $QLCONFIG_PATH
exit 1
fi
permissions:
contents: read
packages: read
env:
CODEQL_ACTION_TEST_MODE: true

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Custom source root
@@ -32,6 +32,9 @@ jobs:
- os: ubuntu-latest
version: nightly-latest
name: Custom source root
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -46,10 +49,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - ML-powered queries
@@ -68,6 +68,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: ML-powered queries
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -82,10 +85,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Multi-language repository
@@ -54,6 +54,9 @@ jobs:
- os: macos-latest
version: nightly-latest
name: Multi-language repository
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -68,10 +71,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
@@ -94,7 +94,7 @@ jobs:
with:
upload-database: false
- name: Check language autodetect for all languages excluding Ruby, Swift
- name: Check language autodetect for all languages excluding Swift
shell: bash
run: |
CPP_DB=${{ fromJson(steps.analysis.outputs.db-locations).cpp }}
@@ -127,11 +127,6 @@ jobs:
echo "Did not create a database for Python, or created it in the wrong location."
exit 1
fi
- name: Check language autodetect for Ruby
if: env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true'
shell: bash
run: |
RUBY_DB=${{ fromJson(steps.analysis.outputs.db-locations).ruby }}
if [[ ! -d $RUBY_DB ]] || [[ ! $RUBY_DB == ${{ runner.temp }}/customDbLocation/* ]]; then
echo "Did not create a database for Ruby, or created it in the wrong location."
@@ -139,7 +134,9 @@ jobs:
fi
- name: Check language autodetect for Swift
if: env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true'
if: >-
env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true' ||
(runner.os != 'Windows' && matrix.version == 'nightly-latest')
shell: bash
run: |
SWIFT_DB=${{ fromJson(steps.analysis.outputs.db-locations).swift }}

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: 'PR Check - Packaging: Config and input passed to the CLI'
@@ -44,6 +44,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: 'Packaging: Config and input passed to the CLI'
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -58,10 +61,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: 'PR Check - Packaging: Config and input'
@@ -44,6 +44,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: 'Packaging: Config and input'
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -58,10 +61,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: 'PR Check - Packaging: Config file'
@@ -44,6 +44,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: 'Packaging: Config file'
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -58,10 +61,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: 'PR Check - Packaging: Action input'
@@ -44,6 +44,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: 'Packaging: Action input'
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -58,10 +61,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Remote config file
@@ -68,6 +68,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: Remote config file
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -82,10 +85,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - RuboCop multi-language
@@ -28,6 +28,9 @@ jobs:
- os: ubuntu-latest
version: cached
name: RuboCop multi-language
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -42,10 +45,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

10
.github/workflows/__ruby.yml generated vendored
View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Ruby analysis
@@ -38,6 +38,9 @@ jobs:
- os: macos-latest
version: nightly-latest
name: Ruby analysis
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -52,10 +55,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Split workflow
@@ -38,6 +38,9 @@ jobs:
- os: macos-latest
version: nightly-latest
name: Split workflow
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -52,10 +55,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Submit SARIF after failure
@@ -32,6 +32,9 @@ jobs:
- os: ubuntu-latest
version: nightly-latest
name: Submit SARIF after failure
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -46,10 +49,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Swift analysis using a custom build command
@@ -38,6 +38,9 @@ jobs:
- os: macos-latest
version: nightly-latest
name: Swift analysis using a custom build command
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -52,10 +55,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Autobuild working directory
@@ -28,6 +28,9 @@ jobs:
- os: ubuntu-latest
version: latest
name: Autobuild working directory
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -42,10 +45,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Local CodeQL bundle
@@ -28,6 +28,9 @@ jobs:
- os: ubuntu-latest
version: nightly-latest
name: Local CodeQL bundle
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -42,10 +45,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

10
.github/workflows/__test-proxy.yml generated vendored
View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Proxy test
@@ -28,6 +28,9 @@ jobs:
- os: ubuntu-latest
version: latest
name: Proxy test
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -42,10 +45,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Test unsetting environment variables
@@ -40,6 +40,9 @@ jobs:
- os: ubuntu-latest
version: nightly-latest
name: Test unsetting environment variables
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -54,10 +57,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: "PR Check - Upload-sarif: 'ref' and 'sha' from inputs"
@@ -68,6 +68,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: "Upload-sarif: 'ref' and 'sha' from inputs"
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -82,10 +85,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -1,6 +1,6 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
name: PR Check - Use a custom `checkout_path`
@@ -68,6 +68,9 @@ jobs:
- os: windows-latest
version: nightly-latest
name: Use a custom `checkout_path`
permissions:
contents: read
security-events: write
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
@@ -82,10 +85,7 @@ jobs:
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV

View File

@@ -49,11 +49,15 @@ jobs:
with:
go-version: ^1.13.1
- uses: ./../action/init
id: init
with:
tools: ${{ steps.prepare-test.outputs.tools-url }}
debug: true
debug-artifact-name: my-debug-artifacts
debug-database-name: my-db
- uses: ./../action/.github/actions/setup-swift
with:
codeql-path: ${{ steps.init.outputs.codeql-path }}
- name: Build code
shell: bash
run: ./build.sh

View File

@@ -40,7 +40,7 @@ jobs:
- name: Update git config
run: |
git config --global user.email "github-actions@github.com"
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
git config --global user.name "github-actions[bot]"
- name: Get version and new branch

View File

@@ -72,7 +72,7 @@ jobs:
- name: Verify packages installed
run: |
$GITHUB_WORKSPACE/python-setup/tests/check_requests_2_26_0.sh ${PYTHON_VERSION}
$GITHUB_WORKSPACE/python-setup/tests/check_requests.sh ${PYTHON_VERSION} 2.31.0
# This one shouldn't fail, but also won't install packages
test-setup-python-scripts-non-standard-location:
@@ -170,5 +170,5 @@ jobs:
- name: Verify packages installed
run: |
$cmd = $Env:GITHUB_WORKSPACE + "\\python-setup\\tests\\check_requests_2_26_0.ps1"
powershell -File $cmd $Env:PYTHON_VERSION
$cmd = $Env:GITHUB_WORKSPACE + "\\python-setup\\tests\\check_requests.ps1"
powershell -File $cmd $Env:PYTHON_VERSION 2.31.0

View File

@@ -30,7 +30,7 @@ jobs:
- name: Update git config
run: |
git config --global user.email "github-actions@github.com"
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
git config --global user.name "github-actions[bot]"
- name: Update bundle

View File

@@ -29,7 +29,7 @@ jobs:
git checkout "origin/$BRANCH"
.github/workflows/script/update-node-modules.sh update
if [ ! -z "$(git status --porcelain)" ]; then
git config --global user.email "github-actions@github.com"
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
git config --global user.name "github-actions[bot]"
git add node_modules
git commit -am "Update checked-in dependencies"

View File

@@ -35,7 +35,7 @@ jobs:
- name: Update git config
run: |
git config --global user.email "github-actions@github.com"
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
git config --global user.name "github-actions[bot]"
- name: Update release branch

View File

@@ -35,14 +35,22 @@ jobs:
npm run build
env:
ENTERPRISE_RELEASES_PATH: ${{ github.workspace }}/enterprise-releases/
- name: Commit Changes
uses: peter-evans/create-pull-request@5b4a9f6a9e2af26e5f02351490b90d01eb8ec1e5 # v5.0.0
with:
commit-message: Update supported GitHub Enterprise Server versions.
title: Update supported GitHub Enterprise Server versions.
body: ""
author: GitHub <noreply@github.com>
branch: update-supported-enterprise-server-versions
draft: true
- name: Update git config
run: |
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
git config --global user.name "github-actions[bot]"
- name: Commit changes and open PR
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
if [[ -z $(git status --porcelain) ]]; then
echo "No changes to commit"
else
git checkout -b update-supported-enterprise-server-versions
git add .
git commit --message "Update supported GitHub Enterprise Server versions"
git push origin update-supported-enterprise-server-versions
gh pr create --fill --draft
fi

View File

@@ -35,7 +35,10 @@ def main():
if oldest_supported_release is None or release_version < oldest_supported_release:
end_of_life_date = datetime.date.fromisoformat(release_data["end"])
if end_of_life_date > datetime.date.today():
# The GHES version is not actually end of life until the end of the day specified by
# `end_of_life_date`. Wait an extra week to be safe.
is_end_of_life = datetime.date.today() > end_of_life_date + datetime.timedelta(weeks=1)
if not is_end_of_life:
oldest_supported_release = release_version
api_compatibility_data = {

View File

@@ -1,5 +1,28 @@
# CodeQL Action Changelog
## [UNRELEASED]
No user facing changes.
## 2.3.6 - 01 Jun 2023
- Update default CodeQL bundle version to 2.13.3. [#1698](https://github.com/github/codeql-action/pull/1698)
## 2.3.5 - 25 May 2023
- Allow invalid URIs to be used as values to `artifactLocation.uri` properties. This reverses a change from [#1668](https://github.com/github/codeql-action/pull/1668) that inadvertently led to stricter validation of some URI values. [#1705](https://github.com/github/codeql-action/pull/1705)
- Gracefully handle invalid URIs when fingerprinting. [#1694](https://github.com/github/codeql-action/pull/1694)
## 2.3.4 - 24 May 2023
- Updated the SARIF 2.1.0 JSON schema file to the latest from [oasis-tcs/sarif-spec](https://github.com/oasis-tcs/sarif-spec/blob/123e95847b13fbdd4cbe2120fa5e33355d4a042b/Schemata/sarif-schema-2.1.0.json). [#1668](https://github.com/github/codeql-action/pull/1668)
- We are rolling out a feature in May 2023 that will disable Python dependency installation for new users of the CodeQL Action. This improves the speed of analysis while having only a very minor impact on results. [#1676](https://github.com/github/codeql-action/pull/1676)
- We are improving the way that [CodeQL bundles](https://github.com/github/codeql-action/releases) are tagged to make it possible to easily identify bundles by their CodeQL semantic version. [#1682](https://github.com/github/codeql-action/pull/1682)
- As of CodeQL CLI 2.13.4, CodeQL bundles will be tagged using semantic versions, for example `codeql-bundle-v2.13.4`, instead of timestamps, like `codeql-bundle-20230615`.
- This change does not affect the majority of workflows, and we will not be changing tags for existing bundle releases.
- Some workflows with custom logic that depends on the specific format of the CodeQL bundle tag may need to be updated. For example, if your workflow matches CodeQL bundle tag names against a `codeql-bundle-yyyymmdd` pattern, you should update it to also recognize `codeql-bundle-vx.y.z` tags.
- Remove the requirement for `on.push` and `on.pull_request` to trigger on the same branches. [#1675](https://github.com/github/codeql-action/pull/1675)
## 2.3.3 - 04 May 2023
- Update default CodeQL bundle version to 2.13.1. [#1664](https://github.com/github/codeql-action/pull/1664)

View File

@@ -12,7 +12,7 @@ Please note that this project is released with a [Contributor Code of Conduct][c
## Development and Testing
Before you start, ensure that you have a recent version of node (14 or higher) installed, along with a recent version of npm (7 or higher). You can see which version of node is used by the action in `init/action.yml`.
Before you start, ensure that you have a recent version of node (16 or higher) installed, along with a recent version of npm (9.2 or higher). You can see which version of node is used by the action in `init/action.yml`.
### Common tasks

View File

@@ -170,3 +170,7 @@ You can use Actions or environment variables to share configuration across multi
## Troubleshooting
Read about [troubleshooting code scanning](https://help.github.com/en/github/finding-security-vulnerabilities-and-errors-in-your-code/troubleshooting-code-scanning).
## Contributing
This project welcomes contributions. See [CONTRIBUTING.md](CONTRIBUTING.md) for details on how to build, install, and contribute.

71
lib/actions-util.js generated
View File

@@ -23,7 +23,7 @@ var __importStar = (this && this.__importStar) || function (mod) {
return result;
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.getUploadValue = exports.printDebugLogs = exports.isAnalyzingDefaultBranch = exports.getRelativeScriptPath = exports.isRunningLocalAction = exports.workflowEventName = exports.sendStatusReport = exports.createStatusReportBase = exports.getActionVersion = exports.getActionsStatus = exports.getRef = exports.computeAutomationID = exports.getAutomationID = exports.getAnalysisKey = exports.determineMergeBaseCommitOid = exports.getCommitOid = exports.getTemporaryDirectory = exports.getOptionalInput = exports.getRequiredInput = void 0;
exports.getUploadValue = exports.printDebugLogs = exports.isAnalyzingDefaultBranch = exports.getRelativeScriptPath = exports.isRunningLocalAction = exports.getWorkflowEventName = exports.sendStatusReport = exports.createStatusReportBase = exports.getActionVersion = exports.getActionsStatus = exports.getRef = exports.computeAutomationID = exports.getAutomationID = exports.getAnalysisKey = exports.determineMergeBaseCommitOid = exports.getCommitOid = exports.getTemporaryDirectory = exports.getOptionalInput = exports.getRequiredInput = void 0;
const fs = __importStar(require("fs"));
const os = __importStar(require("os"));
const path = __importStar(require("path"));
@@ -104,7 +104,7 @@ exports.getCommitOid = getCommitOid;
* Returns undefined if run by other triggers or the merge base cannot be determined.
*/
const determineMergeBaseCommitOid = async function () {
if (workflowEventName() !== "pull_request") {
if (getWorkflowEventName() !== "pull_request") {
return undefined;
}
const mergeSha = (0, util_1.getRequiredEnvParam)("GITHUB_SHA");
@@ -155,7 +155,7 @@ exports.determineMergeBaseCommitOid = determineMergeBaseCommitOid;
*
* This will combine the workflow path and current job name.
* Computing this the first time requires making requests to
* the github API, but after that the result will be cached.
* the GitHub API, but after that the result will be cached.
*/
async function getAnalysisKey() {
const analysisKeyEnvVar = "CODEQL_ACTION_ANALYSIS_KEY";
@@ -395,7 +395,8 @@ async function sendStatusReport(statusReport) {
if ((0, util_1.isHTTPError)(e)) {
switch (e.status) {
case 403:
if (workflowIsTriggeredByPushEvent() && isDependabotActor()) {
if (getWorkflowEventName() === "push" &&
process.env["GITHUB_ACTOR"] === "dependabot[bot]") {
core.setFailed('Workflows triggered by Dependabot on the "push" event run with read-only access. ' +
"Uploading Code Scanning results requires write access. " +
'To use Code Scanning with Dependabot, please ensure you are using the "pull_request" event for this workflow and avoid triggering on the "push" event for Dependabot branches. ' +
@@ -428,42 +429,36 @@ async function sendStatusReport(statusReport) {
}
}
exports.sendStatusReport = sendStatusReport;
function workflowEventName() {
// If the original event is dynamic CODESCANNING_EVENT_NAME will contain the right info (push/pull_request)
if (process.env["GITHUB_EVENT_NAME"] === "dynamic") {
const value = process.env["CODESCANNING_EVENT_NAME"];
if (value === undefined || value.length === 0) {
return process.env["GITHUB_EVENT_NAME"];
}
return value;
}
return process.env["GITHUB_EVENT_NAME"];
/**
* Returns the name of the event that triggered this workflow.
*
* This will be "dynamic" for default setup workflow runs.
*/
function getWorkflowEventName() {
return (0, util_1.getRequiredEnvParam)("GITHUB_EVENT_NAME");
}
exports.workflowEventName = workflowEventName;
// Was the workflow run triggered by a `push` event, for example as opposed to a `pull_request` event.
function workflowIsTriggeredByPushEvent() {
return workflowEventName() === "push";
}
// Is dependabot the actor that triggered the current workflow run.
function isDependabotActor() {
return process.env["GITHUB_ACTOR"] === "dependabot[bot]";
}
// Is the current action executing a local copy (i.e. we're running a workflow on the codeql-action repo itself)
// as opposed to running a remote action (i.e. when another repo references us)
exports.getWorkflowEventName = getWorkflowEventName;
/**
* Returns whether the current workflow is executing a local copy of the Action, e.g. we're running
* a workflow on the codeql-action repo itself.
*/
function isRunningLocalAction() {
const relativeScriptPath = getRelativeScriptPath();
return (relativeScriptPath.startsWith("..") || path.isAbsolute(relativeScriptPath));
}
exports.isRunningLocalAction = isRunningLocalAction;
// Get the location where the action is running from.
// This can be used to get the actions name or tell if we're running a local action.
/**
* Get the location where the Action is running from.
*
* This can be used to get the Action's name or tell if we're running a local Action.
*/
function getRelativeScriptPath() {
const runnerTemp = (0, util_1.getRequiredEnvParam)("RUNNER_TEMP");
const actionsDirectory = path.join(path.dirname(runnerTemp), "_actions");
return path.relative(actionsDirectory, __filename);
}
exports.getRelativeScriptPath = getRelativeScriptPath;
// Reads the contents of GITHUB_EVENT_PATH as a JSON object
/** Returns the contents of `GITHUB_EVENT_PATH` as a JSON object. */
function getWorkflowEvent() {
const eventJsonFile = (0, util_1.getRequiredEnvParam)("GITHUB_EVENT_PATH");
try {
@@ -476,10 +471,13 @@ function getWorkflowEvent() {
function removeRefsHeadsPrefix(ref) {
return ref.startsWith("refs/heads/") ? ref.slice("refs/heads/".length) : ref;
}
// Returns whether we are analyzing the default branch for the repository.
// For cases where the repository information might not be available (e.g.,
// dynamic workflows), this can be forced by the environment variable
// CODE_SCANNING_IS_ANALYZING_DEFAULT_BRANCH.
/**
* Returns whether we are analyzing the default branch for the repository.
*
* This first checks the environment variable `CODE_SCANNING_IS_ANALYZING_DEFAULT_BRANCH`. This
* environment variable can be set in cases where repository information might not be available, for
* example dynamic workflows.
*/
async function isAnalyzingDefaultBranch() {
if (process.env.CODE_SCANNING_IS_ANALYZING_DEFAULT_BRANCH === "true") {
return true;
@@ -489,8 +487,8 @@ async function isAnalyzingDefaultBranch() {
currentRef = removeRefsHeadsPrefix(currentRef);
const event = getWorkflowEvent();
let defaultBranch = event?.repository?.default_branch;
if (process.env.GITHUB_EVENT_NAME === "schedule") {
defaultBranch = removeRefsHeadsPrefix((0, util_1.getRequiredEnvParam)("GITHUB_REF"));
if (getWorkflowEventName() === "schedule") {
defaultBranch = removeRefsHeadsPrefix(getRefFromEnv());
}
return currentRef === defaultBranch;
}
@@ -524,7 +522,10 @@ async function printDebugLogs(config) {
}
}
exports.printDebugLogs = printDebugLogs;
// Parses the `upload` input into an `UploadKind`, converting unspecified and deprecated upload inputs appropriately.
/**
* Parses the `upload` input into an `UploadKind`, converting unspecified and deprecated upload
* inputs appropriately.
*/
function getUploadValue(input) {
switch (input) {
case undefined:

File diff suppressed because one or more lines are too long

View File

@@ -172,6 +172,7 @@ const util_1 = require("./util");
t.deepEqual(process.env.CODEQL_ACTION_VERSION, "1.2.3");
});
(0, ava_1.default)("isAnalyzingDefaultBranch()", async (t) => {
process.env["GITHUB_EVENT_NAME"] = "push";
process.env["CODE_SCANNING_IS_ANALYZING_DEFAULT_BRANCH"] = "true";
t.deepEqual(await actionsutil.isAnalyzingDefaultBranch(), true);
process.env["CODE_SCANNING_IS_ANALYZING_DEFAULT_BRANCH"] = "false";
@@ -210,12 +211,4 @@ const util_1 = require("./util");
getAdditionalInputStub.restore();
});
});
(0, ava_1.default)("workflowEventName()", async (t) => {
process.env["GITHUB_EVENT_NAME"] = "push";
t.deepEqual(actionsutil.workflowEventName(), "push");
process.env["GITHUB_EVENT_NAME"] = "dynamic";
t.deepEqual(actionsutil.workflowEventName(), "dynamic");
process.env["CODESCANNING_EVENT_NAME"] = "push";
t.deepEqual(actionsutil.workflowEventName(), "push");
});
//# sourceMappingURL=actions-util.test.js.map

File diff suppressed because one or more lines are too long

2
lib/analyze-action.js generated
View File

@@ -163,7 +163,7 @@ async function run() {
const gitHubVersion = await (0, api_client_1.getGitHubVersion)();
const features = new feature_flags_1.Features(gitHubVersion, repositoryNwo, actionsUtil.getTemporaryDirectory(), logger);
await runAutobuildIfLegacyGoWorkflow(config, logger);
dbCreationTimings = await (0, analyze_1.runFinalize)(outputDir, threads, memory, config, logger);
dbCreationTimings = await (0, analyze_1.runFinalize)(outputDir, threads, memory, config, logger, features);
if (actionsUtil.getRequiredInput("skip-queries") !== "true") {
runStats = await (0, analyze_1.runQueries)(outputDir, memory, util.getAddSnippetsFlag(actionsUtil.getRequiredInput("add-snippets")), threads, actionsUtil.getOptionalInput("category"), config, logger, features);
}

File diff suppressed because one or more lines are too long

22
lib/analyze.js generated
View File

@@ -36,6 +36,7 @@ const yaml = __importStar(require("js-yaml"));
const analysisPaths = __importStar(require("./analysis-paths"));
const codeql_1 = require("./codeql");
const configUtils = __importStar(require("./config-utils"));
const feature_flags_1 = require("./feature-flags");
const languages_1 = require("./languages");
const tracer_config_1 = require("./tracer-config");
const util = __importStar(require("./util"));
@@ -47,12 +48,17 @@ class CodeQLAnalysisError extends Error {
}
}
exports.CodeQLAnalysisError = CodeQLAnalysisError;
async function setupPythonExtractor(logger) {
async function setupPythonExtractor(logger, features, codeql) {
const codeqlPython = process.env["CODEQL_PYTHON"];
if (codeqlPython === undefined || codeqlPython.length === 0) {
// If CODEQL_PYTHON is not set, no dependencies were installed, so we don't need to do anything
return;
}
if (await features.getValue(feature_flags_1.Feature.DisablePythonDependencyInstallation, codeql)) {
logger.warning("We recommend that you remove the CODEQL_PYTHON environment variable from your workflow. This environment variable was originally used to specify a Python executable that included the dependencies of your Python code, however Python analysis no longer uses these dependencies." +
"\nIf you used CODEQL_PYTHON to force the version of Python to analyze as, please use CODEQL_EXTRACTOR_PYTHON_ANALYSIS_VERSION instead, such as 'CODEQL_EXTRACTOR_PYTHON_ANALYSIS_VERSION=2.7' or 'CODEQL_EXTRACTOR_PYTHON_ANALYSIS_VERSION=3.11'.");
return;
}
const scriptsFolder = path.resolve(__dirname, "../python-setup");
let output = "";
const options = {
@@ -70,7 +76,7 @@ async function setupPythonExtractor(logger) {
logger.info(`Setting LGTM_PYTHON_SETUP_VERSION=${output}`);
process.env["LGTM_PYTHON_SETUP_VERSION"] = output;
}
async function createdDBForScannedLanguages(codeql, config, logger) {
async function createdDBForScannedLanguages(codeql, config, logger, features) {
// Insert the LGTM_INDEX_X env vars at this point so they are set when
// we extract any scanned languages.
analysisPaths.includeAndExcludeAnalysisPaths(config);
@@ -79,7 +85,7 @@ async function createdDBForScannedLanguages(codeql, config, logger) {
!dbIsFinalized(config, language, logger)) {
logger.startGroup(`Extracting ${language}`);
if (language === languages_1.Language.python) {
await setupPythonExtractor(logger);
await setupPythonExtractor(logger, features, codeql);
}
await codeql.extractScannedLanguage(config, language);
logger.endGroup();
@@ -99,10 +105,10 @@ function dbIsFinalized(config, language, logger) {
}
}
exports.dbIsFinalized = dbIsFinalized;
async function finalizeDatabaseCreation(config, threadsFlag, memoryFlag, logger) {
async function finalizeDatabaseCreation(config, threadsFlag, memoryFlag, logger, features) {
const codeql = await (0, codeql_1.getCodeQL)(config.codeQLCmd);
const extractionStart = perf_hooks_1.performance.now();
await createdDBForScannedLanguages(codeql, config, logger);
await createdDBForScannedLanguages(codeql, config, logger, features);
const extractionTime = perf_hooks_1.performance.now() - extractionStart;
const trapImportStart = perf_hooks_1.performance.now();
for (const language of config.languages) {
@@ -203,7 +209,7 @@ async function runQueries(sarifFolder, memoryFlag, addSnippetsFlag, threadsFlag,
logger.endGroup();
logger.info(analysisSummary);
}
logger.info(await runPrintLinesOfCode(language));
await runPrintLinesOfCode(language);
}
catch (e) {
logger.info(String(e));
@@ -271,7 +277,7 @@ function createQuerySuiteContents(queries, queryFilters) {
return yaml.dump(queries.map((q) => ({ query: q })).concat(queryFilters));
}
exports.createQuerySuiteContents = createQuerySuiteContents;
async function runFinalize(outputDir, threadsFlag, memoryFlag, config, logger) {
async function runFinalize(outputDir, threadsFlag, memoryFlag, config, logger, features) {
try {
await (0, del_1.default)(outputDir, { force: true });
}
@@ -281,7 +287,7 @@ async function runFinalize(outputDir, threadsFlag, memoryFlag, config, logger) {
}
}
await fs.promises.mkdir(outputDir, { recursive: true });
const timings = await finalizeDatabaseCreation(config, threadsFlag, memoryFlag, logger);
const timings = await finalizeDatabaseCreation(config, threadsFlag, memoryFlag, logger, features);
// WARNING: This does not _really_ end tracing, as the tracer will restore its
// critical environment variables and it'll still be active for all processes
// launched from this build step.

File diff suppressed because one or more lines are too long

57
lib/codeql.js generated
View File

@@ -66,7 +66,6 @@ const CODEQL_MINIMUM_VERSION = "2.8.5";
* For convenience, please keep these in descending order. Once a version
* flag is older than the oldest supported version above, it may be removed.
*/
const CODEQL_VERSION_CUSTOM_QUERY_HELP = "2.7.1";
const CODEQL_VERSION_LUA_TRACER_CONFIG = "2.10.0";
const CODEQL_VERSION_LUA_TRACING_GO_WINDOWS_FIXED = "2.10.4";
exports.CODEQL_VERSION_GHES_PACK_DOWNLOAD = "2.10.4";
@@ -174,6 +173,7 @@ function setCodeQL(partialCodeql) {
databasePrintBaseline: resolveFunction(partialCodeql, "databasePrintBaseline"),
databaseExportDiagnostics: resolveFunction(partialCodeql, "databaseExportDiagnostics"),
diagnosticsExport: resolveFunction(partialCodeql, "diagnosticsExport"),
resolveExtractor: resolveFunction(partialCodeql, "resolveExtractor"),
};
return cachedCodeQL;
}
@@ -269,10 +269,7 @@ async function getCodeQLForCmd(cmd, checkVersion) {
], { stdin: externalRepositoryToken });
},
async runAutobuild(language) {
const cmdName = process.platform === "win32" ? "autobuild.cmd" : "autobuild.sh";
// The autobuilder for Swift is located in the experimental/ directory.
const possibleExperimentalDir = language === languages_1.Language.swift ? "experimental" : "";
const autobuildCmd = path.join(path.dirname(cmd), possibleExperimentalDir, language, "tools", cmdName);
const autobuildCmd = path.join(await this.resolveExtractor(language), "tools", process.platform === "win32" ? "autobuild.cmd" : "autobuild.sh");
// Update JAVA_TOOL_OPTIONS to contain '-Dhttp.keepAlive=false'
// This is because of an issue with Azure pipelines timing out connections after 4 minutes
// and Maven not properly handling closed connections
@@ -301,31 +298,9 @@ async function getCodeQLForCmd(cmd, checkVersion) {
},
async extractScannedLanguage(config, language) {
const databasePath = util.getCodeQLDatabasePath(config, language);
// Get extractor location
//
// Request it using `format=json` so we don't need to strip the trailing new line generated by
// the CLI.
let extractorPath = "";
await new toolrunner.ToolRunner(cmd, [
"resolve",
"extractor",
"--format=json",
`--language=${language}`,
...getExtraOptionsFromEnv(["resolve", "extractor"]),
], {
silent: true,
listeners: {
stdout: (data) => {
extractorPath += data.toString();
},
stderr: (data) => {
process.stderr.write(data);
},
},
}).exec();
// Set trace command
const ext = process.platform === "win32" ? ".cmd" : ".sh";
const traceCommand = path.resolve(JSON.parse(extractorPath), "tools", `autobuild${ext}`);
const traceCommand = path.resolve(await this.resolveExtractor(language), "tools", `autobuild${ext}`);
// Run trace command
await (0, toolrunner_error_catcher_1.toolrunnerErrorCatcher)(cmd, [
"database",
@@ -438,12 +413,11 @@ async function getCodeQLForCmd(cmd, checkVersion) {
addSnippetsFlag,
"--print-diagnostics-summary",
"--print-metrics-summary",
"--sarif-add-query-help",
"--sarif-group-rules-by-pack",
...(await getCodeScanningConfigExportArguments(config, this, features)),
...getExtraOptionsFromEnv(["database", "interpret-results"]),
];
if (await util.codeQlVersionAbove(this, CODEQL_VERSION_CUSTOM_QUERY_HELP))
codeqlArgs.push("--sarif-add-query-help");
if (automationDetailsId !== undefined) {
codeqlArgs.push("--sarif-category", automationDetailsId);
}
@@ -581,6 +555,29 @@ async function getCodeQLForCmd(cmd, checkVersion) {
}
await new toolrunner.ToolRunner(cmd, args).exec();
},
async resolveExtractor(language) {
// Request it using `format=json` so we don't need to strip the trailing new line generated by
// the CLI.
let extractorPath = "";
await new toolrunner.ToolRunner(cmd, [
"resolve",
"extractor",
"--format=json",
`--language=${language}`,
...getExtraOptionsFromEnv(["resolve", "extractor"]),
], {
silent: true,
listeners: {
stdout: (data) => {
extractorPath += data.toString();
},
stderr: (data) => {
process.stderr.write(data);
},
},
}).exec();
return JSON.parse(extractorPath);
},
};
// To ensure that status reports include the CodeQL CLI version wherever
// possible, we want to call getVersion(), which populates the version value

File diff suppressed because one or more lines are too long

98
lib/codeql.test.js generated
View File

@@ -49,20 +49,11 @@ const testing_utils_1 = require("./testing-utils");
const util = __importStar(require("./util"));
const util_1 = require("./util");
(0, testing_utils_1.setupTests)(ava_1.default);
const sampleApiDetails = {
auth: "token",
url: "https://github.com",
apiURL: "https://api.github.com",
};
const sampleGHAEApiDetails = {
auth: "token",
url: "https://example.githubenterprise.com",
apiURL: "https://example.githubenterprise.com/api/v3",
};
const SAMPLE_DEFAULT_CLI_VERSION = {
cliVersion: "2.0.0",
variant: util.GitHubVariant.DOTCOM,
};
let stubConfig;
ava_1.default.beforeEach(() => {
(0, util_1.initializeEnvironment)("1.2.3");
@@ -91,34 +82,13 @@ ava_1.default.beforeEach(() => {
trapCacheDownloadTime: 0,
};
});
/**
* Mocks the API for downloading the bundle tagged `tagName`.
*
* @returns the download URL for the bundle. This can be passed to the tools parameter of
* `codeql.setupCodeQL`.
*/
function mockDownloadApi({ apiDetails = sampleApiDetails, isPinned, repo = "github/codeql-action", platformSpecific = true, tagName, }) {
const platform = process.platform === "win32"
? "win64"
: process.platform === "linux"
? "linux64"
: "osx64";
const baseUrl = apiDetails?.url ?? "https://example.com";
const relativeUrl = apiDetails
? `/${repo}/releases/download/${tagName}/codeql-bundle${platformSpecific ? `-${platform}` : ""}.tar.gz`
: `/download/${tagName}/codeql-bundle.tar.gz`;
(0, nock_1.default)(baseUrl)
.get(relativeUrl)
.replyWithFile(200, path_1.default.join(__dirname, `/../src/testdata/codeql-bundle${isPinned ? "-pinned" : ""}.tar.gz`));
return `${baseUrl}${relativeUrl}`;
}
async function installIntoToolcache({ apiDetails = sampleApiDetails, cliVersion, isPinned, tagName, tmpDir, }) {
const url = mockDownloadApi({ apiDetails, isPinned, tagName });
async function installIntoToolcache({ apiDetails = testing_utils_1.SAMPLE_DOTCOM_API_DETAILS, cliVersion, isPinned, tagName, tmpDir, }) {
const url = (0, testing_utils_1.mockBundleDownloadApi)({ apiDetails, isPinned, tagName });
await codeql.setupCodeQL(cliVersion !== undefined ? undefined : url, apiDetails, tmpDir, util.GitHubVariant.GHES, cliVersion !== undefined
? { cliVersion, tagName, variant: util.GitHubVariant.GHES }
: SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
: testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
}
function mockReleaseApi({ apiDetails = sampleApiDetails, assetNames, tagName, }) {
function mockReleaseApi({ apiDetails = testing_utils_1.SAMPLE_DOTCOM_API_DETAILS, assetNames, tagName, }) {
return (0, nock_1.default)(apiDetails.apiURL)
.get(`/repos/github/codeql-action/releases/tags/${tagName}`)
.reply(200, {
@@ -149,11 +119,11 @@ function mockApiDetails(apiDetails) {
const versions = ["20200601", "20200610"];
for (let i = 0; i < versions.length; i++) {
const version = versions[i];
const url = mockDownloadApi({
const url = (0, testing_utils_1.mockBundleDownloadApi)({
tagName: `codeql-bundle-${version}`,
isPinned: false,
});
const result = await codeql.setupCodeQL(url, sampleApiDetails, tmpDir, util.GitHubVariant.DOTCOM, SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
const result = await codeql.setupCodeQL(url, testing_utils_1.SAMPLE_DOTCOM_API_DETAILS, tmpDir, util.GitHubVariant.DOTCOM, testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
t.assert(toolcache.find("CodeQL", `0.0.0-${version}`));
t.is(result.toolsVersion, `0.0.0-${version}`);
t.is(result.toolsSource, init_1.ToolsSource.Download);
@@ -170,10 +140,10 @@ function mockApiDetails(apiDetails) {
isPinned: true,
tmpDir,
});
const url = mockDownloadApi({
const url = (0, testing_utils_1.mockBundleDownloadApi)({
tagName: "codeql-bundle-20200610",
});
const result = await codeql.setupCodeQL(url, sampleApiDetails, tmpDir, util.GitHubVariant.DOTCOM, SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
const result = await codeql.setupCodeQL(url, testing_utils_1.SAMPLE_DOTCOM_API_DETAILS, tmpDir, util.GitHubVariant.DOTCOM, testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
t.assert(toolcache.find("CodeQL", "0.0.0-20200610"));
t.deepEqual(result.toolsVersion, "0.0.0-20200610");
t.is(result.toolsSource, init_1.ToolsSource.Download);
@@ -198,16 +168,16 @@ for (const { cliVersion, expectedToolcacheVersion, } of EXPLICITLY_REQUESTED_BUN
(0, ava_1.default)(`caches an explicitly requested bundle containing CLI ${cliVersion} as ${expectedToolcacheVersion}`, async (t) => {
await util.withTmpDir(async (tmpDir) => {
(0, testing_utils_1.setupActionsVars)(tmpDir, tmpDir);
mockApiDetails(sampleApiDetails);
mockApiDetails(testing_utils_1.SAMPLE_DOTCOM_API_DETAILS);
sinon.stub(actionsUtil, "isRunningLocalAction").returns(true);
const releaseApiMock = mockReleaseApi({
assetNames: [`cli-version-${cliVersion}.txt`],
tagName: "codeql-bundle-20200610",
});
const url = mockDownloadApi({
const url = (0, testing_utils_1.mockBundleDownloadApi)({
tagName: "codeql-bundle-20200610",
});
const result = await codeql.setupCodeQL(url, sampleApiDetails, tmpDir, util.GitHubVariant.DOTCOM, SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
const result = await codeql.setupCodeQL(url, testing_utils_1.SAMPLE_DOTCOM_API_DETAILS, tmpDir, util.GitHubVariant.DOTCOM, testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
t.assert(releaseApiMock.isDone(), "Releases API should have been called");
t.assert(toolcache.find("CodeQL", expectedToolcacheVersion));
t.deepEqual(result.toolsVersion, cliVersion);
@@ -220,19 +190,19 @@ for (const { githubReleases, toolcacheVersion } of [
// Test that we use the tools from the toolcache when `SAMPLE_DEFAULT_CLI_VERSION` is requested
// and `SAMPLE_DEFAULT_CLI_VERSION-` is in the toolcache.
{
toolcacheVersion: SAMPLE_DEFAULT_CLI_VERSION.cliVersion,
toolcacheVersion: testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION.cliVersion,
},
{
githubReleases: {
"codeql-bundle-20230101": `cli-version-${SAMPLE_DEFAULT_CLI_VERSION.cliVersion}.txt`,
"codeql-bundle-20230101": `cli-version-${testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION.cliVersion}.txt`,
},
toolcacheVersion: "0.0.0-20230101",
},
{
toolcacheVersion: `${SAMPLE_DEFAULT_CLI_VERSION.cliVersion}-20230101`,
toolcacheVersion: `${testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION.cliVersion}-20230101`,
},
]) {
(0, ava_1.default)(`uses tools from toolcache when ${SAMPLE_DEFAULT_CLI_VERSION.cliVersion} is requested and ` +
(0, ava_1.default)(`uses tools from toolcache when ${testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION.cliVersion} is requested and ` +
`${toolcacheVersion} is installed`, async (t) => {
await util.withTmpDir(async (tmpDir) => {
(0, testing_utils_1.setupActionsVars)(tmpDir, tmpDir);
@@ -256,8 +226,8 @@ for (const { githubReleases, toolcacheVersion } of [
}))),
}));
}
const result = await codeql.setupCodeQL(undefined, sampleApiDetails, tmpDir, util.GitHubVariant.DOTCOM, SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
t.is(result.toolsVersion, SAMPLE_DEFAULT_CLI_VERSION.cliVersion);
const result = await codeql.setupCodeQL(undefined, testing_utils_1.SAMPLE_DOTCOM_API_DETAILS, tmpDir, util.GitHubVariant.DOTCOM, testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
t.is(result.toolsVersion, testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION.cliVersion);
t.is(result.toolsSource, init_1.ToolsSource.Toolcache);
t.is(result.toolsDownloadDurationMs, undefined);
});
@@ -272,7 +242,7 @@ for (const variant of [util.GitHubVariant.GHAE, util.GitHubVariant.GHES]) {
isPinned: true,
tmpDir,
});
const result = await codeql.setupCodeQL(undefined, sampleApiDetails, tmpDir, variant, {
const result = await codeql.setupCodeQL(undefined, testing_utils_1.SAMPLE_DOTCOM_API_DETAILS, tmpDir, variant, {
cliVersion: defaults.cliVersion,
tagName: defaults.bundleVersion,
variant,
@@ -292,10 +262,10 @@ for (const variant of [util.GitHubVariant.GHAE, util.GitHubVariant.GHES]) {
isPinned: false,
tmpDir,
});
mockDownloadApi({
(0, testing_utils_1.mockBundleDownloadApi)({
tagName: defaults.bundleVersion,
});
const result = await codeql.setupCodeQL(undefined, sampleApiDetails, tmpDir, variant, {
const result = await codeql.setupCodeQL(undefined, testing_utils_1.SAMPLE_DOTCOM_API_DETAILS, tmpDir, variant, {
cliVersion: defaults.cliVersion,
tagName: defaults.bundleVersion,
variant,
@@ -316,10 +286,10 @@ for (const variant of [util.GitHubVariant.GHAE, util.GitHubVariant.GHES]) {
isPinned: true,
tmpDir,
});
mockDownloadApi({
(0, testing_utils_1.mockBundleDownloadApi)({
tagName: defaults.bundleVersion,
});
const result = await codeql.setupCodeQL("latest", sampleApiDetails, tmpDir, util.GitHubVariant.DOTCOM, SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
const result = await codeql.setupCodeQL("latest", testing_utils_1.SAMPLE_DOTCOM_API_DETAILS, tmpDir, util.GitHubVariant.DOTCOM, testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
t.deepEqual(result.toolsVersion, defaults.cliVersion);
t.is(result.toolsSource, init_1.ToolsSource.Download);
t.assert(Number.isInteger(result.toolsDownloadDurationMs));
@@ -375,18 +345,18 @@ for (const isBundleVersionInUrl of [true, false]) {
(0, ava_1.default)("bundle URL from another repo is cached as 0.0.0-bundleVersion", async (t) => {
await util.withTmpDir(async (tmpDir) => {
(0, testing_utils_1.setupActionsVars)(tmpDir, tmpDir);
mockApiDetails(sampleApiDetails);
mockApiDetails(testing_utils_1.SAMPLE_DOTCOM_API_DETAILS);
sinon.stub(actionsUtil, "isRunningLocalAction").returns(true);
const releasesApiMock = mockReleaseApi({
assetNames: ["cli-version-2.12.2.txt"],
tagName: "codeql-bundle-20230203",
});
mockDownloadApi({
(0, testing_utils_1.mockBundleDownloadApi)({
repo: "codeql-testing/codeql-cli-nightlies",
platformSpecific: false,
tagName: "codeql-bundle-20230203",
});
const result = await codeql.setupCodeQL("https://github.com/codeql-testing/codeql-cli-nightlies/releases/download/codeql-bundle-20230203/codeql-bundle.tar.gz", sampleApiDetails, tmpDir, util.GitHubVariant.DOTCOM, SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
const result = await codeql.setupCodeQL("https://github.com/codeql-testing/codeql-cli-nightlies/releases/download/codeql-bundle-20230203/codeql-bundle.tar.gz", testing_utils_1.SAMPLE_DOTCOM_API_DETAILS, tmpDir, util.GitHubVariant.DOTCOM, testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION, (0, logging_1.getRunnerLogger)(true), false);
t.is(result.toolsVersion, "0.0.0-20230203");
t.is(result.toolsSource, init_1.ToolsSource.Download);
t.true(Number.isInteger(result.toolsDownloadDurationMs));
@@ -418,24 +388,6 @@ for (const isBundleVersionInUrl of [true, false]) {
t.throws(() => codeql.getExtraOptions({ foo: 87 }, ["foo"], []));
t.throws(() => codeql.getExtraOptions({ "*": [42], foo: { "*": 87, bar: [99] } }, ["foo", "bar"], []));
});
(0, ava_1.default)("databaseInterpretResults() does not set --sarif-add-query-help for 2.7.0", async (t) => {
const runnerConstructorStub = stubToolRunnerConstructor();
const codeqlObject = await codeql.getCodeQLForTesting();
sinon.stub(codeqlObject, "getVersion").resolves("2.7.0");
// safeWhich throws because of the test CodeQL object.
sinon.stub(safeWhich, "safeWhich").resolves("");
await codeqlObject.databaseInterpretResults("", [], "", "", "", "-v", "", stubConfig, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
t.false(runnerConstructorStub.firstCall.args[1].includes("--sarif-add-query-help"), "--sarif-add-query-help should be absent, but it is present");
});
(0, ava_1.default)("databaseInterpretResults() sets --sarif-add-query-help for 2.7.1", async (t) => {
const runnerConstructorStub = stubToolRunnerConstructor();
const codeqlObject = await codeql.getCodeQLForTesting();
sinon.stub(codeqlObject, "getVersion").resolves("2.7.1");
// safeWhich throws because of the test CodeQL object.
sinon.stub(safeWhich, "safeWhich").resolves("");
await codeqlObject.databaseInterpretResults("", [], "", "", "", "-v", "", stubConfig, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
t.true(runnerConstructorStub.firstCall.args[1].includes("--sarif-add-query-help"), "--sarif-add-query-help should be present, but it is absent");
});
(0, ava_1.default)("databaseInitCluster() without injected codescanning config", async (t) => {
await util.withTmpDir(async (tempDir) => {
const runnerConstructorStub = stubToolRunnerConstructor();

File diff suppressed because one or more lines are too long

View File

@@ -1,6 +1,6 @@
{
"bundleVersion": "codeql-bundle-20230428",
"cliVersion": "2.13.1",
"priorBundleVersion": "codeql-bundle-20230414",
"priorCliVersion": "2.13.0"
"bundleVersion": "codeql-bundle-20230524",
"cliVersion": "2.13.3",
"priorBundleVersion": "codeql-bundle-20230428",
"priorCliVersion": "2.13.1"
}

11
lib/feature-flags.js generated
View File

@@ -40,6 +40,7 @@ var Feature;
Feature["ExportDiagnosticsEnabled"] = "export_diagnostics_enabled";
Feature["MlPoweredQueriesEnabled"] = "ml_powered_queries_enabled";
Feature["UploadFailedSarifEnabled"] = "upload_failed_sarif_enabled";
Feature["DisablePythonDependencyInstallation"] = "disable_python_dependency_installation";
})(Feature = exports.Feature || (exports.Feature = {}));
exports.featureConfig = {
[Feature.DisableKotlinAnalysisEnabled]: {
@@ -72,6 +73,16 @@ exports.featureConfig = {
minimumVersion: "2.11.3",
defaultValue: true,
},
[Feature.DisablePythonDependencyInstallation]: {
envVar: "CODEQL_ACTION_DISABLE_PYTHON_DEPENDENCY_INSTALLATION",
// Although the python extractor only started supporting not extracting installed
// dependencies in 2.13.1, the init-action can still benefit from not installing
// dependencies no matter what codeql version we are using, so therefore the
// minimumVersion is set to 'undefined'. This means that with an old CodeQL version,
// packages available with current python3 installation might get extracted.
minimumVersion: undefined,
defaultValue: false,
},
};
exports.FEATURE_FLAGS_FILE_NAME = "cached-feature-flags.json";
/**

File diff suppressed because one or more lines are too long

9
lib/fingerprints.js generated
View File

@@ -194,7 +194,14 @@ function resolveUriToFile(location, artifacts, sourceRoot, logger) {
logger.debug(`Ignoring location as URI "${location.uri}" is invalid`);
return undefined;
}
let uri = decodeURIComponent(location.uri);
let uri;
try {
uri = decodeURIComponent(location.uri);
}
catch (e) {
logger.debug(`Ignoring location as URI "${location.uri}" is invalid`);
return undefined;
}
// Remove a file scheme, and abort if the scheme is anything else
const fileUriPrefix = "file://";
if (uri.startsWith(fileUriPrefix)) {

File diff suppressed because one or more lines are too long

View File

@@ -110,8 +110,9 @@ async function run(uploadDatabaseBundleDebugArtifact, uploadLogsDebugArtifact, p
// but we didn't upload anything.
if (process.env["CODEQL_ACTION_EXPECT_UPLOAD_FAILED_SARIF"] === "true" &&
!uploadFailedSarifResult.raw_upload_size_bytes) {
const error = JSON.stringify(uploadFailedSarifResult);
throw new Error("Expected to upload a failed SARIF file for this CodeQL code scanning run, " +
`but the result was instead ${uploadFailedSarifResult}.`);
`but the result was instead ${error}.`);
}
// Upload appropriate Actions artifacts for debugging
if (config.debugMode) {

View File

@@ -1 +1 @@
{"version":3,"file":"init-action-post-helper.js","sourceRoot":"","sources":["../src/init-action-post-helper.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,oDAAsC;AAEtC,4DAA8C;AAC9C,qCAAqC;AACrC,iDAAmD;AACnD,mDAA6D;AAG7D,6DAAuF;AACvF,wDAA0C;AAC1C,iCAKgB;AAChB,yCAKoB;AAWpB,SAAS,mCAAmC,CAC1C,KAAc;IAEd,MAAM,YAAY,GAAG,IAAA,gBAAS,EAAC,KAAK,CAAC,CAAC;IACtC,OAAO;QACL,uBAAuB,EAAE,YAAY,CAAC,OAAO;QAC7C,6BAA6B,EAAE,YAAY,CAAC,KAAK;KAClD,CAAC;AACJ,CAAC;AAED;;;GAGG;AACH,KAAK,UAAU,sBAAsB,CACnC,MAAc,EACd,aAA4B,EAC5B,QAA2B,EAC3B,MAAc;IAEd,IAAI,CAAC,MAAM,CAAC,SAAS,EAAE;QACrB,OAAO,EAAE,iCAAiC,EAAE,0BAA0B,EAAE,CAAC;KAC1E;IACD,MAAM,MAAM,GAAG,MAAM,IAAA,kBAAS,EAAC,MAAM,CAAC,SAAS,CAAC,CAAC;IACjD,IAAI,CAAC,CAAC,MAAM,QAAQ,CAAC,QAAQ,CAAC,uBAAO,CAAC,wBAAwB,EAAE,MAAM,CAAC,CAAC,EAAE;QACxE,OAAO,EAAE,iCAAiC,EAAE,kBAAkB,EAAE,CAAC;KAClE;IACD,MAAM,QAAQ,GAAG,MAAM,IAAA,sBAAW,EAAC,MAAM,CAAC,CAAC;IAC3C,MAAM,OAAO,GAAG,IAAA,0BAAmB,EAAC,YAAY,CAAC,CAAC;IAClD,MAAM,MAAM,GAAG,IAAA,uBAAgB,EAAC,WAAW,CAAC,gBAAgB,CAAC,QAAQ,CAAC,CAAC,CAAC;IACxE,MAAM,YAAY,GAAG,IAAA,gCAAqB,EAAC,QAAQ,EAAE,OAAO,EAAE,MAAM,CAAC,CAAC;IACtE,IACE,CAAC,CAAC,QAAQ,EAAE,cAAc,CAAC,CAAC,QAAQ,CAClC,WAAW,CAAC,cAAc,CAAC,YAAY,CAAC,CACzC;QACD,IAAA,mBAAY,GAAE,EACd;QACA,OAAO,EAAE,iCAAiC,EAAE,0BAA0B,EAAE,CAAC;KAC1E;IACD,MAAM,QAAQ,GAAG,IAAA,kCAAuB,EAAC,QAAQ,EAAE,OAAO,EAAE,MAAM,CAAC,CAAC;IACpE,MAAM,YAAY,GAAG,IAAA,sCAA2B,EAAC,QAAQ,EAAE,OAAO,EAAE,MAAM,CAAC,CAAC;IAC5E,MAAM,YAAY,GAAG,MAAM,CAAC,UAAU,CAAC;IAEvC,MAAM,SAAS,GAAG,4BAA4B,CAAC;IAE/C,kFAAkF;IAClF,IACE,YAAY,KAAK,SAAS;QAC1B,CAAC,CAAC,MAAM,QAAQ,CAAC,QAAQ,CAAC,uBAAO,CAAC,wBAAwB,EAAE,MAAM,CAAC,CAAC,EACpE;QACA,MAAM,MAAM,CAAC,iBAAiB,CAAC,SAAS,EAAE,QAAQ,EAAE,MAAM,EAAE,QAAQ,CAAC,CAAC;KACvE;SAAM;QACL,8EAA8E;QAC9E,MAAM,MAAM,CAAC,yBAAyB,CACpC,YAAY,EACZ,SAAS,EACT,QAAQ,EACR,MAAM,CAAC,OAAO,EACd,MAAM,CACP,CAAC;KACH;IAED,IAAI,CAAC,IAAI,CAAC,+BAA+B,SAAS,EAAE,CAAC,CAAC;IACtD,MAAM,YAAY,GAAG,MAAM,SAAS,CAAC,iBAAiB,CACpD,SAAS,EACT,YAAY,EACZ,QAAQ,EACR,MAAM,CACP,CAAC;IACF,MAAM,SAAS,CAAC,iBAAiB,CAC/B,aAAa,EACb,YAAY,CAAC,OAAO,EACpB,MAAM,EACN,EAAE,uBAAuB,EAAE,IAAI,EAAE,CAClC,CAAC;IACF,OAAO,YAAY,EAAE,YAAY,IAAI,EAAE,CAAC;AAC1C,CAAC;AAEM,KAAK,UAAU,yBAAyB,CAC7C,MAAc,EACd,aAA4B,EAC5B,QAA2B,EAC3B,MAAc;IAEd,IAAI,OAAO,CAAC,GAAG,CAAC,oEAA+C,CAAC,KAAK,MAAM,EAAE;QAC3E,IAAI;YACF,OAAO,MAAM,sBAAsB,CACjC,MAAM,EACN,aAAa,EACb,QAAQ,EACR,MAAM,CACP,CAAC;SACH;QAAC,OAAO,CAAC,EAAE;YACV,MAAM,CAAC,KAAK,CACV,2EAA2E,CAAC,EAAE,CAC/E,CAAC;YACF,OAAO,mCAAmC,CAAC,CAAC,CAAC,CAAC;SAC/C;KACF;SAAM;QACL,OAAO;YACL,iCAAiC,EAC/B,uCAAuC;SAC1C,CAAC;KACH;AACH,CAAC;AA1BD,8DA0BC;AAEM,KAAK,UAAU,GAAG,CACvB,iCAA2C,EAC3C,uBAAiC,EACjC,cAAwB,EACxB,aAA4B,EAC5B,QAA2B,EAC3B,MAAc;IAEd,MAAM,MAAM,GAAG,MAAM,IAAA,wBAAS,EAAC,WAAW,CAAC,qBAAqB,EAAE,EAAE,MAAM,CAAC,CAAC;IAC5E,IAAI,MAAM,KAAK,SAAS,EAAE;QACxB,MAAM,CAAC,OAAO,CACZ,iGAAiG,CAClG,CAAC;QACF,OAAO;KACR;IAED,MAAM,uBAAuB,GAAG,MAAM,yBAAyB,CAC7D,MAAM,EACN,aAAa,EACb,QAAQ,EACR,MAAM,CACP,CAAC;IAEF,IAAI,uBAAuB,CAAC,iCAAiC,EAAE;QAC7D,MAAM,CAAC,KAAK,CACV,8EAA8E;YAC5E,GAAG,uBAAuB,CAAC,iCAAiC,GAAG,CAClE,CAAC;KACH;IACD,8FAA8F;IAC9F,iCAAiC;IACjC,IACE,OAAO,CAAC,GAAG,CAAC,0CAA0C,CAAC,KAAK,MAAM;QAClE,CAAC,uBAAuB,CAAC,qBAAqB,EAC9C;QACA,MAAM,IAAI,KAAK,CACb,4EAA4E;YAC1E,8BAA8B,uBAAuB,GAAG,CAC3D,CAAC;KACH;IAED,qDAAqD;IACrD,IAAI,MAAM,CAAC,SAAS,EAAE;QACpB,IAAI,CAAC,IAAI,CACP,mGAAmG,CACpG,CAAC;QACF,MAAM,iCAAiC,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACxD,MAAM,uBAAuB,CAAC,MAAM,CAAC,CAAC;QAEtC,MAAM,cAAc,CAAC,MAAM,CAAC,CAAC;KAC9B;IAED,OAAO,uBAAuB,CAAC;AACjC,CAAC;AArDD,kBAqDC"}
{"version":3,"file":"init-action-post-helper.js","sourceRoot":"","sources":["../src/init-action-post-helper.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,oDAAsC;AAEtC,4DAA8C;AAC9C,qCAAqC;AACrC,iDAAmD;AACnD,mDAA6D;AAG7D,6DAAuF;AACvF,wDAA0C;AAC1C,iCAKgB;AAChB,yCAKoB;AAWpB,SAAS,mCAAmC,CAC1C,KAAc;IAEd,MAAM,YAAY,GAAG,IAAA,gBAAS,EAAC,KAAK,CAAC,CAAC;IACtC,OAAO;QACL,uBAAuB,EAAE,YAAY,CAAC,OAAO;QAC7C,6BAA6B,EAAE,YAAY,CAAC,KAAK;KAClD,CAAC;AACJ,CAAC;AAED;;;GAGG;AACH,KAAK,UAAU,sBAAsB,CACnC,MAAc,EACd,aAA4B,EAC5B,QAA2B,EAC3B,MAAc;IAEd,IAAI,CAAC,MAAM,CAAC,SAAS,EAAE;QACrB,OAAO,EAAE,iCAAiC,EAAE,0BAA0B,EAAE,CAAC;KAC1E;IACD,MAAM,MAAM,GAAG,MAAM,IAAA,kBAAS,EAAC,MAAM,CAAC,SAAS,CAAC,CAAC;IACjD,IAAI,CAAC,CAAC,MAAM,QAAQ,CAAC,QAAQ,CAAC,uBAAO,CAAC,wBAAwB,EAAE,MAAM,CAAC,CAAC,EAAE;QACxE,OAAO,EAAE,iCAAiC,EAAE,kBAAkB,EAAE,CAAC;KAClE;IACD,MAAM,QAAQ,GAAG,MAAM,IAAA,sBAAW,EAAC,MAAM,CAAC,CAAC;IAC3C,MAAM,OAAO,GAAG,IAAA,0BAAmB,EAAC,YAAY,CAAC,CAAC;IAClD,MAAM,MAAM,GAAG,IAAA,uBAAgB,EAAC,WAAW,CAAC,gBAAgB,CAAC,QAAQ,CAAC,CAAC,CAAC;IACxE,MAAM,YAAY,GAAG,IAAA,gCAAqB,EAAC,QAAQ,EAAE,OAAO,EAAE,MAAM,CAAC,CAAC;IACtE,IACE,CAAC,CAAC,QAAQ,EAAE,cAAc,CAAC,CAAC,QAAQ,CAClC,WAAW,CAAC,cAAc,CAAC,YAAY,CAAC,CACzC;QACD,IAAA,mBAAY,GAAE,EACd;QACA,OAAO,EAAE,iCAAiC,EAAE,0BAA0B,EAAE,CAAC;KAC1E;IACD,MAAM,QAAQ,GAAG,IAAA,kCAAuB,EAAC,QAAQ,EAAE,OAAO,EAAE,MAAM,CAAC,CAAC;IACpE,MAAM,YAAY,GAAG,IAAA,sCAA2B,EAAC,QAAQ,EAAE,OAAO,EAAE,MAAM,CAAC,CAAC;IAC5E,MAAM,YAAY,GAAG,MAAM,CAAC,UAAU,CAAC;IAEvC,MAAM,SAAS,GAAG,4BAA4B,CAAC;IAE/C,kFAAkF;IAClF,IACE,YAAY,KAAK,SAAS;QAC1B,CAAC,CAAC,MAAM,QAAQ,CAAC,QAAQ,CAAC,uBAAO,CAAC,wBAAwB,EAAE,MAAM,CAAC,CAAC,EACpE;QACA,MAAM,MAAM,CAAC,iBAAiB,CAAC,SAAS,EAAE,QAAQ,EAAE,MAAM,EAAE,QAAQ,CAAC,CAAC;KACvE;SAAM;QACL,8EAA8E;QAC9E,MAAM,MAAM,CAAC,yBAAyB,CACpC,YAAY,EACZ,SAAS,EACT,QAAQ,EACR,MAAM,CAAC,OAAO,EACd,MAAM,CACP,CAAC;KACH;IAED,IAAI,CAAC,IAAI,CAAC,+BAA+B,SAAS,EAAE,CAAC,CAAC;IACtD,MAAM,YAAY,GAAG,MAAM,SAAS,CAAC,iBAAiB,CACpD,SAAS,EACT,YAAY,EACZ,QAAQ,EACR,MAAM,CACP,CAAC;IACF,MAAM,SAAS,CAAC,iBAAiB,CAC/B,aAAa,EACb,YAAY,CAAC,OAAO,EACpB,MAAM,EACN,EAAE,uBAAuB,EAAE,IAAI,EAAE,CAClC,CAAC;IACF,OAAO,YAAY,EAAE,YAAY,IAAI,EAAE,CAAC;AAC1C,CAAC;AAEM,KAAK,UAAU,yBAAyB,CAC7C,MAAc,EACd,aAA4B,EAC5B,QAA2B,EAC3B,MAAc;IAEd,IAAI,OAAO,CAAC,GAAG,CAAC,oEAA+C,CAAC,KAAK,MAAM,EAAE;QAC3E,IAAI;YACF,OAAO,MAAM,sBAAsB,CACjC,MAAM,EACN,aAAa,EACb,QAAQ,EACR,MAAM,CACP,CAAC;SACH;QAAC,OAAO,CAAC,EAAE;YACV,MAAM,CAAC,KAAK,CACV,2EAA2E,CAAC,EAAE,CAC/E,CAAC;YACF,OAAO,mCAAmC,CAAC,CAAC,CAAC,CAAC;SAC/C;KACF;SAAM;QACL,OAAO;YACL,iCAAiC,EAC/B,uCAAuC;SAC1C,CAAC;KACH;AACH,CAAC;AA1BD,8DA0BC;AAEM,KAAK,UAAU,GAAG,CACvB,iCAA2C,EAC3C,uBAAiC,EACjC,cAAwB,EACxB,aAA4B,EAC5B,QAA2B,EAC3B,MAAc;IAEd,MAAM,MAAM,GAAG,MAAM,IAAA,wBAAS,EAAC,WAAW,CAAC,qBAAqB,EAAE,EAAE,MAAM,CAAC,CAAC;IAC5E,IAAI,MAAM,KAAK,SAAS,EAAE;QACxB,MAAM,CAAC,OAAO,CACZ,iGAAiG,CAClG,CAAC;QACF,OAAO;KACR;IAED,MAAM,uBAAuB,GAAG,MAAM,yBAAyB,CAC7D,MAAM,EACN,aAAa,EACb,QAAQ,EACR,MAAM,CACP,CAAC;IAEF,IAAI,uBAAuB,CAAC,iCAAiC,EAAE;QAC7D,MAAM,CAAC,KAAK,CACV,8EAA8E;YAC5E,GAAG,uBAAuB,CAAC,iCAAiC,GAAG,CAClE,CAAC;KACH;IACD,8FAA8F;IAC9F,iCAAiC;IACjC,IACE,OAAO,CAAC,GAAG,CAAC,0CAA0C,CAAC,KAAK,MAAM;QAClE,CAAC,uBAAuB,CAAC,qBAAqB,EAC9C;QACA,MAAM,KAAK,GAAG,IAAI,CAAC,SAAS,CAAC,uBAAuB,CAAC,CAAC;QACtD,MAAM,IAAI,KAAK,CACb,4EAA4E;YAC1E,8BAA8B,KAAK,GAAG,CACzC,CAAC;KACH;IAED,qDAAqD;IACrD,IAAI,MAAM,CAAC,SAAS,EAAE;QACpB,IAAI,CAAC,IAAI,CACP,mGAAmG,CACpG,CAAC;QACF,MAAM,iCAAiC,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACxD,MAAM,uBAAuB,CAAC,MAAM,CAAC,CAAC;QAEtC,MAAM,cAAc,CAAC,MAAM,CAAC,CAAC;KAC9B;IAED,OAAO,uBAAuB,CAAC;AACjC,CAAC;AAtDD,kBAsDC"}

19
lib/init-action.js generated
View File

@@ -136,12 +136,17 @@ async function run() {
(0, actions_util_1.getOptionalInput)("debug") === "true" || core.isDebug(), (0, actions_util_1.getOptionalInput)("debug-artifact-name") || util_1.DEFAULT_DEBUG_ARTIFACT_NAME, (0, actions_util_1.getOptionalInput)("debug-database-name") || util_1.DEFAULT_DEBUG_DATABASE_NAME, repositoryNwo, (0, actions_util_1.getTemporaryDirectory)(), codeql, (0, util_1.getRequiredEnvParam)("GITHUB_WORKSPACE"), gitHubVersion, apiDetails, features, logger);
if (config.languages.includes(languages_1.Language.python) &&
(0, actions_util_1.getRequiredInput)("setup-python-dependencies") === "true") {
try {
await (0, init_1.installPythonDeps)(codeql, logger);
if (await features.getValue(feature_flags_1.Feature.DisablePythonDependencyInstallation, codeql)) {
logger.info("Skipping python dependency installation");
}
catch (unwrappedError) {
const error = (0, util_1.wrapError)(unwrappedError);
logger.warning(`${error.message} You can call this action with 'setup-python-dependencies: false' to disable this process`);
else {
try {
await (0, init_1.installPythonDeps)(codeql, logger);
}
catch (unwrappedError) {
const error = (0, util_1.wrapError)(unwrappedError);
logger.warning(`${error.message} You can call this action with 'setup-python-dependencies: false' to disable this process`);
}
}
}
}
@@ -170,6 +175,10 @@ async function run() {
if (await features.getValue(feature_flags_1.Feature.DisableKotlinAnalysisEnabled)) {
core.exportVariable("CODEQL_EXTRACTOR_JAVA_AGENT_DISABLE_KOTLIN", "true");
}
// Disable Python dependency extraction if feature flag set
if (await features.getValue(feature_flags_1.Feature.DisablePythonDependencyInstallation, codeql)) {
core.exportVariable("CODEQL_EXTRACTOR_PYTHON_DISABLE_LIBRARY_EXTRACTION", "true");
}
const sourceRoot = path.resolve((0, util_1.getRequiredEnvParam)("GITHUB_WORKSPACE"), (0, actions_util_1.getOptionalInput)("source-root") || "");
const tracerConfig = await (0, init_1.runInit)(codeql, config, sourceRoot, "Runner.Worker.exe", registriesInput, features, apiDetails, logger);
if (tracerConfig !== undefined) {

File diff suppressed because one or more lines are too long

7
lib/setup-codeql.js generated
View File

@@ -315,6 +315,13 @@ async function getCodeQLSource(toolsInput, defaultCliVersion, apiDetails, varian
// If a tools URL was provided, then use that.
tagName = tryGetTagNameFromUrl(toolsInput, logger);
url = toolsInput;
if (tagName) {
const bundleVersion = tryGetBundleVersionFromTagName(tagName, logger);
// If the bundle version is a semantic version, it is a CLI version number.
if (bundleVersion && semver.valid(bundleVersion)) {
cliVersion = convertToSemVer(bundleVersion, logger);
}
}
}
else {
// Otherwise, use the default CLI version passed in.

File diff suppressed because one or more lines are too long

View File

@@ -117,4 +117,14 @@ ava_1.default.beforeEach(() => {
message: "Failed to find a release of the CodeQL tools that contains CodeQL CLI 2.12.1.",
});
});
(0, ava_1.default)("getCodeQLSource sets CLI version for a semver tagged bundle", async (t) => {
await (0, util_1.withTmpDir)(async (tmpDir) => {
(0, testing_utils_1.setupActionsVars)(tmpDir, tmpDir);
const tagName = "codeql-bundle-v1.2.3";
(0, testing_utils_1.mockBundleDownloadApi)({ tagName });
const source = await setupCodeql.getCodeQLSource(`https://github.com/github/codeql-action/releases/download/${tagName}/codeql-bundle-linux64.tar.gz`, testing_utils_1.SAMPLE_DEFAULT_CLI_VERSION, testing_utils_1.SAMPLE_DOTCOM_API_DETAILS, util_1.GitHubVariant.DOTCOM, (0, logging_1.getRunnerLogger)(true));
t.is(source.sourceType, "download");
t.is(source["cliVersion"], "1.2.3");
});
});
//# sourceMappingURL=setup-codeql.test.js.map

View File

@@ -1 +1 @@
{"version":3,"file":"setup-codeql.test.js","sourceRoot":"","sources":["../src/setup-codeql.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,2CAA6B;AAE7B,8CAAuB;AACvB,6CAA+B;AAE/B,4DAA8C;AAC9C,kDAAoC;AACpC,uCAA4C;AAC5C,4DAA8C;AAC9C,mDAA6C;AAC7C,iCAA0D;AAE1D,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,aAAI,CAAC,UAAU,CAAC,GAAG,EAAE;IACnB,IAAA,4BAAqB,EAAC,OAAO,CAAC,CAAC;AACjC,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,iCAAiC,EAAE,CAAC,CAAC,EAAE,EAAE;IAC5C,CAAC,CAAC,SAAS,CACT,WAAW,CAAC,mBAAmB,CAC7B,mDAAmD,CACpD,EACD,UAAU,CACX,CAAC;AACJ,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,mBAAmB,EAAE,CAAC,CAAC,EAAE,EAAE;IAC9B,MAAM,KAAK,GAAG;QACZ,UAAU,EAAE,gBAAgB;QAC5B,YAAY,EAAE,kBAAkB;QAChC,cAAc,EAAE,cAAc;QAC9B,OAAO,EAAE,OAAO;QAChB,aAAa,EAAE,aAAa;QAC5B,cAAc,EAAE,cAAc;KAC/B,CAAC;IAEF,KAAK,MAAM,CAAC,OAAO,EAAE,eAAe,CAAC,IAAI,MAAM,CAAC,OAAO,CAAC,KAAK,CAAC,EAAE;QAC9D,IAAI;YACF,MAAM,aAAa,GAAG,WAAW,CAAC,eAAe,CAC/C,OAAO,EACP,IAAA,yBAAe,EAAC,IAAI,CAAC,CACtB,CAAC;YACF,CAAC,CAAC,SAAS,CAAC,aAAa,EAAE,eAAe,CAAC,CAAC;SAC7C;QAAC,OAAO,CAAC,EAAE;YACV,CAAC,CAAC,IAAI,CAAC,IAAA,gBAAS,EAAC,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;SAC9B;KACF;AACH,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,2BAA2B,EAAE,CAAC,CAAC,EAAE,EAAE;IACtC,MAAM,MAAM,GAAG,IAAA,yBAAe,EAAC,IAAI,CAAC,CAAC;IAErC,IAAA,4BAAqB,EAAC,OAAO,CAAC,CAAC;IAE/B,kCAAkC;IAClC,OAAO,OAAO,CAAC,GAAG,CAAC,0BAA0B,CAAC,CAAC;IAC/C,OAAO,CAAC,GAAG,CAAC,aAAa,CAAC,GAAG,IAAI,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC;IACrD,MAAM,eAAe,GAAG,WAAW,CAAC,yBAAyB,CAAC,MAAM,CAAC,CAAC;IACtE,CAAC,CAAC,SAAS,CAAC,eAAe,EAAE,sBAAsB,CAAC,CAAC;IAErD,mCAAmC;IACnC,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,sBAAsB,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC;IAC/D,OAAO,CAAC,GAAG,CAAC,0BAA0B,CAAC,GAAG,SAAS,CAAC;IACpD,MAAM,OAAO,GAAG,WAAW,CAAC,yBAAyB,CAAC,MAAM,CAAC,CAAC;IAC9D,CAAC,CAAC,SAAS,CAAC,OAAO,EAAE,SAAS,CAAC,CAAC;AAClC,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,yEAAyE,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IAC1F,mDAAmD;IACnD,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,sBAAsB,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC;IAC/D,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,cAAc,CAAC,CAAC,KAAK,CAAC,GAAG,EAAE,CAAC,CAAC;QAC3C,KAAK,EAAE;YACL,YAAY,EAAE,KAAK,CAAC,IAAI,EAAE,CAAC,QAAQ,CAAC,SAAS,CAAC;SAC/C;QACD,QAAQ,EAAE,KAAK,CAAC,IAAI,EAAE,CAAC,QAAQ,CAAC;YAC9B;gBACE,MAAM,EAAE;oBACN;wBACE,IAAI,EAAE,wBAAwB;qBAC/B;iBACF;gBACD,QAAQ,EAAE,wBAAwB;aACnC;SACF,CAAC;KACH,CAAC,CAAC,CAAC;IACJ,CAAC,CAAC,EAAE,CACF,MAAM,WAAW,CAAC,6BAA6B,CAC7C,QAAQ,EACR,IAAA,yBAAe,EAAC,IAAI,CAAC,CACtB,EACD,wBAAwB,CACzB,CAAC;AACJ,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,iFAAiF,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IAClG,mDAAmD;IACnD,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,sBAAsB,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC;IAC/D,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,cAAc,CAAC,CAAC,KAAK,CAAC,GAAG,EAAE,CAAC,CAAC;QAC3C,KAAK,EAAE;YACL,YAAY,EAAE,KAAK,CAAC,IAAI,EAAE,CAAC,QAAQ,CAAC,SAAS,CAAC;SAC/C;QACD,QAAQ,EAAE,KAAK,CAAC,IAAI,EAAE,CAAC,QAAQ,CAAC;YAC9B;gBACE,MAAM,EAAE;oBACN;wBACE,IAAI,EAAE,wBAAwB;qBAC/B;iBACF;gBACD,QAAQ,EAAE,wBAAwB;aACnC;SACF,CAAC;KACH,CAAC,CAAC,CAAC;IACJ,MAAM,CAAC,CAAC,WAAW,CACjB,KAAK,IAAI,EAAE,CACT,MAAM,WAAW,CAAC,6BAA6B,CAC7C,QAAQ,EACR,IAAA,yBAAe,EAAC,IAAI,CAAC,CACtB,EACH;QACE,OAAO,EACL,+EAA+E;KAClF,CACF,CAAC;AACJ,CAAC,CAAC,CAAC"}
{"version":3,"file":"setup-codeql.test.js","sourceRoot":"","sources":["../src/setup-codeql.test.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,2CAA6B;AAE7B,8CAAuB;AACvB,6CAA+B;AAE/B,4DAA8C;AAC9C,kDAAoC;AACpC,uCAA4C;AAC5C,4DAA8C;AAC9C,mDAMyB;AACzB,iCAKgB;AAEhB,IAAA,0BAAU,EAAC,aAAI,CAAC,CAAC;AAEjB,aAAI,CAAC,UAAU,CAAC,GAAG,EAAE;IACnB,IAAA,4BAAqB,EAAC,OAAO,CAAC,CAAC;AACjC,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,iCAAiC,EAAE,CAAC,CAAC,EAAE,EAAE;IAC5C,CAAC,CAAC,SAAS,CACT,WAAW,CAAC,mBAAmB,CAC7B,mDAAmD,CACpD,EACD,UAAU,CACX,CAAC;AACJ,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,mBAAmB,EAAE,CAAC,CAAC,EAAE,EAAE;IAC9B,MAAM,KAAK,GAAG;QACZ,UAAU,EAAE,gBAAgB;QAC5B,YAAY,EAAE,kBAAkB;QAChC,cAAc,EAAE,cAAc;QAC9B,OAAO,EAAE,OAAO;QAChB,aAAa,EAAE,aAAa;QAC5B,cAAc,EAAE,cAAc;KAC/B,CAAC;IAEF,KAAK,MAAM,CAAC,OAAO,EAAE,eAAe,CAAC,IAAI,MAAM,CAAC,OAAO,CAAC,KAAK,CAAC,EAAE;QAC9D,IAAI;YACF,MAAM,aAAa,GAAG,WAAW,CAAC,eAAe,CAC/C,OAAO,EACP,IAAA,yBAAe,EAAC,IAAI,CAAC,CACtB,CAAC;YACF,CAAC,CAAC,SAAS,CAAC,aAAa,EAAE,eAAe,CAAC,CAAC;SAC7C;QAAC,OAAO,CAAC,EAAE;YACV,CAAC,CAAC,IAAI,CAAC,IAAA,gBAAS,EAAC,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;SAC9B;KACF;AACH,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,2BAA2B,EAAE,CAAC,CAAC,EAAE,EAAE;IACtC,MAAM,MAAM,GAAG,IAAA,yBAAe,EAAC,IAAI,CAAC,CAAC;IAErC,IAAA,4BAAqB,EAAC,OAAO,CAAC,CAAC;IAE/B,kCAAkC;IAClC,OAAO,OAAO,CAAC,GAAG,CAAC,0BAA0B,CAAC,CAAC;IAC/C,OAAO,CAAC,GAAG,CAAC,aAAa,CAAC,GAAG,IAAI,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC;IACrD,MAAM,eAAe,GAAG,WAAW,CAAC,yBAAyB,CAAC,MAAM,CAAC,CAAC;IACtE,CAAC,CAAC,SAAS,CAAC,eAAe,EAAE,sBAAsB,CAAC,CAAC;IAErD,mCAAmC;IACnC,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,sBAAsB,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC;IAC/D,OAAO,CAAC,GAAG,CAAC,0BAA0B,CAAC,GAAG,SAAS,CAAC;IACpD,MAAM,OAAO,GAAG,WAAW,CAAC,yBAAyB,CAAC,MAAM,CAAC,CAAC;IAC9D,CAAC,CAAC,SAAS,CAAC,OAAO,EAAE,SAAS,CAAC,CAAC;AAClC,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,yEAAyE,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IAC1F,mDAAmD;IACnD,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,sBAAsB,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC;IAC/D,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,cAAc,CAAC,CAAC,KAAK,CAAC,GAAG,EAAE,CAAC,CAAC;QAC3C,KAAK,EAAE;YACL,YAAY,EAAE,KAAK,CAAC,IAAI,EAAE,CAAC,QAAQ,CAAC,SAAS,CAAC;SAC/C;QACD,QAAQ,EAAE,KAAK,CAAC,IAAI,EAAE,CAAC,QAAQ,CAAC;YAC9B;gBACE,MAAM,EAAE;oBACN;wBACE,IAAI,EAAE,wBAAwB;qBAC/B;iBACF;gBACD,QAAQ,EAAE,wBAAwB;aACnC;SACF,CAAC;KACH,CAAC,CAAC,CAAC;IACJ,CAAC,CAAC,EAAE,CACF,MAAM,WAAW,CAAC,6BAA6B,CAC7C,QAAQ,EACR,IAAA,yBAAe,EAAC,IAAI,CAAC,CACtB,EACD,wBAAwB,CACzB,CAAC;AACJ,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,iFAAiF,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IAClG,mDAAmD;IACnD,KAAK,CAAC,IAAI,CAAC,WAAW,EAAE,sBAAsB,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC;IAC/D,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,cAAc,CAAC,CAAC,KAAK,CAAC,GAAG,EAAE,CAAC,CAAC;QAC3C,KAAK,EAAE;YACL,YAAY,EAAE,KAAK,CAAC,IAAI,EAAE,CAAC,QAAQ,CAAC,SAAS,CAAC;SAC/C;QACD,QAAQ,EAAE,KAAK,CAAC,IAAI,EAAE,CAAC,QAAQ,CAAC;YAC9B;gBACE,MAAM,EAAE;oBACN;wBACE,IAAI,EAAE,wBAAwB;qBAC/B;iBACF;gBACD,QAAQ,EAAE,wBAAwB;aACnC;SACF,CAAC;KACH,CAAC,CAAC,CAAC;IACJ,MAAM,CAAC,CAAC,WAAW,CACjB,KAAK,IAAI,EAAE,CACT,MAAM,WAAW,CAAC,6BAA6B,CAC7C,QAAQ,EACR,IAAA,yBAAe,EAAC,IAAI,CAAC,CACtB,EACH;QACE,OAAO,EACL,+EAA+E;KAClF,CACF,CAAC;AACJ,CAAC,CAAC,CAAC;AAEH,IAAA,aAAI,EAAC,6DAA6D,EAAE,KAAK,EAAE,CAAC,EAAE,EAAE;IAC9E,MAAM,IAAA,iBAAU,EAAC,KAAK,EAAE,MAAM,EAAE,EAAE;QAChC,IAAA,gCAAgB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;QACjC,MAAM,OAAO,GAAG,sBAAsB,CAAC;QACvC,IAAA,qCAAqB,EAAC,EAAE,OAAO,EAAE,CAAC,CAAC;QACnC,MAAM,MAAM,GAAG,MAAM,WAAW,CAAC,eAAe,CAC9C,6DAA6D,OAAO,+BAA+B,EACnG,0CAA0B,EAC1B,yCAAyB,EACzB,oBAAa,CAAC,MAAM,EACpB,IAAA,yBAAe,EAAC,IAAI,CAAC,CACtB,CAAC;QAEF,CAAC,CAAC,EAAE,CAAC,MAAM,CAAC,UAAU,EAAE,UAAU,CAAC,CAAC;QACpC,CAAC,CAAC,EAAE,CAAC,MAAM,CAAC,YAAY,CAAC,EAAE,OAAO,CAAC,CAAC;IACtC,CAAC,CAAC,CAAC;AACL,CAAC,CAAC,CAAC"}

41
lib/testing-utils.js generated
View File

@@ -22,15 +22,28 @@ var __importStar = (this && this.__importStar) || function (mod) {
__setModuleDefault(result, mod);
return result;
};
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.createFeatures = exports.mockCodeQLVersion = exports.mockLanguagesInRepo = exports.mockFeatureFlagApiEndpoint = exports.getRecordingLogger = exports.setupActionsVars = exports.setupTests = void 0;
exports.mockBundleDownloadApi = exports.createFeatures = exports.mockCodeQLVersion = exports.mockLanguagesInRepo = exports.mockFeatureFlagApiEndpoint = exports.getRecordingLogger = exports.setupActionsVars = exports.setupTests = exports.SAMPLE_DEFAULT_CLI_VERSION = exports.SAMPLE_DOTCOM_API_DETAILS = void 0;
const node_util_1 = require("node:util");
const path_1 = __importDefault(require("path"));
const github = __importStar(require("@actions/github"));
const nock = __importStar(require("nock"));
const nock_1 = __importDefault(require("nock"));
const sinon = __importStar(require("sinon"));
const apiClient = __importStar(require("./api-client"));
const CodeQL = __importStar(require("./codeql"));
const util_1 = require("./util");
exports.SAMPLE_DOTCOM_API_DETAILS = {
auth: "token",
url: "https://github.com",
apiURL: "https://api.github.com",
};
exports.SAMPLE_DEFAULT_CLI_VERSION = {
cliVersion: "2.0.0",
variant: util_1.GitHubVariant.DOTCOM,
};
function wrapOutput(context) {
// Function signature taken from Socket.write.
// Note there are two overloads:
@@ -92,7 +105,7 @@ function setupTests(test) {
process.stdout.write(t.context.testOutput);
}
// Undo any modifications made by nock
nock.cleanAll();
nock_1.default.cleanAll();
// Undo any modifications made by sinon
sinon.restore();
// Undo any modifications to the env
@@ -196,4 +209,26 @@ function createFeatures(enabledFeatures) {
};
}
exports.createFeatures = createFeatures;
/**
* Mocks the API for downloading the bundle tagged `tagName`.
*
* @returns the download URL for the bundle. This can be passed to the tools parameter of
* `codeql.setupCodeQL`.
*/
function mockBundleDownloadApi({ apiDetails = exports.SAMPLE_DOTCOM_API_DETAILS, isPinned, repo = "github/codeql-action", platformSpecific = true, tagName, }) {
const platform = process.platform === "win32"
? "win64"
: process.platform === "linux"
? "linux64"
: "osx64";
const baseUrl = apiDetails?.url ?? "https://example.com";
const relativeUrl = apiDetails
? `/${repo}/releases/download/${tagName}/codeql-bundle${platformSpecific ? `-${platform}` : ""}.tar.gz`
: `/download/${tagName}/codeql-bundle.tar.gz`;
(0, nock_1.default)(baseUrl)
.get(relativeUrl)
.replyWithFile(200, path_1.default.join(__dirname, `/../src/testdata/codeql-bundle${isPinned ? "-pinned" : ""}.tar.gz`));
return `${baseUrl}${relativeUrl}`;
}
exports.mockBundleDownloadApi = mockBundleDownloadApi;
//# sourceMappingURL=testing-utils.js.map

File diff suppressed because one or more lines are too long

2
lib/trap-caching.js generated
View File

@@ -91,7 +91,7 @@ async function downloadTrapCaches(codeql, languages, logger) {
}
let baseSha = "unknown";
const eventPath = process.env.GITHUB_EVENT_PATH;
if (actionsUtil.workflowEventName() === "pull_request" &&
if (actionsUtil.getWorkflowEventName() === "pull_request" &&
eventPath !== undefined) {
const event = JSON.parse(fs.readFileSync(path.resolve(eventPath), "utf-8"));
baseSha = event.pull_request?.base?.sha || baseSha;

File diff suppressed because one or more lines are too long

17
lib/upload-lib.js generated
View File

@@ -179,18 +179,25 @@ exports.countResultsInSarif = countResultsInSarif;
// Throws an error if the file is invalid.
function validateSarifFileSchema(sarifFilePath, logger) {
const sarif = JSON.parse(fs.readFileSync(sarifFilePath, "utf8"));
const schema = require("../src/sarif_v2.1.0_schema.json");
const schema = require("../src/sarif-schema-2.1.0.json");
const result = new jsonschema.Validator().validate(sarif, schema);
if (!result.valid) {
// Filter errors related to invalid URIs in the artifactLocation field as this
// is a breaking change. See https://github.com/github/codeql-action/issues/1703
const errors = (result.errors || []).filter((err) => err.argument !== "uri-reference");
const warnings = (result.errors || []).filter((err) => err.argument === "uri-reference");
for (const warning of warnings) {
logger.info(`Warning: '${warning.instance}' is not a valid URI in '${warning.property}'.`);
}
if (errors.length) {
// Output the more verbose error messages in groups as these may be very large.
for (const error of result.errors) {
for (const error of errors) {
logger.startGroup(`Error details: ${error.stack}`);
logger.info(JSON.stringify(error, null, 2));
logger.endGroup();
}
// Set the main error message to the stacks of all the errors.
// This should be of a manageable size and may even give enough to fix the error.
const sarifErrors = result.errors.map((e) => `- ${e.stack}`);
const sarifErrors = errors.map((e) => `- ${e.stack}`);
throw new Error(`Unable to upload "${sarifFilePath}" as it is not valid SARIF:\n${sarifErrors.join("\n")}`);
}
}
@@ -213,7 +220,7 @@ function buildPayload(commitOid, ref, analysisKey, analysisName, zippedSarif, wo
base_ref: undefined,
base_sha: undefined,
};
if (actionsUtil.workflowEventName() === "pull_request") {
if (actionsUtil.getWorkflowEventName() === "pull_request") {
if (commitOid === util.getRequiredEnvParam("GITHUB_SHA") &&
mergeBaseCommitOid) {
// We're uploading results for the merge commit

File diff suppressed because one or more lines are too long

12
lib/upload-lib.test.js generated
View File

@@ -233,6 +233,18 @@ ava_1.default.beforeEach(() => {
t.deepEqual(loggedMessages.length, 1);
t.assert(loggedMessages[0].includes("Pruned 2 results"));
});
(0, ava_1.default)("accept results with invalid artifactLocation.uri value", (t) => {
const loggedMessages = [];
const mockLogger = {
info: (message) => {
loggedMessages.push(message);
},
};
const sarifFile = `${__dirname}/../src/testdata/with-invalid-uri.sarif`;
uploadLib.validateSarifFileSchema(sarifFile, mockLogger);
t.deepEqual(loggedMessages.length, 1);
t.deepEqual(loggedMessages[0], "Warning: 'not a valid URI' is not a valid URI in 'instance.runs[0].results[0].locations[0].physicalLocation.artifactLocation.uri'.");
});
const affectedCodeQLVersion = {
driver: {
name: "CodeQL",

File diff suppressed because one or more lines are too long

28
lib/util.js generated
View File

@@ -673,11 +673,6 @@ function removeDuplicateLocations(locations) {
});
}
function fixInvalidNotifications(sarif, logger) {
if (process.env[shared_environment_1.CODEQL_ACTION_DISABLE_DUPLICATE_LOCATION_FIX] === "true") {
logger.info("SARIF notification object duplicate location fix disabled by the " +
`${shared_environment_1.CODEQL_ACTION_DISABLE_DUPLICATE_LOCATION_FIX} environment variable.`);
return sarif;
}
if (!Array.isArray(sarif.runs)) {
return sarif;
}
@@ -727,10 +722,27 @@ function fixInvalidNotifications(sarif, logger) {
return newSarif;
}
exports.fixInvalidNotifications = fixInvalidNotifications;
/**
* Removes duplicates from the sarif file.
*
* When `CODEQL_ACTION_DISABLE_DUPLICATE_LOCATION_FIX` is set to true, this will
* simply rename the input file to the output file. Otherwise, it will parse the
* input file as JSON, remove duplicate locations from the SARIF notification
* objects, and write the result to the output file.
*
* For context, see documentation of:
* `CODEQL_ACTION_DISABLE_DUPLICATE_LOCATION_FIX`. */
function fixInvalidNotificationsInFile(inputPath, outputPath, logger) {
let sarif = JSON.parse(fs.readFileSync(inputPath, "utf8"));
sarif = fixInvalidNotifications(sarif, logger);
fs.writeFileSync(outputPath, JSON.stringify(sarif));
if (process.env[shared_environment_1.CODEQL_ACTION_DISABLE_DUPLICATE_LOCATION_FIX] === "true") {
logger.info("SARIF notification object duplicate location fix disabled by the " +
`${shared_environment_1.CODEQL_ACTION_DISABLE_DUPLICATE_LOCATION_FIX} environment variable.`);
fs.renameSync(inputPath, outputPath);
}
else {
let sarif = JSON.parse(fs.readFileSync(inputPath, "utf8"));
sarif = fixInvalidNotifications(sarif, logger);
fs.writeFileSync(outputPath, JSON.stringify(sarif));
}
}
exports.fixInvalidNotificationsInFile = fixInvalidNotificationsInFile;
function wrapError(error) {

File diff suppressed because one or more lines are too long

41
lib/workflow.js generated
View File

@@ -65,18 +65,6 @@ function patternIsSuperset(patternA, patternB) {
return patternToRegExp(patternA).test(patternB);
}
exports.patternIsSuperset = patternIsSuperset;
function branchesToArray(branches) {
if (typeof branches === "string") {
return [branches];
}
if (Array.isArray(branches)) {
if (branches.length === 0) {
return "**";
}
return branches;
}
return "**";
}
function toCodedErrors(errors) {
return Object.entries(errors).reduce((acc, [code, message]) => {
acc[code] = { message, code };
@@ -86,8 +74,7 @@ function toCodedErrors(errors) {
// code to send back via status report
// message to add as a warning annotation to the run
exports.WorkflowErrors = toCodedErrors({
MismatchedBranches: `Please make sure that every branch in on.pull_request is also in on.push so that Code Scanning can compare pull requests against the state of the base branch.`,
MissingPushHook: `Please specify an on.push hook so that Code Scanning can compare pull requests against the state of the base branch.`,
MissingPushHook: `Please specify an on.push hook to analyze and see code scanning alerts from the default branch on the Security tab.`,
CheckoutWrongHead: `git checkout HEAD^2 is no longer necessary. Please remove this step as Code Scanning recommends analyzing the merge commit for best results.`,
});
function getWorkflowErrors(doc) {
@@ -132,28 +119,6 @@ function getWorkflowErrors(doc) {
if (!hasPush && hasPullRequest) {
missingPush = true;
}
// if doc.on.pull_request is null that means 'all branches'
// if doc.on.pull_request is undefined that means 'off'
// we only want to check for mismatched branches if pull_request is on.
if (doc.on.pull_request !== undefined) {
const push = branchesToArray(doc.on.push?.branches);
if (push !== "**") {
const pull_request = branchesToArray(doc.on.pull_request?.branches);
if (pull_request !== "**") {
const difference = pull_request.filter((value) => !push.some((o) => patternIsSuperset(o, value)));
if (difference.length > 0) {
// there are branches in pull_request that may not have a baseline
// because we are not building them on push
errors.push(exports.WorkflowErrors.MismatchedBranches);
}
}
else if (push.length > 0) {
// push is set up to run on a subset of branches
// and you could open a PR against a branch with no baseline
errors.push(exports.WorkflowErrors.MismatchedBranches);
}
}
}
}
if (missingPush) {
errors.push(exports.WorkflowErrors.MissingPushHook);
@@ -333,7 +298,9 @@ function getInputOrThrow(workflow, jobName, actionName, inputName, matrixVars) {
* This allows us to test workflow parsing functionality as a CodeQL Action PR check.
*/
function getAnalyzeActionName() {
if ((0, util_1.getRequiredEnvParam)("GITHUB_REPOSITORY") === "github/codeql-action") {
if ((0, util_1.isInTestMode)() ||
process.env["CODEQL_ACTION_TESTING_ENVIRONMENT"] ===
"codeql-action-pr-checks") {
return "./analyze";
}
else {

File diff suppressed because one or more lines are too long

49
lib/workflow.test.js generated
View File

@@ -64,12 +64,6 @@ function errorCodes(actual, expected) {
});
t.deepEqual(...errorCodes(errors, []));
});
(0, ava_1.default)("getWorkflowErrors() when on.pull_requests is a string", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: { push: { branches: ["main"] }, pull_request: { branches: "*" } },
});
t.deepEqual(...errorCodes(errors, [workflow_1.WorkflowErrors.MismatchedBranches]));
});
(0, ava_1.default)("getWorkflowErrors() when on.pull_requests is a string and correct", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: { push: { branches: "*" }, pull_request: { branches: "*" } },
@@ -84,15 +78,6 @@ function errorCodes(actual, expected) {
`));
t.deepEqual(...errorCodes(errors, []));
});
(0, ava_1.default)("getWorkflowErrors() when on.push is mismatched", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: {
push: { branches: ["main"] },
pull_request: { branches: ["feature"] },
},
});
t.deepEqual(...errorCodes(errors, [workflow_1.WorkflowErrors.MismatchedBranches]));
});
(0, ava_1.default)("getWorkflowErrors() when on.push is not mismatched", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: {
@@ -102,15 +87,6 @@ function errorCodes(actual, expected) {
});
t.deepEqual(...errorCodes(errors, []));
});
(0, ava_1.default)("getWorkflowErrors() when on.push is mismatched for pull_request", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: {
push: { branches: ["main"] },
pull_request: { branches: ["main", "feature"] },
},
});
t.deepEqual(...errorCodes(errors, [workflow_1.WorkflowErrors.MismatchedBranches]));
});
(0, ava_1.default)("getWorkflowErrors() for a range of malformed workflows", (t) => {
t.deepEqual(...errorCodes((0, workflow_1.getWorkflowErrors)({
on: {
@@ -175,16 +151,6 @@ function errorCodes(actual, expected) {
},
}), []));
});
(0, ava_1.default)("getWorkflowErrors() when on.pull_request for every branch but push specifies branches", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)(yaml.load(`
name: "CodeQL"
on:
push:
branches: ["main"]
pull_request:
`));
t.deepEqual(...errorCodes(errors, [workflow_1.WorkflowErrors.MismatchedBranches]));
});
(0, ava_1.default)("getWorkflowErrors() when on.pull_request for wildcard branches", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: {
@@ -194,15 +160,6 @@ function errorCodes(actual, expected) {
});
t.deepEqual(...errorCodes(errors, []));
});
(0, ava_1.default)("getWorkflowErrors() when on.pull_request for mismatched wildcard branches", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: {
push: { branches: ["feature/moose"] },
pull_request: { branches: "feature/*" },
},
});
t.deepEqual(...errorCodes(errors, [workflow_1.WorkflowErrors.MismatchedBranches]));
});
(0, ava_1.default)("getWorkflowErrors() when HEAD^2 is checked out", (t) => {
process.env.GITHUB_JOB = "test";
const errors = (0, workflow_1.getWorkflowErrors)({
@@ -218,7 +175,7 @@ function errorCodes(actual, expected) {
(0, ava_1.default)("formatWorkflowErrors() when there are multiple errors", (t) => {
const message = (0, workflow_1.formatWorkflowErrors)([
workflow_1.WorkflowErrors.CheckoutWrongHead,
workflow_1.WorkflowErrors.MismatchedBranches,
workflow_1.WorkflowErrors.MissingPushHook,
]);
t.true(message.startsWith("2 issues were detected with this workflow:"));
});
@@ -229,9 +186,9 @@ function errorCodes(actual, expected) {
(0, ava_1.default)("formatWorkflowCause()", (t) => {
const message = (0, workflow_1.formatWorkflowCause)([
workflow_1.WorkflowErrors.CheckoutWrongHead,
workflow_1.WorkflowErrors.MismatchedBranches,
workflow_1.WorkflowErrors.MissingPushHook,
]);
t.deepEqual(message, "CheckoutWrongHead,MismatchedBranches");
t.deepEqual(message, "CheckoutWrongHead,MissingPushHook");
t.deepEqual((0, workflow_1.formatWorkflowCause)([]), undefined);
});
(0, ava_1.default)("patternIsSuperset()", (t) => {

File diff suppressed because one or more lines are too long

7
node_modules/.package-lock.json generated vendored
View File

@@ -1,6 +1,6 @@
{
"name": "codeql",
"version": "2.3.3",
"version": "2.3.7",
"lockfileVersion": 3,
"requires": true,
"packages": {
@@ -4146,8 +4146,9 @@
}
},
"node_modules/jsonschema": {
"version": "1.2.6",
"integrity": "sha512-SqhURKZG07JyKKeo/ir24QnS4/BV7a6gQy93bUSe4lUdNp0QNpIz2c9elWJQ9dpc5cQYY6cvCzgRwy0MQCLyqA==",
"version": "1.4.1",
"resolved": "https://registry.npmjs.org/jsonschema/-/jsonschema-1.4.1.tgz",
"integrity": "sha512-S6cATIPVv1z0IlxdN+zUk5EPjkGCdnhN4wVSBlvoUO1tOLJootbo9CquNJmbIh4yikWHiUedhRYrNPn1arpEmQ==",
"engines": {
"node": "*"
}

230
node_modules/jsonschema/README.md generated vendored
View File

@@ -1,18 +1,21 @@
[![Build Status](https://secure.travis-ci.org/tdegrunt/jsonschema.svg)](http://travis-ci.org/tdegrunt/jsonschema)
# jsonschema
[JSON schema](http://json-schema.org/) validator, which is designed to be fast and simple to use.
The latest IETF published draft is v6, this library is mostly v4 compatible.
[JSON schema](http://json-schema.org/) validator, which is designed to be fast and simple to use. JSON Schema versions through draft-07 are fully supported.
## Contributing & bugs
Please fork the repository, make the changes in your fork and include tests. Once you're done making changes, send in a pull request.
### Bug reports
Please include a test which shows why the code fails.
## Usage
### Simple
Simple object validation using JSON schemas.
```javascript
@@ -78,6 +81,7 @@ v.addSchema(addressSchema, '/SimpleAddress');
console.log(v.validate(p, schema));
```
### Example for Array schema
```json
var arraySchema = {
"type": "array",
@@ -95,21 +99,42 @@ For a comprehensive, annotated example illustrating all possible validation opti
## Features
### Definitions
All schema definitions are supported, $schema is ignored.
### Types
All types are supported
### Handling `undefined`
`undefined` is not a value known to JSON, and by default, the validator treats it as if it is not invalid. i.e., it will return valid.
```javascript
var res = validate(undefined, {type: 'string'});
res.valid // true
```
This behavior may be changed with the "required" option:
```javascript
var res = validate(undefined, {type: 'string'}, {required: true});
res.valid // false
```
### Formats
#### Disabling the format keyword.
You may disable format validation by providing `disableFormat: true` to the validator
options.
#### String Formats
All formats are supported, phone numbers are expected to follow the [E.123](http://en.wikipedia.org/wiki/E.123) standard.
#### Custom Formats
You may add your own custom format functions. Format functions accept the input
being validated and return a boolean value. If the returned value is `true`, then
validation succeeds. If the returned value is `false`, then validation fails.
@@ -133,27 +158,86 @@ validator.validate('foo', {type: 'string', format: 'myFormat'}).valid; // false
```
### Results
The first error found will be thrown as an `Error` object if `options.throwError` is `true`. Otherwise all results will be appended to the `result.errors` array which also contains the success flag `result.valid`.
When `oneOf` or `anyOf` validations fail, errors that caused any of the sub-schemas referenced therein to fail are not reported, unless `options.nestedErrors` is truthy. This option may be useful when troubleshooting validation errors in complex schemas.
By default, results will be returned in a `ValidatorResult` object with the following properties:
### Custom properties
Specify your own JSON Schema properties with the validator.attributes property:
* `instance`: any.
* `schema`: Schema.
* `errors`: ValidationError[].
* `valid`: boolean.
Each item in `errors` is a `ValidationError` with the following properties:
* path: array. An array of property keys or array offsets, indicating where inside objects or arrays the instance was found.
* property: string. Describes the property path. Starts with `instance`, and is delimited with a dot (`.`).
* message: string. A human-readable message for debugging use. Provided in English and subject to change.
* schema: object. The schema containing the keyword that failed
* instance: any. The instance that failed
* name: string. The keyword within the schema that failed.
* argument: any. Provides information about the keyword that failed.
The validator can be configured to throw in the event of a validation error:
* If the `throwFirst` option is set, the validator will terminate validation at the first encountered error and throw a `ValidatorResultError` object.
* If the `throwAll` option is set, the validator will throw a `ValidatorResultError` object after the entire instance has been validated.
* If the `throwError` option is set, it will throw at the first encountered validation error (like `throwFirst`), but the `ValidationError` object itself will be thrown. Note that, despite the name, this does not inherit from Error like `ValidatorResultError` does.
The `ValidatorResultError` object has the same properties as `ValidatorResult` and additionally inherits from Error.
#### "nestedErrors" option
When `oneOf` or `anyOf` validations fail, errors that caused any of the sub-schemas referenced therein to fail are normally suppressed, because it is not necessary to fix all of them. And in the case of `oneOf`, it would itself be an error to fix all of the listed errors.
This behavior may be configured with `options.nestedErrors`. If truthy, it will emit all the errors from the subschemas. This option may be useful when troubleshooting validation errors in complex schemas:
```javascript
var schema = {
oneOf: [
{ type: 'string', minLength: 32, maxLength: 32 },
{ type: 'string', maxLength: 16 },
{ type: 'number' },
]
};
var validator = new Validator();
var result = validator.validate('This string is 28 chars long', schema, {nestedErrors: true});
// result.toString() reads out:
// 0: instance does not meet minimum length of 32
// 1: instance does not meet maximum length of 16
// 2: instance is not of a type(s) number
// 3: instance is not exactly one from [subschema 0],[subschema 1],[subschema 2]
```
#### Localizing Error Messages
To provide localized, human-readable errors, use the `name` string as a translation key. Feel free to open an issue for support relating to localizing error messages. For example:
```
var localized = result.errors.map(function(err){
return localeService.translate(err.name);
});
```
### Custom keywords
Specify your own JSON Schema keywords with the validator.attributes property:
```javascript
validator.attributes.contains = function validateContains(instance, schema, options, ctx) {
if(typeof instance!='string') return;
if(typeof schema.contains!='string') throw new jsonschema.SchemaError('"contains" expects a string', schema);
if(typeof instance !== 'string') return;
if(typeof schema.contains !== 'string') throw new jsonschema.SchemaError('"contains" expects a string', schema);
if(instance.indexOf(schema.contains)<0){
return 'does not contain the string ' + JSON.stringify(schema.contains);
}
}
var result = validator.validate("i am an instance", { type:"string", contains: "i am" });
var result = validator.validate("I am an instance", { type:"string", contains: "I am" });
// result.valid === true;
```
The instance passes validation if the function returns nothing. A single validation error is produced
if the fuction returns a string. Any number of errors (maybe none at all) may be returned by passing a
if the function returns a string. Any number of errors (maybe none at all) may be returned by passing a
`ValidatorResult` object, which may be used like so:
```javascript
@@ -165,6 +249,7 @@ if the fuction returns a string. Any number of errors (maybe none at all) may be
```
### Dereferencing schemas
Sometimes you may want to download schemas from remote sources, like a database, or over HTTP. When importing a schema,
unknown references are inserted into the `validator.unresolvedRefs` Array. Asynchronously shift elements off this array and import
them:
@@ -184,44 +269,119 @@ function importNextSchema(){
importNextSchema();
```
### Default base URI
Schemas should typically have an `id` with an absolute, full URI. However if the schema you are using contains only relative URI references, the `base` option will be used to resolve these.
This following example would throw a `SchemaError` if the `base` option were unset:
```javascript
var result = validate(["Name"], {
id: "/schema.json",
type: "array",
items: { $ref: "http://example.com/schema.json#/definitions/item" },
definitions: {
item: { type: "string" },
},
}, { base: 'http://example.com/' });
```
### Rewrite Hook
The `rewrite` option lets you change the value of an instance after it has successfully been validated. This will mutate the `instance` passed to the validate function. This can be useful for unmarshalling data and parsing it into native instances, such as changing a string to a `Date` instance.
The `rewrite` option accepts a function with the following arguments:
* instance: any
* schema: object
* options: object
* ctx: object
* return value: any new value for the instance
The value may be removed by returning `undefined`.
If you don't want to change the value, call `return instance`.
Here is an example that can convert a property expecting a date into a Date instance:
```javascript
const schema = {
properties: {
date: {id: 'http://example.com/date', type: 'string'},
},
};
const value = {
date: '2020-09-30T23:39:27.060Z',
};
function unmarshall(instance, schema){
if(schema.id === 'http://example.com/date'){
return new Date(instance);
}
return instance;
}
const v = new Validator();
const res = v.validate(value, schema, {rewrite: unmarshall});
assert(res.instance.date instanceof Date);
```
### Pre-Property Validation Hook
If some processing of properties is required prior to validation a function may be passed via the options parameter of the validate function. For example, say you needed to perform type coercion for some properties:
```javascript
const coercionHook = function (instance, property, schema, options, ctx) {
var value = instance[property];
// See examples/coercion.js
function preValidateProperty(object, key, schema, options, ctx) {
var value = object[key];
if (typeof value === 'undefined') return;
// Skip nulls and undefineds
if (value === null || typeof value == 'undefined') {
return;
}
// If the schema declares a type and the property fails type validation.
if (schema.type && this.attributes.type.call(this, instance, schema, options, ctx.makeChild(schema, property))) {
var types = Array.isArray(schema.type) ? schema.type : [schema.type];
var coerced = undefined;
// Go through the declared types until we find something that we can
// coerce the value into.
for (var i = 0; typeof coerced == 'undefined' && i < types.length; i++) {
// If we support coercion to this type
if (lib.coercions[types[i]]) {
// ...attempt it.
coerced = lib.coercions[types[i]](value);
}
// Test if the schema declares a type, but the type keyword fails validation
if (schema.type && validator.attributes.type.call(validator, value, schema, options, ctx.makeChild(schema, key))) {
// If the type is "number" but the instance is not a number, cast it
if(schema.type==='number' && typeof value!=='number'){
object[key] = parseFloat(value);
return;
}
// If we got a successful coercion we modify the property of the instance.
if (typeof coerced != 'undefined') {
instance[property] = coerced;
// If the type is "string" but the instance is not a string, cast it
if(schema.type==='string' && typeof value!=='string'){
object[key] = String(value).toString();
return;
}
}
}.bind(validator)
};
// And now, to actually perform validation with the coercion hook!
v.validate(instance, schema, { preValidateProperty: coercionHook });
v.validate(instance, schema, { preValidateProperty });
```
### Skip validation of certain keywords
Use the "skipAttributes" option to skip validation of certain keywords. Provide an array of keywords to ignore.
For skipping the "format" keyword, see the disableFormat option.
### Fail on unknown keywords
By default, JSON Schema is supposed to ignore unknown schema keywords.
You can change this behavior to require that all keywords used in a schema have a defined behavior, by using setting the "allowUnknownAttributes" option to false.
This example will throw a `SchemaError`:
```javascript
var schema = {
type: "string",
format: "email",
example: "foo",
};
var result = validate("Name", schema, { allowUnknownAttributes: false });
```
## Tests
Uses [JSON Schema Test Suite](https://github.com/json-schema/JSON-Schema-Test-Suite) as well as our own tests.
You'll need to update and init the git submodules:

View File

@@ -16,13 +16,13 @@ attribute.ignoreProperties = {
'description': true,
'title': true,
// arguments to other properties
'exclusiveMinimum': true,
'exclusiveMaximum': true,
'additionalItems': true,
'then': true,
'else': true,
// special-handled properties
'$schema': true,
'$ref': true,
'extends': true
'extends': true,
};
/**
@@ -47,7 +47,9 @@ validators.type = function validateType (instance, schema, options, ctx) {
var types = Array.isArray(schema.type) ? schema.type : [schema.type];
if (!types.some(this.testType.bind(this, instance, schema, options, ctx))) {
var list = types.map(function (v) {
return v.id && ('<' + v.id + '>') || (v+'');
if(!v) return;
var id = v.$id || v.id;
return id ? ('<' + id + '>') : (v+'');
});
result.addError({
name: 'type',
@@ -60,9 +62,12 @@ validators.type = function validateType (instance, schema, options, ctx) {
function testSchemaNoThrow(instance, options, ctx, callback, schema){
var throwError = options.throwError;
var throwAll = options.throwAll;
options.throwError = false;
options.throwAll = false;
var res = this.validateSchema(instance, schema, options, ctx);
options.throwError = throwError;
options.throwAll = throwAll;
if (!res.valid && callback instanceof Function) {
callback(res);
@@ -91,9 +96,11 @@ validators.anyOf = function validateAnyOf (instance, schema, options, ctx) {
if (!schema.anyOf.some(
testSchemaNoThrow.bind(
this, instance, options, ctx, function(res){inner.importErrors(res);}
))) {
))) {
var list = schema.anyOf.map(function (v, i) {
return (v.id && ('<' + v.id + '>')) || (v.title && JSON.stringify(v.title)) || (v['$ref'] && ('<' + v['$ref'] + '>')) || '[subschema '+i+']';
var id = v.$id || v.id;
if(id) return '<' + id + '>';
return(v.title && JSON.stringify(v.title)) || (v['$ref'] && ('<' + v['$ref'] + '>')) || '[subschema '+i+']';
});
if (options.nestedErrors) {
result.importErrors(inner);
@@ -128,7 +135,8 @@ validators.allOf = function validateAllOf (instance, schema, options, ctx) {
schema.allOf.forEach(function(v, i){
var valid = self.validateSchema(instance, v, options, ctx);
if(!valid.valid){
var msg = (v.id && ('<' + v.id + '>')) || (v.title && JSON.stringify(v.title)) || (v['$ref'] && ('<' + v['$ref'] + '>')) || '[subschema '+i+']';
var id = v.$id || v.id;
var msg = id || (v.title && JSON.stringify(v.title)) || (v['$ref'] && ('<' + v['$ref'] + '>')) || '[subschema '+i+']';
result.addError({
name: 'allOf',
argument: { id: msg, length: valid.errors.length, valid: valid },
@@ -161,9 +169,10 @@ validators.oneOf = function validateOneOf (instance, schema, options, ctx) {
var count = schema.oneOf.filter(
testSchemaNoThrow.bind(
this, instance, options, ctx, function(res) {inner.importErrors(res);}
) ).length;
) ).length;
var list = schema.oneOf.map(function (v, i) {
return (v.id && ('<' + v.id + '>')) || (v.title && JSON.stringify(v.title)) || (v['$ref'] && ('<' + v['$ref'] + '>')) || '[subschema '+i+']';
var id = v.$id || v.id;
return id || (v.title && JSON.stringify(v.title)) || (v['$ref'] && ('<' + v['$ref'] + '>')) || '[subschema '+i+']';
});
if (count!==1) {
if (options.nestedErrors) {
@@ -178,6 +187,70 @@ validators.oneOf = function validateOneOf (instance, schema, options, ctx) {
return result;
};
/**
* Validates "then" or "else" depending on the result of validating "if"
* @param instance
* @param schema
* @param options
* @param ctx
* @return {String|null}
*/
validators.if = function validateIf (instance, schema, options, ctx) {
// Ignore undefined instances
if (instance === undefined) return null;
if (!helpers.isSchema(schema.if)) throw new Error('Expected "if" keyword to be a schema');
var ifValid = testSchemaNoThrow.call(this, instance, options, ctx, null, schema.if);
var result = new ValidatorResult(instance, schema, options, ctx);
var res;
if(ifValid){
if (schema.then === undefined) return;
if (!helpers.isSchema(schema.then)) throw new Error('Expected "then" keyword to be a schema');
res = this.validateSchema(instance, schema.then, options, ctx.makeChild(schema.then));
result.importErrors(res);
}else{
if (schema.else === undefined) return;
if (!helpers.isSchema(schema.else)) throw new Error('Expected "else" keyword to be a schema');
res = this.validateSchema(instance, schema.else, options, ctx.makeChild(schema.else));
result.importErrors(res);
}
return result;
};
function getEnumerableProperty(object, key){
// Determine if `key` shows up in `for(var key in object)`
// First test Object.hasOwnProperty.call as an optimization: that guarantees it does
if(Object.hasOwnProperty.call(object, key)) return object[key];
// Test `key in object` as an optimization; false means it won't
if(!(key in object)) return;
while( (object = Object.getPrototypeOf(object)) ){
if(Object.propertyIsEnumerable.call(object, key)) return object[key];
}
}
/**
* Validates propertyNames
* @param instance
* @param schema
* @param options
* @param ctx
* @return {String|null|ValidatorResult}
*/
validators.propertyNames = function validatePropertyNames (instance, schema, options, ctx) {
if(!this.types.object(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
var subschema = schema.propertyNames!==undefined ? schema.propertyNames : {};
if(!helpers.isSchema(subschema)) throw new SchemaError('Expected "propertyNames" to be a schema (object or boolean)');
for (var property in instance) {
if(getEnumerableProperty(instance, property) !== undefined){
var res = this.validateSchema(property, subschema, options, ctx.makeChild(subschema));
result.importErrors(res);
}
}
return result;
};
/**
* Validates properties
* @param instance
@@ -191,12 +264,17 @@ validators.properties = function validateProperties (instance, schema, options,
var result = new ValidatorResult(instance, schema, options, ctx);
var properties = schema.properties || {};
for (var property in properties) {
if (typeof options.preValidateProperty == 'function') {
options.preValidateProperty(instance, property, properties[property], options, ctx);
var subschema = properties[property];
if(subschema===undefined){
continue;
}else if(subschema===null){
throw new SchemaError('Unexpected null, expected schema in "properties"');
}
var prop = Object.hasOwnProperty.call(instance, property) ? instance[property] : undefined;
var res = this.validateSchema(prop, properties[property], options, ctx.makeChild(properties[property], property));
if (typeof options.preValidateProperty == 'function') {
options.preValidateProperty(instance, property, subschema, options, ctx);
}
var prop = getEnumerableProperty(instance, property);
var res = this.validateSchema(prop, subschema, options, ctx.makeChild(subschema, property));
if(res.instance !== result.instance[property]) result.instance[property] = res.instance;
result.importErrors(res);
}
@@ -206,7 +284,7 @@ validators.properties = function validateProperties (instance, schema, options,
/**
* Test a specific property within in instance against the additionalProperties schema attribute
* This ignores properties with definitions in the properties schema attribute, but no other attributes.
* If too many more types of property-existance tests pop up they may need their own class of tests (like `type` has)
* If too many more types of property-existence tests pop up they may need their own class of tests (like `type` has)
* @private
* @return {boolean}
*/
@@ -219,7 +297,7 @@ function testAdditionalProperty (instance, schema, options, ctx, property, resul
result.addError({
name: 'additionalProperties',
argument: property,
message: "additionalProperty " + JSON.stringify(property) + " exists in instance when not allowed",
message: "is not allowed to have the additional property " + JSON.stringify(property),
});
} else {
var additionalProperties = schema.additionalProperties || {};
@@ -250,17 +328,29 @@ validators.patternProperties = function validatePatternProperties (instance, sch
for (var property in instance) {
var test = true;
for (var pattern in patternProperties) {
var expr = new RegExp(pattern);
if (!expr.test(property)) {
var subschema = patternProperties[pattern];
if(subschema===undefined){
continue;
}else if(subschema===null){
throw new SchemaError('Unexpected null, expected schema in "patternProperties"');
}
try {
var regexp = new RegExp(pattern, 'u');
} catch(_e) {
// In the event the stricter handling causes an error, fall back on the forgiving handling
// DEPRECATED
regexp = new RegExp(pattern);
}
if (!regexp.test(property)) {
continue;
}
test = false;
if (typeof options.preValidateProperty == 'function') {
options.preValidateProperty(instance, property, patternProperties[pattern], options, ctx);
options.preValidateProperty(instance, property, subschema, options, ctx);
}
var res = this.validateSchema(instance[property], patternProperties[pattern], options, ctx.makeChild(patternProperties[pattern], property));
var res = this.validateSchema(instance[property], subschema, options, ctx.makeChild(subschema, property));
if(res.instance !== result.instance[property]) result.instance[property] = res.instance;
result.importErrors(res);
}
@@ -308,7 +398,7 @@ validators.minProperties = function validateMinProperties (instance, schema, opt
name: 'minProperties',
argument: schema.minProperties,
message: "does not meet minimum property length of " + schema.minProperties,
})
});
}
return result;
};
@@ -344,10 +434,14 @@ validators.maxProperties = function validateMaxProperties (instance, schema, opt
validators.items = function validateItems (instance, schema, options, ctx) {
var self = this;
if (!this.types.array(instance)) return;
if (!schema.items) return;
if (schema.items===undefined) return;
var result = new ValidatorResult(instance, schema, options, ctx);
instance.every(function (value, i) {
var items = Array.isArray(schema.items) ? (schema.items[i] || schema.additionalItems) : schema.items;
if(Array.isArray(schema.items)){
var items = schema.items[i]===undefined ? schema.additionalItems : schema.items[i];
}else{
var items = schema.items;
}
if (items === undefined) {
return true;
}
@@ -366,6 +460,34 @@ validators.items = function validateItems (instance, schema, options, ctx) {
return result;
};
/**
* Validates the "contains" keyword
* @param instance
* @param schema
* @param options
* @param ctx
* @return {String|null|ValidatorResult}
*/
validators.contains = function validateContains (instance, schema, options, ctx) {
var self = this;
if (!this.types.array(instance)) return;
if (schema.contains===undefined) return;
if (!helpers.isSchema(schema.contains)) throw new Error('Expected "contains" keyword to be a schema');
var result = new ValidatorResult(instance, schema, options, ctx);
var count = instance.some(function (value, i) {
var res = self.validateSchema(value, schema.contains, options, ctx.makeChild(schema.contains, i));
return res.errors.length===0;
});
if(count===false){
result.addError({
name: 'contains',
argument: schema.contains,
message: "must contain an item matching given schema",
});
}
return result;
};
/**
* Validates minimum and exclusiveMinimum when the type of the instance value is a number.
* @param instance
@@ -375,18 +497,22 @@ validators.items = function validateItems (instance, schema, options, ctx) {
validators.minimum = function validateMinimum (instance, schema, options, ctx) {
if (!this.types.number(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
var valid = true;
if (schema.exclusiveMinimum && schema.exclusiveMinimum === true) {
valid = instance > schema.minimum;
if(!(instance > schema.minimum)){
result.addError({
name: 'minimum',
argument: schema.minimum,
message: "must be greater than " + schema.minimum,
});
}
} else {
valid = instance >= schema.minimum;
}
if (!valid) {
result.addError({
name: 'minimum',
argument: schema.minimum,
message: "must have a minimum value of " + schema.minimum,
});
if(!(instance >= schema.minimum)){
result.addError({
name: 'minimum',
argument: schema.minimum,
message: "must be greater than or equal to " + schema.minimum,
});
}
}
return result;
};
@@ -400,17 +526,65 @@ validators.minimum = function validateMinimum (instance, schema, options, ctx) {
validators.maximum = function validateMaximum (instance, schema, options, ctx) {
if (!this.types.number(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
var valid;
if (schema.exclusiveMaximum && schema.exclusiveMaximum === true) {
valid = instance < schema.maximum;
if(!(instance < schema.maximum)){
result.addError({
name: 'maximum',
argument: schema.maximum,
message: "must be less than " + schema.maximum,
});
}
} else {
valid = instance <= schema.maximum;
if(!(instance <= schema.maximum)){
result.addError({
name: 'maximum',
argument: schema.maximum,
message: "must be less than or equal to " + schema.maximum,
});
}
}
return result;
};
/**
* Validates the number form of exclusiveMinimum when the type of the instance value is a number.
* @param instance
* @param schema
* @return {String|null}
*/
validators.exclusiveMinimum = function validateExclusiveMinimum (instance, schema, options, ctx) {
// Support the boolean form of exclusiveMinimum, which is handled by the "minimum" keyword.
if(typeof schema.exclusiveMinimum === 'boolean') return;
if (!this.types.number(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
var valid = instance > schema.exclusiveMinimum;
if (!valid) {
result.addError({
name: 'maximum',
argument: schema.maximum,
message: "must have a maximum value of " + schema.maximum,
name: 'exclusiveMinimum',
argument: schema.exclusiveMinimum,
message: "must be strictly greater than " + schema.exclusiveMinimum,
});
}
return result;
};
/**
* Validates the number form of exclusiveMaximum when the type of the instance value is a number.
* @param instance
* @param schema
* @return {String|null}
*/
validators.exclusiveMaximum = function validateExclusiveMaximum (instance, schema, options, ctx) {
// Support the boolean form of exclusiveMaximum, which is handled by the "maximum" keyword.
if(typeof schema.exclusiveMaximum === 'boolean') return;
if (!this.types.number(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
var valid = instance < schema.exclusiveMaximum;
if (!valid) {
result.addError({
name: 'exclusiveMaximum',
argument: schema.exclusiveMaximum,
message: "must be strictly less than " + schema.exclusiveMaximum,
});
}
return result;
@@ -444,7 +618,7 @@ var validateMultipleOfOrDivisbleBy = function validateMultipleOfOrDivisbleBy (in
result.addError({
name: validationType,
argument: validationArgument,
message: errorMessage + JSON.stringify(validationArgument)
message: errorMessage + JSON.stringify(validationArgument),
});
}
@@ -458,7 +632,7 @@ var validateMultipleOfOrDivisbleBy = function validateMultipleOfOrDivisbleBy (in
* @return {String|null}
*/
validators.multipleOf = function validateMultipleOf (instance, schema, options, ctx) {
return validateMultipleOfOrDivisbleBy.call(this, instance, schema, options, ctx, "multipleOf", "is not a multiple of (divisible by) ");
return validateMultipleOfOrDivisbleBy.call(this, instance, schema, options, ctx, "multipleOf", "is not a multiple of (divisible by) ");
};
/**
@@ -480,14 +654,14 @@ validators.divisibleBy = function validateDivisibleBy (instance, schema, options
validators.required = function validateRequired (instance, schema, options, ctx) {
var result = new ValidatorResult(instance, schema, options, ctx);
if (instance === undefined && schema.required === true) {
// A boolean form is implemented for reverse-compatability with schemas written against older drafts
// A boolean form is implemented for reverse-compatibility with schemas written against older drafts
result.addError({
name: 'required',
message: "is required"
message: "is required",
});
} else if (this.types.object(instance) && Array.isArray(schema.required)) {
schema.required.forEach(function(n){
if(instance[n]===undefined){
if(getEnumerableProperty(instance, n)===undefined){
result.addError({
name: 'required',
argument: n,
@@ -508,7 +682,15 @@ validators.required = function validateRequired (instance, schema, options, ctx)
validators.pattern = function validatePattern (instance, schema, options, ctx) {
if (!this.types.string(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
if (!instance.match(schema.pattern)) {
var pattern = schema.pattern;
try {
var regexp = new RegExp(pattern, 'u');
} catch(_e) {
// In the event the stricter handling causes an error, fall back on the forgiving handling
// DEPRECATED
regexp = new RegExp(pattern);
}
if (!instance.match(regexp)) {
result.addError({
name: 'pattern',
argument: schema.pattern,
@@ -633,32 +815,6 @@ validators.maxItems = function validateMaxItems (instance, schema, options, ctx)
return result;
};
/**
* Validates that every item in an instance array is unique, when instance is an array
* @param instance
* @param schema
* @param options
* @param ctx
* @return {String|null|ValidatorResult}
*/
validators.uniqueItems = function validateUniqueItems (instance, schema, options, ctx) {
if (!this.types.array(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
function testArrays (v, i, a) {
for (var j = i + 1; j < a.length; j++) if (helpers.deepCompareStrict(v, a[j])) {
return false;
}
return true;
}
if (!instance.every(testArrays)) {
result.addError({
name: 'uniqueItems',
message: "contains duplicate item",
});
}
return result;
};
/**
* Deep compares arrays for duplicates
* @param v
@@ -683,6 +839,7 @@ function testArrays (v, i, a) {
* @return {String|null}
*/
validators.uniqueItems = function validateUniqueItems (instance, schema, options, ctx) {
if (schema.uniqueItems!==true) return;
if (!this.types.array(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
if (!instance.every(testArrays)) {
@@ -806,7 +963,8 @@ validators.not = validators.disallow = function validateNot (instance, schema, o
if(!Array.isArray(notTypes)) notTypes=[notTypes];
notTypes.forEach(function (type) {
if (self.testType(instance, schema, options, ctx, type)) {
var schemaId = type && type.id && ('<' + type.id + '>') || type;
var id = type && (type.$id || type.id);
var schemaId = id || type;
result.addError({
name: 'not',
argument: schemaId,

View File

@@ -2,21 +2,23 @@
var uri = require('url');
var ValidationError = exports.ValidationError = function ValidationError (message, instance, schema, propertyPath, name, argument) {
if (propertyPath) {
this.property = propertyPath;
var ValidationError = exports.ValidationError = function ValidationError (message, instance, schema, path, name, argument) {
if(Array.isArray(path)){
this.path = path;
this.property = path.reduce(function(sum, item){
return sum + makeSuffix(item);
}, 'instance');
}else if(path !== undefined){
this.property = path;
}
if (message) {
this.message = message;
}
if (schema) {
if (schema.id) {
this.schema = schema.id;
} else {
this.schema = schema;
}
var id = schema.$id || schema.id;
this.schema = id || schema;
}
if (instance) {
if (instance !== undefined) {
this.instance = instance;
}
this.name = name;
@@ -31,27 +33,33 @@ ValidationError.prototype.toString = function toString() {
var ValidatorResult = exports.ValidatorResult = function ValidatorResult(instance, schema, options, ctx) {
this.instance = instance;
this.schema = schema;
this.options = options;
this.path = ctx.path;
this.propertyPath = ctx.propertyPath;
this.errors = [];
this.throwError = options && options.throwError;
this.throwFirst = options && options.throwFirst;
this.throwAll = options && options.throwAll;
this.disableFormat = options && options.disableFormat === true;
};
ValidatorResult.prototype.addError = function addError(detail) {
var err;
if (typeof detail == 'string') {
err = new ValidationError(detail, this.instance, this.schema, this.propertyPath);
err = new ValidationError(detail, this.instance, this.schema, this.path);
} else {
if (!detail) throw new Error('Missing error detail');
if (!detail.message) throw new Error('Missing error message');
if (!detail.name) throw new Error('Missing validator type');
err = new ValidationError(detail.message, this.instance, this.schema, this.propertyPath, detail.name, detail.argument);
err = new ValidationError(detail.message, this.instance, this.schema, this.path, detail.name, detail.argument);
}
if (this.throwError) {
this.errors.push(err);
if (this.throwFirst) {
throw new ValidatorResultError(this);
}else if(this.throwError){
throw err;
}
this.errors.push(err);
return err;
};
@@ -59,7 +67,7 @@ ValidatorResult.prototype.importErrors = function importErrors(res) {
if (typeof res == 'string' || (res && res.validatorType)) {
this.addError(res);
} else if (res && res.errors) {
Array.prototype.push.apply(this.errors, res.errors);
this.errors = this.errors.concat(res.errors);
}
};
@@ -74,6 +82,20 @@ Object.defineProperty(ValidatorResult.prototype, "valid", { get: function() {
return !this.errors.length;
} });
module.exports.ValidatorResultError = ValidatorResultError;
function ValidatorResultError(result) {
if(Error.captureStackTrace){
Error.captureStackTrace(this, ValidatorResultError);
}
this.instance = result.instance;
this.schema = result.schema;
this.options = result.options;
this.errors = result.errors;
}
ValidatorResultError.prototype = new Error();
ValidatorResultError.prototype.constructor = ValidatorResultError;
ValidatorResultError.prototype.name = "Validation Error";
/**
* Describes a problem with a Schema which prevents validation of an instance
* @name SchemaError
@@ -86,14 +108,22 @@ var SchemaError = exports.SchemaError = function SchemaError (msg, schema) {
Error.captureStackTrace(this, SchemaError);
};
SchemaError.prototype = Object.create(Error.prototype,
{ constructor: {value: SchemaError, enumerable: false}
, name: {value: 'SchemaError', enumerable: false}
{
constructor: {value: SchemaError, enumerable: false},
name: {value: 'SchemaError', enumerable: false},
});
var SchemaContext = exports.SchemaContext = function SchemaContext (schema, options, propertyPath, base, schemas) {
var SchemaContext = exports.SchemaContext = function SchemaContext (schema, options, path, base, schemas) {
this.schema = schema;
this.options = options;
this.propertyPath = propertyPath;
if(Array.isArray(path)){
this.path = path;
this.propertyPath = path.reduce(function(sum, item){
return sum + makeSuffix(item);
}, 'instance');
}else{
this.propertyPath = path;
}
this.base = base;
this.schemas = schemas;
};
@@ -103,36 +133,60 @@ SchemaContext.prototype.resolve = function resolve (target) {
};
SchemaContext.prototype.makeChild = function makeChild(schema, propertyName){
var propertyPath = (propertyName===undefined) ? this.propertyPath : this.propertyPath+makeSuffix(propertyName);
var base = uri.resolve(this.base, schema.id||'');
var ctx = new SchemaContext(schema, this.options, propertyPath, base, Object.create(this.schemas));
if(schema.id && !ctx.schemas[base]){
var path = (propertyName===undefined) ? this.path : this.path.concat([propertyName]);
var id = schema.$id || schema.id;
var base = uri.resolve(this.base, id||'');
var ctx = new SchemaContext(schema, this.options, path, base, Object.create(this.schemas));
if(id && !ctx.schemas[base]){
ctx.schemas[base] = schema;
}
return ctx;
}
};
var FORMAT_REGEXPS = exports.FORMAT_REGEXPS = {
// 7.3.1. Dates, Times, and Duration
'date-time': /^\d{4}-(?:0[0-9]{1}|1[0-2]{1})-(3[01]|0[1-9]|[12][0-9])[tT ](2[0-4]|[01][0-9]):([0-5][0-9]):(60|[0-5][0-9])(\.\d+)?([zZ]|[+-]([0-5][0-9]):(60|[0-5][0-9]))$/,
'date': /^\d{4}-(?:0[0-9]{1}|1[0-2]{1})-(3[01]|0[1-9]|[12][0-9])$/,
'time': /^(2[0-4]|[01][0-9]):([0-5][0-9]):(60|[0-5][0-9])$/,
'duration': /P(T\d+(H(\d+M(\d+S)?)?|M(\d+S)?|S)|\d+(D|M(\d+D)?|Y(\d+M(\d+D)?)?)(T\d+(H(\d+M(\d+S)?)?|M(\d+S)?|S))?|\d+W)/i,
// 7.3.2. Email Addresses
// TODO: fix the email production
'email': /^(?:[\w\!\#\$\%\&\'\*\+\-\/\=\?\^\`\{\|\}\~]+\.)*[\w\!\#\$\%\&\'\*\+\-\/\=\?\^\`\{\|\}\~]+@(?:(?:(?:[a-zA-Z0-9](?:[a-zA-Z0-9\-](?!\.)){0,61}[a-zA-Z0-9]?\.)+[a-zA-Z0-9](?:[a-zA-Z0-9\-](?!$)){0,61}[a-zA-Z0-9]?)|(?:\[(?:(?:[01]?\d{1,2}|2[0-4]\d|25[0-5])\.){3}(?:[01]?\d{1,2}|2[0-4]\d|25[0-5])\]))$/,
'ip-address': /^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$/,
'ipv6': /^\s*((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(%.+)?\s*$/,
'uri': /^[a-zA-Z][a-zA-Z0-9+-.]*:[^\s]*$/,
'idn-email': /^("(?:[!#-\[\]-\u{10FFFF}]|\\[\t -\u{10FFFF}])*"|[!#-'*+\-/-9=?A-Z\^-\u{10FFFF}](?:\.?[!#-'*+\-/-9=?A-Z\^-\u{10FFFF}])*)@([!#-'*+\-/-9=?A-Z\^-\u{10FFFF}](?:\.?[!#-'*+\-/-9=?A-Z\^-\u{10FFFF}])*|\[[!-Z\^-\u{10FFFF}]*\])$/u,
'color': /^(#?([0-9A-Fa-f]{3}){1,2}\b|aqua|black|blue|fuchsia|gray|green|lime|maroon|navy|olive|orange|purple|red|silver|teal|white|yellow|(rgb\(\s*\b([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\b\s*,\s*\b([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\b\s*,\s*\b([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\b\s*\))|(rgb\(\s*(\d?\d%|100%)+\s*,\s*(\d?\d%|100%)+\s*,\s*(\d?\d%|100%)+\s*\)))$/,
// 7.3.3. Hostnames
// 7.3.4. IP Addresses
'ip-address': /^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$/,
// FIXME whitespace is invalid
'ipv6': /^\s*((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(%.+)?\s*$/,
// 7.3.5. Resource Identifiers
// TODO: A more accurate regular expression for "uri" goes:
// [A-Za-z][+\-.0-9A-Za-z]*:((/(/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?)?)?#(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*|(/(/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?[/?]|[!$&-.0-;=?-Z_a-z~])|/?%[0-9A-Fa-f]{2}|[!$&-.0-;=?-Z_a-z~])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*(#(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*)?|/(/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+(:\d*)?|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?:\d*|\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)?)?
'uri': /^[a-zA-Z][a-zA-Z0-9+.-]*:[^\s]*$/,
'uri-reference': /^(((([A-Za-z][+\-.0-9A-Za-z]*(:%[0-9A-Fa-f]{2}|:[!$&-.0-;=?-Z_a-z~]|[/?])|\?)(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*|([A-Za-z][+\-.0-9A-Za-z]*:?)?)|([A-Za-z][+\-.0-9A-Za-z]*:)?\/((%[0-9A-Fa-f]{2}|\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?[/?]|[!$&-.0-;=?-Z_a-z~])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*|(\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?)?))#(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*|(([A-Za-z][+\-.0-9A-Za-z]*)?%[0-9A-Fa-f]{2}|[!$&-.0-9;=@_~]|[A-Za-z][+\-.0-9A-Za-z]*[!$&-*,;=@_~])(%[0-9A-Fa-f]{2}|[!$&-.0-9;=@-Z_a-z~])*((([/?](%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*)?#|[/?])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*)?|([A-Za-z][+\-.0-9A-Za-z]*(:%[0-9A-Fa-f]{2}|:[!$&-.0-;=?-Z_a-z~]|[/?])|\?)(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*|([A-Za-z][+\-.0-9A-Za-z]*:)?\/((%[0-9A-Fa-f]{2}|\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?[/?]|[!$&-.0-;=?-Z_a-z~])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*|\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+(:\d*)?|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?:\d*|\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)?|[A-Za-z][+\-.0-9A-Za-z]*:?)?$/,
'iri': /^[a-zA-Z][a-zA-Z0-9+.-]*:[^\s]*$/,
'iri-reference': /^(((([A-Za-z][+\-.0-9A-Za-z]*(:%[0-9A-Fa-f]{2}|:[!$&-.0-;=?-Z_a-z~-\u{10FFFF}]|[/?])|\?)(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*|([A-Za-z][+\-.0-9A-Za-z]*:?)?)|([A-Za-z][+\-.0-9A-Za-z]*:)?\/((%[0-9A-Fa-f]{2}|\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~-\u{10FFFF}])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~-\u{10FFFF}]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?[/?]|[!$&-.0-;=?-Z_a-z~-\u{10FFFF}])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*|(\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~-\u{10FFFF}])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~-\u{10FFFF}]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?)?))#(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*|(([A-Za-z][+\-.0-9A-Za-z]*)?%[0-9A-Fa-f]{2}|[!$&-.0-9;=@_~-\u{10FFFF}]|[A-Za-z][+\-.0-9A-Za-z]*[!$&-*,;=@_~-\u{10FFFF}])(%[0-9A-Fa-f]{2}|[!$&-.0-9;=@-Z_a-z~-\u{10FFFF}])*((([/?](%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*)?#|[/?])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*)?|([A-Za-z][+\-.0-9A-Za-z]*(:%[0-9A-Fa-f]{2}|:[!$&-.0-;=?-Z_a-z~-\u{10FFFF}]|[/?])|\?)(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*|([A-Za-z][+\-.0-9A-Za-z]*:)?\/((%[0-9A-Fa-f]{2}|\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~-\u{10FFFF}])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~-\u{10FFFF}]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?[/?]|[!$&-.0-;=?-Z_a-z~-\u{10FFFF}])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*|\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~-\u{10FFFF}])+(:\d*)?|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~-\u{10FFFF}]+)?|[.0-:A-Fa-f]+)\])?:\d*|\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~-\u{10FFFF}]+)?|[.0-:A-Fa-f]+)\])?)?|[A-Za-z][+\-.0-9A-Za-z]*:?)?$/u,
'uuid': /^[0-9A-F]{8}-[0-9A-F]{4}-[0-9A-F]{4}-[0-9A-F]{4}-[0-9A-F]{12}$/i,
// 7.3.6. uri-template
'uri-template': /(%[0-9a-f]{2}|[!#$&(-;=?@\[\]_a-z~]|\{[!#&+,./;=?@|]?(%[0-9a-f]{2}|[0-9_a-z])(\.?(%[0-9a-f]{2}|[0-9_a-z]))*(:[1-9]\d{0,3}|\*)?(,(%[0-9a-f]{2}|[0-9_a-z])(\.?(%[0-9a-f]{2}|[0-9_a-z]))*(:[1-9]\d{0,3}|\*)?)*\})*/iu,
// 7.3.7. JSON Pointers
'json-pointer': /^(\/([\x00-\x2e0-@\[-}\x7f]|~[01])*)*$/iu,
'relative-json-pointer': /^\d+(#|(\/([\x00-\x2e0-@\[-}\x7f]|~[01])*)*)$/iu,
// hostname regex from: http://stackoverflow.com/a/1420225/5628
'hostname': /^(?=.{1,255}$)[0-9A-Za-z](?:(?:[0-9A-Za-z]|-){0,61}[0-9A-Za-z])?(?:\.[0-9A-Za-z](?:(?:[0-9A-Za-z]|-){0,61}[0-9A-Za-z])?)*\.?$/,
'host-name': /^(?=.{1,255}$)[0-9A-Za-z](?:(?:[0-9A-Za-z]|-){0,61}[0-9A-Za-z])?(?:\.[0-9A-Za-z](?:(?:[0-9A-Za-z]|-){0,61}[0-9A-Za-z])?)*\.?$/,
'alpha': /^[a-zA-Z]+$/,
'alphanumeric': /^[a-zA-Z0-9]+$/,
'utc-millisec': function (input) {
return (typeof input === 'string') && parseFloat(input) === parseInt(input, 10) && !isNaN(input);
},
// 7.3.8. regex
'regex': function (input) {
var result = true;
try {
@@ -142,8 +196,15 @@ var FORMAT_REGEXPS = exports.FORMAT_REGEXPS = {
}
return result;
},
'style': /\s*(.+?):\s*([^;]+);?/,
'phone': /^\+(?:[0-9] ?){6,14}[0-9]$/
// Other definitions
// "style" was removed from JSON Schema in draft-4 and is deprecated
'style': /[\r\n\t ]*[^\r\n\t ][^:]*:[\r\n\t ]*[^\r\n\t ;]*[\r\n\t ]*;?/,
// "color" was removed from JSON Schema in draft-4 and is deprecated
'color': /^(#?([0-9A-Fa-f]{3}){1,2}\b|aqua|black|blue|fuchsia|gray|green|lime|maroon|navy|olive|orange|purple|red|silver|teal|white|yellow|(rgb\(\s*\b([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\b\s*,\s*\b([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\b\s*,\s*\b([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\b\s*\))|(rgb\(\s*(\d?\d%|100%)+\s*,\s*(\d?\d%|100%)+\s*,\s*(\d?\d%|100%)+\s*\)))$/,
'phone': /^\+(?:[0-9] ?){6,14}[0-9]$/,
'alpha': /^[a-zA-Z]+$/,
'alphanumeric': /^[a-zA-Z0-9]+$/,
};
FORMAT_REGEXPS.regexp = FORMAT_REGEXPS.regex;
@@ -212,10 +273,10 @@ exports.deepCompareStrict = function deepCompareStrict (a, b) {
function deepMerger (target, dst, e, i) {
if (typeof e === 'object') {
dst[i] = deepMerge(target[i], e)
dst[i] = deepMerge(target[i], e);
} else {
if (target.indexOf(e) === -1) {
dst.push(e)
dst.push(e);
}
}
}
@@ -232,7 +293,7 @@ function copyistWithDeepMerge (target, src, dst, key) {
if (!target[key]) {
dst[key] = src[key];
} else {
dst[key] = deepMerge(target[key], src[key])
dst[key] = deepMerge(target[key], src[key]);
}
}
}
@@ -253,7 +314,7 @@ function deepMerge (target, src) {
}
return dst;
};
}
module.exports.deepMerge = deepMerge;
@@ -284,9 +345,9 @@ function pathEncoder (v) {
* @return {String}
*/
exports.encodePath = function encodePointer(a){
// ~ must be encoded explicitly because hacks
// the slash is encoded by encodeURIComponent
return a.map(pathEncoder).join('');
// ~ must be encoded explicitly because hacks
// the slash is encoded by encodeURIComponent
return a.map(pathEncoder).join('');
};
@@ -323,3 +384,7 @@ exports.getDecimalPlaces = function getDecimalPlaces(number) {
return decimalPlaces;
};
exports.isSchema = function isSchema(val){
return (typeof val === 'object' && val) || (typeof val === 'boolean');
};

View File

@@ -29,6 +29,7 @@ export declare class ValidatorResult {
export declare class ValidationError {
constructor(message?: string, instance?: any, schema?: Schema, propertyPath?: any, name?: string, argument?: any);
path: (string|number)[];
property: string;
message: string;
schema: string|Schema;
@@ -48,6 +49,7 @@ export declare class SchemaError extends Error{
export declare function validate(instance: any, schema: any, options?: Options): ValidatorResult
export interface Schema {
$id?: string
id?: string
$schema?: string
$ref?: string
@@ -55,9 +57,9 @@ export interface Schema {
description?: string
multipleOf?: number
maximum?: number
exclusiveMaximum?: boolean
exclusiveMaximum?: number | boolean
minimum?: number
exclusiveMinimum?: boolean
exclusiveMinimum?: number | boolean
maxLength?: number
minLength?: number
pattern?: string | RegExp
@@ -82,6 +84,7 @@ export interface Schema {
dependencies?: {
[name: string]: Schema | string[]
}
const?: any
'enum'?: any[]
type?: string | string[]
format?: string
@@ -89,27 +92,39 @@ export interface Schema {
anyOf?: Schema[]
oneOf?: Schema[]
not?: Schema
if?: Schema
then?: Schema
else?: Schema
}
export interface Options {
skipAttributes?: string[];
allowUnknownAttributes?: boolean;
preValidateProperty?: PreValidatePropertyFunction;
rewrite?: RewriteFunction;
propertyName?: string;
base?: string;
throwError?: boolean;
required?: boolean;
throwFirst?: boolean;
throwAll?: boolean;
nestedErrors?: boolean;
}
export interface RewriteFunction {
(instance: any, schema: Schema, options: Options, ctx: SchemaContext): any;
}
export interface PreValidatePropertyFunction {
(instance: any, key: string, schema: Schema, options: Options, ctx: SchemaContext): any;
}
export interface SchemaContext {
schema: Schema;
options: Options;
propertyPath: string;
base: string;
schemas: {[base: string]: Schema};
makeChild: (schema: Schema, key: string) => SchemaContext;
}
export interface CustomFormat {

View File

@@ -3,6 +3,7 @@
var Validator = module.exports.Validator = require('./validator');
module.exports.ValidatorResult = require('./helpers').ValidatorResult;
module.exports.ValidatorResultError = require('./helpers').ValidatorResultError;
module.exports.ValidationError = require('./helpers').ValidationError;
module.exports.SchemaError = require('./helpers').SchemaError;
module.exports.SchemaScanResult = require('./scan').SchemaScanResult;

View File

@@ -1,3 +1,4 @@
"use strict";
var urilib = require('url');
var helpers = require('./helpers');
@@ -23,13 +24,14 @@ module.exports.scan = function scan(base, schema){
ref[resolvedUri] = ref[resolvedUri] ? ref[resolvedUri]+1 : 0;
return;
}
var ourBase = schema.id ? urilib.resolve(baseuri, schema.id) : baseuri;
var id = schema.$id || schema.id;
var ourBase = id ? urilib.resolve(baseuri, id) : baseuri;
if (ourBase) {
// If there's no fragment, append an empty one
if(ourBase.indexOf('#')<0) ourBase += '#';
if(found[ourBase]){
if(!helpers.deepCompareStrict(found[ourBase], schema)){
throw new Error('Schema <'+schema+'> already exists with different definition');
throw new Error('Schema <'+ourBase+'> already exists with different definition');
}
return found[ourBase];
}
@@ -68,7 +70,6 @@ module.exports.scan = function scan(base, schema){
var found = {};
var ref = {};
var schemaUri = base;
scanSchema(base, schema);
return new SchemaScanResult(found, ref);
}
};

View File

@@ -6,6 +6,7 @@ var attribute = require('./attribute');
var helpers = require('./helpers');
var scanSchema = require('./scan').scan;
var ValidatorResult = helpers.ValidatorResult;
var ValidatorResultError = helpers.ValidatorResultError;
var SchemaError = helpers.SchemaError;
var SchemaContext = helpers.SchemaContext;
//var anonymousBase = 'vnd.jsonschema:///';
@@ -49,13 +50,15 @@ Validator.prototype.addSchema = function addSchema (schema, base) {
return null;
}
var scan = scanSchema(base||anonymousBase, schema);
var ourUri = base || schema.id;
var ourUri = base || schema.$id || schema.id;
for(var uri in scan.id){
this.schemas[uri] = scan.id[uri];
}
for(var uri in scan.ref){
// If this schema is already defined, it will be filtered out by the next step
this.unresolvedRefs.push(uri);
}
// Remove newly defined schemas from unresolvedRefs
this.unresolvedRefs = this.unresolvedRefs.filter(function(uri){
return typeof self.schemas[uri]==='undefined';
});
@@ -103,14 +106,18 @@ Validator.prototype.getSchema = function getSchema (urn) {
* @return {Array}
*/
Validator.prototype.validate = function validate (instance, schema, options, ctx) {
if((typeof schema !== 'boolean' && typeof schema !== 'object') || schema === null){
throw new SchemaError('Expected `schema` to be an object or boolean');
}
if (!options) {
options = {};
}
var propertyName = options.propertyName || 'instance';
// This section indexes subschemas in the provided schema, so they don't need to be added with Validator#addSchema
// This will work so long as the function at uri.resolve() will resolve a relative URI to a relative URI
var base = urilib.resolve(options.base||anonymousBase, schema.id||'');
var id = schema.$id || schema.id;
var base = urilib.resolve(options.base||anonymousBase, id||'');
if(!ctx){
ctx = new SchemaContext(schema, options, propertyName, base, Object.create(this.schemas));
ctx = new SchemaContext(schema, options, [], base, Object.create(this.schemas));
if (!ctx.schemas[base]) {
ctx.schemas[base] = schema;
}
@@ -120,14 +127,18 @@ Validator.prototype.validate = function validate (instance, schema, options, ctx
ctx.schemas[n] = sch;
}
}
if (schema) {
var result = this.validateSchema(instance, schema, options, ctx);
if (!result) {
throw new Error('Result undefined');
}
if(options.required && instance===undefined){
var result = new ValidatorResult(instance, schema, options, ctx);
result.addError('is required, but is undefined');
return result;
}
throw new SchemaError('no schema specified', schema);
var result = this.validateSchema(instance, schema, options, ctx);
if (!result) {
throw new Error('Result undefined');
}else if(options.throwAll && result.errors.length){
throw new ValidatorResultError(result);
}
return result;
};
/**
@@ -152,7 +163,7 @@ function shouldResolve(schema) {
Validator.prototype.validateSchema = function validateSchema (instance, schema, options, ctx) {
var result = new ValidatorResult(instance, schema, options, ctx);
// Support for the true/false schemas
// Support for the true/false schemas
if(typeof schema==='boolean') {
if(schema===true){
// `true` is always valid
@@ -180,10 +191,10 @@ Validator.prototype.validateSchema = function validateSchema (instance, schema,
}
// If passed a string argument, load that schema URI
var switchSchema;
if (switchSchema = shouldResolve(schema)) {
var switchSchema = shouldResolve(schema);
if (switchSchema) {
var resolved = this.resolve(schema, switchSchema, ctx);
var subctx = new SchemaContext(resolved.subschema, options, ctx.propertyPath, resolved.switchSchema, ctx.schemas);
var subctx = new SchemaContext(resolved.subschema, options, ctx.path, resolved.switchSchema, ctx.schemas);
return this.validateSchema(instance, resolved.subschema, options, subctx);
}
@@ -220,7 +231,7 @@ Validator.prototype.validateSchema = function validateSchema (instance, schema,
*/
Validator.prototype.schemaTraverser = function schemaTraverser (schemaobj, s) {
schemaobj.schema = helpers.deepMerge(schemaobj.schema, this.superResolve(s, schemaobj.ctx));
}
};
/**
* @private
@@ -229,12 +240,12 @@ Validator.prototype.schemaTraverser = function schemaTraverser (schemaobj, s) {
* @returns Object schema or resolved schema
*/
Validator.prototype.superResolve = function superResolve (schema, ctx) {
var ref;
if(ref = shouldResolve(schema)) {
var ref = shouldResolve(schema);
if(ref) {
return this.resolve(schema, ref, ctx).subschema;
}
return schema;
}
};
/**
* @private
@@ -275,6 +286,11 @@ Validator.prototype.resolve = function resolve (schema, switchSchema, ctx) {
* @return {boolean}
*/
Validator.prototype.testType = function validateType (instance, schema, options, ctx, type) {
if(type===undefined){
return;
}else if(type===null){
throw new SchemaError('Unexpected null in "type" keyword');
}
if (typeof this.types[type] == 'function') {
return this.types[type].call(this, instance);
}

12
node_modules/jsonschema/package.json generated vendored
View File

@@ -1,7 +1,7 @@
{
"author": "Tom de Grunt <tom@degrunt.nl>",
"name": "jsonschema",
"version": "1.2.6",
"version": "1.4.1",
"license": "MIT",
"dependencies": {},
"contributors": [
@@ -9,12 +9,15 @@
"name": "Austin Wright"
}
],
"main": "./lib",
"main": "./lib/index.js",
"typings": "./lib/index.d.ts",
"devDependencies": {
"@stryker-mutator/core": "^4.0.0",
"@stryker-mutator/mocha-runner": "^4.0.0",
"chai": "~4.2.0",
"eslint": "^7.7.0",
"json-metaschema": "^1.2.0",
"mocha": "~3",
"chai": "~1.5.0"
"mocha": "~8.1.1"
},
"optionalDependencies": {},
"engines": {
@@ -33,6 +36,7 @@
},
"description": "A fast and easy to use JSON Schema validator",
"scripts": {
"stryker": "stryker run",
"test": "./node_modules/.bin/mocha -R spec"
}
}

11
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "codeql",
"version": "2.3.3",
"version": "2.3.7",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "codeql",
"version": "2.3.3",
"version": "2.3.7",
"license": "MIT",
"dependencies": {
"@actions/artifact": "^1.1.0",
@@ -29,7 +29,7 @@
"fs": "0.0.1-security",
"get-folder-size": "^2.0.1",
"js-yaml": "^4.1.0",
"jsonschema": "1.2.6",
"jsonschema": "1.4.1",
"long": "^5.2.0",
"path": "^0.12.7",
"semver": "^7.3.2",
@@ -4202,8 +4202,9 @@
}
},
"node_modules/jsonschema": {
"version": "1.2.6",
"integrity": "sha512-SqhURKZG07JyKKeo/ir24QnS4/BV7a6gQy93bUSe4lUdNp0QNpIz2c9elWJQ9dpc5cQYY6cvCzgRwy0MQCLyqA==",
"version": "1.4.1",
"resolved": "https://registry.npmjs.org/jsonschema/-/jsonschema-1.4.1.tgz",
"integrity": "sha512-S6cATIPVv1z0IlxdN+zUk5EPjkGCdnhN4wVSBlvoUO1tOLJootbo9CquNJmbIh4yikWHiUedhRYrNPn1arpEmQ==",
"engines": {
"node": "*"
}

View File

@@ -1,6 +1,6 @@
{
"name": "codeql",
"version": "2.3.3",
"version": "2.3.7",
"private": true,
"description": "CodeQL action",
"scripts": {
@@ -41,7 +41,7 @@
"fs": "0.0.1-security",
"get-folder-size": "^2.0.1",
"js-yaml": "^4.1.0",
"jsonschema": "1.2.6",
"jsonschema": "1.4.1",
"long": "^5.2.0",
"path": "^0.12.7",
"semver": "^7.3.2",

View File

@@ -21,7 +21,7 @@ steps:
with:
upload-database: false
- name: Check language autodetect for all languages excluding Ruby, Swift
- name: Check language autodetect for all languages excluding Swift
shell: bash
run: |
CPP_DB=${{ fromJson(steps.analysis.outputs.db-locations).cpp }}
@@ -54,11 +54,6 @@ steps:
echo "Did not create a database for Python, or created it in the wrong location."
exit 1
fi
- name: Check language autodetect for Ruby
if: env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true'
shell: bash
run: |
RUBY_DB=${{ fromJson(steps.analysis.outputs.db-locations).ruby }}
if [[ ! -d $RUBY_DB ]] || [[ ! $RUBY_DB == ${{ runner.temp }}/customDbLocation/* ]]; then
echo "Did not create a database for Ruby, or created it in the wrong location."
@@ -66,7 +61,9 @@ steps:
fi
- name: Check language autodetect for Swift
if: env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true'
if: >-
env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true' ||
(runner.os != 'Windows' && matrix.version == 'nightly-latest')
shell: bash
run: |
SWIFT_DB=${{ fromJson(steps.analysis.outputs.db-locations).swift }}

View File

@@ -24,7 +24,7 @@ defaultTestVersions = [
header = """# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
# (cd pr-checks; pip install ruamel.yaml && python3 sync.py)
# to regenerate this file.
"""
@@ -81,10 +81,7 @@ for file in os.listdir('checks'):
'if': FoldedScalarString(textwrap.dedent('''
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
matrix.version == '20221211'
)
''').strip()),
'shell': 'bash',
@@ -101,6 +98,10 @@ for file in os.listdir('checks'):
}
},
'name': checkSpecification['name'],
'permissions': {
'contents': 'read',
'security-events': 'write'
},
'timeout-minutes': 45,
'runs-on': '${{ matrix.os }}',
'steps': steps,

View File

@@ -0,0 +1,27 @@
#! /usr/bin/pwsh
$EXPECTED_PYTHON_VERSION=$args[0]
$EXPECTED_REQUESTS_VERSION=$args[1]
$FOUND_PYTHON_VERSION="$Env:LGTM_PYTHON_SETUP_VERSION"
$FOUND_PYTHONPATH="$Env:LGTM_INDEX_IMPORT_PATH"
write-host "FOUND_PYTHON_VERSION=$FOUND_PYTHON_VERSION FOUND_PYTHONPATH=$FOUND_PYTHONPATH "
if ($FOUND_PYTHON_VERSION -ne $EXPECTED_PYTHON_VERSION) {
write-host "Script told us to use Python $FOUND_PYTHON_VERSION, but expected $EXPECTED_PYTHON_VERSION"
exit 1
} else {
write-host "Script told us to use Python $FOUND_PYTHON_VERSION, which was expected"
}
$env:PYTHONPATH=$FOUND_PYTHONPATH
$INSTALLED_REQUESTS_VERSION = (py -3 -c "import requests; print(requests.__version__)")
if ($INSTALLED_REQUESTS_VERSION -ne $EXPECTED_REQUESTS_VERSION) {
write-host "Using $FOUND_PYTHONPATH as PYTHONPATH, we found version $INSTALLED_REQUESTS_VERSION of requests, but expected $EXPECTED_REQUESTS_VERSION"
exit 1
} else {
write-host "Using $FOUND_PYTHONPATH as PYTHONPATH, we found version $INSTALLED_REQUESTS_VERSION of requests, which was expected"
}

View File

@@ -0,0 +1,31 @@
#!/bin/bash
set -e
SCRIPTDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
EXPECTED_PYTHON_VERSION=$1
EXPECTED_REQUESTS_VERSION=$2
FOUND_PYTHON_VERSION="$LGTM_PYTHON_SETUP_VERSION"
FOUND_PYTHONPATH="$LGTM_INDEX_IMPORT_PATH"
echo "FOUND_PYTHON_VERSION=${FOUND_PYTHON_VERSION} FOUND_PYTHONPATH=${FOUND_PYTHONPATH} "
if [[ $FOUND_PYTHON_VERSION != $EXPECTED_PYTHON_VERSION ]]; then
echo "Script told us to use Python ${FOUND_PYTHON_VERSION}, but expected ${EXPECTED_PYTHON_VERSION}"
exit 1
else
echo "Script told us to use Python ${FOUND_PYTHON_VERSION}, which was expected"
fi
PYTHON_EXE="python${EXPECTED_PYTHON_VERSION}"
INSTALLED_REQUESTS_VERSION=$(PYTHONPATH="${FOUND_PYTHONPATH}" "${PYTHON_EXE}" -c 'import requests; print(requests.__version__)')
if [[ "$INSTALLED_REQUESTS_VERSION" != "$EXPECTED_REQUESTS_VERSION" ]]; then
echo "Using ${FOUND_PYTHONPATH} as PYTHONPATH, we found version $INSTALLED_REQUESTS_VERSION of requests, but expected $EXPECTED_REQUESTS_VERSION"
exit 1
else
echo "Using ${FOUND_PYTHONPATH} as PYTHONPATH, we found version $INSTALLED_REQUESTS_VERSION of requests, which was expected"
fi

View File

@@ -1,28 +0,0 @@
#! /usr/bin/pwsh
$EXPECTED_VERSION=$args[0]
$FOUND_VERSION="$Env:LGTM_PYTHON_SETUP_VERSION"
$FOUND_PYTHONPATH="$Env:LGTM_INDEX_IMPORT_PATH"
write-host "FOUND_VERSION=$FOUND_VERSION FOUND_PYTHONPATH=$FOUND_PYTHONPATH "
if ($FOUND_VERSION -ne $EXPECTED_VERSION) {
write-host "Script told us to use Python $FOUND_VERSION, but expected $EXPECTED_VERSION"
exit 1
} else {
write-host "Script told us to use Python $FOUND_VERSION, which was expected"
}
$env:PYTHONPATH=$FOUND_PYTHONPATH
$INSTALLED_REQUESTS_VERSION = (py -3 -c "import requests; print(requests.__version__)")
$EXPECTED_REQUESTS="2.26.0"
if ($INSTALLED_REQUESTS_VERSION -ne $EXPECTED_REQUESTS) {
write-host "Using $FOUND_PYTHONPATH as PYTHONPATH, we found version $INSTALLED_REQUESTS_VERSION of requests, but expected $EXPECTED_REQUESTS"
exit 1
} else {
write-host "Using $FOUND_PYTHONPATH as PYTHONPATH, we found version $INSTALLED_REQUESTS_VERSION of requests, which was expected"
}

View File

@@ -1,32 +0,0 @@
#!/bin/bash
set -e
SCRIPTDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
EXPECTED_VERSION=$1
FOUND_VERSION="$LGTM_PYTHON_SETUP_VERSION"
FOUND_PYTHONPATH="$LGTM_INDEX_IMPORT_PATH"
echo "FOUND_VERSION=${FOUND_VERSION} FOUND_PYTHONPATH=${FOUND_PYTHONPATH} "
if [[ $FOUND_VERSION != $EXPECTED_VERSION ]]; then
echo "Script told us to use Python ${FOUND_VERSION}, but expected ${EXPECTED_VERSION}"
exit 1
else
echo "Script told us to use Python ${FOUND_VERSION}, which was expected"
fi
PYTHON_EXE="python${EXPECTED_VERSION}"
INSTALLED_REQUESTS_VERSION=$(PYTHONPATH="${FOUND_PYTHONPATH}" "${PYTHON_EXE}" -c 'import requests; print(requests.__version__)')
EXPECTED_REQUESTS="2.26.0"
if [[ "$INSTALLED_REQUESTS_VERSION" != "$EXPECTED_REQUESTS" ]]; then
echo "Using ${FOUND_PYTHONPATH} as PYTHONPATH, we found version $INSTALLED_REQUESTS_VERSION of requests, but expected $EXPECTED_REQUESTS"
exit 1
else
echo "Using ${FOUND_PYTHONPATH} as PYTHONPATH, we found version $INSTALLED_REQUESTS_VERSION of requests, which was expected"
fi

View File

@@ -18,43 +18,116 @@
"default": {
"certifi": {
"hashes": [
"sha256:35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3",
"sha256:4ad3232f5e926d6718ec31cfc1fcadfde020920e278684144551c91769c7bc18"
"sha256:0f0d56dc5a6ad56fd4ba36484d6cc34451e1c6548c61daad8c320169f91eddc7",
"sha256:c6c2e98f5c7869efca1f8916fed228dd91539f9f1b444c314c06eef02980c716"
],
"index": "pypi",
"version": "==2022.12.7"
"markers": "python_version >= '3.6'",
"version": "==2023.5.7"
},
"charset-normalizer": {
"hashes": [
"sha256:2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597",
"sha256:6881edbebdb17b39b4eaaa821b438bf6eddffb4468cf344f09f89def34a8b1df"
"sha256:04afa6387e2b282cf78ff3dbce20f0cc071c12dc8f685bd40960cc68644cfea6",
"sha256:04eefcee095f58eaabe6dc3cc2262f3bcd776d2c67005880894f447b3f2cb9c1",
"sha256:0be65ccf618c1e7ac9b849c315cc2e8a8751d9cfdaa43027d4f6624bd587ab7e",
"sha256:0c95f12b74681e9ae127728f7e5409cbbef9cd914d5896ef238cc779b8152373",
"sha256:0ca564606d2caafb0abe6d1b5311c2649e8071eb241b2d64e75a0d0065107e62",
"sha256:10c93628d7497c81686e8e5e557aafa78f230cd9e77dd0c40032ef90c18f2230",
"sha256:11d117e6c63e8f495412d37e7dc2e2fff09c34b2d09dbe2bee3c6229577818be",
"sha256:11d3bcb7be35e7b1bba2c23beedac81ee893ac9871d0ba79effc7fc01167db6c",
"sha256:12a2b561af122e3d94cdb97fe6fb2bb2b82cef0cdca131646fdb940a1eda04f0",
"sha256:12d1a39aa6b8c6f6248bb54550efcc1c38ce0d8096a146638fd4738e42284448",
"sha256:1435ae15108b1cb6fffbcea2af3d468683b7afed0169ad718451f8db5d1aff6f",
"sha256:1c60b9c202d00052183c9be85e5eaf18a4ada0a47d188a83c8f5c5b23252f649",
"sha256:1e8fcdd8f672a1c4fc8d0bd3a2b576b152d2a349782d1eb0f6b8e52e9954731d",
"sha256:20064ead0717cf9a73a6d1e779b23d149b53daf971169289ed2ed43a71e8d3b0",
"sha256:21fa558996782fc226b529fdd2ed7866c2c6ec91cee82735c98a197fae39f706",
"sha256:22908891a380d50738e1f978667536f6c6b526a2064156203d418f4856d6e86a",
"sha256:3160a0fd9754aab7d47f95a6b63ab355388d890163eb03b2d2b87ab0a30cfa59",
"sha256:322102cdf1ab682ecc7d9b1c5eed4ec59657a65e1c146a0da342b78f4112db23",
"sha256:34e0a2f9c370eb95597aae63bf85eb5e96826d81e3dcf88b8886012906f509b5",
"sha256:3573d376454d956553c356df45bb824262c397c6e26ce43e8203c4c540ee0acb",
"sha256:3747443b6a904001473370d7810aa19c3a180ccd52a7157aacc264a5ac79265e",
"sha256:38e812a197bf8e71a59fe55b757a84c1f946d0ac114acafaafaf21667a7e169e",
"sha256:3a06f32c9634a8705f4ca9946d667609f52cf130d5548881401f1eb2c39b1e2c",
"sha256:3a5fc78f9e3f501a1614a98f7c54d3969f3ad9bba8ba3d9b438c3bc5d047dd28",
"sha256:3d9098b479e78c85080c98e1e35ff40b4a31d8953102bb0fd7d1b6f8a2111a3d",
"sha256:3dc5b6a8ecfdc5748a7e429782598e4f17ef378e3e272eeb1340ea57c9109f41",
"sha256:4155b51ae05ed47199dc5b2a4e62abccb274cee6b01da5b895099b61b1982974",
"sha256:49919f8400b5e49e961f320c735388ee686a62327e773fa5b3ce6721f7e785ce",
"sha256:53d0a3fa5f8af98a1e261de6a3943ca631c526635eb5817a87a59d9a57ebf48f",
"sha256:5f008525e02908b20e04707a4f704cd286d94718f48bb33edddc7d7b584dddc1",
"sha256:628c985afb2c7d27a4800bfb609e03985aaecb42f955049957814e0491d4006d",
"sha256:65ed923f84a6844de5fd29726b888e58c62820e0769b76565480e1fdc3d062f8",
"sha256:6734e606355834f13445b6adc38b53c0fd45f1a56a9ba06c2058f86893ae8017",
"sha256:6baf0baf0d5d265fa7944feb9f7451cc316bfe30e8df1a61b1bb08577c554f31",
"sha256:6f4f4668e1831850ebcc2fd0b1cd11721947b6dc7c00bf1c6bd3c929ae14f2c7",
"sha256:6f5c2e7bc8a4bf7c426599765b1bd33217ec84023033672c1e9a8b35eaeaaaf8",
"sha256:6f6c7a8a57e9405cad7485f4c9d3172ae486cfef1344b5ddd8e5239582d7355e",
"sha256:7381c66e0561c5757ffe616af869b916c8b4e42b367ab29fedc98481d1e74e14",
"sha256:73dc03a6a7e30b7edc5b01b601e53e7fc924b04e1835e8e407c12c037e81adbd",
"sha256:74db0052d985cf37fa111828d0dd230776ac99c740e1a758ad99094be4f1803d",
"sha256:75f2568b4189dda1c567339b48cba4ac7384accb9c2a7ed655cd86b04055c795",
"sha256:78cacd03e79d009d95635e7d6ff12c21eb89b894c354bd2b2ed0b4763373693b",
"sha256:80d1543d58bd3d6c271b66abf454d437a438dff01c3e62fdbcd68f2a11310d4b",
"sha256:830d2948a5ec37c386d3170c483063798d7879037492540f10a475e3fd6f244b",
"sha256:891cf9b48776b5c61c700b55a598621fdb7b1e301a550365571e9624f270c203",
"sha256:8f25e17ab3039b05f762b0a55ae0b3632b2e073d9c8fc88e89aca31a6198e88f",
"sha256:9a3267620866c9d17b959a84dd0bd2d45719b817245e49371ead79ed4f710d19",
"sha256:a04f86f41a8916fe45ac5024ec477f41f886b3c435da2d4e3d2709b22ab02af1",
"sha256:aaf53a6cebad0eae578f062c7d462155eada9c172bd8c4d250b8c1d8eb7f916a",
"sha256:abc1185d79f47c0a7aaf7e2412a0eb2c03b724581139193d2d82b3ad8cbb00ac",
"sha256:ac0aa6cd53ab9a31d397f8303f92c42f534693528fafbdb997c82bae6e477ad9",
"sha256:ac3775e3311661d4adace3697a52ac0bab17edd166087d493b52d4f4f553f9f0",
"sha256:b06f0d3bf045158d2fb8837c5785fe9ff9b8c93358be64461a1089f5da983137",
"sha256:b116502087ce8a6b7a5f1814568ccbd0e9f6cfd99948aa59b0e241dc57cf739f",
"sha256:b82fab78e0b1329e183a65260581de4375f619167478dddab510c6c6fb04d9b6",
"sha256:bd7163182133c0c7701b25e604cf1611c0d87712e56e88e7ee5d72deab3e76b5",
"sha256:c36bcbc0d5174a80d6cccf43a0ecaca44e81d25be4b7f90f0ed7bcfbb5a00909",
"sha256:c3af8e0f07399d3176b179f2e2634c3ce9c1301379a6b8c9c9aeecd481da494f",
"sha256:c84132a54c750fda57729d1e2599bb598f5fa0344085dbde5003ba429a4798c0",
"sha256:cb7b2ab0188829593b9de646545175547a70d9a6e2b63bf2cd87a0a391599324",
"sha256:cca4def576f47a09a943666b8f829606bcb17e2bc2d5911a46c8f8da45f56755",
"sha256:cf6511efa4801b9b38dc5546d7547d5b5c6ef4b081c60b23e4d941d0eba9cbeb",
"sha256:d16fd5252f883eb074ca55cb622bc0bee49b979ae4e8639fff6ca3ff44f9f854",
"sha256:d2686f91611f9e17f4548dbf050e75b079bbc2a82be565832bc8ea9047b61c8c",
"sha256:d7fc3fca01da18fbabe4625d64bb612b533533ed10045a2ac3dd194bfa656b60",
"sha256:dd5653e67b149503c68c4018bf07e42eeed6b4e956b24c00ccdf93ac79cdff84",
"sha256:de5695a6f1d8340b12a5d6d4484290ee74d61e467c39ff03b39e30df62cf83a0",
"sha256:e0ac8959c929593fee38da1c2b64ee9778733cdf03c482c9ff1d508b6b593b2b",
"sha256:e1b25e3ad6c909f398df8921780d6a3d120d8c09466720226fc621605b6f92b1",
"sha256:e633940f28c1e913615fd624fcdd72fdba807bf53ea6925d6a588e84e1151531",
"sha256:e89df2958e5159b811af9ff0f92614dabf4ff617c03a4c1c6ff53bf1c399e0e1",
"sha256:ea9f9c6034ea2d93d9147818f17c2a0860d41b71c38b9ce4d55f21b6f9165a11",
"sha256:f645caaf0008bacf349875a974220f1f1da349c5dbe7c4ec93048cdc785a3326",
"sha256:f8303414c7b03f794347ad062c0516cee0e15f7a612abd0ce1e25caf6ceb47df",
"sha256:fca62a8301b605b954ad2e9c3666f9d97f63872aa4efcae5492baca2056b74ab"
],
"markers": "python_version >= '3'",
"version": "==2.0.12"
"markers": "python_full_version >= '3.7.0'",
"version": "==3.1.0"
},
"idna": {
"hashes": [
"sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4",
"sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"
],
"markers": "python_version >= '3'",
"markers": "python_version >= '3.5'",
"version": "==3.4"
},
"requests": {
"hashes": [
"sha256:6c1246513ecd5ecd4528a0906f910e8f0f9c6b8ec72030dc9fd154dc1a6efd24",
"sha256:b8aa58f8cf793ffd8782d3d8cb19e66ef36f7aba4353eec859e74678b01b07a7"
"sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f",
"sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"
],
"index": "pypi",
"version": "==2.26.0"
"version": "==2.31.0"
},
"urllib3": {
"hashes": [
"sha256:47cc05d99aaa09c9e72ed5809b60e7ba354e64b59c9c173ac3018642d8bb41fc",
"sha256:c083dd0dce68dbfbe1129d5271cb90f9447dea7d52097c6e0126120c521ddea8"
"sha256:61717a1095d7e155cdb737ac7bb2f4324a858a1e2e6466f6d03ff630ca68d3cc",
"sha256:d055c2f9d38dc53c808f6fdc8eab7360b6fdbbde02340ed25cfbcd817c62469e"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'",
"version": "==1.26.13"
"markers": "python_version >= '3.7'",
"version": "==2.0.2"
}
},
"develop": {}

View File

@@ -16,43 +16,116 @@
"default": {
"certifi": {
"hashes": [
"sha256:35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3",
"sha256:4ad3232f5e926d6718ec31cfc1fcadfde020920e278684144551c91769c7bc18"
"sha256:0f0d56dc5a6ad56fd4ba36484d6cc34451e1c6548c61daad8c320169f91eddc7",
"sha256:c6c2e98f5c7869efca1f8916fed228dd91539f9f1b444c314c06eef02980c716"
],
"index": "pypi",
"version": "==2022.12.7"
"markers": "python_version >= '3.6'",
"version": "==2023.5.7"
},
"charset-normalizer": {
"hashes": [
"sha256:2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597",
"sha256:6881edbebdb17b39b4eaaa821b438bf6eddffb4468cf344f09f89def34a8b1df"
"sha256:04afa6387e2b282cf78ff3dbce20f0cc071c12dc8f685bd40960cc68644cfea6",
"sha256:04eefcee095f58eaabe6dc3cc2262f3bcd776d2c67005880894f447b3f2cb9c1",
"sha256:0be65ccf618c1e7ac9b849c315cc2e8a8751d9cfdaa43027d4f6624bd587ab7e",
"sha256:0c95f12b74681e9ae127728f7e5409cbbef9cd914d5896ef238cc779b8152373",
"sha256:0ca564606d2caafb0abe6d1b5311c2649e8071eb241b2d64e75a0d0065107e62",
"sha256:10c93628d7497c81686e8e5e557aafa78f230cd9e77dd0c40032ef90c18f2230",
"sha256:11d117e6c63e8f495412d37e7dc2e2fff09c34b2d09dbe2bee3c6229577818be",
"sha256:11d3bcb7be35e7b1bba2c23beedac81ee893ac9871d0ba79effc7fc01167db6c",
"sha256:12a2b561af122e3d94cdb97fe6fb2bb2b82cef0cdca131646fdb940a1eda04f0",
"sha256:12d1a39aa6b8c6f6248bb54550efcc1c38ce0d8096a146638fd4738e42284448",
"sha256:1435ae15108b1cb6fffbcea2af3d468683b7afed0169ad718451f8db5d1aff6f",
"sha256:1c60b9c202d00052183c9be85e5eaf18a4ada0a47d188a83c8f5c5b23252f649",
"sha256:1e8fcdd8f672a1c4fc8d0bd3a2b576b152d2a349782d1eb0f6b8e52e9954731d",
"sha256:20064ead0717cf9a73a6d1e779b23d149b53daf971169289ed2ed43a71e8d3b0",
"sha256:21fa558996782fc226b529fdd2ed7866c2c6ec91cee82735c98a197fae39f706",
"sha256:22908891a380d50738e1f978667536f6c6b526a2064156203d418f4856d6e86a",
"sha256:3160a0fd9754aab7d47f95a6b63ab355388d890163eb03b2d2b87ab0a30cfa59",
"sha256:322102cdf1ab682ecc7d9b1c5eed4ec59657a65e1c146a0da342b78f4112db23",
"sha256:34e0a2f9c370eb95597aae63bf85eb5e96826d81e3dcf88b8886012906f509b5",
"sha256:3573d376454d956553c356df45bb824262c397c6e26ce43e8203c4c540ee0acb",
"sha256:3747443b6a904001473370d7810aa19c3a180ccd52a7157aacc264a5ac79265e",
"sha256:38e812a197bf8e71a59fe55b757a84c1f946d0ac114acafaafaf21667a7e169e",
"sha256:3a06f32c9634a8705f4ca9946d667609f52cf130d5548881401f1eb2c39b1e2c",
"sha256:3a5fc78f9e3f501a1614a98f7c54d3969f3ad9bba8ba3d9b438c3bc5d047dd28",
"sha256:3d9098b479e78c85080c98e1e35ff40b4a31d8953102bb0fd7d1b6f8a2111a3d",
"sha256:3dc5b6a8ecfdc5748a7e429782598e4f17ef378e3e272eeb1340ea57c9109f41",
"sha256:4155b51ae05ed47199dc5b2a4e62abccb274cee6b01da5b895099b61b1982974",
"sha256:49919f8400b5e49e961f320c735388ee686a62327e773fa5b3ce6721f7e785ce",
"sha256:53d0a3fa5f8af98a1e261de6a3943ca631c526635eb5817a87a59d9a57ebf48f",
"sha256:5f008525e02908b20e04707a4f704cd286d94718f48bb33edddc7d7b584dddc1",
"sha256:628c985afb2c7d27a4800bfb609e03985aaecb42f955049957814e0491d4006d",
"sha256:65ed923f84a6844de5fd29726b888e58c62820e0769b76565480e1fdc3d062f8",
"sha256:6734e606355834f13445b6adc38b53c0fd45f1a56a9ba06c2058f86893ae8017",
"sha256:6baf0baf0d5d265fa7944feb9f7451cc316bfe30e8df1a61b1bb08577c554f31",
"sha256:6f4f4668e1831850ebcc2fd0b1cd11721947b6dc7c00bf1c6bd3c929ae14f2c7",
"sha256:6f5c2e7bc8a4bf7c426599765b1bd33217ec84023033672c1e9a8b35eaeaaaf8",
"sha256:6f6c7a8a57e9405cad7485f4c9d3172ae486cfef1344b5ddd8e5239582d7355e",
"sha256:7381c66e0561c5757ffe616af869b916c8b4e42b367ab29fedc98481d1e74e14",
"sha256:73dc03a6a7e30b7edc5b01b601e53e7fc924b04e1835e8e407c12c037e81adbd",
"sha256:74db0052d985cf37fa111828d0dd230776ac99c740e1a758ad99094be4f1803d",
"sha256:75f2568b4189dda1c567339b48cba4ac7384accb9c2a7ed655cd86b04055c795",
"sha256:78cacd03e79d009d95635e7d6ff12c21eb89b894c354bd2b2ed0b4763373693b",
"sha256:80d1543d58bd3d6c271b66abf454d437a438dff01c3e62fdbcd68f2a11310d4b",
"sha256:830d2948a5ec37c386d3170c483063798d7879037492540f10a475e3fd6f244b",
"sha256:891cf9b48776b5c61c700b55a598621fdb7b1e301a550365571e9624f270c203",
"sha256:8f25e17ab3039b05f762b0a55ae0b3632b2e073d9c8fc88e89aca31a6198e88f",
"sha256:9a3267620866c9d17b959a84dd0bd2d45719b817245e49371ead79ed4f710d19",
"sha256:a04f86f41a8916fe45ac5024ec477f41f886b3c435da2d4e3d2709b22ab02af1",
"sha256:aaf53a6cebad0eae578f062c7d462155eada9c172bd8c4d250b8c1d8eb7f916a",
"sha256:abc1185d79f47c0a7aaf7e2412a0eb2c03b724581139193d2d82b3ad8cbb00ac",
"sha256:ac0aa6cd53ab9a31d397f8303f92c42f534693528fafbdb997c82bae6e477ad9",
"sha256:ac3775e3311661d4adace3697a52ac0bab17edd166087d493b52d4f4f553f9f0",
"sha256:b06f0d3bf045158d2fb8837c5785fe9ff9b8c93358be64461a1089f5da983137",
"sha256:b116502087ce8a6b7a5f1814568ccbd0e9f6cfd99948aa59b0e241dc57cf739f",
"sha256:b82fab78e0b1329e183a65260581de4375f619167478dddab510c6c6fb04d9b6",
"sha256:bd7163182133c0c7701b25e604cf1611c0d87712e56e88e7ee5d72deab3e76b5",
"sha256:c36bcbc0d5174a80d6cccf43a0ecaca44e81d25be4b7f90f0ed7bcfbb5a00909",
"sha256:c3af8e0f07399d3176b179f2e2634c3ce9c1301379a6b8c9c9aeecd481da494f",
"sha256:c84132a54c750fda57729d1e2599bb598f5fa0344085dbde5003ba429a4798c0",
"sha256:cb7b2ab0188829593b9de646545175547a70d9a6e2b63bf2cd87a0a391599324",
"sha256:cca4def576f47a09a943666b8f829606bcb17e2bc2d5911a46c8f8da45f56755",
"sha256:cf6511efa4801b9b38dc5546d7547d5b5c6ef4b081c60b23e4d941d0eba9cbeb",
"sha256:d16fd5252f883eb074ca55cb622bc0bee49b979ae4e8639fff6ca3ff44f9f854",
"sha256:d2686f91611f9e17f4548dbf050e75b079bbc2a82be565832bc8ea9047b61c8c",
"sha256:d7fc3fca01da18fbabe4625d64bb612b533533ed10045a2ac3dd194bfa656b60",
"sha256:dd5653e67b149503c68c4018bf07e42eeed6b4e956b24c00ccdf93ac79cdff84",
"sha256:de5695a6f1d8340b12a5d6d4484290ee74d61e467c39ff03b39e30df62cf83a0",
"sha256:e0ac8959c929593fee38da1c2b64ee9778733cdf03c482c9ff1d508b6b593b2b",
"sha256:e1b25e3ad6c909f398df8921780d6a3d120d8c09466720226fc621605b6f92b1",
"sha256:e633940f28c1e913615fd624fcdd72fdba807bf53ea6925d6a588e84e1151531",
"sha256:e89df2958e5159b811af9ff0f92614dabf4ff617c03a4c1c6ff53bf1c399e0e1",
"sha256:ea9f9c6034ea2d93d9147818f17c2a0860d41b71c38b9ce4d55f21b6f9165a11",
"sha256:f645caaf0008bacf349875a974220f1f1da349c5dbe7c4ec93048cdc785a3326",
"sha256:f8303414c7b03f794347ad062c0516cee0e15f7a612abd0ce1e25caf6ceb47df",
"sha256:fca62a8301b605b954ad2e9c3666f9d97f63872aa4efcae5492baca2056b74ab"
],
"markers": "python_version >= '3'",
"version": "==2.0.12"
"markers": "python_full_version >= '3.7.0'",
"version": "==3.1.0"
},
"idna": {
"hashes": [
"sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4",
"sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"
],
"markers": "python_version >= '3'",
"markers": "python_version >= '3.5'",
"version": "==3.4"
},
"requests": {
"hashes": [
"sha256:6c1246513ecd5ecd4528a0906f910e8f0f9c6b8ec72030dc9fd154dc1a6efd24",
"sha256:b8aa58f8cf793ffd8782d3d8cb19e66ef36f7aba4353eec859e74678b01b07a7"
"sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f",
"sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"
],
"index": "pypi",
"version": "==2.26.0"
"version": "==2.31.0"
},
"urllib3": {
"hashes": [
"sha256:47cc05d99aaa09c9e72ed5809b60e7ba354e64b59c9c173ac3018642d8bb41fc",
"sha256:c083dd0dce68dbfbe1129d5271cb90f9447dea7d52097c6e0126120c521ddea8"
"sha256:61717a1095d7e155cdb737ac7bb2f4324a858a1e2e6466f6d03ff630ca68d3cc",
"sha256:d055c2f9d38dc53c808f6fdc8eab7360b6fdbbde02340ed25cfbcd817c62469e"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'",
"version": "==1.26.13"
"markers": "python_version >= '3.7'",
"version": "==2.0.2"
}
},
"develop": {}

Some files were not shown because too many files have changed in this diff Show More