Compare commits

...

93 Commits

Author SHA1 Message Date
Henry Mercer
130884e4e1 Merge pull request #1675 from shaikhul/remove-consts
Remove MismatchedBranches check from code scanning workflow validation
2023-05-11 15:45:33 +01:00
Shaikhul Islam
a0755a79b6 Update CHANGELOG.md
Co-authored-by: Henry Mercer <henry.mercer@me.com>
2023-05-11 10:22:57 -04:00
Shaikhul Islam
903cb278c5 recompile src 2023-05-11 14:16:34 +00:00
Shaikhul Islam
e5fdcd4a8f Apply suggestions from code review
Co-authored-by: Henry Mercer <henry.mercer@me.com>
2023-05-11 09:29:25 -04:00
Shaikhul Islam
c26fc558ba revert MissingPushHook checks changes 2023-05-10 20:37:56 +00:00
Shaikhul Islam
f8707c9939 update changelog 2023-05-10 15:01:33 +00:00
Shaikhul Islam
699855c048 fix linter issue 2023-05-09 15:05:36 +00:00
Shaikhul Islam
edb138ff88 remove consts MismatchedBranches and MissingPushHook 2023-05-09 14:39:49 +00:00
Andrew Eisenberg
95cfca769b Merge pull request #1673 from github/dependabot/github_actions/peter-evans/create-pull-request-5.0.1
Bump peter-evans/create-pull-request from 5.0.0 to 5.0.1
2023-05-08 12:25:32 -07:00
dependabot[bot]
9c51a58355 Bump peter-evans/create-pull-request from 5.0.0 to 5.0.1
Bumps [peter-evans/create-pull-request](https://github.com/peter-evans/create-pull-request) from 5.0.0 to 5.0.1.
- [Release notes](https://github.com/peter-evans/create-pull-request/releases)
- [Commits](5b4a9f6a9e...284f54f989)

---
updated-dependencies:
- dependency-name: peter-evans/create-pull-request
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-05-08 18:00:47 +00:00
Aditya Sharad
deb312c60b Merge pull request #1672 from github/aeisenberg/sarif-again
Fix broken regex
2023-05-05 12:53:12 -07:00
Andrew Eisenberg
9824588133 Fix broken regex
`($i)` is not valid for javascript regexes.
2023-05-05 12:02:19 -07:00
Andrew Eisenberg
11fba50273 Merge pull request #1668 from github/aeisenberg/update-sarif-schema 2023-05-05 09:14:24 -07:00
Andrew Eisenberg
684c4b5c77 Update CHANGELOG.md
Co-authored-by: Aditya Sharad <6874315+adityasharad@users.noreply.github.com>
2023-05-05 08:41:11 -07:00
Dave Bartolomeo
1e1aca8165 Merge pull request #1670 from github/mergeback/v2.3.3-to-main-29b1f65c
Mergeback v2.3.3 refs/heads/releases/v2 into main
2023-05-04 15:27:32 -04:00
github-actions[bot]
898fba281b Update checked-in dependencies 2023-05-04 19:02:16 +00:00
github-actions[bot]
913b8b11ad Update changelog and version after v2.3.3 2023-05-04 18:53:44 +00:00
Dave Bartolomeo
29b1f65c5e Merge pull request #1669 from github/update-v2.3.3-318bcc7f8
Merge main into releases/v2
2023-05-04 14:52:14 -04:00
github-actions[bot]
140500d80a Update changelog for v2.3.3 2023-05-04 18:24:50 +00:00
Dave Bartolomeo
318bcc7f84 Merge pull request #1664 from github/update-bundle/codeql-bundle-20230428
Update default bundle to 2.13.1
2023-05-04 00:32:01 -04:00
Dave Bartolomeo
f72bf5dfb3 Fix workflow formatting 2023-05-03 21:43:47 -04:00
Dave Bartolomeo
33461954a5 Merge branch 'main' into update-bundle/codeql-bundle-20230428 2023-05-03 19:02:27 -04:00
Andrew Eisenberg
3df80238a3 Re-run sync.py with new ruamel.yaml 2023-05-02 15:19:05 -07:00
Andrew Eisenberg
ef88842204 Update jsonschema version
Fixes bug in `uniqueItems` property.
2023-05-02 14:26:17 -07:00
Andrew Eisenberg
ece3cbc8ec Update changelog 2023-05-02 13:52:28 -07:00
Andrew Eisenberg
febbadf751 Update the sarif schema file
The version we were using is quite old. Copied the latest from
123e95847b/Schemata/sarif-schema-2.1.0.json

I do not think the sarif spec will be changing any more without
an explicit version update, so this is fine for now.
2023-05-02 13:46:24 -07:00
Andrew Eisenberg
8ca5570701 Merge pull request #1666 from github/aeisenberg/readme-update
Add link to changenote for custom config
2023-05-01 15:07:45 -07:00
Andrew Eisenberg
b1b3d00b62 Add link to changenote for custom config
Also, use a better link in the readme.
2023-05-01 11:06:31 -07:00
Andrew Eisenberg
d2f6dfd52d Merge pull request #1665 from github/aeisenberg/config-param
Add new configuration Parameter
2023-05-01 10:26:31 -07:00
Andrew Eisenberg
cba5616040 Update CHANGELOG.md 2023-05-01 09:21:50 -07:00
github-actions[bot]
40c95932fe Add changelog note 2023-05-01 03:46:54 +00:00
github-actions[bot]
234badad23 Update default bundle to codeql-bundle-20230428 2023-05-01 03:46:50 +00:00
Andrew Eisenberg
824d18c689 Merge remote-tracking branch 'upstream/main' into issue-1589-config-param 2023-04-28 11:34:52 -07:00
Angela P Wen
f31a31c052 Merge pull request #1663 from github/mergeback/v2.3.2-to-main-f3feb00a
Mergeback v2.3.2 refs/heads/releases/v2 into main
2023-04-27 14:00:39 -07:00
github-actions[bot]
e3395de200 Update checked-in dependencies 2023-04-27 18:52:55 +00:00
github-actions[bot]
1cccbfcedc Update changelog and version after v2.3.2 2023-04-27 18:51:28 +00:00
Angela P Wen
f3feb00acb Merge pull request #1662 from github/update-v2.3.2-8b12d99ee
Merge main into releases/v2
2023-04-27 11:49:50 -07:00
github-actions[bot]
1c9e206df3 Update changelog for v2.3.2 2023-04-27 18:18:58 +00:00
Angela P Wen
8b12d99ee5 Fix bug where run attempt was reported as run ID (#1661) 2023-04-27 18:05:34 +00:00
Angela P Wen
dcf71cf79b Merge pull request #1660 from github/mergeback/v2.3.1-to-main-8662eabe
Mergeback v2.3.1 refs/heads/releases/v2 into main
2023-04-26 14:15:40 -07:00
github-actions[bot]
194450bdd6 Update checked-in dependencies 2023-04-26 20:48:31 +00:00
github-actions[bot]
e78ef455a8 Update changelog and version after v2.3.1 2023-04-26 20:44:18 +00:00
Angela P Wen
8662eabe0e Merge pull request #1659 from github/update-v2.3.1-da583b07a
* Update changelog and version after v2.3.0

* Update checked-in dependencies

* Throw full error for CLI bundle download (#1657)

* Add `workload_run_attempt` to analysis upload (#1658)

* Refactor status report upload logic

Previously we had duplicated the logic to check `GITHUB_RUN_ID`. We now call the `getWorkflowRunID()` method for the status report upload method, and move the logic for the run attempt to `getWorkflowRunAttempt()`

* Add `workflow_run_attempt` to analysis payload

* Stop allowing `undefined` run IDs and attempts

Because we already throw an error if the ID or attempt aren't numbers, we don't have to allow `undefined` values into the payload.

* Update changelog for v2.3.1

---------

Co-authored-by: github-actions[bot] <github-actions@github.com>
Co-authored-by: Chuan-kai Lin <cklin@github.com>
Co-authored-by: Angela P Wen <angelapwen@github.com>
2023-04-26 13:42:37 -07:00
github-actions[bot]
1f2f707d99 Update changelog for v2.3.1 2023-04-26 20:16:15 +00:00
Angela P Wen
da583b07a7 Add workload_run_attempt to analysis upload (#1658)
* Refactor status report upload logic

Previously we had duplicated the logic to check `GITHUB_RUN_ID`. We now call the `getWorkflowRunID()` method for the status report upload method, and move the logic for the run attempt to `getWorkflowRunAttempt()`

* Add `workflow_run_attempt` to analysis payload

* Stop allowing `undefined` run IDs and attempts

Because we already throw an error if the ID or attempt aren't numbers, we don't have to allow `undefined` values into the payload.
2023-04-26 02:13:27 +00:00
Angela P Wen
a9648ea7c6 Throw full error for CLI bundle download (#1657) 2023-04-24 07:46:45 -07:00
Chuan-kai Lin
c5f3f016ae Merge pull request #1656 from github/mergeback/v2.3.0-to-main-b2c19fb9
Mergeback v2.3.0 refs/heads/releases/v2 into main
2023-04-21 12:43:38 -07:00
github-actions[bot]
90f053271e Update checked-in dependencies 2023-04-21 19:12:19 +00:00
github-actions[bot]
0f085f964c Update changelog and version after v2.3.0 2023-04-21 19:09:10 +00:00
Chuan-kai Lin
b2c19fb9a2 Merge pull request #1655 from github/update-v2.3.0-a8affb063
Merge main into releases/v2
2023-04-21 12:07:18 -07:00
github-actions[bot]
b203f98343 Update changelog for v2.3.0 2023-04-21 18:24:50 +00:00
Chuan-kai Lin
a8affb0639 Merge pull request #1649 from github/cklin/codeql-cli-2.13.0
Update default CodeQL bundle version to 2.13.0
2023-04-20 07:39:38 -07:00
Henry Mercer
b8cc643a23 Merge branch 'main' into cklin/codeql-cli-2.13.0 2023-04-20 11:23:25 +01:00
Henry Mercer
7019a9c6fd Merge pull request #1618 from github/henrymercer/remove-legacy-tracing
Remove legacy tracing
2023-04-20 11:22:32 +01:00
Henry Mercer
66f62df188 Merge branch 'main' into henrymercer/remove-legacy-tracing 2023-04-19 15:56:42 +01:00
Henry Mercer
afdf30f311 Merge pull request #1652 from github/henrymercer/fix-bundle-version
Fix the `bundleVersion` field set by the automated bundle update PR
2023-04-18 21:04:26 +01:00
Henry Mercer
55a2e70992 Autoformat index.ts 2023-04-18 18:59:36 +01:00
Henry Mercer
1c2f282107 Fix bundle version
It's the whole tag, we don't want to remove the `codeql-bundle-` prefix.
2023-04-18 18:59:09 +01:00
tgrall
47cec7ab01 add test with config file and input together 2023-04-18 06:01:33 +02:00
tgrall
7b876ae4f4 remove space from json string 2023-04-18 05:46:51 +02:00
tgrall
f398a65921 fix after review from @henrymercer 2023-04-18 05:43:21 +02:00
dependabot[bot]
9a866ed452 Bump swift-actions/setup-swift in /.github/actions/setup-swift (#1650)
Bumps [swift-actions/setup-swift](https://github.com/swift-actions/setup-swift) from 1.22.0 to 1.23.0.
- [Release notes](https://github.com/swift-actions/setup-swift/releases)
- [Commits](da0e3e04b5...65540b95f5)

---
updated-dependencies:
- dependency-name: swift-actions/setup-swift
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-04-17 19:16:10 +00:00
Chuan-kai Lin
7867d03591 Update default CodeQL bundle version to 2.13.0 2023-04-14 15:28:21 -07:00
Chuan-kai Lin
be2b53b5c7 Merge pull request #1648 from github/cklin/update-bundle-trigger
Fix pre-release trigger for update-bundle action
2023-04-14 15:11:42 -07:00
Chuan-kai Lin
ae24b75fca Fix pre-release trigger for update-bundle action
This PR switches the update-bundle release trigger from `prereleased` to `published` because the former has been documented not to work.

From https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#release:

> Note: The prereleased type will not trigger for pre-releases published from draft releases, but the published type will trigger. If you want a workflow to run when stable and pre-releases publish, subscribe to published instead of released and prereleased.
2023-04-14 14:50:37 -07:00
Henry Mercer
8a093aa1a5 Merge branch 'main' into henrymercer/remove-legacy-tracing 2023-04-11 12:25:45 +01:00
tgrall
fc374f5e9a remove the documentation about workflow parameters 2023-04-10 07:59:26 +02:00
tgrall
b4e6f81a72 resolve comments from @henrymercer 2023-04-10 07:56:09 +02:00
Tugdual Grall
0b75f471b1 Update CHANGELOG.md : during PR Review
Co-authored-by: Henry Mercer <henry.mercer@me.com>
2023-04-10 07:38:51 +02:00
Tugdual Grall
c9f360d9af Update README.md : during PR Review
Co-authored-by: Henry Mercer <henry.mercer@me.com>
2023-04-10 07:38:31 +02:00
Tugdual Grall
d2950c11f0 Update README.md : during PR Review
Co-authored-by: Henry Mercer <henry.mercer@me.com>
2023-04-10 07:37:52 +02:00
Tugdual Grall
a9fb7d923c Update init/action.yml : PR review
Co-authored-by: Henry Mercer <henry.mercer@me.com>
2023-04-10 07:37:20 +02:00
Tugdual Grall
696504dcab Accept change in PR
Co-authored-by: Henry Mercer <henry.mercer@me.com>
2023-04-10 07:35:57 +02:00
Tugdual Grall
18f13455eb Merge branch 'main' into issue-1589-config-param 2023-04-10 07:33:09 +02:00
Henry Mercer
2058418de9 Don't expect Swift baseline info on Windows 2023-04-05 20:41:23 +01:00
Henry Mercer
5da64f56c0 Set up Swift in unset environment workflow 2023-04-05 20:27:02 +01:00
Henry Mercer
322cea6439 Set up Swift in local bundle workflow 2023-04-05 19:31:20 +01:00
Henry Mercer
f7a67e4341 Merge branch 'main' into henrymercer/remove-legacy-tracing 2023-04-05 18:39:27 +01:00
tgrall
fe4a785361 rename new parameter from configuration to config 2023-04-01 07:13:01 +02:00
Henry Mercer
d838bacfbe Simplify matrix 2023-03-29 15:48:13 +01:00
Henry Mercer
72d018e267 Improve serialization of Swift environment variable if expression 2023-03-29 13:15:59 +01:00
Henry Mercer
9975b733f4 Fix bundle version comments 2023-03-29 13:03:45 +01:00
Henry Mercer
6cd5121600 Merge branch 'main' into henrymercer/remove-legacy-tracing 2023-03-29 13:03:14 +01:00
Henry Mercer
6ef37003ca Update CodeQL releases used in PR checks 2023-03-28 20:07:09 +01:00
Henry Mercer
d13d683355 Bump minor version number and add changelog note 2023-03-28 18:53:47 +01:00
Henry Mercer
d8fe76e161 Delete legacy tracing 2023-03-28 18:53:43 +01:00
Henry Mercer
4772c1d99f Bump minimum version to 2.8.5 2023-03-28 17:24:45 +01:00
Tugdual Grall
34231cfd52 fix CI failure - check js 2023-03-18 16:51:49 +00:00
Tugdual Grall
f1fb80a041 Update README.md 2023-03-18 16:17:44 +01:00
Tugdual Grall
f81f52702f Update README.md - typo 2023-03-18 16:17:16 +01:00
Tugdual Grall
2f141340f0 fix linter issues 2023-03-18 15:14:34 +00:00
Tugdual Grall
94786b354b update changelog 2023-03-18 13:48:45 +00:00
Tugdual Grall
ee44252240 - Add new configuration Parameter
- Write test to check it is read from configuration
- Update documentation
2023-03-18 13:40:54 +00:00
107 changed files with 1973 additions and 2349 deletions

View File

@@ -1,18 +1,18 @@
name: "Set up Swift"
description: Performs necessary steps to set up appropriate Swift version.
description: Sets up an appropriate Swift version if Swift is enabled via CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT.
inputs:
codeql-path:
description: Path to the CodeQL CLI executable.
required: true
runs:
using: "composite"
steps:
- name: Get Swift version
id: get_swift_version
# We don't support Swift on Windows or prior versions of CLI.
if: "(runner.os != 'Windows') && (matrix.version == 'cached' || matrix.version == 'latest' || matrix.version == 'nightly-latest')"
if: env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true'
shell: bash
env:
CODEQL_PATH: ${{inputs.codeql-path}}
CODEQL_PATH: ${{ inputs.codeql-path }}
run: |
if [ $RUNNER_OS = "macOS" ]; then
PLATFORM="osx64"
@@ -26,7 +26,7 @@ runs:
VERSION="5.7.0"
fi
echo "version=$VERSION" | tee -a $GITHUB_OUTPUT
- uses: swift-actions/setup-swift@da0e3e04b5e3e15dbc3861bd835ad9f0afe56296 # Please update the corresponding SHA in the CLI's CodeQL Action Integration Test.
if: "(runner.os != 'Windows') && (matrix.version == 'cached' || matrix.version == 'latest' || matrix.version == 'nightly-latest')"
- uses: swift-actions/setup-swift@65540b95f51493d65f5e59e97dcef9629ddf11bf # Please update the corresponding SHA in the CLI's CodeQL Action Integration Test.
if: env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true'
with:
swift-version: "${{steps.get_swift_version.outputs.version}}"
swift-version: "${{ steps.get_swift_version.outputs.version }}"

View File

@@ -13,57 +13,55 @@ interface Defaults {
priorCliVersion: string;
}
const CODEQL_BUNDLE_PREFIX = 'codeql-bundle-';
function getCodeQLCliVersionForRelease(release): string {
// We do not currently tag CodeQL bundles based on the CLI version they contain.
// Instead, we use a marker file `cli-version-<version>.txt` to record the CLI version.
// This marker file is uploaded as a release asset for all new CodeQL bundles.
const cliVersionsFromMarkerFiles = release.assets
.map((asset) => asset.name.match(/cli-version-(.*)\.txt/)?.[1])
.filter((v) => v)
.map((v) => v as string);
.map((asset) => asset.name.match(/cli-version-(.*)\.txt/)?.[1])
.filter((v) => v)
.map((v) => v as string);
if (cliVersionsFromMarkerFiles.length > 1) {
throw new Error(
`Release ${release.tag_name} has multiple CLI version marker files.`
);
} else if (cliVersionsFromMarkerFiles.length === 0) {
throw new Error(
`Failed to find the CodeQL CLI version for release ${release.tag_name}.`
);
}
return cliVersionsFromMarkerFiles[0];
}
);
} else if (cliVersionsFromMarkerFiles.length === 0) {
throw new Error(
`Failed to find the CodeQL CLI version for release ${release.tag_name}.`
);
}
return cliVersionsFromMarkerFiles[0];
}
async function getBundleInfoFromRelease(release): Promise<BundleInfo> {
return {
bundleVersion: release.tag_name.substring(CODEQL_BUNDLE_PREFIX.length),
cliVersion: getCodeQLCliVersionForRelease(release)
};
}
async function getBundleInfoFromRelease(release): Promise<BundleInfo> {
return {
bundleVersion: release.tag_name,
cliVersion: getCodeQLCliVersionForRelease(release)
};
}
async function getNewDefaults(currentDefaults: Defaults): Promise<Defaults> {
const release = github.context.payload.release;
console.log('Updating default bundle as a result of the following release: ' +
`${JSON.stringify(release)}.`)
async function getNewDefaults(currentDefaults: Defaults): Promise<Defaults> {
const release = github.context.payload.release;
console.log('Updating default bundle as a result of the following release: ' +
`${JSON.stringify(release)}.`)
const bundleInfo = await getBundleInfoFromRelease(release);
return {
bundleVersion: bundleInfo.bundleVersion,
cliVersion: bundleInfo.cliVersion,
priorBundleVersion: currentDefaults.bundleVersion,
priorCliVersion: currentDefaults.cliVersion
};
}
const bundleInfo = await getBundleInfoFromRelease(release);
return {
bundleVersion: bundleInfo.bundleVersion,
cliVersion: bundleInfo.cliVersion,
priorBundleVersion: currentDefaults.bundleVersion,
priorCliVersion: currentDefaults.cliVersion
};
}
async function main() {
const previousDefaults: Defaults = JSON.parse(fs.readFileSync('../../../src/defaults.json', 'utf8'));
const newDefaults = await getNewDefaults(previousDefaults);
// Update the source file in the repository. Calling workflows should subsequently rebuild
// the Action to update `lib/defaults.json`.
fs.writeFileSync('../../../src/defaults.json', JSON.stringify(newDefaults, null, 2) + "\n");
}
async function main() {
const previousDefaults: Defaults = JSON.parse(fs.readFileSync('../../../src/defaults.json', 'utf8'));
const newDefaults = await getNewDefaults(previousDefaults);
// Update the source file in the repository. Calling workflows should subsequently rebuild
// the Action to update `lib/defaults.json`.
fs.writeFileSync('../../../src/defaults.json', JSON.stringify(newDefaults, null, 2) + "\n");
}
// Ideally, we'd await main() here, but that doesn't work well with `ts-node`.
// So instead we rely on the fact that Node won't exit until the event loop is empty.
main();
// Ideally, we'd await main() here, but that doesn't work well with `ts-node`.
// So instead we rely on the fact that Node won't exit until the event loop is empty.
main();

View File

@@ -25,24 +25,30 @@ jobs:
strategy:
matrix:
include:
- os: ubuntu-20.04
version: stable-20211005
- os: macos-latest
version: stable-20211005
- os: windows-2019
version: stable-20211005
- os: ubuntu-20.04
version: stable-20220120
- os: macos-latest
version: stable-20220120
- os: windows-2019
version: stable-20220120
- os: ubuntu-latest
version: stable-20220401
- os: macos-latest
version: stable-20220401
- os: windows-latest
version: stable-20220401
- os: ubuntu-latest
version: stable-20220615
- os: macos-latest
version: stable-20220615
- os: windows-latest
version: stable-20220615
- os: ubuntu-latest
version: stable-20220908
- os: macos-latest
version: stable-20220908
- os: windows-latest
version: stable-20220908
- os: ubuntu-latest
version: stable-20221211
- os: macos-latest
version: stable-20221211
- os: windows-latest
version: stable-20221211
- os: ubuntu-latest
version: cached
- os: macos-latest
@@ -72,11 +78,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set up Go
if: matrix.os == 'ubuntu-20.04' || matrix.os == 'windows-2019'
uses: actions/setup-go@v4
with:
go-version: ^1.13.1
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
tools: ${{ steps.prepare-test.outputs.tools-url }}

View File

@@ -42,6 +42,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
languages: csharp

View File

@@ -48,6 +48,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
languages: javascript

View File

@@ -54,6 +54,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
id: init
with:

View File

@@ -42,6 +42,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
id: init
with:
@@ -70,7 +81,10 @@ jobs:
shell: bash
run: |
cd "$RUNNER_TEMP/results"
expected_baseline_languages="cpp cs go java js py rb swift"
expected_baseline_languages="cpp cs go java js py rb"
if [[ $RUNNER_OS != "Windows" ]]; then
expected_baseline_languages+=" swift"
fi
for lang in ${expected_baseline_languages}; do
rule_name="${lang}/baseline/expected-extracted-files"
@@ -84,5 +98,4 @@ jobs:
fi
done
env:
CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT: true # Remove when Swift is GA.
CODEQL_ACTION_TEST_MODE: true

View File

@@ -38,6 +38,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
languages: java

View File

@@ -25,24 +25,30 @@ jobs:
strategy:
matrix:
include:
- os: ubuntu-20.04
version: stable-20211005
- os: macos-latest
version: stable-20211005
- os: windows-2019
version: stable-20211005
- os: ubuntu-20.04
version: stable-20220120
- os: macos-latest
version: stable-20220120
- os: windows-2019
version: stable-20220120
- os: ubuntu-latest
version: stable-20220401
- os: macos-latest
version: stable-20220401
- os: windows-latest
version: stable-20220401
- os: ubuntu-latest
version: stable-20220615
- os: macos-latest
version: stable-20220615
- os: windows-latest
version: stable-20220615
- os: ubuntu-latest
version: stable-20220908
- os: macos-latest
version: stable-20220908
- os: windows-latest
version: stable-20220908
- os: ubuntu-latest
version: stable-20221211
- os: macos-latest
version: stable-20221211
- os: windows-latest
version: stable-20221211
- os: ubuntu-latest
version: cached
- os: macos-latest
@@ -72,11 +78,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set up Go
if: matrix.os == 'ubuntu-20.04' || matrix.os == 'windows-2019'
uses: actions/setup-go@v4
with:
go-version: ^1.13.1
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
languages: go

View File

@@ -25,18 +25,22 @@ jobs:
strategy:
matrix:
include:
- os: ubuntu-20.04
version: stable-20211005
- os: macos-latest
version: stable-20211005
- os: ubuntu-20.04
version: stable-20220120
- os: macos-latest
version: stable-20220120
- os: ubuntu-latest
version: stable-20220401
- os: macos-latest
version: stable-20220401
- os: ubuntu-latest
version: stable-20220615
- os: macos-latest
version: stable-20220615
- os: ubuntu-latest
version: stable-20220908
- os: macos-latest
version: stable-20220908
- os: ubuntu-latest
version: stable-20221211
- os: macos-latest
version: stable-20221211
- os: ubuntu-latest
version: cached
- os: macos-latest
@@ -60,11 +64,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set up Go
if: matrix.os == 'ubuntu-20.04' || matrix.os == 'windows-2019'
uses: actions/setup-go@v4
with:
go-version: ^1.13.1
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
languages: go

View File

@@ -25,18 +25,22 @@ jobs:
strategy:
matrix:
include:
- os: ubuntu-20.04
version: stable-20211005
- os: macos-latest
version: stable-20211005
- os: ubuntu-20.04
version: stable-20220120
- os: macos-latest
version: stable-20220120
- os: ubuntu-latest
version: stable-20220401
- os: macos-latest
version: stable-20220401
- os: ubuntu-latest
version: stable-20220615
- os: macos-latest
version: stable-20220615
- os: ubuntu-latest
version: stable-20220908
- os: macos-latest
version: stable-20220908
- os: ubuntu-latest
version: stable-20221211
- os: macos-latest
version: stable-20221211
- os: ubuntu-latest
version: cached
- os: macos-latest
@@ -60,11 +64,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set up Go
if: matrix.os == 'ubuntu-20.04' || matrix.os == 'windows-2019'
uses: actions/setup-go@v4
with:
go-version: ^1.13.1
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
languages: go

View File

@@ -25,18 +25,22 @@ jobs:
strategy:
matrix:
include:
- os: ubuntu-20.04
version: stable-20211005
- os: macos-latest
version: stable-20211005
- os: ubuntu-20.04
version: stable-20220120
- os: macos-latest
version: stable-20220120
- os: ubuntu-latest
version: stable-20220401
- os: macos-latest
version: stable-20220401
- os: ubuntu-latest
version: stable-20220615
- os: macos-latest
version: stable-20220615
- os: ubuntu-latest
version: stable-20220908
- os: macos-latest
version: stable-20220908
- os: ubuntu-latest
version: stable-20221211
- os: macos-latest
version: stable-20221211
- os: ubuntu-latest
version: cached
- os: macos-latest
@@ -60,11 +64,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set up Go
if: matrix.os == 'ubuntu-20.04' || matrix.os == 'windows-2019'
uses: actions/setup-go@v4
with:
go-version: ^1.13.1
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
languages: go

View File

@@ -54,6 +54,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- name: Init with registries
uses: ./../action/init
with:

View File

@@ -42,6 +42,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- name: Move codeql-action
shell: bash
run: |

View File

@@ -25,12 +25,30 @@ jobs:
strategy:
matrix:
include:
- os: ubuntu-20.04
version: stable-20220120
- os: ubuntu-latest
version: stable-20220401
- os: macos-latest
version: stable-20220120
- os: windows-2019
version: stable-20220120
version: stable-20220401
- os: windows-latest
version: stable-20220401
- os: ubuntu-latest
version: stable-20220615
- os: macos-latest
version: stable-20220615
- os: windows-latest
version: stable-20220615
- os: ubuntu-latest
version: stable-20220908
- os: macos-latest
version: stable-20220908
- os: windows-latest
version: stable-20220908
- os: ubuntu-latest
version: stable-20221211
- os: macos-latest
version: stable-20221211
- os: windows-latest
version: stable-20221211
- os: ubuntu-latest
version: cached
- os: macos-latest
@@ -60,11 +78,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set up Go
if: matrix.os == 'ubuntu-20.04' || matrix.os == 'windows-2019'
uses: actions/setup-go@v4
with:
go-version: ^1.13.1
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
languages: javascript
@@ -87,16 +111,17 @@ jobs:
- name: Check sarif
uses: ./../action/.github/actions/check-sarif
# Running on Windows requires CodeQL CLI 2.9.0+.
if: "!(matrix.version == 'stable-20220120' && runner.os == 'Windows')"
if: "!(matrix.version == 'stable-20220401' && runner.os == 'Windows')"
with:
sarif-file: ${{ runner.temp }}/results/javascript.sarif
queries-run: js/ml-powered/nosql-injection,js/ml-powered/path-injection,js/ml-powered/sql-injection,js/ml-powered/xss
queries-run:
js/ml-powered/nosql-injection,js/ml-powered/path-injection,js/ml-powered/sql-injection,js/ml-powered/xss
queries-not-run: foo,bar
- name: Check results
env:
# Running on Windows requires CodeQL CLI 2.9.0+.
SHOULD_RUN_ML_POWERED_QUERIES: ${{ !(matrix.version == 'stable-20220120' &&
SHOULD_RUN_ML_POWERED_QUERIES: ${{ !(matrix.version == 'stable-20220401' &&
runner.os == 'Windows') }}
shell: bash
run: |

View File

@@ -25,18 +25,22 @@ jobs:
strategy:
matrix:
include:
- os: ubuntu-20.04
version: stable-20211005
- os: macos-latest
version: stable-20211005
- os: ubuntu-20.04
version: stable-20220120
- os: macos-latest
version: stable-20220120
- os: ubuntu-latest
version: stable-20220401
- os: macos-latest
version: stable-20220401
- os: ubuntu-latest
version: stable-20220615
- os: macos-latest
version: stable-20220615
- os: ubuntu-latest
version: stable-20220908
- os: macos-latest
version: stable-20220908
- os: ubuntu-latest
version: stable-20221211
- os: macos-latest
version: stable-20221211
- os: ubuntu-latest
version: cached
- os: macos-latest
@@ -60,11 +64,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set up Go
if: matrix.os == 'ubuntu-20.04' || matrix.os == 'windows-2019'
uses: actions/setup-go@v4
with:
go-version: ^1.13.1
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
id: init
with:
@@ -73,7 +83,7 @@ jobs:
- uses: ./../action/.github/actions/setup-swift
with:
codeql-path: ${{steps.init.outputs.codeql-path}}
codeql-path: ${{ steps.init.outputs.codeql-path }}
- name: Build code
shell: bash
@@ -119,8 +129,7 @@ jobs:
fi
- name: Check language autodetect for Ruby
if: (matrix.version == 'cached' || matrix.version == 'latest' || matrix.version
== 'nightly-latest')
if: env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true'
shell: bash
run: |
RUBY_DB=${{ fromJson(steps.analysis.outputs.db-locations).ruby }}
@@ -130,8 +139,7 @@ jobs:
fi
- name: Check language autodetect for Swift
if: (matrix.version == 'cached' || matrix.version == 'latest' || matrix.version
== 'nightly-latest')
if: env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true'
shell: bash
run: |
SWIFT_DB=${{ fromJson(steps.analysis.outputs.db-locations).swift }}
@@ -140,5 +148,4 @@ jobs:
exit 1
fi
env:
CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT: 'true' # Remove when Swift is GA.
CODEQL_ACTION_TEST_MODE: true

View File

@@ -54,6 +54,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
config-file: .github/codeql/codeql-config-packaging3.yml
@@ -72,7 +83,8 @@ jobs:
uses: ./../action/.github/actions/check-sarif
with:
sarif-file: ${{ runner.temp }}/results/javascript.sarif
queries-run: javascript/example/empty-or-one-block,javascript/example/empty-or-one-block,javascript/example/other-query-block,javascript/example/two-block
queries-run:
javascript/example/empty-or-one-block,javascript/example/empty-or-one-block,javascript/example/other-query-block,javascript/example/two-block
queries-not-run: foo,bar
- name: Assert Results

View File

@@ -54,6 +54,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
config-file: .github/codeql/codeql-config-packaging3.yml
@@ -72,7 +83,8 @@ jobs:
uses: ./../action/.github/actions/check-sarif
with:
sarif-file: ${{ runner.temp }}/results/javascript.sarif
queries-run: javascript/example/empty-or-one-block,javascript/example/empty-or-one-block,javascript/example/other-query-block,javascript/example/two-block
queries-run:
javascript/example/empty-or-one-block,javascript/example/empty-or-one-block,javascript/example/other-query-block,javascript/example/two-block
queries-not-run: foo,bar
- name: Assert Results

View File

@@ -54,6 +54,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
config-file: .github/codeql/codeql-config-packaging.yml
@@ -71,7 +82,8 @@ jobs:
uses: ./../action/.github/actions/check-sarif
with:
sarif-file: ${{ runner.temp }}/results/javascript.sarif
queries-run: javascript/example/empty-or-one-block,javascript/example/empty-or-one-block,javascript/example/other-query-block,javascript/example/two-block
queries-run:
javascript/example/empty-or-one-block,javascript/example/empty-or-one-block,javascript/example/other-query-block,javascript/example/two-block
queries-not-run: foo,bar
- name: Assert Results

View File

@@ -54,6 +54,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
config-file: .github/codeql/codeql-config-packaging2.yml
@@ -71,7 +82,8 @@ jobs:
uses: ./../action/.github/actions/check-sarif
with:
sarif-file: ${{ runner.temp }}/results/javascript.sarif
queries-run: javascript/example/empty-or-one-block,javascript/example/empty-or-one-block,javascript/example/other-query-block,javascript/example/two-block
queries-run:
javascript/example/empty-or-one-block,javascript/example/empty-or-one-block,javascript/example/other-query-block,javascript/example/two-block
queries-not-run: foo,bar
- name: Assert Results

View File

@@ -25,24 +25,30 @@ jobs:
strategy:
matrix:
include:
- os: ubuntu-20.04
version: stable-20211005
- os: macos-latest
version: stable-20211005
- os: windows-2019
version: stable-20211005
- os: ubuntu-20.04
version: stable-20220120
- os: macos-latest
version: stable-20220120
- os: windows-2019
version: stable-20220120
- os: ubuntu-latest
version: stable-20220401
- os: macos-latest
version: stable-20220401
- os: windows-latest
version: stable-20220401
- os: ubuntu-latest
version: stable-20220615
- os: macos-latest
version: stable-20220615
- os: windows-latest
version: stable-20220615
- os: ubuntu-latest
version: stable-20220908
- os: macos-latest
version: stable-20220908
- os: windows-latest
version: stable-20220908
- os: ubuntu-latest
version: stable-20221211
- os: macos-latest
version: stable-20221211
- os: windows-latest
version: stable-20221211
- os: ubuntu-latest
version: cached
- os: macos-latest
@@ -72,11 +78,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set up Go
if: matrix.os == 'ubuntu-20.04' || matrix.os == 'windows-2019'
uses: actions/setup-go@v4
with:
go-version: ^1.13.1
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
tools: ${{ steps.prepare-test.outputs.tools-url }}

View File

@@ -38,6 +38,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- name: Set up Ruby
uses: ruby/setup-ruby@v1
with:

11
.github/workflows/__ruby.yml generated vendored
View File

@@ -48,6 +48,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
languages: ruby

View File

@@ -48,6 +48,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
config-file: .github/codeql/codeql-config-packaging3.yml

View File

@@ -42,6 +42,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: actions/checkout@v3
- uses: ./init
with:

View File

@@ -48,6 +48,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
id: init
with:
@@ -75,6 +86,5 @@ jobs:
exit 1
fi
env:
CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT: 'true' # Remove when Swift is GA.
DOTNET_GENERATE_ASPNET_CERTIFICATE: 'false'
CODEQL_ACTION_TEST_MODE: true

View File

@@ -38,6 +38,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- name: Test setup
shell: bash
run: |

View File

@@ -38,15 +38,30 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- name: Fetch a CodeQL bundle
shell: bash
env:
CODEQL_URL: ${{ steps.prepare-test.outputs.tools-url }}
run: |
wget "$CODEQL_URL"
- uses: ./../action/init
- id: init
uses: ./../action/init
with:
tools: ./codeql-bundle.tar.gz
- uses: ./../action/.github/actions/setup-swift
with:
codeql-path: ${{ steps.init.outputs.codeql-path }}
- name: Build code
shell: bash
run: ./build.sh

11
.github/workflows/__test-proxy.yml generated vendored
View File

@@ -38,6 +38,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
languages: javascript

View File

@@ -25,12 +25,14 @@ jobs:
strategy:
matrix:
include:
- os: ubuntu-20.04
version: stable-20211005
- os: ubuntu-20.04
version: stable-20220120
- os: ubuntu-latest
version: stable-20220401
- os: ubuntu-latest
version: stable-20220615
- os: ubuntu-latest
version: stable-20220908
- os: ubuntu-latest
version: stable-20221211
- os: ubuntu-latest
version: cached
- os: ubuntu-latest
@@ -48,15 +50,25 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set up Go
if: matrix.os == 'ubuntu-20.04' || matrix.os == 'windows-2019'
uses: actions/setup-go@v4
with:
go-version: ^1.13.1
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
id: init
with:
db-location: ${{ runner.temp }}/customDbLocation
tools: ${{ steps.prepare-test.outputs.tools-url }}
- uses: ./../action/.github/actions/setup-swift
with:
codeql-path: ${{ steps.init.outputs.codeql-path }}
- name: Build code
shell: bash
# Disable Kotlin analysis while it's incompatible with Kotlin 1.8, until we find a

View File

@@ -25,24 +25,30 @@ jobs:
strategy:
matrix:
include:
- os: ubuntu-20.04
version: stable-20211005
- os: macos-latest
version: stable-20211005
- os: windows-2019
version: stable-20211005
- os: ubuntu-20.04
version: stable-20220120
- os: macos-latest
version: stable-20220120
- os: windows-2019
version: stable-20220120
- os: ubuntu-latest
version: stable-20220401
- os: macos-latest
version: stable-20220401
- os: windows-latest
version: stable-20220401
- os: ubuntu-latest
version: stable-20220615
- os: macos-latest
version: stable-20220615
- os: windows-latest
version: stable-20220615
- os: ubuntu-latest
version: stable-20220908
- os: macos-latest
version: stable-20220908
- os: windows-latest
version: stable-20220908
- os: ubuntu-latest
version: stable-20221211
- os: macos-latest
version: stable-20221211
- os: windows-latest
version: stable-20221211
- os: ubuntu-latest
version: cached
- os: macos-latest
@@ -72,11 +78,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set up Go
if: matrix.os == 'ubuntu-20.04' || matrix.os == 'windows-2019'
uses: actions/setup-go@v4
with:
go-version: ^1.13.1
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: ./../action/init
with:
tools: ${{ steps.prepare-test.outputs.tools-url }}

View File

@@ -25,24 +25,30 @@ jobs:
strategy:
matrix:
include:
- os: ubuntu-20.04
version: stable-20211005
- os: macos-latest
version: stable-20211005
- os: windows-2019
version: stable-20211005
- os: ubuntu-20.04
version: stable-20220120
- os: macos-latest
version: stable-20220120
- os: windows-2019
version: stable-20220120
- os: ubuntu-latest
version: stable-20220401
- os: macos-latest
version: stable-20220401
- os: windows-latest
version: stable-20220401
- os: ubuntu-latest
version: stable-20220615
- os: macos-latest
version: stable-20220615
- os: windows-latest
version: stable-20220615
- os: ubuntu-latest
version: stable-20220908
- os: macos-latest
version: stable-20220908
- os: windows-latest
version: stable-20220908
- os: ubuntu-latest
version: stable-20221211
- os: macos-latest
version: stable-20221211
- os: windows-latest
version: stable-20221211
- os: ubuntu-latest
version: cached
- os: macos-latest
@@ -72,11 +78,17 @@ jobs:
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
- name: Set up Go
if: matrix.os == 'ubuntu-20.04' || matrix.os == 'windows-2019'
uses: actions/setup-go@v4
with:
go-version: ^1.13.1
- name: Set environment variable for Swift enablement
if: >-
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
shell: bash
run: echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV
- uses: actions/checkout@v3
with:
ref: 474bbf07f9247ffe1856c6a0f94aeeb10e7afee6

View File

@@ -21,31 +21,17 @@ jobs:
upload-artifacts:
strategy:
matrix:
include:
- os: ubuntu-20.04
version: stable-20211005
- os: macos-latest
version: stable-20211005
- os: ubuntu-20.04
version: stable-20220120
- os: macos-latest
version: stable-20220120
- os: ubuntu-latest
version: stable-20220401
- os: macos-latest
version: stable-20220401
- os: ubuntu-latest
version: cached
- os: macos-latest
version: cached
- os: ubuntu-latest
version: latest
- os: macos-latest
version: latest
- os: ubuntu-latest
version: nightly-latest
- os: macos-latest
version: nightly-latest
os:
- ubuntu-latest
- macos-latest
version:
- stable-20220401
- stable-20220615
- stable-20220908
- stable-20221211
- cached
- latest
- nightly-latest
name: Upload debug artifacts
env:
CODEQL_ACTION_TEST_MODE: true
@@ -84,17 +70,10 @@ jobs:
- name: Check expected artifacts exist
shell: bash
run: |
VERSIONS="stable-20211005 stable-20220120 stable-20220401 cached latest nightly-latest"
VERSIONS="stable-20220401 stable-20220615 stable-20220908 stable-20221211 cached latest nightly-latest"
LANGUAGES="cpp csharp go java javascript python"
for version in $VERSIONS; do
if [[ "$version" =~ stable-(20211005|20220120|20210809) ]]; then
# Note the absence of the period in "ubuntu-2004": this is present in the image name
# but not the artifact name
OPERATING_SYSTEMS="ubuntu-2004 macos-latest"
else
OPERATING_SYSTEMS="ubuntu-latest macos-latest"
fi
for os in $OPERATING_SYSTEMS; do
for os in ubuntu-latest macos-latest; do
pushd "./my-debug-artifacts-$os-$version"
echo "Artifacts from version $version on $os:"
for language in $LANGUAGES; do

View File

@@ -2,11 +2,20 @@ name: Update default CodeQL bundle
on:
release:
types: [prereleased]
# From https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#release
# Note: The prereleased type will not trigger for pre-releases published
# from draft releases, but the published type will trigger. If you want a
# workflow to run when stable and pre-releases publish, subscribe to
# published instead of released and prereleased.
#
# From https://github.com/orgs/community/discussions/26281
# As a work around, in published type workflow, you could add if condition
# to filter pre-release attribute.
types: [published]
jobs:
update-bundle:
if: startsWith(github.event.release.tag_name, 'codeql-bundle-')
if: github.event.release.prerelease && startsWith(github.event.release.tag_name, 'codeql-bundle-')
runs-on: ubuntu-latest
steps:
- name: Dump environment

View File

@@ -36,7 +36,7 @@ jobs:
env:
ENTERPRISE_RELEASES_PATH: ${{ github.workspace }}/enterprise-releases/
- name: Commit Changes
uses: peter-evans/create-pull-request@5b4a9f6a9e2af26e5f02351490b90d01eb8ec1e5 # v5.0.0
uses: peter-evans/create-pull-request@284f54f989303d2699d373481a0cfa13ad5a6666 # v5.0.1
with:
commit-message: Update supported GitHub Enterprise Server versions.
title: Update supported GitHub Enterprise Server versions.

View File

@@ -2,8 +2,27 @@
## [UNRELEASED]
- Remove the requirement for `on.push` and `on.pull_request` to trigger on the same branches. [#1675](https://github.com/github/codeql-action/pull/1675)
## 2.3.3 - 04 May 2023
- Update default CodeQL bundle version to 2.13.1. [#1664](https://github.com/github/codeql-action/pull/1664)
- You can now configure CodeQL within your code scanning workflow by passing a `config` input to the `init` Action. See [Using a custom configuration file](https://aka.ms/code-scanning-docs/config-file) for more information about configuring code scanning. [#1590](https://github.com/github/codeql-action/pull/1590)
- Updated the SARIF 2.1.0 JSON schema file to the latest from [oasis-tcs/sarif-spec](https://github.com/oasis-tcs/sarif-spec/blob/123e95847b13fbdd4cbe2120fa5e33355d4a042b/Schemata/sarif-schema-2.1.0.json). [#1668](https://github.com/github/codeql-action/pull/1668)
## 2.3.2 - 27 Apr 2023
No user facing changes.
## 2.3.1 - 26 Apr 2023
No user facing changes.
## 2.3.0 - 21 Apr 2023
- Update default CodeQL bundle version to 2.13.0. [#1649](https://github.com/github/codeql-action/pull/1649)
- Bump the minimum CodeQL bundle version to 2.8.5. [#1618](https://github.com/github/codeql-action/pull/1618)
## 2.2.12 - 13 Apr 2023
- Include the value of the `GITHUB_RUN_ATTEMPT` environment variable in the telemetry sent to GitHub. [#1640](https://github.com/github/codeql-action/pull/1640)

View File

@@ -135,7 +135,38 @@ By default, this will override any queries specified in a config file. If you wi
queries: +<local-or-remote-query>,<another-query>
```
### Configuration via `config` input
You can alternatively configure CodeQL using the `config` input to the `init` Action. The value of this input must be a YAML string that follows the configuration file format documented at "[Using a custom configuration file](https://aka.ms/code-scanning-docs/config-file)."
#### Example configuration
```yaml
- uses: github/codeql-action/init@v2
with:
languages: ${{ matrix.language }}
config: |
disable-default-queries: true
queries:
- uses: security-extended
- uses: security-and-quality
query-filters:
- include:
tags: /cwe-020/
```
#### Sharing configuration across multiple repositories
You can use Actions or environment variables to share configuration across multiple repositories and to modify configuration without needing to edit the workflow file. In the following example, `vars.CODEQL_CONF` is an [Actions configuration variable](https://docs.github.com/en/actions/learn-github-actions/variables#defining-configuration-variables-for-multiple-workflows):
```yaml
- uses: github/codeql-action/init@v2
with:
languages: ${{ matrix.language }}
config: ${{ vars.CODEQL_CONF }}
```
## Troubleshooting
Read about [troubleshooting code scanning](https://help.github.com/en/github/finding-security-vulnerabilities-and-errors-in-your-code/troubleshooting-code-scanning).

View File

@@ -44,6 +44,9 @@ inputs:
db-location:
description: Path where CodeQL databases should be created. If not specified, a temporary directory will be used.
required: false
config:
description: Configuration passed as a YAML string in the same format as the config-file input. This takes precedence over the config-file input.
required: false
queries:
description: Comma-separated list of additional queries to run. By default, this overrides the same setting in a configuration file; prefix with "+" to use both sets of queries.
required: false

12
lib/actions-util.js generated
View File

@@ -290,16 +290,8 @@ exports.getActionVersion = getActionVersion;
async function createStatusReportBase(actionName, status, actionStartedAt, cause, exception) {
const commitOid = (0, exports.getOptionalInput)("sha") || process.env["GITHUB_SHA"] || "";
const ref = await getRef();
const workflowRunIDStr = process.env["GITHUB_RUN_ID"];
let workflowRunID = -1;
if (workflowRunIDStr) {
workflowRunID = parseInt(workflowRunIDStr, 10);
}
const workflowRunAttemptStr = process.env["GITHUB_RUN_ATTEMPT"];
let workflowRunAttempt = -1;
if (workflowRunAttemptStr) {
workflowRunAttempt = parseInt(workflowRunAttemptStr, 10);
}
const workflowRunID = (0, workflow_1.getWorkflowRunID)();
const workflowRunAttempt = (0, workflow_1.getWorkflowRunAttempt)();
const workflowName = process.env["GITHUB_WORKFLOW"] || "";
const jobName = process.env["GITHUB_JOB"] || "";
const analysis_key = await getAnalysisKey();

File diff suppressed because one or more lines are too long

1
lib/analyze-action.js generated
View File

@@ -155,7 +155,6 @@ async function run() {
if (hasBadExpectErrorInput()) {
throw new Error("`expect-error` input parameter is for internal use only. It should only be set by codeql-action or a fork.");
}
await (0, codeql_1.enrichEnvironment)(await (0, codeql_1.getCodeQL)(config.codeQLCmd));
const apiDetails = (0, api_client_1.getApiDetails)();
const outputDir = actionsUtil.getRequiredInput("output");
const threads = util.getThreadsFlag(actionsUtil.getOptionalInput("threads") || process.env["CODEQL_THREADS"], logger);

File diff suppressed because one or more lines are too long

12
lib/analyze.js generated
View File

@@ -37,7 +37,6 @@ const analysisPaths = __importStar(require("./analysis-paths"));
const codeql_1 = require("./codeql");
const configUtils = __importStar(require("./config-utils"));
const languages_1 = require("./languages");
const sharedEnv = __importStar(require("./shared-environment"));
const tracer_config_1 = require("./tracer-config");
const util = __importStar(require("./util"));
class CodeQLAnalysisError extends Error {
@@ -283,20 +282,13 @@ async function runFinalize(outputDir, threadsFlag, memoryFlag, config, logger) {
}
await fs.promises.mkdir(outputDir, { recursive: true });
const timings = await finalizeDatabaseCreation(config, threadsFlag, memoryFlag, logger);
const codeql = await (0, codeql_1.getCodeQL)(config.codeQLCmd);
// WARNING: This does not _really_ end tracing, as the tracer will restore its
// critical environment variables and it'll still be active for all processes
// launched from this build step.
// However, it will stop tracing for all steps past the codeql-action/analyze
// step.
if (await util.codeQlVersionAbove(codeql, codeql_1.CODEQL_VERSION_NEW_TRACING)) {
// Delete variables as specified by the end-tracing script
await (0, tracer_config_1.endTracingForCluster)(config);
}
else {
// Delete the tracer config env var to avoid tracing ourselves
delete process.env[sharedEnv.ODASA_TRACER_CONFIGURATION];
}
// Delete variables as specified by the end-tracing script
await (0, tracer_config_1.endTracingForCluster)(config);
return timings;
}
exports.runFinalize = runFinalize;

File diff suppressed because one or more lines are too long

108
lib/codeql.js generated
View File

@@ -23,10 +23,9 @@ var __importStar = (this && this.__importStar) || function (mod) {
return result;
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.enrichEnvironment = exports.getExtraOptions = exports.getCodeQLForCmd = exports.getCodeQLForTesting = exports.getCachedCodeQL = exports.setCodeQL = exports.getCodeQL = exports.setupCodeQL = exports.CODEQL_VERSION_INIT_WITH_QLCONFIG = exports.CODEQL_VERSION_SECURITY_EXPERIMENTAL_SUITE = exports.CODEQL_VERSION_BETTER_RESOLVE_LANGUAGES = exports.CODEQL_VERSION_ML_POWERED_QUERIES_WINDOWS = exports.CODEQL_VERSION_TRACING_GLIBC_2_34 = exports.CODEQL_VERSION_NEW_TRACING = exports.CODEQL_VERSION_GHES_PACK_DOWNLOAD = exports.CommandInvocationError = void 0;
exports.getExtraOptions = exports.getCodeQLForCmd = exports.getCodeQLForTesting = exports.getCachedCodeQL = exports.setCodeQL = exports.getCodeQL = exports.setupCodeQL = exports.CODEQL_VERSION_INIT_WITH_QLCONFIG = exports.CODEQL_VERSION_SECURITY_EXPERIMENTAL_SUITE = exports.CODEQL_VERSION_BETTER_RESOLVE_LANGUAGES = exports.CODEQL_VERSION_ML_POWERED_QUERIES_WINDOWS = exports.CODEQL_VERSION_GHES_PACK_DOWNLOAD = exports.CommandInvocationError = void 0;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const core = __importStar(require("@actions/core"));
const toolrunner = __importStar(require("@actions/exec/lib/toolrunner"));
const yaml = __importStar(require("js-yaml"));
const actions_util_1 = require("./actions-util");
@@ -35,7 +34,6 @@ const error_matcher_1 = require("./error-matcher");
const feature_flags_1 = require("./feature-flags");
const languages_1 = require("./languages");
const setupCodeql = __importStar(require("./setup-codeql"));
const shared_environment_1 = require("./shared-environment");
const toolrunner_error_catcher_1 = require("./toolrunner-error-catcher");
const trap_caching_1 = require("./trap-caching");
const util = __importStar(require("./util"));
@@ -62,7 +60,7 @@ let cachedCodeQL = undefined;
* The version flags below can be used to conditionally enable certain features
* on versions newer than this.
*/
const CODEQL_MINIMUM_VERSION = "2.6.3";
const CODEQL_MINIMUM_VERSION = "2.8.5";
/**
* Versions of CodeQL that version-flag certain functionality in the Action.
* For convenience, please keep these in descending order. Once a version
@@ -73,21 +71,6 @@ const CODEQL_VERSION_LUA_TRACER_CONFIG = "2.10.0";
const CODEQL_VERSION_LUA_TRACING_GO_WINDOWS_FIXED = "2.10.4";
exports.CODEQL_VERSION_GHES_PACK_DOWNLOAD = "2.10.4";
const CODEQL_VERSION_FILE_BASELINE_INFORMATION = "2.11.3";
/**
* This variable controls using the new style of tracing from the CodeQL
* CLI. In particular, with versions above this we will use both indirect
* tracing, and multi-language tracing together with database clusters.
*
* Note that there were bugs in both of these features that were fixed in
* release 2.7.0 of the CodeQL CLI, therefore this flag is only enabled for
* versions above that.
*/
exports.CODEQL_VERSION_NEW_TRACING = "2.7.0";
/**
* Versions 2.7.3+ of the CodeQL CLI support build tracing with glibc 2.34 on Linux. Versions before
* this cannot perform build tracing when running on the Actions `ubuntu-22.04` runner image.
*/
exports.CODEQL_VERSION_TRACING_GLIBC_2_34 = "2.7.3";
/**
* Versions 2.9.0+ of the CodeQL CLI run machine learning models from a temporary directory, which
* resolves an issue on Windows where TensorFlow models are not correctly loaded due to the path of
@@ -139,8 +122,7 @@ async function setupCodeQL(toolsInput, apiDetails, tempDir, variant, defaultCliV
};
}
catch (e) {
logger.error((0, util_1.wrapError)(e).message);
throw new Error("Unable to download and extract CodeQL CLI");
throw new Error(`Unable to download and extract CodeQL CLI: ${(0, util_1.wrapError)(e).message}`);
}
}
exports.setupCodeQL = setupCodeQL;
@@ -177,8 +159,6 @@ function setCodeQL(partialCodeql) {
getPath: resolveFunction(partialCodeql, "getPath", () => "/tmp/dummy-path"),
getVersion: resolveFunction(partialCodeql, "getVersion", () => new Promise((resolve) => resolve("1.0.0"))),
printVersion: resolveFunction(partialCodeql, "printVersion"),
getTracerEnv: resolveFunction(partialCodeql, "getTracerEnv"),
databaseInit: resolveFunction(partialCodeql, "databaseInit"),
databaseInitCluster: resolveFunction(partialCodeql, "databaseInitCluster"),
runAutobuild: resolveFunction(partialCodeql, "runAutobuild"),
extractScannedLanguage: resolveFunction(partialCodeql, "extractScannedLanguage"),
@@ -245,73 +225,6 @@ async function getCodeQLForCmd(cmd, checkVersion) {
async printVersion() {
await runTool(cmd, ["version", "--format=json"]);
},
async getTracerEnv(databasePath) {
// Write tracer-env.js to a temp location.
// BEWARE: The name and location of this file is recognized by `codeql database
// trace-command` in order to enable special support for concatenable tracer
// configurations. Consequently the name must not be changed.
// (This warning can be removed once a different way to recognize the
// action/runner has been implemented in `codeql database trace-command`
// _and_ is present in the latest supported CLI release.)
const tracerEnvJs = path.resolve(databasePath, "working", "tracer-env.js");
fs.mkdirSync(path.dirname(tracerEnvJs), { recursive: true });
fs.writeFileSync(tracerEnvJs, `
const fs = require('fs');
const env = {};
for (let entry of Object.entries(process.env)) {
const key = entry[0];
const value = entry[1];
if (typeof value !== 'undefined' && key !== '_' && !key.startsWith('JAVA_MAIN_CLASS_')) {
env[key] = value;
}
}
process.stdout.write(process.argv[2]);
fs.writeFileSync(process.argv[2], JSON.stringify(env), 'utf-8');`);
// BEWARE: The name and location of this file is recognized by `codeql database
// trace-command` in order to enable special support for concatenable tracer
// configurations. Consequently the name must not be changed.
// (This warning can be removed once a different way to recognize the
// action/runner has been implemented in `codeql database trace-command`
// _and_ is present in the latest supported CLI release.)
const envFile = path.resolve(databasePath, "working", "env.tmp");
try {
await runTool(cmd, [
"database",
"trace-command",
databasePath,
...getExtraOptionsFromEnv(["database", "trace-command"]),
process.execPath,
tracerEnvJs,
envFile,
]);
}
catch (e) {
if (e instanceof CommandInvocationError &&
e.output.includes("undefined symbol: __libc_dlopen_mode, version GLIBC_PRIVATE") &&
process.platform === "linux" &&
!(await util.codeQlVersionAbove(this, exports.CODEQL_VERSION_TRACING_GLIBC_2_34))) {
throw new util.UserError("The CodeQL CLI is incompatible with the version of glibc on your system. " +
`Please upgrade to CodeQL CLI version ${exports.CODEQL_VERSION_TRACING_GLIBC_2_34} or ` +
"later. If you cannot upgrade to a newer version of the CodeQL CLI, you can " +
`alternatively run your workflow on another runner image such as "ubuntu-20.04" ` +
"that has glibc 2.33 or earlier installed.");
}
else {
throw e;
}
}
return JSON.parse(fs.readFileSync(envFile, "utf-8"));
},
async databaseInit(databasePath, language, sourceRoot) {
await runTool(cmd, [
"database",
"init",
databasePath,
`--language=${language}`,
`--source-root=${sourceRoot}`,
...getExtraOptionsFromEnv(["database", "init"]),
]);
},
async databaseInitCluster(config, sourceRoot, processName, features, qlconfigFile, logger) {
const extraArgs = config.languages.map((language) => `--language=${language}`);
if (config.languages.filter((l) => (0, languages_1.isTracedLanguage)(l)).length > 0) {
@@ -853,19 +766,4 @@ async function getCodeScanningConfigExportArguments(config, codeql, features) {
}
return [];
}
/**
* Enrich the environment variables with further flags that we cannot
* know the value of until we know what version of CodeQL we're running.
*/
async function enrichEnvironment(codeql) {
if (await util.codeQlVersionAbove(codeql, exports.CODEQL_VERSION_NEW_TRACING)) {
core.exportVariable(shared_environment_1.EnvVar.FEATURE_MULTI_LANGUAGE, "false");
core.exportVariable(shared_environment_1.EnvVar.FEATURE_SANDWICH, "false");
}
else {
core.exportVariable(shared_environment_1.EnvVar.FEATURE_MULTI_LANGUAGE, "true");
core.exportVariable(shared_environment_1.EnvVar.FEATURE_SANDWICH, "true");
}
}
exports.enrichEnvironment = enrichEnvironment;
//# sourceMappingURL=codeql.js.map

File diff suppressed because one or more lines are too long

11
lib/config-utils.js generated
View File

@@ -932,8 +932,17 @@ function dbLocationOrDefault(dbLocation, tempDir) {
* This will parse the config from the user input if present, or generate
* a default config. The parsed config is then stored to a known location.
*/
async function initConfig(languagesInput, queriesInput, packsInput, registriesInput, configFile, dbLocation, trapCachingEnabled, debugMode, debugArtifactName, debugDatabaseName, repository, tempDir, codeQL, workspacePath, gitHubVersion, apiDetails, features, logger) {
async function initConfig(languagesInput, queriesInput, packsInput, registriesInput, configFile, dbLocation, configInput, trapCachingEnabled, debugMode, debugArtifactName, debugDatabaseName, repository, tempDir, codeQL, workspacePath, gitHubVersion, apiDetails, features, logger) {
let config;
// if configInput is set, it takes precedence over configFile
if (configInput) {
if (configFile) {
logger.warning(`Both a config file and config input were provided. Ignoring config file.`);
}
configFile = path.resolve(workspacePath, "user-config-from-action.yml");
fs.writeFileSync(configFile, configInput);
logger.debug(`Using config from action input: ${configFile}`);
}
// If no config file was provided create an empty one
if (!configFile) {
logger.debug("No configuration file was provided");

File diff suppressed because one or more lines are too long

137
lib/config-utils.test.js generated
View File

@@ -102,8 +102,8 @@ function mockListLanguages(languages) {
return { packs: [] };
},
});
const config = await configUtils.initConfig(languages, undefined, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), logger);
t.deepEqual(config, await configUtils.getDefaultConfig(languages, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), logger));
const config = await configUtils.initConfig(languages, undefined, undefined, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), logger);
t.deepEqual(config, await configUtils.getDefaultConfig(languages, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), logger));
});
});
(0, ava_1.default)("loading config saves config", async (t) => {
@@ -128,7 +128,7 @@ function mockListLanguages(languages) {
t.false(fs.existsSync(configUtils.getPathToParsedConfigFile(tmpDir)));
// Sanity check that getConfig returns undefined before we have called initConfig
t.deepEqual(await configUtils.getConfig(tmpDir, logger), undefined);
const config1 = await configUtils.initConfig("javascript,python", undefined, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), logger);
const config1 = await configUtils.initConfig("javascript,python", undefined, undefined, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), logger);
// The saved config file should now exist
t.true(fs.existsSync(configUtils.getPathToParsedConfigFile(tmpDir)));
// And that same newly-initialised config should now be returned by getConfig
@@ -144,7 +144,7 @@ function mockListLanguages(languages) {
(0, ava_1.default)("load input outside of workspace", async (t) => {
return await util.withTmpDir(async (tmpDir) => {
try {
await configUtils.initConfig(undefined, undefined, undefined, undefined, "../input", undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, (0, codeql_1.getCachedCodeQL)(), tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
await configUtils.initConfig(undefined, undefined, undefined, undefined, "../input", undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, (0, codeql_1.getCachedCodeQL)(), tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
throw new Error("initConfig did not throw error");
}
catch (err) {
@@ -157,7 +157,7 @@ function mockListLanguages(languages) {
// no filename given, just a repo
const configFile = "octo-org/codeql-config@main";
try {
await configUtils.initConfig(undefined, undefined, undefined, undefined, configFile, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, (0, codeql_1.getCachedCodeQL)(), tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
await configUtils.initConfig(undefined, undefined, undefined, undefined, configFile, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, (0, codeql_1.getCachedCodeQL)(), tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
throw new Error("initConfig did not throw error");
}
catch (err) {
@@ -171,7 +171,7 @@ function mockListLanguages(languages) {
const configFile = "input";
t.false(fs.existsSync(path.join(tmpDir, configFile)));
try {
await configUtils.initConfig(languages, undefined, undefined, undefined, configFile, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, (0, codeql_1.getCachedCodeQL)(), tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
await configUtils.initConfig(languages, undefined, undefined, undefined, configFile, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, (0, codeql_1.getCachedCodeQL)(), tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
throw new Error("initConfig did not throw error");
}
catch (err) {
@@ -247,7 +247,7 @@ function mockListLanguages(languages) {
};
const languages = "javascript";
const configFilePath = createConfigFile(inputFileContents, tmpDir);
const actualConfig = await configUtils.initConfig(languages, undefined, undefined, undefined, configFilePath, undefined, false, false, "my-artifact", "my-db", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
const actualConfig = await configUtils.initConfig(languages, undefined, undefined, undefined, configFilePath, undefined, undefined, false, false, "my-artifact", "my-db", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
// Should exactly equal the object we constructed earlier
t.deepEqual(actualConfig, expectedConfig);
});
@@ -286,7 +286,7 @@ function mockListLanguages(languages) {
fs.mkdirSync(path.join(tmpDir, "foo"));
const languages = "javascript";
const configFilePath = createConfigFile(inputFileContents, tmpDir);
await configUtils.initConfig(languages, undefined, undefined, undefined, configFilePath, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
await configUtils.initConfig(languages, undefined, undefined, undefined, configFilePath, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
// Check resolve queries was called correctly
t.deepEqual(resolveQueriesArgs.length, 1);
t.deepEqual(resolveQueriesArgs[0].queries, [
@@ -332,7 +332,7 @@ function queriesToResolvedQueryForm(queries) {
},
});
const languages = "javascript";
const config = await configUtils.initConfig(languages, undefined, undefined, undefined, configFilePath, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
const config = await configUtils.initConfig(languages, undefined, undefined, undefined, configFilePath, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
// Check resolveQueries was called correctly
// It'll be called once for the default queries
// and once for `./foo` from the config file.
@@ -368,7 +368,7 @@ function queriesToResolvedQueryForm(queries) {
},
});
const languages = "javascript";
const config = await configUtils.initConfig(languages, testQueries, undefined, undefined, configFilePath, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
const config = await configUtils.initConfig(languages, testQueries, undefined, undefined, configFilePath, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
// Check resolveQueries was called correctly
// It'll be called once for the default queries and once for `./override`,
// but won't be called for './foo' from the config file.
@@ -403,7 +403,7 @@ function queriesToResolvedQueryForm(queries) {
},
});
const languages = "javascript";
const config = await configUtils.initConfig(languages, testQueries, undefined, undefined, configFilePath, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
const config = await configUtils.initConfig(languages, testQueries, undefined, undefined, configFilePath, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
// Check resolveQueries was called correctly
// It'll be called once for `./workflow-query`,
// but won't be called for the default one since that was disabled
@@ -432,7 +432,7 @@ function queriesToResolvedQueryForm(queries) {
},
});
const languages = "javascript";
const config = await configUtils.initConfig(languages, testQueries, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
const config = await configUtils.initConfig(languages, testQueries, undefined, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
// Check resolveQueries was called correctly:
// It'll be called once for the default queries,
// and then once for each of the two queries from the workflow
@@ -474,7 +474,7 @@ function queriesToResolvedQueryForm(queries) {
},
});
const languages = "javascript";
const config = await configUtils.initConfig(languages, testQueries, undefined, undefined, configFilePath, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
const config = await configUtils.initConfig(languages, testQueries, undefined, undefined, configFilePath, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
// Check resolveQueries was called correctly
// It'll be called once for the default queries,
// once for each of additional1 and additional2,
@@ -495,6 +495,97 @@ function queriesToResolvedQueryForm(queries) {
t.true(config.queries["javascript"].custom[2].queries[0].endsWith(`${path.sep}foo`));
});
});
(0, ava_1.default)("Queries can be specified using config input", async (t) => {
return await util.withTmpDir(async (tmpDir) => {
const configInput = `
name: my config
queries:
- uses: ./foo
packs:
javascript:
- a/b@1.2.3
python:
- c/d@1.2.3
`;
fs.mkdirSync(path.join(tmpDir, "foo"));
const resolveQueriesArgs = [];
const codeQL = (0, codeql_1.setCodeQL)({
async resolveQueries(queries, extraSearchPath) {
resolveQueriesArgs.push({ queries, extraSearchPath });
return queriesToResolvedQueryForm(queries);
},
async packDownload() {
return { packs: [] };
},
});
// Only JS, python packs will be ignored
const languages = "javascript";
const config = await configUtils.initConfig(languages, undefined, undefined, undefined, undefined, undefined, configInput, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
// Check resolveQueries was called correctly
// It'll be called once for the default queries
// and once for `./foo` from the config file.
t.deepEqual(resolveQueriesArgs.length, 2);
t.deepEqual(resolveQueriesArgs[1].queries.length, 1);
t.true(resolveQueriesArgs[1].queries[0].endsWith(`${path.sep}foo`));
t.deepEqual(config.packs, {
[languages_1.Language.javascript]: ["a/b@1.2.3"],
});
// Now check that the end result contains the default queries and the query from config
t.deepEqual(config.queries["javascript"].builtin.length, 1);
t.deepEqual(config.queries["javascript"].custom.length, 1);
t.true(config.queries["javascript"].builtin[0].endsWith("javascript-code-scanning.qls"));
t.true(config.queries["javascript"].custom[0].queries[0].endsWith(`${path.sep}foo`));
});
});
(0, ava_1.default)("Using config input and file together, config input should be used.", async (t) => {
return await util.withTmpDir(async (tmpDir) => {
process.env["RUNNER_TEMP"] = tmpDir;
process.env["GITHUB_WORKSPACE"] = tmpDir;
const inputFileContents = `
name: my config
queries:
- uses: ./foo_file`;
const configFilePath = createConfigFile(inputFileContents, tmpDir);
const configInput = `
name: my config
queries:
- uses: ./foo
packs:
javascript:
- a/b@1.2.3
python:
- c/d@1.2.3
`;
fs.mkdirSync(path.join(tmpDir, "foo"));
const resolveQueriesArgs = [];
const codeQL = (0, codeql_1.setCodeQL)({
async resolveQueries(queries, extraSearchPath) {
resolveQueriesArgs.push({ queries, extraSearchPath });
return queriesToResolvedQueryForm(queries);
},
async packDownload() {
return { packs: [] };
},
});
// Only JS, python packs will be ignored
const languages = "javascript";
const config = await configUtils.initConfig(languages, undefined, undefined, undefined, undefined, configFilePath, configInput, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
// Check resolveQueries was called correctly
// It'll be called once for the default queries
// and once for `./foo` from the config file.
t.deepEqual(resolveQueriesArgs.length, 2);
t.deepEqual(resolveQueriesArgs[1].queries.length, 1);
t.true(resolveQueriesArgs[1].queries[0].endsWith(`${path.sep}foo`));
t.deepEqual(config.packs, {
[languages_1.Language.javascript]: ["a/b@1.2.3"],
});
// Now check that the end result contains the default queries and the query from config
t.deepEqual(config.queries["javascript"].builtin.length, 1);
t.deepEqual(config.queries["javascript"].custom.length, 1);
t.true(config.queries["javascript"].builtin[0].endsWith("javascript-code-scanning.qls"));
t.true(config.queries["javascript"].custom[0].queries[0].endsWith(`${path.sep}foo`));
});
});
(0, ava_1.default)("Invalid queries in workflow file handled correctly", async (t) => {
return await util.withTmpDir(async (tmpDir) => {
const queries = "foo/bar@v1@v3";
@@ -516,7 +607,7 @@ function queriesToResolvedQueryForm(queries) {
},
});
try {
await configUtils.initConfig(languages, queries, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
await configUtils.initConfig(languages, queries, undefined, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
t.fail("initConfig did not throw error");
}
catch (err) {
@@ -562,7 +653,7 @@ function queriesToResolvedQueryForm(queries) {
fs.mkdirSync(path.join(tmpDir, "foo/bar/dev"), { recursive: true });
const configFile = "octo-org/codeql-config/config.yaml@main";
const languages = "javascript";
await configUtils.initConfig(languages, undefined, undefined, undefined, configFile, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
await configUtils.initConfig(languages, undefined, undefined, undefined, configFile, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
t.assert(spyGetContents.called);
});
});
@@ -572,7 +663,7 @@ function queriesToResolvedQueryForm(queries) {
mockGetContents(dummyResponse);
const repoReference = "octo-org/codeql-config/config.yaml@main";
try {
await configUtils.initConfig(undefined, undefined, undefined, undefined, repoReference, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, (0, codeql_1.getCachedCodeQL)(), tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
await configUtils.initConfig(undefined, undefined, undefined, undefined, repoReference, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, (0, codeql_1.getCachedCodeQL)(), tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
throw new Error("initConfig did not throw error");
}
catch (err) {
@@ -588,7 +679,7 @@ function queriesToResolvedQueryForm(queries) {
mockGetContents(dummyResponse);
const repoReference = "octo-org/codeql-config/config.yaml@main";
try {
await configUtils.initConfig(undefined, undefined, undefined, undefined, repoReference, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, (0, codeql_1.getCachedCodeQL)(), tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
await configUtils.initConfig(undefined, undefined, undefined, undefined, repoReference, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, (0, codeql_1.getCachedCodeQL)(), tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
throw new Error("initConfig did not throw error");
}
catch (err) {
@@ -608,7 +699,7 @@ function queriesToResolvedQueryForm(queries) {
},
});
try {
await configUtils.initConfig(undefined, undefined, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
await configUtils.initConfig(undefined, undefined, undefined, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
throw new Error("initConfig did not throw error");
}
catch (err) {
@@ -620,7 +711,7 @@ function queriesToResolvedQueryForm(queries) {
return await util.withTmpDir(async (tmpDir) => {
const languages = "rubbish,english";
try {
await configUtils.initConfig(languages, undefined, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, (0, codeql_1.getCachedCodeQL)(), tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
await configUtils.initConfig(languages, undefined, undefined, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, (0, codeql_1.getCachedCodeQL)(), tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
throw new Error("initConfig did not throw error");
}
catch (err) {
@@ -651,7 +742,7 @@ function queriesToResolvedQueryForm(queries) {
const configFile = path.join(tmpDir, "codeql-config.yaml");
fs.writeFileSync(configFile, inputFileContents);
const languages = "javascript";
const { packs } = await configUtils.initConfig(languages, undefined, undefined, undefined, configFile, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
const { packs } = await configUtils.initConfig(languages, undefined, undefined, undefined, configFile, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
t.deepEqual(packs, {
[languages_1.Language.javascript]: ["a/b@1.2.3"],
});
@@ -688,7 +779,7 @@ function queriesToResolvedQueryForm(queries) {
fs.writeFileSync(configFile, inputFileContents);
fs.mkdirSync(path.join(tmpDir, "foo"));
const languages = "javascript,python,cpp";
const { packs, queries } = await configUtils.initConfig(languages, undefined, undefined, undefined, configFile, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
const { packs, queries } = await configUtils.initConfig(languages, undefined, undefined, undefined, configFile, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
t.deepEqual(packs, {
[languages_1.Language.javascript]: ["a/b@1.2.3"],
[languages_1.Language.python]: ["c/d@1.2.3"],
@@ -734,7 +825,7 @@ function doInvalidInputTest(testName, inputFileContents, expectedErrorMessageGen
const inputFile = path.join(tmpDir, configFile);
fs.writeFileSync(inputFile, inputFileContents, "utf8");
try {
await configUtils.initConfig(languages, undefined, undefined, undefined, configFile, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
await configUtils.initConfig(languages, undefined, undefined, undefined, configFile, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)([]), (0, logging_1.getRunnerLogger)(true));
throw new Error("initConfig did not throw error");
}
catch (err) {
@@ -991,7 +1082,7 @@ const mlPoweredQueriesMacro = ava_1.default.macro({
return { packs: [] };
},
});
const { packs } = await configUtils.initConfig("javascript", queriesInput, packsInput, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example " }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)(isMlPoweredQueriesEnabled ? [feature_flags_1.Feature.MlPoweredQueriesEnabled] : []), (0, logging_1.getRunnerLogger)(true));
const { packs } = await configUtils.initConfig("javascript", queriesInput, packsInput, undefined, undefined, undefined, undefined, false, false, "", "", { owner: "github", repo: "example" }, tmpDir, codeQL, tmpDir, gitHubVersion, sampleApiDetails, (0, testing_utils_1.createFeatures)(isMlPoweredQueriesEnabled ? [feature_flags_1.Feature.MlPoweredQueriesEnabled] : []), (0, logging_1.getRunnerLogger)(true));
if (expectedVersionString !== undefined) {
t.deepEqual(packs, {
[languages_1.Language.javascript]: [

File diff suppressed because one or more lines are too long

18
lib/debug-artifacts.js generated
View File

@@ -74,7 +74,6 @@ async function uploadSarifDebugArtifact(config, outputDir) {
}
exports.uploadSarifDebugArtifact = uploadSarifDebugArtifact;
async function uploadLogsDebugArtifact(config) {
const codeql = await (0, codeql_1.getCodeQL)(config.codeQLCmd);
let toUpload = [];
for (const language of config.languages) {
const databaseDirectory = (0, util_1.getCodeQLDatabasePath)(config, language);
@@ -83,21 +82,12 @@ async function uploadLogsDebugArtifact(config) {
toUpload = toUpload.concat((0, util_1.listFolder)(logsDirectory));
}
}
if (await (0, util_1.codeQlVersionAbove)(codeql, codeql_1.CODEQL_VERSION_NEW_TRACING)) {
// Multilanguage tracing: there are additional logs in the root of the cluster
const multiLanguageTracingLogsDirectory = path.resolve(config.dbLocation, "log");
if ((0, util_1.doesDirectoryExist)(multiLanguageTracingLogsDirectory)) {
toUpload = toUpload.concat((0, util_1.listFolder)(multiLanguageTracingLogsDirectory));
}
// Multilanguage tracing: there are additional logs in the root of the cluster
const multiLanguageTracingLogsDirectory = path.resolve(config.dbLocation, "log");
if ((0, util_1.doesDirectoryExist)(multiLanguageTracingLogsDirectory)) {
toUpload = toUpload.concat((0, util_1.listFolder)(multiLanguageTracingLogsDirectory));
}
await uploadDebugArtifacts(toUpload, config.dbLocation, config.debugArtifactName);
// Before multi-language tracing, we wrote a compound-build-tracer.log in the temp dir
if (!(await (0, util_1.codeQlVersionAbove)(codeql, codeql_1.CODEQL_VERSION_NEW_TRACING))) {
const compoundBuildTracerLogDirectory = path.resolve(config.tempDir, "compound-build-tracer.log");
if ((0, util_1.doesDirectoryExist)(compoundBuildTracerLogDirectory)) {
await uploadDebugArtifacts([compoundBuildTracerLogDirectory], config.tempDir, config.debugArtifactName);
}
}
}
exports.uploadLogsDebugArtifact = uploadLogsDebugArtifact;
/**

View File

@@ -1 +1 @@
{"version":3,"file":"debug-artifacts.js","sourceRoot":"","sources":["../src/debug-artifacts.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,uCAAyB;AACzB,2CAA6B;AAE7B,4DAA8C;AAC9C,oDAAsC;AACtC,sDAA6B;AAC7B,8CAAsB;AAEtB,iDAAkD;AAClD,uCAA0C;AAC1C,qCAAiE;AAIjE,iCAMgB;AAEhB,SAAgB,mBAAmB,CAAC,IAAY;IAC9C,OAAO,IAAI,CAAC,OAAO,CAAC,oBAAoB,EAAE,EAAE,CAAC,CAAC;AAChD,CAAC;AAFD,kDAEC;AAEM,KAAK,UAAU,oBAAoB,CACxC,QAAkB,EAClB,OAAe,EACf,YAAoB;IAEpB,IAAI,QAAQ,CAAC,MAAM,KAAK,CAAC,EAAE;QACzB,OAAO;KACR;IACD,IAAI,MAAM,GAAG,EAAE,CAAC;IAChB,MAAM,MAAM,GAAG,IAAA,+BAAgB,EAAC,QAAQ,CAAC,CAAC;IAC1C,IAAI,MAAM,EAAE;QACV,IAAI;YACF,KAAK,MAAM,CAAC,EAAE,SAAS,CAAC,IAAI,MAAM,CAAC,OAAO,CACxC,IAAI,CAAC,KAAK,CAAC,MAAM,CAAY,CAC9B,CAAC,IAAI,EAAE;gBACN,MAAM,IAAI,IAAI,SAAS,EAAE,CAAC;SAC7B;QAAC,OAAO,CAAC,EAAE;YACV,IAAI,CAAC,IAAI,CACP,+HAA+H,CAChI,CAAC;SACH;KACF;IACD,MAAM,QAAQ,CAAC,MAAM,EAAE,CAAC,cAAc,CACpC,mBAAmB,CAAC,GAAG,YAAY,GAAG,MAAM,EAAE,CAAC,EAC/C,QAAQ,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,EAAE,CAAC,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,EAC5C,IAAI,CAAC,SAAS,CAAC,OAAO,CAAC,CACxB,CAAC;AACJ,CAAC;AA3BD,oDA2BC;AAEM,KAAK,UAAU,wBAAwB,CAC5C,MAAc,EACd,SAAiB;IAEjB,IAAI,CAAC,IAAA,yBAAkB,EAAC,SAAS,CAAC,EAAE;QAClC,OAAO;KACR;IAED,IAAI,QAAQ,GAAa,EAAE,CAAC;IAC5B,KAAK,MAAM,IAAI,IAAI,MAAM,CAAC,SAAS,EAAE;QACnC,MAAM,SAAS,GAAG,IAAI,CAAC,OAAO,CAAC,SAAS,EAAE,GAAG,IAAI,QAAQ,CAAC,CAAC;QAC3D,IAAI,EAAE,CAAC,UAAU,CAAC,SAAS,CAAC,EAAE;YAC5B,QAAQ,GAAG,QAAQ,CAAC,MAAM,CAAC,SAAS,CAAC,CAAC;SACvC;KACF;IACD,MAAM,oBAAoB,CAAC,QAAQ,EAAE,SAAS,EAAE,MAAM,CAAC,iBAAiB,CAAC,CAAC;AAC5E,CAAC;AAhBD,4DAgBC;AAEM,KAAK,UAAU,uBAAuB,CAAC,MAAc;IAC1D,MAAM,MAAM,GAAG,MAAM,IAAA,kBAAS,EAAC,MAAM,CAAC,SAAS,CAAC,CAAC;IAEjD,IAAI,QAAQ,GAAa,EAAE,CAAC;IAC5B,KAAK,MAAM,QAAQ,IAAI,MAAM,CAAC,SAAS,EAAE;QACvC,MAAM,iBAAiB,GAAG,IAAA,4BAAqB,EAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;QAClE,MAAM,aAAa,GAAG,IAAI,CAAC,OAAO,CAAC,iBAAiB,EAAE,KAAK,CAAC,CAAC;QAC7D,IAAI,IAAA,yBAAkB,EAAC,aAAa,CAAC,EAAE;YACrC,QAAQ,GAAG,QAAQ,CAAC,MAAM,CAAC,IAAA,iBAAU,EAAC,aAAa,CAAC,CAAC,CAAC;SACvD;KACF;IAED,IAAI,MAAM,IAAA,yBAAkB,EAAC,MAAM,EAAE,mCAA0B,CAAC,EAAE;QAChE,8EAA8E;QAC9E,MAAM,iCAAiC,GAAG,IAAI,CAAC,OAAO,CACpD,MAAM,CAAC,UAAU,EACjB,KAAK,CACN,CAAC;QACF,IAAI,IAAA,yBAAkB,EAAC,iCAAiC,CAAC,EAAE;YACzD,QAAQ,GAAG,QAAQ,CAAC,MAAM,CAAC,IAAA,iBAAU,EAAC,iCAAiC,CAAC,CAAC,CAAC;SAC3E;KACF;IACD,MAAM,oBAAoB,CACxB,QAAQ,EACR,MAAM,CAAC,UAAU,EACjB,MAAM,CAAC,iBAAiB,CACzB,CAAC;IAEF,sFAAsF;IACtF,IAAI,CAAC,CAAC,MAAM,IAAA,yBAAkB,EAAC,MAAM,EAAE,mCAA0B,CAAC,CAAC,EAAE;QACnE,MAAM,+BAA+B,GAAG,IAAI,CAAC,OAAO,CAClD,MAAM,CAAC,OAAO,EACd,2BAA2B,CAC5B,CAAC;QACF,IAAI,IAAA,yBAAkB,EAAC,+BAA+B,CAAC,EAAE;YACvD,MAAM,oBAAoB,CACxB,CAAC,+BAA+B,CAAC,EACjC,MAAM,CAAC,OAAO,EACd,MAAM,CAAC,iBAAiB,CACzB,CAAC;SACH;KACF;AACH,CAAC;AA1CD,0DA0CC;AAED;;;;GAIG;AACH,KAAK,UAAU,2BAA2B,CACxC,MAAc,EACd,QAAkB;IAElB,MAAM,YAAY,GAAG,IAAA,4BAAqB,EAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;IAC7D,MAAM,kBAAkB,GAAG,IAAI,CAAC,OAAO,CACrC,MAAM,CAAC,UAAU,EACjB,GAAG,MAAM,CAAC,iBAAiB,IAAI,QAAQ,cAAc,CACtD,CAAC;IACF,IAAI,CAAC,IAAI,CACP,GAAG,MAAM,CAAC,iBAAiB,IAAI,QAAQ,2DAA2D,kBAAkB,KAAK,CAC1H,CAAC;IACF,qEAAqE;IACrE,IAAI,EAAE,CAAC,UAAU,CAAC,kBAAkB,CAAC,EAAE;QACrC,MAAM,IAAA,aAAG,EAAC,kBAAkB,EAAE,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC;KAChD;IACD,MAAM,GAAG,GAAG,IAAI,iBAAM,EAAE,CAAC;IACzB,GAAG,CAAC,cAAc,CAAC,YAAY,CAAC,CAAC;IACjC,GAAG,CAAC,QAAQ,CAAC,kBAAkB,CAAC,CAAC;IACjC,OAAO,kBAAkB,CAAC;AAC5B,CAAC;AAED;;GAEG;AACH,KAAK,UAAU,uBAAuB,CACpC,MAAc,EACd,QAAkB;IAElB,kDAAkD;IAClD,MAAM,kBAAkB,GAAG,MAAM,IAAA,eAAQ,EACvC,MAAM,EACN,QAAQ,EACR,MAAM,IAAA,kBAAS,EAAC,MAAM,CAAC,SAAS,CAAC,EACjC,GAAG,MAAM,CAAC,iBAAiB,IAAI,QAAQ,EAAE,CAC1C,CAAC;IACF,OAAO,kBAAkB,CAAC;AAC5B,CAAC;AAEM,KAAK,UAAU,iCAAiC,CACrD,MAAc,EACd,MAAc;IAEd,KAAK,MAAM,QAAQ,IAAI,MAAM,CAAC,SAAS,EAAE;QACvC,IAAI;YACF,IAAI,kBAA0B,CAAC;YAC/B,IAAI,CAAC,IAAA,uBAAa,EAAC,MAAM,EAAE,QAAQ,EAAE,MAAM,CAAC,EAAE;gBAC5C,kBAAkB,GAAG,MAAM,2BAA2B,CACpD,MAAM,EACN,QAAQ,CACT,CAAC;aACH;iBAAM;gBACL,kBAAkB,GAAG,MAAM,uBAAuB,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;aACtE;YACD,MAAM,oBAAoB,CACxB,CAAC,kBAAkB,CAAC,EACpB,MAAM,CAAC,UAAU,EACjB,MAAM,CAAC,iBAAiB,CACzB,CAAC;SACH;QAAC,OAAO,KAAK,EAAE;YACd,IAAI,CAAC,IAAI,CACP,8CAA8C,MAAM,CAAC,iBAAiB,IAAI,QAAQ,KAAK,KAAK,EAAE,CAC/F,CAAC;SACH;KACF;AACH,CAAC;AA1BD,8EA0BC"}
{"version":3,"file":"debug-artifacts.js","sourceRoot":"","sources":["../src/debug-artifacts.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,uCAAyB;AACzB,2CAA6B;AAE7B,4DAA8C;AAC9C,oDAAsC;AACtC,sDAA6B;AAC7B,8CAAsB;AAEtB,iDAAkD;AAClD,uCAA0C;AAC1C,qCAAqC;AAIrC,iCAKgB;AAEhB,SAAgB,mBAAmB,CAAC,IAAY;IAC9C,OAAO,IAAI,CAAC,OAAO,CAAC,oBAAoB,EAAE,EAAE,CAAC,CAAC;AAChD,CAAC;AAFD,kDAEC;AAEM,KAAK,UAAU,oBAAoB,CACxC,QAAkB,EAClB,OAAe,EACf,YAAoB;IAEpB,IAAI,QAAQ,CAAC,MAAM,KAAK,CAAC,EAAE;QACzB,OAAO;KACR;IACD,IAAI,MAAM,GAAG,EAAE,CAAC;IAChB,MAAM,MAAM,GAAG,IAAA,+BAAgB,EAAC,QAAQ,CAAC,CAAC;IAC1C,IAAI,MAAM,EAAE;QACV,IAAI;YACF,KAAK,MAAM,CAAC,EAAE,SAAS,CAAC,IAAI,MAAM,CAAC,OAAO,CACxC,IAAI,CAAC,KAAK,CAAC,MAAM,CAAY,CAC9B,CAAC,IAAI,EAAE;gBACN,MAAM,IAAI,IAAI,SAAS,EAAE,CAAC;SAC7B;QAAC,OAAO,CAAC,EAAE;YACV,IAAI,CAAC,IAAI,CACP,+HAA+H,CAChI,CAAC;SACH;KACF;IACD,MAAM,QAAQ,CAAC,MAAM,EAAE,CAAC,cAAc,CACpC,mBAAmB,CAAC,GAAG,YAAY,GAAG,MAAM,EAAE,CAAC,EAC/C,QAAQ,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,EAAE,CAAC,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,EAC5C,IAAI,CAAC,SAAS,CAAC,OAAO,CAAC,CACxB,CAAC;AACJ,CAAC;AA3BD,oDA2BC;AAEM,KAAK,UAAU,wBAAwB,CAC5C,MAAc,EACd,SAAiB;IAEjB,IAAI,CAAC,IAAA,yBAAkB,EAAC,SAAS,CAAC,EAAE;QAClC,OAAO;KACR;IAED,IAAI,QAAQ,GAAa,EAAE,CAAC;IAC5B,KAAK,MAAM,IAAI,IAAI,MAAM,CAAC,SAAS,EAAE;QACnC,MAAM,SAAS,GAAG,IAAI,CAAC,OAAO,CAAC,SAAS,EAAE,GAAG,IAAI,QAAQ,CAAC,CAAC;QAC3D,IAAI,EAAE,CAAC,UAAU,CAAC,SAAS,CAAC,EAAE;YAC5B,QAAQ,GAAG,QAAQ,CAAC,MAAM,CAAC,SAAS,CAAC,CAAC;SACvC;KACF;IACD,MAAM,oBAAoB,CAAC,QAAQ,EAAE,SAAS,EAAE,MAAM,CAAC,iBAAiB,CAAC,CAAC;AAC5E,CAAC;AAhBD,4DAgBC;AAEM,KAAK,UAAU,uBAAuB,CAAC,MAAc;IAC1D,IAAI,QAAQ,GAAa,EAAE,CAAC;IAC5B,KAAK,MAAM,QAAQ,IAAI,MAAM,CAAC,SAAS,EAAE;QACvC,MAAM,iBAAiB,GAAG,IAAA,4BAAqB,EAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;QAClE,MAAM,aAAa,GAAG,IAAI,CAAC,OAAO,CAAC,iBAAiB,EAAE,KAAK,CAAC,CAAC;QAC7D,IAAI,IAAA,yBAAkB,EAAC,aAAa,CAAC,EAAE;YACrC,QAAQ,GAAG,QAAQ,CAAC,MAAM,CAAC,IAAA,iBAAU,EAAC,aAAa,CAAC,CAAC,CAAC;SACvD;KACF;IAED,8EAA8E;IAC9E,MAAM,iCAAiC,GAAG,IAAI,CAAC,OAAO,CACpD,MAAM,CAAC,UAAU,EACjB,KAAK,CACN,CAAC;IACF,IAAI,IAAA,yBAAkB,EAAC,iCAAiC,CAAC,EAAE;QACzD,QAAQ,GAAG,QAAQ,CAAC,MAAM,CAAC,IAAA,iBAAU,EAAC,iCAAiC,CAAC,CAAC,CAAC;KAC3E;IAED,MAAM,oBAAoB,CACxB,QAAQ,EACR,MAAM,CAAC,UAAU,EACjB,MAAM,CAAC,iBAAiB,CACzB,CAAC;AACJ,CAAC;AAxBD,0DAwBC;AAED;;;;GAIG;AACH,KAAK,UAAU,2BAA2B,CACxC,MAAc,EACd,QAAkB;IAElB,MAAM,YAAY,GAAG,IAAA,4BAAqB,EAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;IAC7D,MAAM,kBAAkB,GAAG,IAAI,CAAC,OAAO,CACrC,MAAM,CAAC,UAAU,EACjB,GAAG,MAAM,CAAC,iBAAiB,IAAI,QAAQ,cAAc,CACtD,CAAC;IACF,IAAI,CAAC,IAAI,CACP,GAAG,MAAM,CAAC,iBAAiB,IAAI,QAAQ,2DAA2D,kBAAkB,KAAK,CAC1H,CAAC;IACF,qEAAqE;IACrE,IAAI,EAAE,CAAC,UAAU,CAAC,kBAAkB,CAAC,EAAE;QACrC,MAAM,IAAA,aAAG,EAAC,kBAAkB,EAAE,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC;KAChD;IACD,MAAM,GAAG,GAAG,IAAI,iBAAM,EAAE,CAAC;IACzB,GAAG,CAAC,cAAc,CAAC,YAAY,CAAC,CAAC;IACjC,GAAG,CAAC,QAAQ,CAAC,kBAAkB,CAAC,CAAC;IACjC,OAAO,kBAAkB,CAAC;AAC5B,CAAC;AAED;;GAEG;AACH,KAAK,UAAU,uBAAuB,CACpC,MAAc,EACd,QAAkB;IAElB,kDAAkD;IAClD,MAAM,kBAAkB,GAAG,MAAM,IAAA,eAAQ,EACvC,MAAM,EACN,QAAQ,EACR,MAAM,IAAA,kBAAS,EAAC,MAAM,CAAC,SAAS,CAAC,EACjC,GAAG,MAAM,CAAC,iBAAiB,IAAI,QAAQ,EAAE,CAC1C,CAAC;IACF,OAAO,kBAAkB,CAAC;AAC5B,CAAC;AAEM,KAAK,UAAU,iCAAiC,CACrD,MAAc,EACd,MAAc;IAEd,KAAK,MAAM,QAAQ,IAAI,MAAM,CAAC,SAAS,EAAE;QACvC,IAAI;YACF,IAAI,kBAA0B,CAAC;YAC/B,IAAI,CAAC,IAAA,uBAAa,EAAC,MAAM,EAAE,QAAQ,EAAE,MAAM,CAAC,EAAE;gBAC5C,kBAAkB,GAAG,MAAM,2BAA2B,CACpD,MAAM,EACN,QAAQ,CACT,CAAC;aACH;iBAAM;gBACL,kBAAkB,GAAG,MAAM,uBAAuB,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;aACtE;YACD,MAAM,oBAAoB,CACxB,CAAC,kBAAkB,CAAC,EACpB,MAAM,CAAC,UAAU,EACjB,MAAM,CAAC,iBAAiB,CACzB,CAAC;SACH;QAAC,OAAO,KAAK,EAAE;YACd,IAAI,CAAC,IAAI,CACP,8CAA8C,MAAM,CAAC,iBAAiB,IAAI,QAAQ,KAAK,KAAK,EAAE,CAC/F,CAAC;SACH;KACF;AACH,CAAC;AA1BD,8EA0BC"}

View File

@@ -1,6 +1,6 @@
{
"bundleVersion": "codeql-bundle-20230403",
"cliVersion": "2.12.6",
"priorBundleVersion": "codeql-bundle-20230317",
"priorCliVersion": "2.12.5"
"bundleVersion": "codeql-bundle-20230428",
"cliVersion": "2.13.1",
"priorBundleVersion": "codeql-bundle-20230414",
"priorCliVersion": "2.13.0"
}

8
lib/init-action.js generated
View File

@@ -27,7 +27,6 @@ const path = __importStar(require("path"));
const core = __importStar(require("@actions/core"));
const actions_util_1 = require("./actions-util");
const api_client_1 = require("./api-client");
const codeql_1 = require("./codeql");
const feature_flags_1 = require("./feature-flags");
const init_1 = require("./init");
const languages_1 = require("./languages");
@@ -129,8 +128,7 @@ async function run() {
toolsDownloadDurationMs = initCodeQLResult.toolsDownloadDurationMs;
toolsVersion = initCodeQLResult.toolsVersion;
toolsSource = initCodeQLResult.toolsSource;
await (0, codeql_1.enrichEnvironment)(codeql);
config = await (0, init_1.initConfig)((0, actions_util_1.getOptionalInput)("languages"), (0, actions_util_1.getOptionalInput)("queries"), (0, actions_util_1.getOptionalInput)("packs"), registriesInput, (0, actions_util_1.getOptionalInput)("config-file"), (0, actions_util_1.getOptionalInput)("db-location"), getTrapCachingEnabled(),
config = await (0, init_1.initConfig)((0, actions_util_1.getOptionalInput)("languages"), (0, actions_util_1.getOptionalInput)("queries"), (0, actions_util_1.getOptionalInput)("packs"), registriesInput, (0, actions_util_1.getOptionalInput)("config-file"), (0, actions_util_1.getOptionalInput)("db-location"), (0, actions_util_1.getOptionalInput)("config"), getTrapCachingEnabled(),
// Debug mode is enabled if:
// - The `init` Action is passed `debug: true`.
// - Actions step debugging is enabled (e.g. by [enabling debug logging for a rerun](https://docs.github.com/en/actions/managing-workflow-runs/re-running-workflows-and-jobs#re-running-all-the-jobs-in-a-workflow),
@@ -178,10 +176,6 @@ async function run() {
for (const [key, value] of Object.entries(tracerConfig.env)) {
core.exportVariable(key, value);
}
if (process.platform === "win32" &&
!(await (0, util_1.codeQlVersionAbove)(codeql, codeql_1.CODEQL_VERSION_NEW_TRACING))) {
await (0, init_1.injectWindowsTracer)("Runner.Worker.exe", undefined, config, codeql, tracerConfig);
}
}
core.setOutput("codeql-path", config.codeQLCmd);
}

File diff suppressed because one or more lines are too long

130
lib/init.js generated
View File

@@ -23,7 +23,7 @@ var __importStar = (this && this.__importStar) || function (mod) {
return result;
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.installPythonDeps = exports.injectWindowsTracer = exports.runInit = exports.initConfig = exports.initCodeQL = exports.ToolsSource = void 0;
exports.installPythonDeps = exports.runInit = exports.initConfig = exports.initCodeQL = exports.ToolsSource = void 0;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const toolrunner = __importStar(require("@actions/exec/lib/toolrunner"));
@@ -33,7 +33,6 @@ const codeql_1 = require("./codeql");
const configUtils = __importStar(require("./config-utils"));
const tracer_config_1 = require("./tracer-config");
const util = __importStar(require("./util"));
const util_1 = require("./util");
var ToolsSource;
(function (ToolsSource) {
ToolsSource["Unknown"] = "UNKNOWN";
@@ -49,9 +48,9 @@ async function initCodeQL(toolsInput, apiDetails, tempDir, variant, defaultCliVe
return { codeql, toolsDownloadDurationMs, toolsSource, toolsVersion };
}
exports.initCodeQL = initCodeQL;
async function initConfig(languagesInput, queriesInput, packsInput, registriesInput, configFile, dbLocation, trapCachingEnabled, debugMode, debugArtifactName, debugDatabaseName, repository, tempDir, codeQL, workspacePath, gitHubVersion, apiDetails, features, logger) {
async function initConfig(languagesInput, queriesInput, packsInput, registriesInput, configFile, dbLocation, configInput, trapCachingEnabled, debugMode, debugArtifactName, debugDatabaseName, repository, tempDir, codeQL, workspacePath, gitHubVersion, apiDetails, features, logger) {
logger.startGroup("Load language configuration");
const config = await configUtils.initConfig(languagesInput, queriesInput, packsInput, registriesInput, configFile, dbLocation, trapCachingEnabled, debugMode, debugArtifactName, debugDatabaseName, repository, tempDir, codeQL, workspacePath, gitHubVersion, apiDetails, features, logger);
const config = await configUtils.initConfig(languagesInput, queriesInput, packsInput, registriesInput, configFile, dbLocation, configInput, trapCachingEnabled, debugMode, debugArtifactName, debugDatabaseName, repository, tempDir, codeQL, workspacePath, gitHubVersion, apiDetails, features, logger);
analysisPaths.printPathFiltersWarning(config, logger);
logger.endGroup();
return config;
@@ -60,35 +59,27 @@ exports.initConfig = initConfig;
async function runInit(codeql, config, sourceRoot, processName, registriesInput, features, apiDetails, logger) {
fs.mkdirSync(config.dbLocation, { recursive: true });
try {
if (await (0, util_1.codeQlVersionAbove)(codeql, codeql_1.CODEQL_VERSION_NEW_TRACING)) {
// When parsing the codeql config in the CLI, we have not yet created the qlconfig file.
// So, create it now.
// If we are parsing the config file in the Action, then the qlconfig file was already created
// before the `pack download` command was invoked. It is not required for the init command.
let registriesAuthTokens;
let qlconfigFile;
if (await util.useCodeScanningConfigInCli(codeql, features)) {
({ registriesAuthTokens, qlconfigFile } =
await configUtils.generateRegistries(registriesInput, codeql, config.tempDir, logger));
}
await configUtils.wrapEnvironment({
GITHUB_TOKEN: apiDetails.auth,
CODEQL_REGISTRIES_AUTH: registriesAuthTokens,
},
// Init a database cluster
async () => await codeql.databaseInitCluster(config, sourceRoot, processName, features, qlconfigFile, logger));
}
else {
for (const language of config.languages) {
// Init language database
await codeql.databaseInit(util.getCodeQLDatabasePath(config, language), language, sourceRoot);
}
// When parsing the codeql config in the CLI, we have not yet created the qlconfig file.
// So, create it now.
// If we are parsing the config file in the Action, then the qlconfig file was already created
// before the `pack download` command was invoked. It is not required for the init command.
let registriesAuthTokens;
let qlconfigFile;
if (await util.useCodeScanningConfigInCli(codeql, features)) {
({ registriesAuthTokens, qlconfigFile } =
await configUtils.generateRegistries(registriesInput, codeql, config.tempDir, logger));
}
await configUtils.wrapEnvironment({
GITHUB_TOKEN: apiDetails.auth,
CODEQL_REGISTRIES_AUTH: registriesAuthTokens,
},
// Init a database cluster
async () => await codeql.databaseInitCluster(config, sourceRoot, processName, features, qlconfigFile, logger));
}
catch (e) {
throw processError(e);
}
return await (0, tracer_config_1.getCombinedTracerConfig)(config, codeql);
return await (0, tracer_config_1.getCombinedTracerConfig)(config);
}
exports.runInit = runInit;
/**
@@ -119,89 +110,6 @@ function processError(e) {
}
return e;
}
// Runs a powershell script to inject the tracer into a parent process
// so it can tracer future processes, hopefully including the build process.
// If processName is given then injects into the nearest parent process with
// this name, otherwise uses the processLevel-th parent if defined, otherwise
// defaults to the 3rd parent as a rough guess.
async function injectWindowsTracer(processName, processLevel, config, codeql, tracerConfig) {
let script;
if (processName !== undefined) {
script = `
Param(
[Parameter(Position=0)]
[String]
$tracer
)
$id = $PID
while ($true) {
$p = Get-CimInstance -Class Win32_Process -Filter "ProcessId = $id"
Write-Host "Found process: $p"
if ($p -eq $null) {
throw "Could not determine ${processName} process"
}
if ($p[0].Name -eq "${processName}") {
Break
} else {
$id = $p[0].ParentProcessId
}
}
Write-Host "Final process: $p"
Invoke-Expression "&$tracer --inject=$id"`;
}
else {
// If the level is not defined then guess at the 3rd parent process.
// This won't be correct in every setting but it should be enough in most settings,
// and overestimating is likely better in this situation so we definitely trace
// what we want, though this does run the risk of interfering with future CI jobs.
// Note that the default of 3 doesn't work on github actions, so we include a
// special case in the script that checks for Runner.Worker.exe so we can still work
// on actions if the runner is invoked there.
processLevel = processLevel || 3;
script = `
Param(
[Parameter(Position=0)]
[String]
$tracer
)
$id = $PID
for ($i = 0; $i -le ${processLevel}; $i++) {
$p = Get-CimInstance -Class Win32_Process -Filter "ProcessId = $id"
Write-Host "Parent process \${i}: $p"
if ($p -eq $null) {
throw "Process tree ended before reaching required level"
}
# Special case just in case the runner is used on actions
if ($p[0].Name -eq "Runner.Worker.exe") {
Write-Host "Found Runner.Worker.exe process which means we are running on GitHub Actions"
Write-Host "Aborting search early and using process: $p"
Break
} elseif ($p[0].Name -eq "Agent.Worker.exe") {
Write-Host "Found Agent.Worker.exe process which means we are running on Azure Pipelines"
Write-Host "Aborting search early and using process: $p"
Break
} else {
$id = $p[0].ParentProcessId
}
}
Write-Host "Final process: $p"
Invoke-Expression "&$tracer --inject=$id"`;
}
const injectTracerPath = path.join(config.tempDir, "inject-tracer.ps1");
fs.writeFileSync(injectTracerPath, script);
await new toolrunner.ToolRunner(await safeWhich.safeWhich("powershell"), [
"-ExecutionPolicy",
"Bypass",
"-file",
injectTracerPath,
path.resolve(path.dirname(codeql.getPath()), "tools", "win64", "tracer.exe"),
], { env: { ODASA_TRACER_CONFIGURATION: tracerConfig.spec } }).exec();
}
exports.injectWindowsTracer = injectWindowsTracer;
async function installPythonDeps(codeql, logger) {
logger.startGroup("Setup Python dependencies");
const scriptsFolder = path.resolve(__dirname, "../python-setup");

File diff suppressed because one or more lines are too long

161
lib/tracer-config.js generated
View File

@@ -23,20 +23,10 @@ var __importStar = (this && this.__importStar) || function (mod) {
return result;
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.getCombinedTracerConfig = exports.concatTracerConfigs = exports.getTracerConfigForLanguage = exports.getTracerConfigForCluster = exports.endTracingForCluster = void 0;
exports.getCombinedTracerConfig = exports.getTracerConfigForCluster = exports.endTracingForCluster = void 0;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const codeql_1 = require("./codeql");
const languages_1 = require("./languages");
const util = __importStar(require("./util"));
const util_1 = require("./util");
const CRITICAL_TRACER_VARS = new Set([
"SEMMLE_PRELOAD_libtrace",
"SEMMLE_RUNNER",
"SEMMLE_COPY_EXECUTABLES_ROOT",
"SEMMLE_DEPTRACE_SOCKET",
"SEMMLE_JAVA_TOOL_OPTIONS",
]);
async function endTracingForCluster(config) {
// If there are no traced languages, we don't need to do anything.
if (!config.languages.some((l) => (0, languages_1.isTracedLanguage)(l)))
@@ -64,162 +54,17 @@ exports.endTracingForCluster = endTracingForCluster;
async function getTracerConfigForCluster(config) {
const tracingEnvVariables = JSON.parse(fs.readFileSync(path.resolve(config.dbLocation, "temp/tracingEnvironment/start-tracing.json"), "utf8"));
return {
spec: tracingEnvVariables["ODASA_TRACER_CONFIGURATION"],
env: tracingEnvVariables,
};
}
exports.getTracerConfigForCluster = getTracerConfigForCluster;
async function getTracerConfigForLanguage(codeql, config, language) {
const env = await codeql.getTracerEnv(util.getCodeQLDatabasePath(config, language));
const spec = env["ODASA_TRACER_CONFIGURATION"];
const info = { spec, env: {} };
// Extract critical tracer variables from the environment
for (const entry of Object.entries(env)) {
const key = entry[0];
const value = entry[1];
// skip ODASA_TRACER_CONFIGURATION as it is handled separately
if (key === "ODASA_TRACER_CONFIGURATION") {
continue;
}
// skip undefined values
if (typeof value === "undefined") {
continue;
}
// Keep variables that do not exist in current environment. In addition always keep
// critical and CODEQL_ variables
if (typeof process.env[key] === "undefined" ||
CRITICAL_TRACER_VARS.has(key) ||
key.startsWith("CODEQL_")) {
info.env[key] = value;
}
}
return info;
}
exports.getTracerConfigForLanguage = getTracerConfigForLanguage;
function concatTracerConfigs(tracerConfigs, config, writeBothEnvironments = false) {
// A tracer config is a map containing additional environment variables and a tracer 'spec' file.
// A tracer 'spec' file has the following format [log_file, number_of_blocks, blocks_text]
// Merge the environments
const env = {};
let copyExecutables = false;
let envSize = 0;
for (const v of Object.values(tracerConfigs)) {
for (const e of Object.entries(v.env)) {
const name = e[0];
const value = e[1];
// skip SEMMLE_COPY_EXECUTABLES_ROOT as it is handled separately
if (name === "SEMMLE_COPY_EXECUTABLES_ROOT") {
copyExecutables = true;
}
else if (name in env) {
if (env[name] !== value) {
throw Error(`Incompatible values in environment parameter ${name}: ${env[name]} and ${value}`);
}
}
else {
env[name] = value;
envSize += 1;
}
}
}
// Concatenate spec files into a new spec file
const languages = Object.keys(tracerConfigs);
const cppIndex = languages.indexOf("cpp");
// Make sure cpp is the last language, if it's present since it must be concatenated last
if (cppIndex !== -1) {
const lastLang = languages[languages.length - 1];
languages[languages.length - 1] = languages[cppIndex];
languages[cppIndex] = lastLang;
}
const totalLines = [];
let totalCount = 0;
for (const lang of languages) {
const lines = fs
.readFileSync(tracerConfigs[lang].spec, "utf8")
.split(/\r?\n/);
const count = parseInt(lines[1], 10);
totalCount += count;
totalLines.push(...lines.slice(2));
}
const newLogFilePath = path.resolve(config.tempDir, "compound-build-tracer.log");
const spec = path.resolve(config.tempDir, "compound-spec");
const compoundTempFolder = path.resolve(config.tempDir, "compound-temp");
const newSpecContent = [
newLogFilePath,
totalCount.toString(10),
...totalLines,
];
if (copyExecutables) {
env["SEMMLE_COPY_EXECUTABLES_ROOT"] = compoundTempFolder;
envSize += 1;
}
fs.writeFileSync(spec, newSpecContent.join("\n"));
if (writeBothEnvironments || process.platform !== "win32") {
// Prepare the content of the compound environment file on Unix
let buffer = Buffer.alloc(4);
buffer.writeInt32LE(envSize, 0);
for (const e of Object.entries(env)) {
const key = e[0];
const value = e[1];
const lineBuffer = Buffer.from(`${key}=${value}\0`, "utf8");
const sizeBuffer = Buffer.alloc(4);
sizeBuffer.writeInt32LE(lineBuffer.length, 0);
buffer = Buffer.concat([buffer, sizeBuffer, lineBuffer]);
}
// Write the compound environment for Unix
const envPath = `${spec}.environment`;
fs.writeFileSync(envPath, buffer);
}
if (writeBothEnvironments || process.platform === "win32") {
// Prepare the content of the compound environment file on Windows
let bufferWindows = Buffer.alloc(0);
let length = 0;
for (const e of Object.entries(env)) {
const key = e[0];
const value = e[1];
const string = `${key}=${value}\0`;
length += string.length;
const lineBuffer = Buffer.from(string, "utf16le");
bufferWindows = Buffer.concat([bufferWindows, lineBuffer]);
}
const sizeBuffer = Buffer.alloc(4);
sizeBuffer.writeInt32LE(length + 1, 0); // Add one for trailing null character marking end
const trailingNull = Buffer.from(`\0`, "utf16le");
bufferWindows = Buffer.concat([sizeBuffer, bufferWindows, trailingNull]);
// Write the compound environment for Windows
const envPathWindows = `${spec}.win32env`;
fs.writeFileSync(envPathWindows, bufferWindows);
}
return { env, spec };
}
exports.concatTracerConfigs = concatTracerConfigs;
async function getCombinedTracerConfig(config, codeql) {
async function getCombinedTracerConfig(config) {
// Abort if there are no traced languages as there's nothing to do
const tracedLanguages = config.languages.filter((l) => (0, languages_1.isTracedLanguage)(l));
if (tracedLanguages.length === 0) {
return undefined;
}
let mainTracerConfig;
if (await (0, util_1.codeQlVersionAbove)(codeql, codeql_1.CODEQL_VERSION_NEW_TRACING)) {
mainTracerConfig = await getTracerConfigForCluster(config);
}
else {
// Get all the tracer configs and combine them together
const tracedLanguageConfigs = {};
for (const language of tracedLanguages) {
tracedLanguageConfigs[language] = await getTracerConfigForLanguage(codeql, config, language);
}
mainTracerConfig = concatTracerConfigs(tracedLanguageConfigs, config);
// Add a couple more variables
mainTracerConfig.env["ODASA_TRACER_CONFIGURATION"] = mainTracerConfig.spec;
const codeQLDir = path.dirname(codeql.getPath());
if (process.platform === "darwin") {
mainTracerConfig.env["DYLD_INSERT_LIBRARIES"] = path.join(codeQLDir, "tools", "osx64", "libtrace.dylib");
}
else if (process.platform !== "win32") {
mainTracerConfig.env["LD_PRELOAD"] = path.join(codeQLDir, "tools", "linux64", "${LIB}trace.so");
}
}
const mainTracerConfig = await getTracerConfigForCluster(config);
// On macos it's necessary to prefix the build command with the runner executable
// on order to trace when System Integrity Protection is enabled.
// The executable also exists and works for other platforms so we output this env

File diff suppressed because one or more lines are too long

View File

@@ -29,7 +29,6 @@ Object.defineProperty(exports, "__esModule", { value: true });
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const ava_1 = __importDefault(require("ava"));
const codeql_1 = require("./codeql");
const configUtils = __importStar(require("./config-utils"));
const languages_1 = require("./languages");
const testing_utils_1 = require("./testing-utils");
@@ -56,267 +55,35 @@ function getTestConfig(tmpDir) {
trapCacheDownloadTime: 0,
};
}
// A very minimal setup
(0, ava_1.default)("getTracerConfigForLanguage - minimal setup", async (t) => {
await util.withTmpDir(async (tmpDir) => {
const config = getTestConfig(tmpDir);
const codeQL = (0, codeql_1.setCodeQL)({
async getTracerEnv() {
return {
ODASA_TRACER_CONFIGURATION: "abc",
foo: "bar",
};
},
});
const result = await (0, tracer_config_1.getTracerConfigForLanguage)(codeQL, config, languages_1.Language.javascript);
t.deepEqual(result, { spec: "abc", env: { foo: "bar" } });
});
});
// Existing vars should not be overwritten, unless they are critical or prefixed with CODEQL_
(0, ava_1.default)("getTracerConfigForLanguage - existing / critical vars", async (t) => {
await util.withTmpDir(async (tmpDir) => {
const config = getTestConfig(tmpDir);
// Set up some variables in the environment
process.env["foo"] = "abc";
process.env["SEMMLE_PRELOAD_libtrace"] = "abc";
process.env["SEMMLE_RUNNER"] = "abc";
process.env["SEMMLE_COPY_EXECUTABLES_ROOT"] = "abc";
process.env["SEMMLE_DEPTRACE_SOCKET"] = "abc";
process.env["SEMMLE_JAVA_TOOL_OPTIONS"] = "abc";
process.env["CODEQL_VAR"] = "abc";
// Now CodeQL returns all these variables, and one more, with different values
const codeQL = (0, codeql_1.setCodeQL)({
async getTracerEnv() {
return {
ODASA_TRACER_CONFIGURATION: "abc",
foo: "bar",
baz: "qux",
SEMMLE_PRELOAD_libtrace: "SEMMLE_PRELOAD_libtrace",
SEMMLE_RUNNER: "SEMMLE_RUNNER",
SEMMLE_COPY_EXECUTABLES_ROOT: "SEMMLE_COPY_EXECUTABLES_ROOT",
SEMMLE_DEPTRACE_SOCKET: "SEMMLE_DEPTRACE_SOCKET",
SEMMLE_JAVA_TOOL_OPTIONS: "SEMMLE_JAVA_TOOL_OPTIONS",
CODEQL_VAR: "CODEQL_VAR",
};
},
});
const result = await (0, tracer_config_1.getTracerConfigForLanguage)(codeQL, config, languages_1.Language.javascript);
t.deepEqual(result, {
spec: "abc",
env: {
// Should contain all variables except 'foo', because that already existed in the
// environment with a different value, and is not deemed a "critical" variable.
baz: "qux",
SEMMLE_PRELOAD_libtrace: "SEMMLE_PRELOAD_libtrace",
SEMMLE_RUNNER: "SEMMLE_RUNNER",
SEMMLE_COPY_EXECUTABLES_ROOT: "SEMMLE_COPY_EXECUTABLES_ROOT",
SEMMLE_DEPTRACE_SOCKET: "SEMMLE_DEPTRACE_SOCKET",
SEMMLE_JAVA_TOOL_OPTIONS: "SEMMLE_JAVA_TOOL_OPTIONS",
CODEQL_VAR: "CODEQL_VAR",
},
});
});
});
(0, ava_1.default)("concatTracerConfigs - minimal configs correctly combined", async (t) => {
await util.withTmpDir(async (tmpDir) => {
const config = getTestConfig(tmpDir);
const spec1 = path.join(tmpDir, "spec1");
fs.writeFileSync(spec1, "foo.log\n2\nabc\ndef");
const tc1 = {
spec: spec1,
env: {
a: "a",
b: "b",
},
};
const spec2 = path.join(tmpDir, "spec2");
fs.writeFileSync(spec2, "foo.log\n1\nghi");
const tc2 = {
spec: spec2,
env: {
c: "c",
},
};
const result = (0, tracer_config_1.concatTracerConfigs)({ javascript: tc1, python: tc2 }, config);
t.deepEqual(result, {
spec: path.join(tmpDir, "compound-spec"),
env: {
a: "a",
b: "b",
c: "c",
},
});
t.true(fs.existsSync(result.spec));
t.deepEqual(fs.readFileSync(result.spec, "utf8"), `${path.join(tmpDir, "compound-build-tracer.log")}\n3\nabc\ndef\nghi`);
});
});
(0, ava_1.default)("concatTracerConfigs - conflicting env vars", async (t) => {
await util.withTmpDir(async (tmpDir) => {
const config = getTestConfig(tmpDir);
const spec = path.join(tmpDir, "spec");
fs.writeFileSync(spec, "foo.log\n0");
// Ok if env vars have the same name and the same value
t.deepEqual((0, tracer_config_1.concatTracerConfigs)({
javascript: { spec, env: { a: "a", b: "b" } },
python: { spec, env: { b: "b", c: "c" } },
}, config).env, {
a: "a",
b: "b",
c: "c",
});
// Throws if env vars have same name but different values
const e = t.throws(() => (0, tracer_config_1.concatTracerConfigs)({
javascript: { spec, env: { a: "a", b: "b" } },
python: { spec, env: { b: "c" } },
}, config));
// If e is undefined, then the previous assertion will fail.
if (e !== undefined) {
t.deepEqual(e.message, "Incompatible values in environment parameter b: b and c");
}
});
});
(0, ava_1.default)("concatTracerConfigs - cpp spec lines come last if present", async (t) => {
await util.withTmpDir(async (tmpDir) => {
const config = getTestConfig(tmpDir);
const spec1 = path.join(tmpDir, "spec1");
fs.writeFileSync(spec1, "foo.log\n2\nabc\ndef");
const tc1 = {
spec: spec1,
env: {
a: "a",
b: "b",
},
};
const spec2 = path.join(tmpDir, "spec2");
fs.writeFileSync(spec2, "foo.log\n1\nghi");
const tc2 = {
spec: spec2,
env: {
c: "c",
},
};
const result = (0, tracer_config_1.concatTracerConfigs)({ cpp: tc1, python: tc2 }, config);
t.deepEqual(result, {
spec: path.join(tmpDir, "compound-spec"),
env: {
a: "a",
b: "b",
c: "c",
},
});
t.true(fs.existsSync(result.spec));
t.deepEqual(fs.readFileSync(result.spec, "utf8"), `${path.join(tmpDir, "compound-build-tracer.log")}\n3\nghi\nabc\ndef`);
});
});
(0, ava_1.default)("concatTracerConfigs - SEMMLE_COPY_EXECUTABLES_ROOT is updated to point to compound spec", async (t) => {
await util.withTmpDir(async (tmpDir) => {
const config = getTestConfig(tmpDir);
const spec = path.join(tmpDir, "spec");
fs.writeFileSync(spec, "foo.log\n0");
const result = (0, tracer_config_1.concatTracerConfigs)({
javascript: { spec, env: { a: "a", b: "b" } },
python: { spec, env: { SEMMLE_COPY_EXECUTABLES_ROOT: "foo" } },
}, config);
t.deepEqual(result.env, {
a: "a",
b: "b",
SEMMLE_COPY_EXECUTABLES_ROOT: path.join(tmpDir, "compound-temp"),
});
});
});
(0, ava_1.default)("concatTracerConfigs - compound environment file is created correctly", async (t) => {
await util.withTmpDir(async (tmpDir) => {
const config = getTestConfig(tmpDir);
const spec1 = path.join(tmpDir, "spec1");
fs.writeFileSync(spec1, "foo.log\n2\nabc\ndef");
const tc1 = {
spec: spec1,
env: {
a: "a",
},
};
const spec2 = path.join(tmpDir, "spec2");
fs.writeFileSync(spec2, "foo.log\n1\nghi");
const tc2 = {
spec: spec2,
env: {
foo: "bar_baz",
},
};
const result = (0, tracer_config_1.concatTracerConfigs)({ javascript: tc1, python: tc2 }, config, true);
// Check binary contents for the Unix file
const envPath = `${result.spec}.environment`;
t.true(fs.existsSync(envPath));
const buffer = fs.readFileSync(envPath);
t.deepEqual(buffer.length, 28);
t.deepEqual(buffer.readInt32LE(0), 2); // number of env vars
t.deepEqual(buffer.readInt32LE(4), 4); // length of env var definition
t.deepEqual(buffer.toString("utf8", 8, 12), "a=a\0"); // [key]=[value]\0
t.deepEqual(buffer.readInt32LE(12), 12); // length of env var definition
t.deepEqual(buffer.toString("utf8", 16, 28), "foo=bar_baz\0"); // [key]=[value]\0
// Check binary contents for the Windows file
const envPathWindows = `${result.spec}.win32env`;
t.true(fs.existsSync(envPathWindows));
const bufferWindows = fs.readFileSync(envPathWindows);
t.deepEqual(bufferWindows.length, 38);
t.deepEqual(bufferWindows.readInt32LE(0), 4 + 12 + 1); // number of tchars to represent the environment
t.deepEqual(bufferWindows.toString("utf16le", 4, 12), "a=a\0"); // [key]=[value]\0
t.deepEqual(bufferWindows.toString("utf16le", 12, 36), "foo=bar_baz\0"); // [key]=[value]\0
t.deepEqual(bufferWindows.toString("utf16le", 36, 38), "\0"); // trailing null character
});
});
(0, ava_1.default)("getCombinedTracerConfig - return undefined when no languages are traced languages", async (t) => {
await util.withTmpDir(async (tmpDir) => {
const config = getTestConfig(tmpDir);
// No traced languages
config.languages = [languages_1.Language.javascript, languages_1.Language.python];
const codeQL = (0, codeql_1.setCodeQL)({
async getTracerEnv() {
return {
ODASA_TRACER_CONFIGURATION: "abc",
CODEQL_DIST: "/",
foo: "bar",
};
},
});
t.deepEqual(await (0, tracer_config_1.getCombinedTracerConfig)(config, codeQL), undefined);
t.deepEqual(await (0, tracer_config_1.getCombinedTracerConfig)(config), undefined);
});
});
(0, ava_1.default)("getCombinedTracerConfig - valid spec file", async (t) => {
(0, ava_1.default)("getCombinedTracerConfig - with start-tracing.json environment file", async (t) => {
await util.withTmpDir(async (tmpDir) => {
const config = getTestConfig(tmpDir);
const spec = path.join(tmpDir, "spec");
fs.writeFileSync(spec, "foo.log\n2\nabc\ndef");
const bundlePath = path.join(tmpDir, "bundle");
const codeqlPlatform = process.platform === "win32"
? "win64"
: process.platform === "darwin"
? "osx64"
: "linux64";
const codeQL = (0, codeql_1.setCodeQL)({
async getTracerEnv() {
return {
ODASA_TRACER_CONFIGURATION: spec,
CODEQL_DIST: bundlePath,
CODEQL_PLATFORM: codeqlPlatform,
foo: "bar",
};
},
});
const result = await (0, tracer_config_1.getCombinedTracerConfig)(config, codeQL);
t.notDeepEqual(result, undefined);
const expectedEnv = {
const startTracingEnv = {
foo: "bar",
CODEQL_DIST: bundlePath,
CODEQL_PLATFORM: codeqlPlatform,
ODASA_TRACER_CONFIGURATION: result.spec,
};
if (process.platform === "darwin") {
expectedEnv["DYLD_INSERT_LIBRARIES"] = path.join(path.dirname(codeQL.getPath()), "tools", "osx64", "libtrace.dylib");
}
else if (process.platform !== "win32") {
expectedEnv["LD_PRELOAD"] = path.join(path.dirname(codeQL.getPath()), "tools", "linux64", "${LIB}trace.so");
}
const tracingEnvironmentDir = path.join(config.dbLocation, "temp", "tracingEnvironment");
fs.mkdirSync(tracingEnvironmentDir, { recursive: true });
const startTracingJson = path.join(tracingEnvironmentDir, "start-tracing.json");
fs.writeFileSync(startTracingJson, JSON.stringify(startTracingEnv));
const result = await (0, tracer_config_1.getCombinedTracerConfig)(config);
t.notDeepEqual(result, undefined);
const expectedEnv = startTracingEnv;
if (process.platform === "win32") {
expectedEnv["CODEQL_RUNNER"] = path.join(bundlePath, "tools/win64/runner.exe");
}
@@ -327,7 +94,6 @@ function getTestConfig(tmpDir) {
expectedEnv["CODEQL_RUNNER"] = path.join(bundlePath, "tools/linux64/runner");
}
t.deepEqual(result, {
spec: path.join(tmpDir, "compound-spec"),
env: expectedEnv,
});
});

File diff suppressed because one or more lines are too long

11
lib/upload-lib.js generated
View File

@@ -134,7 +134,7 @@ exports.findSarifFilesInDir = findSarifFilesInDir;
// Uploads a single sarif file or a directory of sarif files
// depending on what the path happens to refer to.
async function uploadFromActions(sarifPath, checkoutPath, category, logger) {
return await uploadFiles(getSarifFilePaths(sarifPath), (0, repository_1.parseRepositoryNwo)(util.getRequiredEnvParam("GITHUB_REPOSITORY")), await actionsUtil.getCommitOid(checkoutPath), await actionsUtil.getRef(), await actionsUtil.getAnalysisKey(), category, util.getRequiredEnvParam("GITHUB_WORKFLOW"), workflow.getWorkflowRunID(), checkoutPath, actionsUtil.getRequiredInput("matrix"), logger);
return await uploadFiles(getSarifFilePaths(sarifPath), (0, repository_1.parseRepositoryNwo)(util.getRequiredEnvParam("GITHUB_REPOSITORY")), await actionsUtil.getCommitOid(checkoutPath), await actionsUtil.getRef(), await actionsUtil.getAnalysisKey(), category, util.getRequiredEnvParam("GITHUB_WORKFLOW"), workflow.getWorkflowRunID(), workflow.getWorkflowRunAttempt(), checkoutPath, actionsUtil.getRequiredInput("matrix"), logger);
}
exports.uploadFromActions = uploadFromActions;
function getSarifFilePaths(sarifPath) {
@@ -179,7 +179,7 @@ exports.countResultsInSarif = countResultsInSarif;
// Throws an error if the file is invalid.
function validateSarifFileSchema(sarifFilePath, logger) {
const sarif = JSON.parse(fs.readFileSync(sarifFilePath, "utf8"));
const schema = require("../src/sarif_v2.1.0_schema.json");
const schema = require("../src/sarif-schema-2.1.0.json");
const result = new jsonschema.Validator().validate(sarif, schema);
if (!result.valid) {
// Output the more verbose error messages in groups as these may be very large.
@@ -197,7 +197,7 @@ function validateSarifFileSchema(sarifFilePath, logger) {
exports.validateSarifFileSchema = validateSarifFileSchema;
// buildPayload constructs a map ready to be uploaded to the API from the given
// parameters, respecting the current mode and target GitHub instance version.
function buildPayload(commitOid, ref, analysisKey, analysisName, zippedSarif, workflowRunID, checkoutURI, environment, toolNames, mergeBaseCommitOid) {
function buildPayload(commitOid, ref, analysisKey, analysisName, zippedSarif, workflowRunID, workflowRunAttempt, checkoutURI, environment, toolNames, mergeBaseCommitOid) {
const payloadObj = {
commit_oid: commitOid,
ref,
@@ -205,6 +205,7 @@ function buildPayload(commitOid, ref, analysisKey, analysisName, zippedSarif, wo
analysis_name: analysisName,
sarif: zippedSarif,
workflow_run_id: workflowRunID,
workflow_run_attempt: workflowRunAttempt,
checkout_uri: checkoutURI,
environment,
started_at: process.env[shared_environment_1.CODEQL_WORKFLOW_STARTED_AT],
@@ -235,7 +236,7 @@ function buildPayload(commitOid, ref, analysisKey, analysisName, zippedSarif, wo
exports.buildPayload = buildPayload;
// Uploads the given set of sarif files.
// Returns true iff the upload occurred and succeeded
async function uploadFiles(sarifFiles, repositoryNwo, commitOid, ref, analysisKey, category, analysisName, workflowRunID, sourceRoot, environment, logger) {
async function uploadFiles(sarifFiles, repositoryNwo, commitOid, ref, analysisKey, category, analysisName, workflowRunID, workflowRunAttempt, sourceRoot, environment, logger) {
logger.startGroup("Uploading results");
logger.info(`Processing sarif files: ${JSON.stringify(sarifFiles)}`);
// Validate that the files we were asked to upload are all valid SARIF files
@@ -252,7 +253,7 @@ async function uploadFiles(sarifFiles, repositoryNwo, commitOid, ref, analysisKe
const sarifPayload = JSON.stringify(sarif);
const zippedSarif = zlib_1.default.gzipSync(sarifPayload).toString("base64");
const checkoutURI = (0, file_url_1.default)(sourceRoot);
const payload = buildPayload(commitOid, ref, analysisKey, analysisName, zippedSarif, workflowRunID, checkoutURI, environment, toolNames, await actionsUtil.determineMergeBaseCommitOid());
const payload = buildPayload(commitOid, ref, analysisKey, analysisName, zippedSarif, workflowRunID, workflowRunAttempt, checkoutURI, environment, toolNames, await actionsUtil.determineMergeBaseCommitOid());
// Log some useful debug info about the info
const rawUploadSizeBytes = sarifPayload.length;
logger.debug(`Raw upload size: ${rawUploadSizeBytes} bytes`);

File diff suppressed because one or more lines are too long

View File

@@ -48,7 +48,7 @@ ava_1.default.beforeEach(() => {
});
(0, ava_1.default)("validate correct payload used for push, PR merge commit, and PR head", async (t) => {
process.env["GITHUB_EVENT_NAME"] = "push";
const pushPayload = uploadLib.buildPayload("commit", "refs/heads/master", "key", undefined, "", undefined, "/opt/src", undefined, ["CodeQL", "eslint"], "mergeBaseCommit");
const pushPayload = uploadLib.buildPayload("commit", "refs/heads/master", "key", undefined, "", 1234, 1, "/opt/src", undefined, ["CodeQL", "eslint"], "mergeBaseCommit");
// Not triggered by a pull request
t.falsy(pushPayload.base_ref);
t.falsy(pushPayload.base_sha);
@@ -56,11 +56,11 @@ ava_1.default.beforeEach(() => {
process.env["GITHUB_SHA"] = "commit";
process.env["GITHUB_BASE_REF"] = "master";
process.env["GITHUB_EVENT_PATH"] = `${__dirname}/../src/testdata/pull_request.json`;
const prMergePayload = uploadLib.buildPayload("commit", "refs/pull/123/merge", "key", undefined, "", undefined, "/opt/src", undefined, ["CodeQL", "eslint"], "mergeBaseCommit");
const prMergePayload = uploadLib.buildPayload("commit", "refs/pull/123/merge", "key", undefined, "", 1234, 1, "/opt/src", undefined, ["CodeQL", "eslint"], "mergeBaseCommit");
// Uploads for a merge commit use the merge base
t.deepEqual(prMergePayload.base_ref, "refs/heads/master");
t.deepEqual(prMergePayload.base_sha, "mergeBaseCommit");
const prHeadPayload = uploadLib.buildPayload("headCommit", "refs/pull/123/head", "key", undefined, "", undefined, "/opt/src", undefined, ["CodeQL", "eslint"], "mergeBaseCommit");
const prHeadPayload = uploadLib.buildPayload("headCommit", "refs/pull/123/head", "key", undefined, "", 1234, 1, "/opt/src", undefined, ["CodeQL", "eslint"], "mergeBaseCommit");
// Uploads for the head use the PR base
t.deepEqual(prHeadPayload.base_ref, "refs/heads/master");
t.deepEqual(prHeadPayload.base_sha, "f95f852bd8fca8fcc58a9a2d6c842781e32a215e");

File diff suppressed because one or more lines are too long

4
lib/util.js generated
View File

@@ -337,9 +337,11 @@ exports.assertNever = assertNever;
* knowing what version of CodeQL we're running.
*/
function initializeEnvironment(version) {
core.exportVariable(String(shared_environment_1.EnvVar.VERSION), version);
core.exportVariable(String(shared_environment_1.EnvVar.FEATURE_MULTI_LANGUAGE), "false");
core.exportVariable(String(shared_environment_1.EnvVar.FEATURE_SANDWICH), "false");
core.exportVariable(String(shared_environment_1.EnvVar.FEATURE_SARIF_COMBINE), "true");
core.exportVariable(String(shared_environment_1.EnvVar.FEATURE_WILL_UPLOAD), "true");
core.exportVariable(String(shared_environment_1.EnvVar.VERSION), version);
}
exports.initializeEnvironment = initializeEnvironment;
/**

File diff suppressed because one or more lines are too long

62
lib/workflow.js generated
View File

@@ -26,7 +26,7 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.getCheckoutPathInputOrThrow = exports.getUploadInputOrThrow = exports.getCategoryInputOrThrow = exports.getWorkflowRunID = exports.getWorkflowRelativePath = exports.getWorkflow = exports.formatWorkflowCause = exports.formatWorkflowErrors = exports.validateWorkflow = exports.getWorkflowErrors = exports.WorkflowErrors = exports.patternIsSuperset = void 0;
exports.getCheckoutPathInputOrThrow = exports.getUploadInputOrThrow = exports.getCategoryInputOrThrow = exports.getWorkflowRunAttempt = exports.getWorkflowRunID = exports.getWorkflowRelativePath = exports.getWorkflow = exports.formatWorkflowCause = exports.formatWorkflowErrors = exports.validateWorkflow = exports.getWorkflowErrors = exports.WorkflowErrors = exports.patternIsSuperset = void 0;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const zlib_1 = __importDefault(require("zlib"));
@@ -65,18 +65,6 @@ function patternIsSuperset(patternA, patternB) {
return patternToRegExp(patternA).test(patternB);
}
exports.patternIsSuperset = patternIsSuperset;
function branchesToArray(branches) {
if (typeof branches === "string") {
return [branches];
}
if (Array.isArray(branches)) {
if (branches.length === 0) {
return "**";
}
return branches;
}
return "**";
}
function toCodedErrors(errors) {
return Object.entries(errors).reduce((acc, [code, message]) => {
acc[code] = { message, code };
@@ -86,8 +74,7 @@ function toCodedErrors(errors) {
// code to send back via status report
// message to add as a warning annotation to the run
exports.WorkflowErrors = toCodedErrors({
MismatchedBranches: `Please make sure that every branch in on.pull_request is also in on.push so that Code Scanning can compare pull requests against the state of the base branch.`,
MissingPushHook: `Please specify an on.push hook so that Code Scanning can compare pull requests against the state of the base branch.`,
MissingPushHook: `Please specify an on.push hook to analyze and see code scanning alerts from the default branch on the Security tab.`,
CheckoutWrongHead: `git checkout HEAD^2 is no longer necessary. Please remove this step as Code Scanning recommends analyzing the merge commit for best results.`,
});
function getWorkflowErrors(doc) {
@@ -132,28 +119,6 @@ function getWorkflowErrors(doc) {
if (!hasPush && hasPullRequest) {
missingPush = true;
}
// if doc.on.pull_request is null that means 'all branches'
// if doc.on.pull_request is undefined that means 'off'
// we only want to check for mismatched branches if pull_request is on.
if (doc.on.pull_request !== undefined) {
const push = branchesToArray(doc.on.push?.branches);
if (push !== "**") {
const pull_request = branchesToArray(doc.on.pull_request?.branches);
if (pull_request !== "**") {
const difference = pull_request.filter((value) => !push.some((o) => patternIsSuperset(o, value)));
if (difference.length > 0) {
// there are branches in pull_request that may not have a baseline
// because we are not building them on push
errors.push(exports.WorkflowErrors.MismatchedBranches);
}
}
else if (push.length > 0) {
// push is set up to run on a subset of branches
// and you could open a PR against a branch with no baseline
errors.push(exports.WorkflowErrors.MismatchedBranches);
}
}
}
}
if (missingPush) {
errors.push(exports.WorkflowErrors.MissingPushHook);
@@ -252,13 +217,32 @@ exports.getWorkflowRelativePath = getWorkflowRelativePath;
* Get the workflow run ID.
*/
function getWorkflowRunID() {
const workflowRunID = parseInt((0, util_1.getRequiredEnvParam)("GITHUB_RUN_ID"), 10);
const workflowRunIdString = (0, util_1.getRequiredEnvParam)("GITHUB_RUN_ID");
const workflowRunID = parseInt(workflowRunIdString, 10);
if (Number.isNaN(workflowRunID)) {
throw new Error("GITHUB_RUN_ID must define a non NaN workflow run ID");
throw new Error(`GITHUB_RUN_ID must define a non NaN workflow run ID. Current value is ${workflowRunIdString}`);
}
if (workflowRunID < 0) {
throw new Error(`GITHUB_RUN_ID must be a non-negative integer. Current value is ${workflowRunIdString}`);
}
return workflowRunID;
}
exports.getWorkflowRunID = getWorkflowRunID;
/**
* Get the workflow run attempt number.
*/
function getWorkflowRunAttempt() {
const workflowRunAttemptString = (0, util_1.getRequiredEnvParam)("GITHUB_RUN_ATTEMPT");
const workflowRunAttempt = parseInt(workflowRunAttemptString, 10);
if (Number.isNaN(workflowRunAttempt)) {
throw new Error(`GITHUB_RUN_ATTEMPT must define a non NaN workflow run attempt. Current value is ${workflowRunAttemptString}`);
}
if (workflowRunAttempt <= 0) {
throw new Error(`GITHUB_RUN_ATTEMPT must be a positive integer. Current value is ${workflowRunAttemptString}`);
}
return workflowRunAttempt;
}
exports.getWorkflowRunAttempt = getWorkflowRunAttempt;
function getStepsCallingAction(job, actionName) {
if (job.uses) {
throw new Error(`Could not get steps calling ${actionName} since the job calls a reusable workflow.`);

File diff suppressed because one or more lines are too long

49
lib/workflow.test.js generated
View File

@@ -64,12 +64,6 @@ function errorCodes(actual, expected) {
});
t.deepEqual(...errorCodes(errors, []));
});
(0, ava_1.default)("getWorkflowErrors() when on.pull_requests is a string", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: { push: { branches: ["main"] }, pull_request: { branches: "*" } },
});
t.deepEqual(...errorCodes(errors, [workflow_1.WorkflowErrors.MismatchedBranches]));
});
(0, ava_1.default)("getWorkflowErrors() when on.pull_requests is a string and correct", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: { push: { branches: "*" }, pull_request: { branches: "*" } },
@@ -84,15 +78,6 @@ function errorCodes(actual, expected) {
`));
t.deepEqual(...errorCodes(errors, []));
});
(0, ava_1.default)("getWorkflowErrors() when on.push is mismatched", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: {
push: { branches: ["main"] },
pull_request: { branches: ["feature"] },
},
});
t.deepEqual(...errorCodes(errors, [workflow_1.WorkflowErrors.MismatchedBranches]));
});
(0, ava_1.default)("getWorkflowErrors() when on.push is not mismatched", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: {
@@ -102,15 +87,6 @@ function errorCodes(actual, expected) {
});
t.deepEqual(...errorCodes(errors, []));
});
(0, ava_1.default)("getWorkflowErrors() when on.push is mismatched for pull_request", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: {
push: { branches: ["main"] },
pull_request: { branches: ["main", "feature"] },
},
});
t.deepEqual(...errorCodes(errors, [workflow_1.WorkflowErrors.MismatchedBranches]));
});
(0, ava_1.default)("getWorkflowErrors() for a range of malformed workflows", (t) => {
t.deepEqual(...errorCodes((0, workflow_1.getWorkflowErrors)({
on: {
@@ -175,16 +151,6 @@ function errorCodes(actual, expected) {
},
}), []));
});
(0, ava_1.default)("getWorkflowErrors() when on.pull_request for every branch but push specifies branches", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)(yaml.load(`
name: "CodeQL"
on:
push:
branches: ["main"]
pull_request:
`));
t.deepEqual(...errorCodes(errors, [workflow_1.WorkflowErrors.MismatchedBranches]));
});
(0, ava_1.default)("getWorkflowErrors() when on.pull_request for wildcard branches", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: {
@@ -194,15 +160,6 @@ function errorCodes(actual, expected) {
});
t.deepEqual(...errorCodes(errors, []));
});
(0, ava_1.default)("getWorkflowErrors() when on.pull_request for mismatched wildcard branches", (t) => {
const errors = (0, workflow_1.getWorkflowErrors)({
on: {
push: { branches: ["feature/moose"] },
pull_request: { branches: "feature/*" },
},
});
t.deepEqual(...errorCodes(errors, [workflow_1.WorkflowErrors.MismatchedBranches]));
});
(0, ava_1.default)("getWorkflowErrors() when HEAD^2 is checked out", (t) => {
process.env.GITHUB_JOB = "test";
const errors = (0, workflow_1.getWorkflowErrors)({
@@ -218,7 +175,7 @@ function errorCodes(actual, expected) {
(0, ava_1.default)("formatWorkflowErrors() when there are multiple errors", (t) => {
const message = (0, workflow_1.formatWorkflowErrors)([
workflow_1.WorkflowErrors.CheckoutWrongHead,
workflow_1.WorkflowErrors.MismatchedBranches,
workflow_1.WorkflowErrors.MissingPushHook,
]);
t.true(message.startsWith("2 issues were detected with this workflow:"));
});
@@ -229,9 +186,9 @@ function errorCodes(actual, expected) {
(0, ava_1.default)("formatWorkflowCause()", (t) => {
const message = (0, workflow_1.formatWorkflowCause)([
workflow_1.WorkflowErrors.CheckoutWrongHead,
workflow_1.WorkflowErrors.MismatchedBranches,
workflow_1.WorkflowErrors.MissingPushHook,
]);
t.deepEqual(message, "CheckoutWrongHead,MismatchedBranches");
t.deepEqual(message, "CheckoutWrongHead,MissingPushHook");
t.deepEqual((0, workflow_1.formatWorkflowCause)([]), undefined);
});
(0, ava_1.default)("patternIsSuperset()", (t) => {

File diff suppressed because one or more lines are too long

7
node_modules/.package-lock.json generated vendored
View File

@@ -1,6 +1,6 @@
{
"name": "codeql",
"version": "2.2.13",
"version": "2.3.4",
"lockfileVersion": 3,
"requires": true,
"packages": {
@@ -4146,8 +4146,9 @@
}
},
"node_modules/jsonschema": {
"version": "1.2.6",
"integrity": "sha512-SqhURKZG07JyKKeo/ir24QnS4/BV7a6gQy93bUSe4lUdNp0QNpIz2c9elWJQ9dpc5cQYY6cvCzgRwy0MQCLyqA==",
"version": "1.4.1",
"resolved": "https://registry.npmjs.org/jsonschema/-/jsonschema-1.4.1.tgz",
"integrity": "sha512-S6cATIPVv1z0IlxdN+zUk5EPjkGCdnhN4wVSBlvoUO1tOLJootbo9CquNJmbIh4yikWHiUedhRYrNPn1arpEmQ==",
"engines": {
"node": "*"
}

230
node_modules/jsonschema/README.md generated vendored
View File

@@ -1,18 +1,21 @@
[![Build Status](https://secure.travis-ci.org/tdegrunt/jsonschema.svg)](http://travis-ci.org/tdegrunt/jsonschema)
# jsonschema
[JSON schema](http://json-schema.org/) validator, which is designed to be fast and simple to use.
The latest IETF published draft is v6, this library is mostly v4 compatible.
[JSON schema](http://json-schema.org/) validator, which is designed to be fast and simple to use. JSON Schema versions through draft-07 are fully supported.
## Contributing & bugs
Please fork the repository, make the changes in your fork and include tests. Once you're done making changes, send in a pull request.
### Bug reports
Please include a test which shows why the code fails.
## Usage
### Simple
Simple object validation using JSON schemas.
```javascript
@@ -78,6 +81,7 @@ v.addSchema(addressSchema, '/SimpleAddress');
console.log(v.validate(p, schema));
```
### Example for Array schema
```json
var arraySchema = {
"type": "array",
@@ -95,21 +99,42 @@ For a comprehensive, annotated example illustrating all possible validation opti
## Features
### Definitions
All schema definitions are supported, $schema is ignored.
### Types
All types are supported
### Handling `undefined`
`undefined` is not a value known to JSON, and by default, the validator treats it as if it is not invalid. i.e., it will return valid.
```javascript
var res = validate(undefined, {type: 'string'});
res.valid // true
```
This behavior may be changed with the "required" option:
```javascript
var res = validate(undefined, {type: 'string'}, {required: true});
res.valid // false
```
### Formats
#### Disabling the format keyword.
You may disable format validation by providing `disableFormat: true` to the validator
options.
#### String Formats
All formats are supported, phone numbers are expected to follow the [E.123](http://en.wikipedia.org/wiki/E.123) standard.
#### Custom Formats
You may add your own custom format functions. Format functions accept the input
being validated and return a boolean value. If the returned value is `true`, then
validation succeeds. If the returned value is `false`, then validation fails.
@@ -133,27 +158,86 @@ validator.validate('foo', {type: 'string', format: 'myFormat'}).valid; // false
```
### Results
The first error found will be thrown as an `Error` object if `options.throwError` is `true`. Otherwise all results will be appended to the `result.errors` array which also contains the success flag `result.valid`.
When `oneOf` or `anyOf` validations fail, errors that caused any of the sub-schemas referenced therein to fail are not reported, unless `options.nestedErrors` is truthy. This option may be useful when troubleshooting validation errors in complex schemas.
By default, results will be returned in a `ValidatorResult` object with the following properties:
### Custom properties
Specify your own JSON Schema properties with the validator.attributes property:
* `instance`: any.
* `schema`: Schema.
* `errors`: ValidationError[].
* `valid`: boolean.
Each item in `errors` is a `ValidationError` with the following properties:
* path: array. An array of property keys or array offsets, indicating where inside objects or arrays the instance was found.
* property: string. Describes the property path. Starts with `instance`, and is delimited with a dot (`.`).
* message: string. A human-readable message for debugging use. Provided in English and subject to change.
* schema: object. The schema containing the keyword that failed
* instance: any. The instance that failed
* name: string. The keyword within the schema that failed.
* argument: any. Provides information about the keyword that failed.
The validator can be configured to throw in the event of a validation error:
* If the `throwFirst` option is set, the validator will terminate validation at the first encountered error and throw a `ValidatorResultError` object.
* If the `throwAll` option is set, the validator will throw a `ValidatorResultError` object after the entire instance has been validated.
* If the `throwError` option is set, it will throw at the first encountered validation error (like `throwFirst`), but the `ValidationError` object itself will be thrown. Note that, despite the name, this does not inherit from Error like `ValidatorResultError` does.
The `ValidatorResultError` object has the same properties as `ValidatorResult` and additionally inherits from Error.
#### "nestedErrors" option
When `oneOf` or `anyOf` validations fail, errors that caused any of the sub-schemas referenced therein to fail are normally suppressed, because it is not necessary to fix all of them. And in the case of `oneOf`, it would itself be an error to fix all of the listed errors.
This behavior may be configured with `options.nestedErrors`. If truthy, it will emit all the errors from the subschemas. This option may be useful when troubleshooting validation errors in complex schemas:
```javascript
var schema = {
oneOf: [
{ type: 'string', minLength: 32, maxLength: 32 },
{ type: 'string', maxLength: 16 },
{ type: 'number' },
]
};
var validator = new Validator();
var result = validator.validate('This string is 28 chars long', schema, {nestedErrors: true});
// result.toString() reads out:
// 0: instance does not meet minimum length of 32
// 1: instance does not meet maximum length of 16
// 2: instance is not of a type(s) number
// 3: instance is not exactly one from [subschema 0],[subschema 1],[subschema 2]
```
#### Localizing Error Messages
To provide localized, human-readable errors, use the `name` string as a translation key. Feel free to open an issue for support relating to localizing error messages. For example:
```
var localized = result.errors.map(function(err){
return localeService.translate(err.name);
});
```
### Custom keywords
Specify your own JSON Schema keywords with the validator.attributes property:
```javascript
validator.attributes.contains = function validateContains(instance, schema, options, ctx) {
if(typeof instance!='string') return;
if(typeof schema.contains!='string') throw new jsonschema.SchemaError('"contains" expects a string', schema);
if(typeof instance !== 'string') return;
if(typeof schema.contains !== 'string') throw new jsonschema.SchemaError('"contains" expects a string', schema);
if(instance.indexOf(schema.contains)<0){
return 'does not contain the string ' + JSON.stringify(schema.contains);
}
}
var result = validator.validate("i am an instance", { type:"string", contains: "i am" });
var result = validator.validate("I am an instance", { type:"string", contains: "I am" });
// result.valid === true;
```
The instance passes validation if the function returns nothing. A single validation error is produced
if the fuction returns a string. Any number of errors (maybe none at all) may be returned by passing a
if the function returns a string. Any number of errors (maybe none at all) may be returned by passing a
`ValidatorResult` object, which may be used like so:
```javascript
@@ -165,6 +249,7 @@ if the fuction returns a string. Any number of errors (maybe none at all) may be
```
### Dereferencing schemas
Sometimes you may want to download schemas from remote sources, like a database, or over HTTP. When importing a schema,
unknown references are inserted into the `validator.unresolvedRefs` Array. Asynchronously shift elements off this array and import
them:
@@ -184,44 +269,119 @@ function importNextSchema(){
importNextSchema();
```
### Default base URI
Schemas should typically have an `id` with an absolute, full URI. However if the schema you are using contains only relative URI references, the `base` option will be used to resolve these.
This following example would throw a `SchemaError` if the `base` option were unset:
```javascript
var result = validate(["Name"], {
id: "/schema.json",
type: "array",
items: { $ref: "http://example.com/schema.json#/definitions/item" },
definitions: {
item: { type: "string" },
},
}, { base: 'http://example.com/' });
```
### Rewrite Hook
The `rewrite` option lets you change the value of an instance after it has successfully been validated. This will mutate the `instance` passed to the validate function. This can be useful for unmarshalling data and parsing it into native instances, such as changing a string to a `Date` instance.
The `rewrite` option accepts a function with the following arguments:
* instance: any
* schema: object
* options: object
* ctx: object
* return value: any new value for the instance
The value may be removed by returning `undefined`.
If you don't want to change the value, call `return instance`.
Here is an example that can convert a property expecting a date into a Date instance:
```javascript
const schema = {
properties: {
date: {id: 'http://example.com/date', type: 'string'},
},
};
const value = {
date: '2020-09-30T23:39:27.060Z',
};
function unmarshall(instance, schema){
if(schema.id === 'http://example.com/date'){
return new Date(instance);
}
return instance;
}
const v = new Validator();
const res = v.validate(value, schema, {rewrite: unmarshall});
assert(res.instance.date instanceof Date);
```
### Pre-Property Validation Hook
If some processing of properties is required prior to validation a function may be passed via the options parameter of the validate function. For example, say you needed to perform type coercion for some properties:
```javascript
const coercionHook = function (instance, property, schema, options, ctx) {
var value = instance[property];
// See examples/coercion.js
function preValidateProperty(object, key, schema, options, ctx) {
var value = object[key];
if (typeof value === 'undefined') return;
// Skip nulls and undefineds
if (value === null || typeof value == 'undefined') {
return;
}
// If the schema declares a type and the property fails type validation.
if (schema.type && this.attributes.type.call(this, instance, schema, options, ctx.makeChild(schema, property))) {
var types = Array.isArray(schema.type) ? schema.type : [schema.type];
var coerced = undefined;
// Go through the declared types until we find something that we can
// coerce the value into.
for (var i = 0; typeof coerced == 'undefined' && i < types.length; i++) {
// If we support coercion to this type
if (lib.coercions[types[i]]) {
// ...attempt it.
coerced = lib.coercions[types[i]](value);
}
// Test if the schema declares a type, but the type keyword fails validation
if (schema.type && validator.attributes.type.call(validator, value, schema, options, ctx.makeChild(schema, key))) {
// If the type is "number" but the instance is not a number, cast it
if(schema.type==='number' && typeof value!=='number'){
object[key] = parseFloat(value);
return;
}
// If we got a successful coercion we modify the property of the instance.
if (typeof coerced != 'undefined') {
instance[property] = coerced;
// If the type is "string" but the instance is not a string, cast it
if(schema.type==='string' && typeof value!=='string'){
object[key] = String(value).toString();
return;
}
}
}.bind(validator)
};
// And now, to actually perform validation with the coercion hook!
v.validate(instance, schema, { preValidateProperty: coercionHook });
v.validate(instance, schema, { preValidateProperty });
```
### Skip validation of certain keywords
Use the "skipAttributes" option to skip validation of certain keywords. Provide an array of keywords to ignore.
For skipping the "format" keyword, see the disableFormat option.
### Fail on unknown keywords
By default, JSON Schema is supposed to ignore unknown schema keywords.
You can change this behavior to require that all keywords used in a schema have a defined behavior, by using setting the "allowUnknownAttributes" option to false.
This example will throw a `SchemaError`:
```javascript
var schema = {
type: "string",
format: "email",
example: "foo",
};
var result = validate("Name", schema, { allowUnknownAttributes: false });
```
## Tests
Uses [JSON Schema Test Suite](https://github.com/json-schema/JSON-Schema-Test-Suite) as well as our own tests.
You'll need to update and init the git submodules:

View File

@@ -16,13 +16,13 @@ attribute.ignoreProperties = {
'description': true,
'title': true,
// arguments to other properties
'exclusiveMinimum': true,
'exclusiveMaximum': true,
'additionalItems': true,
'then': true,
'else': true,
// special-handled properties
'$schema': true,
'$ref': true,
'extends': true
'extends': true,
};
/**
@@ -47,7 +47,9 @@ validators.type = function validateType (instance, schema, options, ctx) {
var types = Array.isArray(schema.type) ? schema.type : [schema.type];
if (!types.some(this.testType.bind(this, instance, schema, options, ctx))) {
var list = types.map(function (v) {
return v.id && ('<' + v.id + '>') || (v+'');
if(!v) return;
var id = v.$id || v.id;
return id ? ('<' + id + '>') : (v+'');
});
result.addError({
name: 'type',
@@ -60,9 +62,12 @@ validators.type = function validateType (instance, schema, options, ctx) {
function testSchemaNoThrow(instance, options, ctx, callback, schema){
var throwError = options.throwError;
var throwAll = options.throwAll;
options.throwError = false;
options.throwAll = false;
var res = this.validateSchema(instance, schema, options, ctx);
options.throwError = throwError;
options.throwAll = throwAll;
if (!res.valid && callback instanceof Function) {
callback(res);
@@ -91,9 +96,11 @@ validators.anyOf = function validateAnyOf (instance, schema, options, ctx) {
if (!schema.anyOf.some(
testSchemaNoThrow.bind(
this, instance, options, ctx, function(res){inner.importErrors(res);}
))) {
))) {
var list = schema.anyOf.map(function (v, i) {
return (v.id && ('<' + v.id + '>')) || (v.title && JSON.stringify(v.title)) || (v['$ref'] && ('<' + v['$ref'] + '>')) || '[subschema '+i+']';
var id = v.$id || v.id;
if(id) return '<' + id + '>';
return(v.title && JSON.stringify(v.title)) || (v['$ref'] && ('<' + v['$ref'] + '>')) || '[subschema '+i+']';
});
if (options.nestedErrors) {
result.importErrors(inner);
@@ -128,7 +135,8 @@ validators.allOf = function validateAllOf (instance, schema, options, ctx) {
schema.allOf.forEach(function(v, i){
var valid = self.validateSchema(instance, v, options, ctx);
if(!valid.valid){
var msg = (v.id && ('<' + v.id + '>')) || (v.title && JSON.stringify(v.title)) || (v['$ref'] && ('<' + v['$ref'] + '>')) || '[subschema '+i+']';
var id = v.$id || v.id;
var msg = id || (v.title && JSON.stringify(v.title)) || (v['$ref'] && ('<' + v['$ref'] + '>')) || '[subschema '+i+']';
result.addError({
name: 'allOf',
argument: { id: msg, length: valid.errors.length, valid: valid },
@@ -161,9 +169,10 @@ validators.oneOf = function validateOneOf (instance, schema, options, ctx) {
var count = schema.oneOf.filter(
testSchemaNoThrow.bind(
this, instance, options, ctx, function(res) {inner.importErrors(res);}
) ).length;
) ).length;
var list = schema.oneOf.map(function (v, i) {
return (v.id && ('<' + v.id + '>')) || (v.title && JSON.stringify(v.title)) || (v['$ref'] && ('<' + v['$ref'] + '>')) || '[subschema '+i+']';
var id = v.$id || v.id;
return id || (v.title && JSON.stringify(v.title)) || (v['$ref'] && ('<' + v['$ref'] + '>')) || '[subschema '+i+']';
});
if (count!==1) {
if (options.nestedErrors) {
@@ -178,6 +187,70 @@ validators.oneOf = function validateOneOf (instance, schema, options, ctx) {
return result;
};
/**
* Validates "then" or "else" depending on the result of validating "if"
* @param instance
* @param schema
* @param options
* @param ctx
* @return {String|null}
*/
validators.if = function validateIf (instance, schema, options, ctx) {
// Ignore undefined instances
if (instance === undefined) return null;
if (!helpers.isSchema(schema.if)) throw new Error('Expected "if" keyword to be a schema');
var ifValid = testSchemaNoThrow.call(this, instance, options, ctx, null, schema.if);
var result = new ValidatorResult(instance, schema, options, ctx);
var res;
if(ifValid){
if (schema.then === undefined) return;
if (!helpers.isSchema(schema.then)) throw new Error('Expected "then" keyword to be a schema');
res = this.validateSchema(instance, schema.then, options, ctx.makeChild(schema.then));
result.importErrors(res);
}else{
if (schema.else === undefined) return;
if (!helpers.isSchema(schema.else)) throw new Error('Expected "else" keyword to be a schema');
res = this.validateSchema(instance, schema.else, options, ctx.makeChild(schema.else));
result.importErrors(res);
}
return result;
};
function getEnumerableProperty(object, key){
// Determine if `key` shows up in `for(var key in object)`
// First test Object.hasOwnProperty.call as an optimization: that guarantees it does
if(Object.hasOwnProperty.call(object, key)) return object[key];
// Test `key in object` as an optimization; false means it won't
if(!(key in object)) return;
while( (object = Object.getPrototypeOf(object)) ){
if(Object.propertyIsEnumerable.call(object, key)) return object[key];
}
}
/**
* Validates propertyNames
* @param instance
* @param schema
* @param options
* @param ctx
* @return {String|null|ValidatorResult}
*/
validators.propertyNames = function validatePropertyNames (instance, schema, options, ctx) {
if(!this.types.object(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
var subschema = schema.propertyNames!==undefined ? schema.propertyNames : {};
if(!helpers.isSchema(subschema)) throw new SchemaError('Expected "propertyNames" to be a schema (object or boolean)');
for (var property in instance) {
if(getEnumerableProperty(instance, property) !== undefined){
var res = this.validateSchema(property, subschema, options, ctx.makeChild(subschema));
result.importErrors(res);
}
}
return result;
};
/**
* Validates properties
* @param instance
@@ -191,12 +264,17 @@ validators.properties = function validateProperties (instance, schema, options,
var result = new ValidatorResult(instance, schema, options, ctx);
var properties = schema.properties || {};
for (var property in properties) {
if (typeof options.preValidateProperty == 'function') {
options.preValidateProperty(instance, property, properties[property], options, ctx);
var subschema = properties[property];
if(subschema===undefined){
continue;
}else if(subschema===null){
throw new SchemaError('Unexpected null, expected schema in "properties"');
}
var prop = Object.hasOwnProperty.call(instance, property) ? instance[property] : undefined;
var res = this.validateSchema(prop, properties[property], options, ctx.makeChild(properties[property], property));
if (typeof options.preValidateProperty == 'function') {
options.preValidateProperty(instance, property, subschema, options, ctx);
}
var prop = getEnumerableProperty(instance, property);
var res = this.validateSchema(prop, subschema, options, ctx.makeChild(subschema, property));
if(res.instance !== result.instance[property]) result.instance[property] = res.instance;
result.importErrors(res);
}
@@ -206,7 +284,7 @@ validators.properties = function validateProperties (instance, schema, options,
/**
* Test a specific property within in instance against the additionalProperties schema attribute
* This ignores properties with definitions in the properties schema attribute, but no other attributes.
* If too many more types of property-existance tests pop up they may need their own class of tests (like `type` has)
* If too many more types of property-existence tests pop up they may need their own class of tests (like `type` has)
* @private
* @return {boolean}
*/
@@ -219,7 +297,7 @@ function testAdditionalProperty (instance, schema, options, ctx, property, resul
result.addError({
name: 'additionalProperties',
argument: property,
message: "additionalProperty " + JSON.stringify(property) + " exists in instance when not allowed",
message: "is not allowed to have the additional property " + JSON.stringify(property),
});
} else {
var additionalProperties = schema.additionalProperties || {};
@@ -250,17 +328,29 @@ validators.patternProperties = function validatePatternProperties (instance, sch
for (var property in instance) {
var test = true;
for (var pattern in patternProperties) {
var expr = new RegExp(pattern);
if (!expr.test(property)) {
var subschema = patternProperties[pattern];
if(subschema===undefined){
continue;
}else if(subschema===null){
throw new SchemaError('Unexpected null, expected schema in "patternProperties"');
}
try {
var regexp = new RegExp(pattern, 'u');
} catch(_e) {
// In the event the stricter handling causes an error, fall back on the forgiving handling
// DEPRECATED
regexp = new RegExp(pattern);
}
if (!regexp.test(property)) {
continue;
}
test = false;
if (typeof options.preValidateProperty == 'function') {
options.preValidateProperty(instance, property, patternProperties[pattern], options, ctx);
options.preValidateProperty(instance, property, subschema, options, ctx);
}
var res = this.validateSchema(instance[property], patternProperties[pattern], options, ctx.makeChild(patternProperties[pattern], property));
var res = this.validateSchema(instance[property], subschema, options, ctx.makeChild(subschema, property));
if(res.instance !== result.instance[property]) result.instance[property] = res.instance;
result.importErrors(res);
}
@@ -308,7 +398,7 @@ validators.minProperties = function validateMinProperties (instance, schema, opt
name: 'minProperties',
argument: schema.minProperties,
message: "does not meet minimum property length of " + schema.minProperties,
})
});
}
return result;
};
@@ -344,10 +434,14 @@ validators.maxProperties = function validateMaxProperties (instance, schema, opt
validators.items = function validateItems (instance, schema, options, ctx) {
var self = this;
if (!this.types.array(instance)) return;
if (!schema.items) return;
if (schema.items===undefined) return;
var result = new ValidatorResult(instance, schema, options, ctx);
instance.every(function (value, i) {
var items = Array.isArray(schema.items) ? (schema.items[i] || schema.additionalItems) : schema.items;
if(Array.isArray(schema.items)){
var items = schema.items[i]===undefined ? schema.additionalItems : schema.items[i];
}else{
var items = schema.items;
}
if (items === undefined) {
return true;
}
@@ -366,6 +460,34 @@ validators.items = function validateItems (instance, schema, options, ctx) {
return result;
};
/**
* Validates the "contains" keyword
* @param instance
* @param schema
* @param options
* @param ctx
* @return {String|null|ValidatorResult}
*/
validators.contains = function validateContains (instance, schema, options, ctx) {
var self = this;
if (!this.types.array(instance)) return;
if (schema.contains===undefined) return;
if (!helpers.isSchema(schema.contains)) throw new Error('Expected "contains" keyword to be a schema');
var result = new ValidatorResult(instance, schema, options, ctx);
var count = instance.some(function (value, i) {
var res = self.validateSchema(value, schema.contains, options, ctx.makeChild(schema.contains, i));
return res.errors.length===0;
});
if(count===false){
result.addError({
name: 'contains',
argument: schema.contains,
message: "must contain an item matching given schema",
});
}
return result;
};
/**
* Validates minimum and exclusiveMinimum when the type of the instance value is a number.
* @param instance
@@ -375,18 +497,22 @@ validators.items = function validateItems (instance, schema, options, ctx) {
validators.minimum = function validateMinimum (instance, schema, options, ctx) {
if (!this.types.number(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
var valid = true;
if (schema.exclusiveMinimum && schema.exclusiveMinimum === true) {
valid = instance > schema.minimum;
if(!(instance > schema.minimum)){
result.addError({
name: 'minimum',
argument: schema.minimum,
message: "must be greater than " + schema.minimum,
});
}
} else {
valid = instance >= schema.minimum;
}
if (!valid) {
result.addError({
name: 'minimum',
argument: schema.minimum,
message: "must have a minimum value of " + schema.minimum,
});
if(!(instance >= schema.minimum)){
result.addError({
name: 'minimum',
argument: schema.minimum,
message: "must be greater than or equal to " + schema.minimum,
});
}
}
return result;
};
@@ -400,17 +526,65 @@ validators.minimum = function validateMinimum (instance, schema, options, ctx) {
validators.maximum = function validateMaximum (instance, schema, options, ctx) {
if (!this.types.number(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
var valid;
if (schema.exclusiveMaximum && schema.exclusiveMaximum === true) {
valid = instance < schema.maximum;
if(!(instance < schema.maximum)){
result.addError({
name: 'maximum',
argument: schema.maximum,
message: "must be less than " + schema.maximum,
});
}
} else {
valid = instance <= schema.maximum;
if(!(instance <= schema.maximum)){
result.addError({
name: 'maximum',
argument: schema.maximum,
message: "must be less than or equal to " + schema.maximum,
});
}
}
return result;
};
/**
* Validates the number form of exclusiveMinimum when the type of the instance value is a number.
* @param instance
* @param schema
* @return {String|null}
*/
validators.exclusiveMinimum = function validateExclusiveMinimum (instance, schema, options, ctx) {
// Support the boolean form of exclusiveMinimum, which is handled by the "minimum" keyword.
if(typeof schema.exclusiveMinimum === 'boolean') return;
if (!this.types.number(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
var valid = instance > schema.exclusiveMinimum;
if (!valid) {
result.addError({
name: 'maximum',
argument: schema.maximum,
message: "must have a maximum value of " + schema.maximum,
name: 'exclusiveMinimum',
argument: schema.exclusiveMinimum,
message: "must be strictly greater than " + schema.exclusiveMinimum,
});
}
return result;
};
/**
* Validates the number form of exclusiveMaximum when the type of the instance value is a number.
* @param instance
* @param schema
* @return {String|null}
*/
validators.exclusiveMaximum = function validateExclusiveMaximum (instance, schema, options, ctx) {
// Support the boolean form of exclusiveMaximum, which is handled by the "maximum" keyword.
if(typeof schema.exclusiveMaximum === 'boolean') return;
if (!this.types.number(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
var valid = instance < schema.exclusiveMaximum;
if (!valid) {
result.addError({
name: 'exclusiveMaximum',
argument: schema.exclusiveMaximum,
message: "must be strictly less than " + schema.exclusiveMaximum,
});
}
return result;
@@ -444,7 +618,7 @@ var validateMultipleOfOrDivisbleBy = function validateMultipleOfOrDivisbleBy (in
result.addError({
name: validationType,
argument: validationArgument,
message: errorMessage + JSON.stringify(validationArgument)
message: errorMessage + JSON.stringify(validationArgument),
});
}
@@ -458,7 +632,7 @@ var validateMultipleOfOrDivisbleBy = function validateMultipleOfOrDivisbleBy (in
* @return {String|null}
*/
validators.multipleOf = function validateMultipleOf (instance, schema, options, ctx) {
return validateMultipleOfOrDivisbleBy.call(this, instance, schema, options, ctx, "multipleOf", "is not a multiple of (divisible by) ");
return validateMultipleOfOrDivisbleBy.call(this, instance, schema, options, ctx, "multipleOf", "is not a multiple of (divisible by) ");
};
/**
@@ -480,14 +654,14 @@ validators.divisibleBy = function validateDivisibleBy (instance, schema, options
validators.required = function validateRequired (instance, schema, options, ctx) {
var result = new ValidatorResult(instance, schema, options, ctx);
if (instance === undefined && schema.required === true) {
// A boolean form is implemented for reverse-compatability with schemas written against older drafts
// A boolean form is implemented for reverse-compatibility with schemas written against older drafts
result.addError({
name: 'required',
message: "is required"
message: "is required",
});
} else if (this.types.object(instance) && Array.isArray(schema.required)) {
schema.required.forEach(function(n){
if(instance[n]===undefined){
if(getEnumerableProperty(instance, n)===undefined){
result.addError({
name: 'required',
argument: n,
@@ -508,7 +682,15 @@ validators.required = function validateRequired (instance, schema, options, ctx)
validators.pattern = function validatePattern (instance, schema, options, ctx) {
if (!this.types.string(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
if (!instance.match(schema.pattern)) {
var pattern = schema.pattern;
try {
var regexp = new RegExp(pattern, 'u');
} catch(_e) {
// In the event the stricter handling causes an error, fall back on the forgiving handling
// DEPRECATED
regexp = new RegExp(pattern);
}
if (!instance.match(regexp)) {
result.addError({
name: 'pattern',
argument: schema.pattern,
@@ -633,32 +815,6 @@ validators.maxItems = function validateMaxItems (instance, schema, options, ctx)
return result;
};
/**
* Validates that every item in an instance array is unique, when instance is an array
* @param instance
* @param schema
* @param options
* @param ctx
* @return {String|null|ValidatorResult}
*/
validators.uniqueItems = function validateUniqueItems (instance, schema, options, ctx) {
if (!this.types.array(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
function testArrays (v, i, a) {
for (var j = i + 1; j < a.length; j++) if (helpers.deepCompareStrict(v, a[j])) {
return false;
}
return true;
}
if (!instance.every(testArrays)) {
result.addError({
name: 'uniqueItems',
message: "contains duplicate item",
});
}
return result;
};
/**
* Deep compares arrays for duplicates
* @param v
@@ -683,6 +839,7 @@ function testArrays (v, i, a) {
* @return {String|null}
*/
validators.uniqueItems = function validateUniqueItems (instance, schema, options, ctx) {
if (schema.uniqueItems!==true) return;
if (!this.types.array(instance)) return;
var result = new ValidatorResult(instance, schema, options, ctx);
if (!instance.every(testArrays)) {
@@ -806,7 +963,8 @@ validators.not = validators.disallow = function validateNot (instance, schema, o
if(!Array.isArray(notTypes)) notTypes=[notTypes];
notTypes.forEach(function (type) {
if (self.testType(instance, schema, options, ctx, type)) {
var schemaId = type && type.id && ('<' + type.id + '>') || type;
var id = type && (type.$id || type.id);
var schemaId = id || type;
result.addError({
name: 'not',
argument: schemaId,

View File

@@ -2,21 +2,23 @@
var uri = require('url');
var ValidationError = exports.ValidationError = function ValidationError (message, instance, schema, propertyPath, name, argument) {
if (propertyPath) {
this.property = propertyPath;
var ValidationError = exports.ValidationError = function ValidationError (message, instance, schema, path, name, argument) {
if(Array.isArray(path)){
this.path = path;
this.property = path.reduce(function(sum, item){
return sum + makeSuffix(item);
}, 'instance');
}else if(path !== undefined){
this.property = path;
}
if (message) {
this.message = message;
}
if (schema) {
if (schema.id) {
this.schema = schema.id;
} else {
this.schema = schema;
}
var id = schema.$id || schema.id;
this.schema = id || schema;
}
if (instance) {
if (instance !== undefined) {
this.instance = instance;
}
this.name = name;
@@ -31,27 +33,33 @@ ValidationError.prototype.toString = function toString() {
var ValidatorResult = exports.ValidatorResult = function ValidatorResult(instance, schema, options, ctx) {
this.instance = instance;
this.schema = schema;
this.options = options;
this.path = ctx.path;
this.propertyPath = ctx.propertyPath;
this.errors = [];
this.throwError = options && options.throwError;
this.throwFirst = options && options.throwFirst;
this.throwAll = options && options.throwAll;
this.disableFormat = options && options.disableFormat === true;
};
ValidatorResult.prototype.addError = function addError(detail) {
var err;
if (typeof detail == 'string') {
err = new ValidationError(detail, this.instance, this.schema, this.propertyPath);
err = new ValidationError(detail, this.instance, this.schema, this.path);
} else {
if (!detail) throw new Error('Missing error detail');
if (!detail.message) throw new Error('Missing error message');
if (!detail.name) throw new Error('Missing validator type');
err = new ValidationError(detail.message, this.instance, this.schema, this.propertyPath, detail.name, detail.argument);
err = new ValidationError(detail.message, this.instance, this.schema, this.path, detail.name, detail.argument);
}
if (this.throwError) {
this.errors.push(err);
if (this.throwFirst) {
throw new ValidatorResultError(this);
}else if(this.throwError){
throw err;
}
this.errors.push(err);
return err;
};
@@ -59,7 +67,7 @@ ValidatorResult.prototype.importErrors = function importErrors(res) {
if (typeof res == 'string' || (res && res.validatorType)) {
this.addError(res);
} else if (res && res.errors) {
Array.prototype.push.apply(this.errors, res.errors);
this.errors = this.errors.concat(res.errors);
}
};
@@ -74,6 +82,20 @@ Object.defineProperty(ValidatorResult.prototype, "valid", { get: function() {
return !this.errors.length;
} });
module.exports.ValidatorResultError = ValidatorResultError;
function ValidatorResultError(result) {
if(Error.captureStackTrace){
Error.captureStackTrace(this, ValidatorResultError);
}
this.instance = result.instance;
this.schema = result.schema;
this.options = result.options;
this.errors = result.errors;
}
ValidatorResultError.prototype = new Error();
ValidatorResultError.prototype.constructor = ValidatorResultError;
ValidatorResultError.prototype.name = "Validation Error";
/**
* Describes a problem with a Schema which prevents validation of an instance
* @name SchemaError
@@ -86,14 +108,22 @@ var SchemaError = exports.SchemaError = function SchemaError (msg, schema) {
Error.captureStackTrace(this, SchemaError);
};
SchemaError.prototype = Object.create(Error.prototype,
{ constructor: {value: SchemaError, enumerable: false}
, name: {value: 'SchemaError', enumerable: false}
{
constructor: {value: SchemaError, enumerable: false},
name: {value: 'SchemaError', enumerable: false},
});
var SchemaContext = exports.SchemaContext = function SchemaContext (schema, options, propertyPath, base, schemas) {
var SchemaContext = exports.SchemaContext = function SchemaContext (schema, options, path, base, schemas) {
this.schema = schema;
this.options = options;
this.propertyPath = propertyPath;
if(Array.isArray(path)){
this.path = path;
this.propertyPath = path.reduce(function(sum, item){
return sum + makeSuffix(item);
}, 'instance');
}else{
this.propertyPath = path;
}
this.base = base;
this.schemas = schemas;
};
@@ -103,36 +133,60 @@ SchemaContext.prototype.resolve = function resolve (target) {
};
SchemaContext.prototype.makeChild = function makeChild(schema, propertyName){
var propertyPath = (propertyName===undefined) ? this.propertyPath : this.propertyPath+makeSuffix(propertyName);
var base = uri.resolve(this.base, schema.id||'');
var ctx = new SchemaContext(schema, this.options, propertyPath, base, Object.create(this.schemas));
if(schema.id && !ctx.schemas[base]){
var path = (propertyName===undefined) ? this.path : this.path.concat([propertyName]);
var id = schema.$id || schema.id;
var base = uri.resolve(this.base, id||'');
var ctx = new SchemaContext(schema, this.options, path, base, Object.create(this.schemas));
if(id && !ctx.schemas[base]){
ctx.schemas[base] = schema;
}
return ctx;
}
};
var FORMAT_REGEXPS = exports.FORMAT_REGEXPS = {
// 7.3.1. Dates, Times, and Duration
'date-time': /^\d{4}-(?:0[0-9]{1}|1[0-2]{1})-(3[01]|0[1-9]|[12][0-9])[tT ](2[0-4]|[01][0-9]):([0-5][0-9]):(60|[0-5][0-9])(\.\d+)?([zZ]|[+-]([0-5][0-9]):(60|[0-5][0-9]))$/,
'date': /^\d{4}-(?:0[0-9]{1}|1[0-2]{1})-(3[01]|0[1-9]|[12][0-9])$/,
'time': /^(2[0-4]|[01][0-9]):([0-5][0-9]):(60|[0-5][0-9])$/,
'duration': /P(T\d+(H(\d+M(\d+S)?)?|M(\d+S)?|S)|\d+(D|M(\d+D)?|Y(\d+M(\d+D)?)?)(T\d+(H(\d+M(\d+S)?)?|M(\d+S)?|S))?|\d+W)/i,
// 7.3.2. Email Addresses
// TODO: fix the email production
'email': /^(?:[\w\!\#\$\%\&\'\*\+\-\/\=\?\^\`\{\|\}\~]+\.)*[\w\!\#\$\%\&\'\*\+\-\/\=\?\^\`\{\|\}\~]+@(?:(?:(?:[a-zA-Z0-9](?:[a-zA-Z0-9\-](?!\.)){0,61}[a-zA-Z0-9]?\.)+[a-zA-Z0-9](?:[a-zA-Z0-9\-](?!$)){0,61}[a-zA-Z0-9]?)|(?:\[(?:(?:[01]?\d{1,2}|2[0-4]\d|25[0-5])\.){3}(?:[01]?\d{1,2}|2[0-4]\d|25[0-5])\]))$/,
'ip-address': /^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$/,
'ipv6': /^\s*((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(%.+)?\s*$/,
'uri': /^[a-zA-Z][a-zA-Z0-9+-.]*:[^\s]*$/,
'idn-email': /^("(?:[!#-\[\]-\u{10FFFF}]|\\[\t -\u{10FFFF}])*"|[!#-'*+\-/-9=?A-Z\^-\u{10FFFF}](?:\.?[!#-'*+\-/-9=?A-Z\^-\u{10FFFF}])*)@([!#-'*+\-/-9=?A-Z\^-\u{10FFFF}](?:\.?[!#-'*+\-/-9=?A-Z\^-\u{10FFFF}])*|\[[!-Z\^-\u{10FFFF}]*\])$/u,
'color': /^(#?([0-9A-Fa-f]{3}){1,2}\b|aqua|black|blue|fuchsia|gray|green|lime|maroon|navy|olive|orange|purple|red|silver|teal|white|yellow|(rgb\(\s*\b([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\b\s*,\s*\b([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\b\s*,\s*\b([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\b\s*\))|(rgb\(\s*(\d?\d%|100%)+\s*,\s*(\d?\d%|100%)+\s*,\s*(\d?\d%|100%)+\s*\)))$/,
// 7.3.3. Hostnames
// 7.3.4. IP Addresses
'ip-address': /^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$/,
// FIXME whitespace is invalid
'ipv6': /^\s*((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(%.+)?\s*$/,
// 7.3.5. Resource Identifiers
// TODO: A more accurate regular expression for "uri" goes:
// [A-Za-z][+\-.0-9A-Za-z]*:((/(/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?)?)?#(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*|(/(/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?[/?]|[!$&-.0-;=?-Z_a-z~])|/?%[0-9A-Fa-f]{2}|[!$&-.0-;=?-Z_a-z~])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*(#(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*)?|/(/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+(:\d*)?|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?:\d*|\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)?)?
'uri': /^[a-zA-Z][a-zA-Z0-9+.-]*:[^\s]*$/,
'uri-reference': /^(((([A-Za-z][+\-.0-9A-Za-z]*(:%[0-9A-Fa-f]{2}|:[!$&-.0-;=?-Z_a-z~]|[/?])|\?)(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*|([A-Za-z][+\-.0-9A-Za-z]*:?)?)|([A-Za-z][+\-.0-9A-Za-z]*:)?\/((%[0-9A-Fa-f]{2}|\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?[/?]|[!$&-.0-;=?-Z_a-z~])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*|(\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?)?))#(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*|(([A-Za-z][+\-.0-9A-Za-z]*)?%[0-9A-Fa-f]{2}|[!$&-.0-9;=@_~]|[A-Za-z][+\-.0-9A-Za-z]*[!$&-*,;=@_~])(%[0-9A-Fa-f]{2}|[!$&-.0-9;=@-Z_a-z~])*((([/?](%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*)?#|[/?])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*)?|([A-Za-z][+\-.0-9A-Za-z]*(:%[0-9A-Fa-f]{2}|:[!$&-.0-;=?-Z_a-z~]|[/?])|\?)(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*|([A-Za-z][+\-.0-9A-Za-z]*:)?\/((%[0-9A-Fa-f]{2}|\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?[/?]|[!$&-.0-;=?-Z_a-z~])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~])*|\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~])+(:\d*)?|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?:\d*|\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~]+)?|[.0-:A-Fa-f]+)\])?)?|[A-Za-z][+\-.0-9A-Za-z]*:?)?$/,
'iri': /^[a-zA-Z][a-zA-Z0-9+.-]*:[^\s]*$/,
'iri-reference': /^(((([A-Za-z][+\-.0-9A-Za-z]*(:%[0-9A-Fa-f]{2}|:[!$&-.0-;=?-Z_a-z~-\u{10FFFF}]|[/?])|\?)(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*|([A-Za-z][+\-.0-9A-Za-z]*:?)?)|([A-Za-z][+\-.0-9A-Za-z]*:)?\/((%[0-9A-Fa-f]{2}|\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~-\u{10FFFF}])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~-\u{10FFFF}]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?[/?]|[!$&-.0-;=?-Z_a-z~-\u{10FFFF}])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*|(\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~-\u{10FFFF}])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~-\u{10FFFF}]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?)?))#(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*|(([A-Za-z][+\-.0-9A-Za-z]*)?%[0-9A-Fa-f]{2}|[!$&-.0-9;=@_~-\u{10FFFF}]|[A-Za-z][+\-.0-9A-Za-z]*[!$&-*,;=@_~-\u{10FFFF}])(%[0-9A-Fa-f]{2}|[!$&-.0-9;=@-Z_a-z~-\u{10FFFF}])*((([/?](%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*)?#|[/?])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*)?|([A-Za-z][+\-.0-9A-Za-z]*(:%[0-9A-Fa-f]{2}|:[!$&-.0-;=?-Z_a-z~-\u{10FFFF}]|[/?])|\?)(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*|([A-Za-z][+\-.0-9A-Za-z]*:)?\/((%[0-9A-Fa-f]{2}|\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~-\u{10FFFF}])+|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~-\u{10FFFF}]+)?|[.0-:A-Fa-f]+)\])?)(:\d*)?[/?]|[!$&-.0-;=?-Z_a-z~-\u{10FFFF}])(%[0-9A-Fa-f]{2}|[!$&-;=?-Z_a-z~-\u{10FFFF}])*|\/((%[0-9A-Fa-f]{2}|[!$&-.0-9;=A-Z_a-z~-\u{10FFFF}])+(:\d*)?|(\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~-\u{10FFFF}]+)?|[.0-:A-Fa-f]+)\])?:\d*|\[(([Vv][0-9A-Fa-f]+\.[!$&-.0-;=A-Z_a-z~-\u{10FFFF}]+)?|[.0-:A-Fa-f]+)\])?)?|[A-Za-z][+\-.0-9A-Za-z]*:?)?$/u,
'uuid': /^[0-9A-F]{8}-[0-9A-F]{4}-[0-9A-F]{4}-[0-9A-F]{4}-[0-9A-F]{12}$/i,
// 7.3.6. uri-template
'uri-template': /(%[0-9a-f]{2}|[!#$&(-;=?@\[\]_a-z~]|\{[!#&+,./;=?@|]?(%[0-9a-f]{2}|[0-9_a-z])(\.?(%[0-9a-f]{2}|[0-9_a-z]))*(:[1-9]\d{0,3}|\*)?(,(%[0-9a-f]{2}|[0-9_a-z])(\.?(%[0-9a-f]{2}|[0-9_a-z]))*(:[1-9]\d{0,3}|\*)?)*\})*/iu,
// 7.3.7. JSON Pointers
'json-pointer': /^(\/([\x00-\x2e0-@\[-}\x7f]|~[01])*)*$/iu,
'relative-json-pointer': /^\d+(#|(\/([\x00-\x2e0-@\[-}\x7f]|~[01])*)*)$/iu,
// hostname regex from: http://stackoverflow.com/a/1420225/5628
'hostname': /^(?=.{1,255}$)[0-9A-Za-z](?:(?:[0-9A-Za-z]|-){0,61}[0-9A-Za-z])?(?:\.[0-9A-Za-z](?:(?:[0-9A-Za-z]|-){0,61}[0-9A-Za-z])?)*\.?$/,
'host-name': /^(?=.{1,255}$)[0-9A-Za-z](?:(?:[0-9A-Za-z]|-){0,61}[0-9A-Za-z])?(?:\.[0-9A-Za-z](?:(?:[0-9A-Za-z]|-){0,61}[0-9A-Za-z])?)*\.?$/,
'alpha': /^[a-zA-Z]+$/,
'alphanumeric': /^[a-zA-Z0-9]+$/,
'utc-millisec': function (input) {
return (typeof input === 'string') && parseFloat(input) === parseInt(input, 10) && !isNaN(input);
},
// 7.3.8. regex
'regex': function (input) {
var result = true;
try {
@@ -142,8 +196,15 @@ var FORMAT_REGEXPS = exports.FORMAT_REGEXPS = {
}
return result;
},
'style': /\s*(.+?):\s*([^;]+);?/,
'phone': /^\+(?:[0-9] ?){6,14}[0-9]$/
// Other definitions
// "style" was removed from JSON Schema in draft-4 and is deprecated
'style': /[\r\n\t ]*[^\r\n\t ][^:]*:[\r\n\t ]*[^\r\n\t ;]*[\r\n\t ]*;?/,
// "color" was removed from JSON Schema in draft-4 and is deprecated
'color': /^(#?([0-9A-Fa-f]{3}){1,2}\b|aqua|black|blue|fuchsia|gray|green|lime|maroon|navy|olive|orange|purple|red|silver|teal|white|yellow|(rgb\(\s*\b([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\b\s*,\s*\b([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\b\s*,\s*\b([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\b\s*\))|(rgb\(\s*(\d?\d%|100%)+\s*,\s*(\d?\d%|100%)+\s*,\s*(\d?\d%|100%)+\s*\)))$/,
'phone': /^\+(?:[0-9] ?){6,14}[0-9]$/,
'alpha': /^[a-zA-Z]+$/,
'alphanumeric': /^[a-zA-Z0-9]+$/,
};
FORMAT_REGEXPS.regexp = FORMAT_REGEXPS.regex;
@@ -212,10 +273,10 @@ exports.deepCompareStrict = function deepCompareStrict (a, b) {
function deepMerger (target, dst, e, i) {
if (typeof e === 'object') {
dst[i] = deepMerge(target[i], e)
dst[i] = deepMerge(target[i], e);
} else {
if (target.indexOf(e) === -1) {
dst.push(e)
dst.push(e);
}
}
}
@@ -232,7 +293,7 @@ function copyistWithDeepMerge (target, src, dst, key) {
if (!target[key]) {
dst[key] = src[key];
} else {
dst[key] = deepMerge(target[key], src[key])
dst[key] = deepMerge(target[key], src[key]);
}
}
}
@@ -253,7 +314,7 @@ function deepMerge (target, src) {
}
return dst;
};
}
module.exports.deepMerge = deepMerge;
@@ -284,9 +345,9 @@ function pathEncoder (v) {
* @return {String}
*/
exports.encodePath = function encodePointer(a){
// ~ must be encoded explicitly because hacks
// the slash is encoded by encodeURIComponent
return a.map(pathEncoder).join('');
// ~ must be encoded explicitly because hacks
// the slash is encoded by encodeURIComponent
return a.map(pathEncoder).join('');
};
@@ -323,3 +384,7 @@ exports.getDecimalPlaces = function getDecimalPlaces(number) {
return decimalPlaces;
};
exports.isSchema = function isSchema(val){
return (typeof val === 'object' && val) || (typeof val === 'boolean');
};

View File

@@ -29,6 +29,7 @@ export declare class ValidatorResult {
export declare class ValidationError {
constructor(message?: string, instance?: any, schema?: Schema, propertyPath?: any, name?: string, argument?: any);
path: (string|number)[];
property: string;
message: string;
schema: string|Schema;
@@ -48,6 +49,7 @@ export declare class SchemaError extends Error{
export declare function validate(instance: any, schema: any, options?: Options): ValidatorResult
export interface Schema {
$id?: string
id?: string
$schema?: string
$ref?: string
@@ -55,9 +57,9 @@ export interface Schema {
description?: string
multipleOf?: number
maximum?: number
exclusiveMaximum?: boolean
exclusiveMaximum?: number | boolean
minimum?: number
exclusiveMinimum?: boolean
exclusiveMinimum?: number | boolean
maxLength?: number
minLength?: number
pattern?: string | RegExp
@@ -82,6 +84,7 @@ export interface Schema {
dependencies?: {
[name: string]: Schema | string[]
}
const?: any
'enum'?: any[]
type?: string | string[]
format?: string
@@ -89,27 +92,39 @@ export interface Schema {
anyOf?: Schema[]
oneOf?: Schema[]
not?: Schema
if?: Schema
then?: Schema
else?: Schema
}
export interface Options {
skipAttributes?: string[];
allowUnknownAttributes?: boolean;
preValidateProperty?: PreValidatePropertyFunction;
rewrite?: RewriteFunction;
propertyName?: string;
base?: string;
throwError?: boolean;
required?: boolean;
throwFirst?: boolean;
throwAll?: boolean;
nestedErrors?: boolean;
}
export interface RewriteFunction {
(instance: any, schema: Schema, options: Options, ctx: SchemaContext): any;
}
export interface PreValidatePropertyFunction {
(instance: any, key: string, schema: Schema, options: Options, ctx: SchemaContext): any;
}
export interface SchemaContext {
schema: Schema;
options: Options;
propertyPath: string;
base: string;
schemas: {[base: string]: Schema};
makeChild: (schema: Schema, key: string) => SchemaContext;
}
export interface CustomFormat {

View File

@@ -3,6 +3,7 @@
var Validator = module.exports.Validator = require('./validator');
module.exports.ValidatorResult = require('./helpers').ValidatorResult;
module.exports.ValidatorResultError = require('./helpers').ValidatorResultError;
module.exports.ValidationError = require('./helpers').ValidationError;
module.exports.SchemaError = require('./helpers').SchemaError;
module.exports.SchemaScanResult = require('./scan').SchemaScanResult;

View File

@@ -1,3 +1,4 @@
"use strict";
var urilib = require('url');
var helpers = require('./helpers');
@@ -23,13 +24,14 @@ module.exports.scan = function scan(base, schema){
ref[resolvedUri] = ref[resolvedUri] ? ref[resolvedUri]+1 : 0;
return;
}
var ourBase = schema.id ? urilib.resolve(baseuri, schema.id) : baseuri;
var id = schema.$id || schema.id;
var ourBase = id ? urilib.resolve(baseuri, id) : baseuri;
if (ourBase) {
// If there's no fragment, append an empty one
if(ourBase.indexOf('#')<0) ourBase += '#';
if(found[ourBase]){
if(!helpers.deepCompareStrict(found[ourBase], schema)){
throw new Error('Schema <'+schema+'> already exists with different definition');
throw new Error('Schema <'+ourBase+'> already exists with different definition');
}
return found[ourBase];
}
@@ -68,7 +70,6 @@ module.exports.scan = function scan(base, schema){
var found = {};
var ref = {};
var schemaUri = base;
scanSchema(base, schema);
return new SchemaScanResult(found, ref);
}
};

View File

@@ -6,6 +6,7 @@ var attribute = require('./attribute');
var helpers = require('./helpers');
var scanSchema = require('./scan').scan;
var ValidatorResult = helpers.ValidatorResult;
var ValidatorResultError = helpers.ValidatorResultError;
var SchemaError = helpers.SchemaError;
var SchemaContext = helpers.SchemaContext;
//var anonymousBase = 'vnd.jsonschema:///';
@@ -49,13 +50,15 @@ Validator.prototype.addSchema = function addSchema (schema, base) {
return null;
}
var scan = scanSchema(base||anonymousBase, schema);
var ourUri = base || schema.id;
var ourUri = base || schema.$id || schema.id;
for(var uri in scan.id){
this.schemas[uri] = scan.id[uri];
}
for(var uri in scan.ref){
// If this schema is already defined, it will be filtered out by the next step
this.unresolvedRefs.push(uri);
}
// Remove newly defined schemas from unresolvedRefs
this.unresolvedRefs = this.unresolvedRefs.filter(function(uri){
return typeof self.schemas[uri]==='undefined';
});
@@ -103,14 +106,18 @@ Validator.prototype.getSchema = function getSchema (urn) {
* @return {Array}
*/
Validator.prototype.validate = function validate (instance, schema, options, ctx) {
if((typeof schema !== 'boolean' && typeof schema !== 'object') || schema === null){
throw new SchemaError('Expected `schema` to be an object or boolean');
}
if (!options) {
options = {};
}
var propertyName = options.propertyName || 'instance';
// This section indexes subschemas in the provided schema, so they don't need to be added with Validator#addSchema
// This will work so long as the function at uri.resolve() will resolve a relative URI to a relative URI
var base = urilib.resolve(options.base||anonymousBase, schema.id||'');
var id = schema.$id || schema.id;
var base = urilib.resolve(options.base||anonymousBase, id||'');
if(!ctx){
ctx = new SchemaContext(schema, options, propertyName, base, Object.create(this.schemas));
ctx = new SchemaContext(schema, options, [], base, Object.create(this.schemas));
if (!ctx.schemas[base]) {
ctx.schemas[base] = schema;
}
@@ -120,14 +127,18 @@ Validator.prototype.validate = function validate (instance, schema, options, ctx
ctx.schemas[n] = sch;
}
}
if (schema) {
var result = this.validateSchema(instance, schema, options, ctx);
if (!result) {
throw new Error('Result undefined');
}
if(options.required && instance===undefined){
var result = new ValidatorResult(instance, schema, options, ctx);
result.addError('is required, but is undefined');
return result;
}
throw new SchemaError('no schema specified', schema);
var result = this.validateSchema(instance, schema, options, ctx);
if (!result) {
throw new Error('Result undefined');
}else if(options.throwAll && result.errors.length){
throw new ValidatorResultError(result);
}
return result;
};
/**
@@ -152,7 +163,7 @@ function shouldResolve(schema) {
Validator.prototype.validateSchema = function validateSchema (instance, schema, options, ctx) {
var result = new ValidatorResult(instance, schema, options, ctx);
// Support for the true/false schemas
// Support for the true/false schemas
if(typeof schema==='boolean') {
if(schema===true){
// `true` is always valid
@@ -180,10 +191,10 @@ Validator.prototype.validateSchema = function validateSchema (instance, schema,
}
// If passed a string argument, load that schema URI
var switchSchema;
if (switchSchema = shouldResolve(schema)) {
var switchSchema = shouldResolve(schema);
if (switchSchema) {
var resolved = this.resolve(schema, switchSchema, ctx);
var subctx = new SchemaContext(resolved.subschema, options, ctx.propertyPath, resolved.switchSchema, ctx.schemas);
var subctx = new SchemaContext(resolved.subschema, options, ctx.path, resolved.switchSchema, ctx.schemas);
return this.validateSchema(instance, resolved.subschema, options, subctx);
}
@@ -220,7 +231,7 @@ Validator.prototype.validateSchema = function validateSchema (instance, schema,
*/
Validator.prototype.schemaTraverser = function schemaTraverser (schemaobj, s) {
schemaobj.schema = helpers.deepMerge(schemaobj.schema, this.superResolve(s, schemaobj.ctx));
}
};
/**
* @private
@@ -229,12 +240,12 @@ Validator.prototype.schemaTraverser = function schemaTraverser (schemaobj, s) {
* @returns Object schema or resolved schema
*/
Validator.prototype.superResolve = function superResolve (schema, ctx) {
var ref;
if(ref = shouldResolve(schema)) {
var ref = shouldResolve(schema);
if(ref) {
return this.resolve(schema, ref, ctx).subschema;
}
return schema;
}
};
/**
* @private
@@ -275,6 +286,11 @@ Validator.prototype.resolve = function resolve (schema, switchSchema, ctx) {
* @return {boolean}
*/
Validator.prototype.testType = function validateType (instance, schema, options, ctx, type) {
if(type===undefined){
return;
}else if(type===null){
throw new SchemaError('Unexpected null in "type" keyword');
}
if (typeof this.types[type] == 'function') {
return this.types[type].call(this, instance);
}

12
node_modules/jsonschema/package.json generated vendored
View File

@@ -1,7 +1,7 @@
{
"author": "Tom de Grunt <tom@degrunt.nl>",
"name": "jsonschema",
"version": "1.2.6",
"version": "1.4.1",
"license": "MIT",
"dependencies": {},
"contributors": [
@@ -9,12 +9,15 @@
"name": "Austin Wright"
}
],
"main": "./lib",
"main": "./lib/index.js",
"typings": "./lib/index.d.ts",
"devDependencies": {
"@stryker-mutator/core": "^4.0.0",
"@stryker-mutator/mocha-runner": "^4.0.0",
"chai": "~4.2.0",
"eslint": "^7.7.0",
"json-metaschema": "^1.2.0",
"mocha": "~3",
"chai": "~1.5.0"
"mocha": "~8.1.1"
},
"optionalDependencies": {},
"engines": {
@@ -33,6 +36,7 @@
},
"description": "A fast and easy to use JSON Schema validator",
"scripts": {
"stryker": "stryker run",
"test": "./node_modules/.bin/mocha -R spec"
}
}

11
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "codeql",
"version": "2.2.13",
"version": "2.3.4",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "codeql",
"version": "2.2.13",
"version": "2.3.4",
"license": "MIT",
"dependencies": {
"@actions/artifact": "^1.1.0",
@@ -29,7 +29,7 @@
"fs": "0.0.1-security",
"get-folder-size": "^2.0.1",
"js-yaml": "^4.1.0",
"jsonschema": "1.2.6",
"jsonschema": "1.4.1",
"long": "^5.2.0",
"path": "^0.12.7",
"semver": "^7.3.2",
@@ -4202,8 +4202,9 @@
}
},
"node_modules/jsonschema": {
"version": "1.2.6",
"integrity": "sha512-SqhURKZG07JyKKeo/ir24QnS4/BV7a6gQy93bUSe4lUdNp0QNpIz2c9elWJQ9dpc5cQYY6cvCzgRwy0MQCLyqA==",
"version": "1.4.1",
"resolved": "https://registry.npmjs.org/jsonschema/-/jsonschema-1.4.1.tgz",
"integrity": "sha512-S6cATIPVv1z0IlxdN+zUk5EPjkGCdnhN4wVSBlvoUO1tOLJootbo9CquNJmbIh4yikWHiUedhRYrNPn1arpEmQ==",
"engines": {
"node": "*"
}

View File

@@ -1,6 +1,6 @@
{
"name": "codeql",
"version": "2.2.13",
"version": "2.3.4",
"private": true,
"description": "CodeQL action",
"scripts": {
@@ -41,7 +41,7 @@
"fs": "0.0.1-security",
"get-folder-size": "^2.0.1",
"js-yaml": "^4.1.0",
"jsonschema": "1.2.6",
"jsonschema": "1.4.1",
"long": "^5.2.0",
"path": "^0.12.7",
"semver": "^7.3.2",

View File

@@ -1,8 +1,6 @@
name: "Export file baseline information"
description: "Tests that file baseline information is exported when the feature is enabled"
versions: ["nightly-latest"]
env:
CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT: true # Remove when Swift is GA.
steps:
- uses: ./../action/init
id: init
@@ -32,7 +30,10 @@ steps:
shell: bash
run: |
cd "$RUNNER_TEMP/results"
expected_baseline_languages="cpp cs go java js py rb swift"
expected_baseline_languages="cpp cs go java js py rb"
if [[ $RUNNER_OS != "Windows" ]]; then
expected_baseline_languages+=" swift"
fi
for lang in ${expected_baseline_languages}; do
rule_name="${lang}/baseline/expected-extracted-files"

View File

@@ -1,12 +1,5 @@
name: "ML-powered queries"
description: "Tests that ML-powered queries are run with the security-extended suite and that they produce alerts on a test DB"
versions: [
# Latest release in 2.7.x series
"stable-20220120",
"cached",
"latest",
"nightly-latest",
]
steps:
- uses: ./../action/init
with:
@@ -30,7 +23,7 @@ steps:
- name: Check sarif
uses: ./../action/.github/actions/check-sarif
# Running on Windows requires CodeQL CLI 2.9.0+.
if: "!(matrix.version == 'stable-20220120' && runner.os == 'Windows')"
if: "!(matrix.version == 'stable-20220401' && runner.os == 'Windows')"
with:
sarif-file: ${{ runner.temp }}/results/javascript.sarif
queries-run: js/ml-powered/nosql-injection,js/ml-powered/path-injection,js/ml-powered/sql-injection,js/ml-powered/xss
@@ -39,7 +32,7 @@ steps:
- name: Check results
env:
# Running on Windows requires CodeQL CLI 2.9.0+.
SHOULD_RUN_ML_POWERED_QUERIES: ${{ !(matrix.version == 'stable-20220120' && runner.os == 'Windows') }}
SHOULD_RUN_ML_POWERED_QUERIES: ${{ !(matrix.version == 'stable-20220401' && runner.os == 'Windows') }}
shell: bash
run: |
echo "Expecting ML-powered queries to be run: ${SHOULD_RUN_ML_POWERED_QUERIES}"

View File

@@ -1,8 +1,6 @@
name: "Multi-language repository"
description: "An end-to-end integration test of a multi-language repository using automatic language detection"
operatingSystems: ["ubuntu", "macos"]
env:
CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT: "true" # Remove when Swift is GA.
steps:
- uses: ./../action/init
id: init
@@ -12,7 +10,7 @@ steps:
- uses: ./../action/.github/actions/setup-swift
with:
codeql-path: ${{steps.init.outputs.codeql-path}}
codeql-path: ${{ steps.init.outputs.codeql-path }}
- name: Build code
shell: bash
@@ -58,7 +56,7 @@ steps:
fi
- name: Check language autodetect for Ruby
if: "(matrix.version == 'cached' || matrix.version == 'latest' || matrix.version == 'nightly-latest')"
if: env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true'
shell: bash
run: |
RUBY_DB=${{ fromJson(steps.analysis.outputs.db-locations).ruby }}
@@ -68,7 +66,7 @@ steps:
fi
- name: Check language autodetect for Swift
if: "(matrix.version == 'cached' || matrix.version == 'latest' || matrix.version == 'nightly-latest')"
if: env.CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT == 'true'
shell: bash
run: |
SWIFT_DB=${{ fromJson(steps.analysis.outputs.db-locations).swift }}

View File

@@ -3,7 +3,6 @@ description: "Tests creation of a Swift database using custom build"
versions: ["latest", "cached", "nightly-latest"]
operatingSystems: ["ubuntu", "macos"]
env:
CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT: "true" # Remove when Swift is GA.
DOTNET_GENERATE_ASPNET_CERTIFICATE: "false"
steps:
- uses: ./../action/init

View File

@@ -9,9 +9,13 @@ steps:
CODEQL_URL: ${{ steps.prepare-test.outputs.tools-url }}
run: |
wget "$CODEQL_URL"
- uses: ./../action/init
- id: init
uses: ./../action/init
with:
tools: ./codeql-bundle.tar.gz
- uses: ./../action/.github/actions/setup-swift
with:
codeql-path: ${{ steps.init.outputs.codeql-path }}
- name: Build code
shell: bash
run: ./build.sh

View File

@@ -3,9 +3,13 @@ description: "An end-to-end integration test that unsets some environment variab
operatingSystems: ["ubuntu"]
steps:
- uses: ./../action/init
id: init
with:
db-location: ${{ runner.temp }}/customDbLocation
tools: ${{ steps.prepare-test.outputs.tools-url }}
- uses: ./../action/.github/actions/setup-swift
with:
codeql-path: ${{ steps.init.outputs.codeql-path }}
- name: Build code
shell: bash
# Disable Kotlin analysis while it's incompatible with Kotlin 1.8, until we find a

View File

@@ -1,14 +1,18 @@
import ruamel.yaml
from ruamel.yaml.scalarstring import FoldedScalarString
import os
import textwrap
# The default set of CodeQL Bundle versions to use for the PR checks.
defaultTestVersions = [
# The oldest supported CodeQL version: 2.6.3. If bumping, update `CODEQL_MINIMUM_VERSION` in `codeql.ts`
"stable-20211005",
# The last CodeQL release in the 2.7 series: 2.7.6.
"stable-20220120",
# The last CodeQL release in the 2.8 series: 2.8.5.
# The oldest supported CodeQL version: 2.8.5. If bumping, update `CODEQL_MINIMUM_VERSION` in `codeql.ts`
"stable-20220401",
# The last CodeQL release in the 2.9 series: 2.9.4.
"stable-20220615",
# The last CodeQL release in the 2.10 series: 2.10.5.
"stable-20220908",
# The last CodeQL release in the 2.11 series: 2.11.6.
"stable-20221211",
# The version of CodeQL currently in the toolcache. Typically either the latest release or the one before.
"cached",
# The latest release of CodeQL.
@@ -18,22 +22,6 @@ defaultTestVersions = [
]
def isCompatibleWithLatestImages(version):
if version in ["cached", "latest", "nightly-latest"]:
return True
date = version.split("-")[1]
# The first version of the CodeQL CLI compatible with `ubuntu-22.04` and `windows-2022` is
# 2.8.2. This appears in CodeQL Bundle version codeql-bundle-20220224.
return date >= "20220224"
def operatingSystemsForVersion(version):
if isCompatibleWithLatestImages(version):
return ["ubuntu-latest", "macos-latest", "windows-latest"]
else:
return ["ubuntu-20.04", "macos-latest", "windows-2019"]
header = """# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pip install ruamel.yaml && python3 sync.py
@@ -53,6 +41,7 @@ def writeHeader(checkStream):
yaml = ruamel.yaml.YAML()
yaml.Representer = NonAliasingRTRepresenter
allJobs = {}
for file in os.listdir('checks'):
with open(f"checks/{file}", 'r') as checkStream:
@@ -60,7 +49,7 @@ for file in os.listdir('checks'):
matrix = []
for version in checkSpecification.get('versions', defaultTestVersions):
runnerImages = operatingSystemsForVersion(version)
runnerImages = ["ubuntu-latest", "macos-latest", "windows-latest"]
if checkSpecification.get('operatingSystems', None):
runnerImages = [image for image in runnerImages for operatingSystem in checkSpecification['operatingSystems']
if image.startswith(operatingSystem)]
@@ -83,19 +72,26 @@ for file in os.listdir('checks'):
'with': {
'version': '${{ matrix.version }}'
}
}
},
# We don't support Swift on Windows or prior versions of the CLI.
{
'name': 'Set environment variable for Swift enablement',
# Ensure that this is serialized as a folded (`>`) string to preserve the readability
# of the generated workflow.
'if': FoldedScalarString(textwrap.dedent('''
runner.os != 'Windows' && (
matrix.version == '20220908' ||
matrix.version == '20221211' ||
matrix.version == 'cached' ||
matrix.version == 'latest' ||
matrix.version == 'nightly-latest'
)
''').strip()),
'shell': 'bash',
'run': 'echo "CODEQL_ENABLE_EXPERIMENTAL_FEATURES_SWIFT=true" >> $GITHUB_ENV'
},
]
if any(not isCompatibleWithLatestImages(m['version']) for m in matrix):
steps.append({
'name': 'Set up Go',
'if': "matrix.os == 'ubuntu-20.04' || matrix.os == 'windows-2019'",
'uses': 'actions/setup-go@v4',
'with': {
'go-version': '^1.13.1'
}
})
steps.extend(checkSpecification['steps'])
checkJob = {

View File

@@ -21,7 +21,11 @@ import {
parseMatrixInput,
UserError,
} from "./util";
import { getWorkflowRelativePath } from "./workflow";
import {
getWorkflowRunID,
getWorkflowRunAttempt,
getWorkflowRelativePath,
} from "./workflow";
// eslint-disable-next-line import/no-commonjs
const pkg = require("../package.json") as JSONSchemaForNPMPackageJsonFiles;
@@ -407,16 +411,8 @@ export async function createStatusReportBase(
): Promise<StatusReportBase> {
const commitOid = getOptionalInput("sha") || process.env["GITHUB_SHA"] || "";
const ref = await getRef();
const workflowRunIDStr = process.env["GITHUB_RUN_ID"];
let workflowRunID = -1;
if (workflowRunIDStr) {
workflowRunID = parseInt(workflowRunIDStr, 10);
}
const workflowRunAttemptStr = process.env["GITHUB_RUN_ATTEMPT"];
let workflowRunAttempt = -1;
if (workflowRunAttemptStr) {
workflowRunAttempt = parseInt(workflowRunAttemptStr, 10);
}
const workflowRunID = getWorkflowRunID();
const workflowRunAttempt = getWorkflowRunAttempt();
const workflowName = process.env["GITHUB_WORKFLOW"] || "";
const jobName = process.env["GITHUB_JOB"] || "";
const analysis_key = await getAnalysisKey();

View File

@@ -16,7 +16,7 @@ import {
} from "./analyze";
import { getApiDetails, getGitHubVersion } from "./api-client";
import { runAutobuild } from "./autobuild";
import { enrichEnvironment, getCodeQL } from "./codeql";
import { getCodeQL } from "./codeql";
import { Config, getConfig } from "./config-utils";
import { uploadDatabases } from "./database-upload";
import { Features } from "./feature-flags";
@@ -207,8 +207,6 @@ async function run() {
);
}
await enrichEnvironment(await getCodeQL(config.codeQLCmd));
const apiDetails = getApiDetails();
const outputDir = actionsUtil.getRequiredInput("output");
const threads = util.getThreadsFlag(

View File

@@ -8,12 +8,11 @@ import * as yaml from "js-yaml";
import { DatabaseCreationTimings } from "./actions-util";
import * as analysisPaths from "./analysis-paths";
import { CodeQL, CODEQL_VERSION_NEW_TRACING, getCodeQL } from "./codeql";
import { CodeQL, getCodeQL } from "./codeql";
import * as configUtils from "./config-utils";
import { FeatureEnablement } from "./feature-flags";
import { isScannedLanguage, Language } from "./languages";
import { Logger } from "./logging";
import * as sharedEnv from "./shared-environment";
import { endTracingForCluster } from "./tracer-config";
import * as util from "./util";
@@ -493,19 +492,13 @@ export async function runFinalize(
logger
);
const codeql = await getCodeQL(config.codeQLCmd);
// WARNING: This does not _really_ end tracing, as the tracer will restore its
// critical environment variables and it'll still be active for all processes
// launched from this build step.
// However, it will stop tracing for all steps past the codeql-action/analyze
// step.
if (await util.codeQlVersionAbove(codeql, CODEQL_VERSION_NEW_TRACING)) {
// Delete variables as specified by the end-tracing script
await endTracingForCluster(config);
} else {
// Delete the tracer config env var to avoid tracing ourselves
delete process.env[sharedEnv.ODASA_TRACER_CONFIGURATION];
}
// Delete variables as specified by the end-tracing script
await endTracingForCluster(config);
return timings;
}

View File

@@ -1,7 +1,6 @@
import * as fs from "fs";
import * as path from "path";
import * as core from "@actions/core";
import * as toolrunner from "@actions/exec/lib/toolrunner";
import * as yaml from "js-yaml";
@@ -18,7 +17,6 @@ import { ToolsSource } from "./init";
import { isTracedLanguage, Language } from "./languages";
import { Logger } from "./logging";
import * as setupCodeql from "./setup-codeql";
import { EnvVar } from "./shared-environment";
import { toolrunnerErrorCatcher } from "./toolrunner-error-catcher";
import {
getTrapCachingExtractorConfigArgs,
@@ -77,19 +75,6 @@ export interface CodeQL {
* Print version information about CodeQL.
*/
printVersion(): Promise<void>;
/**
* Run 'codeql database trace-command' on 'tracer-env.js' and parse
* the result to get environment variables set by CodeQL.
*/
getTracerEnv(databasePath: string): Promise<{ [key: string]: string }>;
/**
* Run 'codeql database init'.
*/
databaseInit(
databasePath: string,
language: Language,
sourceRoot: string
): Promise<void>;
/**
* Run 'codeql database init --db-cluster'.
*/
@@ -267,7 +252,7 @@ let cachedCodeQL: CodeQL | undefined = undefined;
* The version flags below can be used to conditionally enable certain features
* on versions newer than this.
*/
const CODEQL_MINIMUM_VERSION = "2.6.3";
const CODEQL_MINIMUM_VERSION = "2.8.5";
/**
* Versions of CodeQL that version-flag certain functionality in the Action.
@@ -280,23 +265,6 @@ const CODEQL_VERSION_LUA_TRACING_GO_WINDOWS_FIXED = "2.10.4";
export const CODEQL_VERSION_GHES_PACK_DOWNLOAD = "2.10.4";
const CODEQL_VERSION_FILE_BASELINE_INFORMATION = "2.11.3";
/**
* This variable controls using the new style of tracing from the CodeQL
* CLI. In particular, with versions above this we will use both indirect
* tracing, and multi-language tracing together with database clusters.
*
* Note that there were bugs in both of these features that were fixed in
* release 2.7.0 of the CodeQL CLI, therefore this flag is only enabled for
* versions above that.
*/
export const CODEQL_VERSION_NEW_TRACING = "2.7.0";
/**
* Versions 2.7.3+ of the CodeQL CLI support build tracing with glibc 2.34 on Linux. Versions before
* this cannot perform build tracing when running on the Actions `ubuntu-22.04` runner image.
*/
export const CODEQL_VERSION_TRACING_GLIBC_2_34 = "2.7.3";
/**
* Versions 2.9.0+ of the CodeQL CLI run machine learning models from a temporary directory, which
* resolves an issue on Windows where TensorFlow models are not correctly loaded due to the path of
@@ -372,8 +340,9 @@ export async function setupCodeQL(
toolsVersion,
};
} catch (e) {
logger.error(wrapError(e).message);
throw new Error("Unable to download and extract CodeQL CLI");
throw new Error(
`Unable to download and extract CodeQL CLI: ${wrapError(e).message}`
);
}
}
@@ -419,8 +388,6 @@ export function setCodeQL(partialCodeql: Partial<CodeQL>): CodeQL {
() => new Promise((resolve) => resolve("1.0.0"))
),
printVersion: resolveFunction(partialCodeql, "printVersion"),
getTracerEnv: resolveFunction(partialCodeql, "getTracerEnv"),
databaseInit: resolveFunction(partialCodeql, "databaseInit"),
databaseInitCluster: resolveFunction(partialCodeql, "databaseInitCluster"),
runAutobuild: resolveFunction(partialCodeql, "runAutobuild"),
extractScannedLanguage: resolveFunction(
@@ -507,94 +474,6 @@ export async function getCodeQLForCmd(
async printVersion() {
await runTool(cmd, ["version", "--format=json"]);
},
async getTracerEnv(databasePath: string) {
// Write tracer-env.js to a temp location.
// BEWARE: The name and location of this file is recognized by `codeql database
// trace-command` in order to enable special support for concatenable tracer
// configurations. Consequently the name must not be changed.
// (This warning can be removed once a different way to recognize the
// action/runner has been implemented in `codeql database trace-command`
// _and_ is present in the latest supported CLI release.)
const tracerEnvJs = path.resolve(
databasePath,
"working",
"tracer-env.js"
);
fs.mkdirSync(path.dirname(tracerEnvJs), { recursive: true });
fs.writeFileSync(
tracerEnvJs,
`
const fs = require('fs');
const env = {};
for (let entry of Object.entries(process.env)) {
const key = entry[0];
const value = entry[1];
if (typeof value !== 'undefined' && key !== '_' && !key.startsWith('JAVA_MAIN_CLASS_')) {
env[key] = value;
}
}
process.stdout.write(process.argv[2]);
fs.writeFileSync(process.argv[2], JSON.stringify(env), 'utf-8');`
);
// BEWARE: The name and location of this file is recognized by `codeql database
// trace-command` in order to enable special support for concatenable tracer
// configurations. Consequently the name must not be changed.
// (This warning can be removed once a different way to recognize the
// action/runner has been implemented in `codeql database trace-command`
// _and_ is present in the latest supported CLI release.)
const envFile = path.resolve(databasePath, "working", "env.tmp");
try {
await runTool(cmd, [
"database",
"trace-command",
databasePath,
...getExtraOptionsFromEnv(["database", "trace-command"]),
process.execPath,
tracerEnvJs,
envFile,
]);
} catch (e) {
if (
e instanceof CommandInvocationError &&
e.output.includes(
"undefined symbol: __libc_dlopen_mode, version GLIBC_PRIVATE"
) &&
process.platform === "linux" &&
!(await util.codeQlVersionAbove(
this,
CODEQL_VERSION_TRACING_GLIBC_2_34
))
) {
throw new util.UserError(
"The CodeQL CLI is incompatible with the version of glibc on your system. " +
`Please upgrade to CodeQL CLI version ${CODEQL_VERSION_TRACING_GLIBC_2_34} or ` +
"later. If you cannot upgrade to a newer version of the CodeQL CLI, you can " +
`alternatively run your workflow on another runner image such as "ubuntu-20.04" ` +
"that has glibc 2.33 or earlier installed."
);
} else {
throw e;
}
}
return JSON.parse(fs.readFileSync(envFile, "utf-8"));
},
async databaseInit(
databasePath: string,
language: Language,
sourceRoot: string
) {
await runTool(cmd, [
"database",
"init",
databasePath,
`--language=${language}`,
`--source-root=${sourceRoot}`,
...getExtraOptionsFromEnv(["database", "init"]),
]);
},
async databaseInitCluster(
config: Config,
sourceRoot: string,
@@ -1305,17 +1184,3 @@ async function getCodeScanningConfigExportArguments(
}
return [];
}
/**
* Enrich the environment variables with further flags that we cannot
* know the value of until we know what version of CodeQL we're running.
*/
export async function enrichEnvironment(codeql: CodeQL) {
if (await util.codeQlVersionAbove(codeql, CODEQL_VERSION_NEW_TRACING)) {
core.exportVariable(EnvVar.FEATURE_MULTI_LANGUAGE, "false");
core.exportVariable(EnvVar.FEATURE_SANDWICH, "false");
} else {
core.exportVariable(EnvVar.FEATURE_MULTI_LANGUAGE, "true");
core.exportVariable(EnvVar.FEATURE_SANDWICH, "true");
}
}

View File

@@ -105,11 +105,12 @@ test("load empty config", async (t) => {
undefined,
undefined,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -130,7 +131,7 @@ test("load empty config", async (t) => {
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -176,11 +177,12 @@ test("loading config saves config", async (t) => {
undefined,
undefined,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -214,11 +216,12 @@ test("load input outside of workspace", async (t) => {
undefined,
"../input",
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
getCachedCodeQL(),
tmpDir,
@@ -254,11 +257,12 @@ test("load non-local input with invalid repo syntax", async (t) => {
undefined,
configFile,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
getCachedCodeQL(),
tmpDir,
@@ -295,11 +299,12 @@ test("load non-existent input", async (t) => {
undefined,
configFile,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
getCachedCodeQL(),
tmpDir,
@@ -402,11 +407,12 @@ test("load non-empty input", async (t) => {
undefined,
configFilePath,
undefined,
undefined,
false,
false,
"my-artifact",
"my-db",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -473,11 +479,12 @@ test("Default queries are used", async (t) => {
undefined,
configFilePath,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -552,11 +559,12 @@ test("Queries can be specified in config file", async (t) => {
undefined,
configFilePath,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -630,11 +638,12 @@ test("Queries from config file can be overridden in workflow file", async (t) =>
undefined,
configFilePath,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -706,11 +715,12 @@ test("Queries in workflow file can be used in tandem with the 'disable default q
undefined,
configFilePath,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -773,11 +783,12 @@ test("Multiple queries can be specified in workflow file, no config file require
undefined,
undefined,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -861,11 +872,12 @@ test("Queries in workflow file can be added to the set of queries without overri
undefined,
configFilePath,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -913,6 +925,181 @@ test("Queries in workflow file can be added to the set of queries without overri
});
});
test("Queries can be specified using config input", async (t) => {
return await util.withTmpDir(async (tmpDir) => {
const configInput = `
name: my config
queries:
- uses: ./foo
packs:
javascript:
- a/b@1.2.3
python:
- c/d@1.2.3
`;
fs.mkdirSync(path.join(tmpDir, "foo"));
const resolveQueriesArgs: Array<{
queries: string[];
extraSearchPath: string | undefined;
}> = [];
const codeQL = setCodeQL({
async resolveQueries(
queries: string[],
extraSearchPath: string | undefined
) {
resolveQueriesArgs.push({ queries, extraSearchPath });
return queriesToResolvedQueryForm(queries);
},
async packDownload(): Promise<PackDownloadOutput> {
return { packs: [] };
},
});
// Only JS, python packs will be ignored
const languages = "javascript";
const config = await configUtils.initConfig(
languages,
undefined,
undefined,
undefined,
undefined,
undefined,
configInput,
false,
false,
"",
"",
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
gitHubVersion,
sampleApiDetails,
createFeatures([]),
getRunnerLogger(true)
);
// Check resolveQueries was called correctly
// It'll be called once for the default queries
// and once for `./foo` from the config file.
t.deepEqual(resolveQueriesArgs.length, 2);
t.deepEqual(resolveQueriesArgs[1].queries.length, 1);
t.true(resolveQueriesArgs[1].queries[0].endsWith(`${path.sep}foo`));
t.deepEqual(config.packs as unknown, {
[Language.javascript]: ["a/b@1.2.3"],
});
// Now check that the end result contains the default queries and the query from config
t.deepEqual(config.queries["javascript"].builtin.length, 1);
t.deepEqual(config.queries["javascript"].custom.length, 1);
t.true(
config.queries["javascript"].builtin[0].endsWith(
"javascript-code-scanning.qls"
)
);
t.true(
config.queries["javascript"].custom[0].queries[0].endsWith(
`${path.sep}foo`
)
);
});
});
test("Using config input and file together, config input should be used.", async (t) => {
return await util.withTmpDir(async (tmpDir) => {
process.env["RUNNER_TEMP"] = tmpDir;
process.env["GITHUB_WORKSPACE"] = tmpDir;
const inputFileContents = `
name: my config
queries:
- uses: ./foo_file`;
const configFilePath = createConfigFile(inputFileContents, tmpDir);
const configInput = `
name: my config
queries:
- uses: ./foo
packs:
javascript:
- a/b@1.2.3
python:
- c/d@1.2.3
`;
fs.mkdirSync(path.join(tmpDir, "foo"));
const resolveQueriesArgs: Array<{
queries: string[];
extraSearchPath: string | undefined;
}> = [];
const codeQL = setCodeQL({
async resolveQueries(
queries: string[],
extraSearchPath: string | undefined
) {
resolveQueriesArgs.push({ queries, extraSearchPath });
return queriesToResolvedQueryForm(queries);
},
async packDownload(): Promise<PackDownloadOutput> {
return { packs: [] };
},
});
// Only JS, python packs will be ignored
const languages = "javascript";
const config = await configUtils.initConfig(
languages,
undefined,
undefined,
undefined,
undefined,
configFilePath,
configInput,
false,
false,
"",
"",
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
gitHubVersion,
sampleApiDetails,
createFeatures([]),
getRunnerLogger(true)
);
// Check resolveQueries was called correctly
// It'll be called once for the default queries
// and once for `./foo` from the config file.
t.deepEqual(resolveQueriesArgs.length, 2);
t.deepEqual(resolveQueriesArgs[1].queries.length, 1);
t.true(resolveQueriesArgs[1].queries[0].endsWith(`${path.sep}foo`));
t.deepEqual(config.packs as unknown, {
[Language.javascript]: ["a/b@1.2.3"],
});
// Now check that the end result contains the default queries and the query from config
t.deepEqual(config.queries["javascript"].builtin.length, 1);
t.deepEqual(config.queries["javascript"].custom.length, 1);
t.true(
config.queries["javascript"].builtin[0].endsWith(
"javascript-code-scanning.qls"
)
);
t.true(
config.queries["javascript"].custom[0].queries[0].endsWith(
`${path.sep}foo`
)
);
});
});
test("Invalid queries in workflow file handled correctly", async (t) => {
return await util.withTmpDir(async (tmpDir) => {
const queries = "foo/bar@v1@v3";
@@ -943,11 +1130,12 @@ test("Invalid queries in workflow file handled correctly", async (t) => {
undefined,
undefined,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -1015,11 +1203,12 @@ test("API client used when reading remote config", async (t) => {
undefined,
configFile,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -1046,11 +1235,12 @@ test("Remote config handles the case where a directory is provided", async (t) =
undefined,
repoReference,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
getCachedCodeQL(),
tmpDir,
@@ -1085,11 +1275,12 @@ test("Invalid format of remote config handled correctly", async (t) => {
undefined,
repoReference,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
getCachedCodeQL(),
tmpDir,
@@ -1128,11 +1319,12 @@ test("No detected languages", async (t) => {
undefined,
undefined,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -1160,11 +1352,12 @@ test("Unknown languages", async (t) => {
undefined,
undefined,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
getCachedCodeQL(),
tmpDir,
@@ -1217,11 +1410,12 @@ test("Config specifies packages", async (t) => {
undefined,
configFile,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -1278,6 +1472,7 @@ test("Config specifies packages for multiple languages", async (t) => {
undefined,
configFile,
undefined,
undefined,
false,
false,
"",
@@ -1350,11 +1545,12 @@ function doInvalidInputTest(
undefined,
configFile,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,
@@ -1934,11 +2130,12 @@ const mlPoweredQueriesMacro = test.macro({
undefined,
undefined,
undefined,
undefined,
false,
false,
"",
"",
{ owner: "github", repo: "example " },
{ owner: "github", repo: "example" },
tmpDir,
codeQL,
tmpDir,

View File

@@ -1690,6 +1690,7 @@ export async function initConfig(
registriesInput: string | undefined,
configFile: string | undefined,
dbLocation: string | undefined,
configInput: string | undefined,
trapCachingEnabled: boolean,
debugMode: boolean,
debugArtifactName: string,
@@ -1705,6 +1706,18 @@ export async function initConfig(
): Promise<Config> {
let config: Config;
// if configInput is set, it takes precedence over configFile
if (configInput) {
if (configFile) {
logger.warning(
`Both a config file and config input were provided. Ignoring config file.`
);
}
configFile = path.resolve(workspacePath, "user-config-from-action.yml");
fs.writeFileSync(configFile, configInput);
logger.debug(`Using config from action input: ${configFile}`);
}
// If no config file was provided create an empty one
if (!configFile) {
logger.debug("No configuration file was provided");

View File

@@ -8,13 +8,12 @@ import del from "del";
import { getRequiredInput } from "./actions-util";
import { dbIsFinalized } from "./analyze";
import { CODEQL_VERSION_NEW_TRACING, getCodeQL } from "./codeql";
import { getCodeQL } from "./codeql";
import { Config } from "./config-utils";
import { Language } from "./languages";
import { Logger } from "./logging";
import {
bundleDb,
codeQlVersionAbove,
doesDirectoryExist,
getCodeQLDatabasePath,
listFolder,
@@ -72,8 +71,6 @@ export async function uploadSarifDebugArtifact(
}
export async function uploadLogsDebugArtifact(config: Config) {
const codeql = await getCodeQL(config.codeQLCmd);
let toUpload: string[] = [];
for (const language of config.languages) {
const databaseDirectory = getCodeQLDatabasePath(config, language);
@@ -83,36 +80,20 @@ export async function uploadLogsDebugArtifact(config: Config) {
}
}
if (await codeQlVersionAbove(codeql, CODEQL_VERSION_NEW_TRACING)) {
// Multilanguage tracing: there are additional logs in the root of the cluster
const multiLanguageTracingLogsDirectory = path.resolve(
config.dbLocation,
"log"
);
if (doesDirectoryExist(multiLanguageTracingLogsDirectory)) {
toUpload = toUpload.concat(listFolder(multiLanguageTracingLogsDirectory));
}
// Multilanguage tracing: there are additional logs in the root of the cluster
const multiLanguageTracingLogsDirectory = path.resolve(
config.dbLocation,
"log"
);
if (doesDirectoryExist(multiLanguageTracingLogsDirectory)) {
toUpload = toUpload.concat(listFolder(multiLanguageTracingLogsDirectory));
}
await uploadDebugArtifacts(
toUpload,
config.dbLocation,
config.debugArtifactName
);
// Before multi-language tracing, we wrote a compound-build-tracer.log in the temp dir
if (!(await codeQlVersionAbove(codeql, CODEQL_VERSION_NEW_TRACING))) {
const compoundBuildTracerLogDirectory = path.resolve(
config.tempDir,
"compound-build-tracer.log"
);
if (doesDirectoryExist(compoundBuildTracerLogDirectory)) {
await uploadDebugArtifacts(
[compoundBuildTracerLogDirectory],
config.tempDir,
config.debugArtifactName
);
}
}
}
/**

View File

@@ -1,6 +1,6 @@
{
"bundleVersion": "codeql-bundle-20230403",
"cliVersion": "2.12.6",
"priorBundleVersion": "codeql-bundle-20230317",
"priorCliVersion": "2.12.5"
"bundleVersion": "codeql-bundle-20230428",
"cliVersion": "2.13.1",
"priorBundleVersion": "codeql-bundle-20230414",
"priorCliVersion": "2.13.0"
}

View File

@@ -13,17 +13,12 @@ import {
StatusReportBase,
} from "./actions-util";
import { getGitHubVersion } from "./api-client";
import {
CodeQL,
CODEQL_VERSION_NEW_TRACING,
enrichEnvironment,
} from "./codeql";
import { CodeQL } from "./codeql";
import * as configUtils from "./config-utils";
import { Feature, Features } from "./feature-flags";
import {
initCodeQL,
initConfig,
injectWindowsTracer,
installPythonDeps,
runInit,
ToolsSource,
@@ -35,7 +30,6 @@ import { getTotalCacheSize } from "./trap-caching";
import {
checkForTimeout,
checkGitHubVersionInRange,
codeQlVersionAbove,
DEFAULT_DEBUG_ARTIFACT_NAME,
DEFAULT_DEBUG_DATABASE_NAME,
getMemoryFlagValue,
@@ -252,7 +246,6 @@ async function run() {
toolsDownloadDurationMs = initCodeQLResult.toolsDownloadDurationMs;
toolsVersion = initCodeQLResult.toolsVersion;
toolsSource = initCodeQLResult.toolsSource;
await enrichEnvironment(codeql);
config = await initConfig(
getOptionalInput("languages"),
@@ -261,6 +254,7 @@ async function run() {
registriesInput,
getOptionalInput("config-file"),
getOptionalInput("db-location"),
getOptionalInput("config"),
getTrapCachingEnabled(),
// Debug mode is enabled if:
// - The `init` Action is passed `debug: true`.
@@ -356,19 +350,6 @@ async function run() {
for (const [key, value] of Object.entries(tracerConfig.env)) {
core.exportVariable(key, value);
}
if (
process.platform === "win32" &&
!(await codeQlVersionAbove(codeql, CODEQL_VERSION_NEW_TRACING))
) {
await injectWindowsTracer(
"Runner.Worker.exe",
undefined,
config,
codeql,
tracerConfig
);
}
}
core.setOutput("codeql-path", config.codeQLCmd);

View File

@@ -6,14 +6,13 @@ import * as safeWhich from "@chrisgavin/safe-which";
import * as analysisPaths from "./analysis-paths";
import { GitHubApiCombinedDetails, GitHubApiDetails } from "./api-client";
import { CodeQL, CODEQL_VERSION_NEW_TRACING, setupCodeQL } from "./codeql";
import { CodeQL, setupCodeQL } from "./codeql";
import * as configUtils from "./config-utils";
import { CodeQLDefaultVersionInfo, FeatureEnablement } from "./feature-flags";
import { Logger } from "./logging";
import { RepositoryNwo } from "./repository";
import { TracerConfig, getCombinedTracerConfig } from "./tracer-config";
import * as util from "./util";
import { codeQlVersionAbove } from "./util";
export enum ToolsSource {
Unknown = "UNKNOWN",
@@ -58,6 +57,7 @@ export async function initConfig(
registriesInput: string | undefined,
configFile: string | undefined,
dbLocation: string | undefined,
configInput: string | undefined,
trapCachingEnabled: boolean,
debugMode: boolean,
debugArtifactName: string,
@@ -79,6 +79,7 @@ export async function initConfig(
registriesInput,
configFile,
dbLocation,
configInput,
trapCachingEnabled,
debugMode,
debugArtifactName,
@@ -108,55 +109,43 @@ export async function runInit(
logger: Logger
): Promise<TracerConfig | undefined> {
fs.mkdirSync(config.dbLocation, { recursive: true });
try {
if (await codeQlVersionAbove(codeql, CODEQL_VERSION_NEW_TRACING)) {
// When parsing the codeql config in the CLI, we have not yet created the qlconfig file.
// So, create it now.
// If we are parsing the config file in the Action, then the qlconfig file was already created
// before the `pack download` command was invoked. It is not required for the init command.
let registriesAuthTokens: string | undefined;
let qlconfigFile: string | undefined;
if (await util.useCodeScanningConfigInCli(codeql, features)) {
({ registriesAuthTokens, qlconfigFile } =
await configUtils.generateRegistries(
registriesInput,
codeql,
config.tempDir,
logger
));
}
await configUtils.wrapEnvironment(
{
GITHUB_TOKEN: apiDetails.auth,
CODEQL_REGISTRIES_AUTH: registriesAuthTokens,
},
// Init a database cluster
async () =>
await codeql.databaseInitCluster(
config,
sourceRoot,
processName,
features,
qlconfigFile,
logger
)
);
} else {
for (const language of config.languages) {
// Init language database
await codeql.databaseInit(
util.getCodeQLDatabasePath(config, language),
language,
sourceRoot
);
}
// When parsing the codeql config in the CLI, we have not yet created the qlconfig file.
// So, create it now.
// If we are parsing the config file in the Action, then the qlconfig file was already created
// before the `pack download` command was invoked. It is not required for the init command.
let registriesAuthTokens: string | undefined;
let qlconfigFile: string | undefined;
if (await util.useCodeScanningConfigInCli(codeql, features)) {
({ registriesAuthTokens, qlconfigFile } =
await configUtils.generateRegistries(
registriesInput,
codeql,
config.tempDir,
logger
));
}
await configUtils.wrapEnvironment(
{
GITHUB_TOKEN: apiDetails.auth,
CODEQL_REGISTRIES_AUTH: registriesAuthTokens,
},
// Init a database cluster
async () =>
await codeql.databaseInitCluster(
config,
sourceRoot,
processName,
features,
qlconfigFile,
logger
)
);
} catch (e) {
throw processError(e);
}
return await getCombinedTracerConfig(config, codeql);
return await getCombinedTracerConfig(config);
}
/**
@@ -195,105 +184,6 @@ function processError(e: any): Error {
return e;
}
// Runs a powershell script to inject the tracer into a parent process
// so it can tracer future processes, hopefully including the build process.
// If processName is given then injects into the nearest parent process with
// this name, otherwise uses the processLevel-th parent if defined, otherwise
// defaults to the 3rd parent as a rough guess.
export async function injectWindowsTracer(
processName: string | undefined,
processLevel: number | undefined,
config: configUtils.Config,
codeql: CodeQL,
tracerConfig: TracerConfig
) {
let script: string;
if (processName !== undefined) {
script = `
Param(
[Parameter(Position=0)]
[String]
$tracer
)
$id = $PID
while ($true) {
$p = Get-CimInstance -Class Win32_Process -Filter "ProcessId = $id"
Write-Host "Found process: $p"
if ($p -eq $null) {
throw "Could not determine ${processName} process"
}
if ($p[0].Name -eq "${processName}") {
Break
} else {
$id = $p[0].ParentProcessId
}
}
Write-Host "Final process: $p"
Invoke-Expression "&$tracer --inject=$id"`;
} else {
// If the level is not defined then guess at the 3rd parent process.
// This won't be correct in every setting but it should be enough in most settings,
// and overestimating is likely better in this situation so we definitely trace
// what we want, though this does run the risk of interfering with future CI jobs.
// Note that the default of 3 doesn't work on github actions, so we include a
// special case in the script that checks for Runner.Worker.exe so we can still work
// on actions if the runner is invoked there.
processLevel = processLevel || 3;
script = `
Param(
[Parameter(Position=0)]
[String]
$tracer
)
$id = $PID
for ($i = 0; $i -le ${processLevel}; $i++) {
$p = Get-CimInstance -Class Win32_Process -Filter "ProcessId = $id"
Write-Host "Parent process \${i}: $p"
if ($p -eq $null) {
throw "Process tree ended before reaching required level"
}
# Special case just in case the runner is used on actions
if ($p[0].Name -eq "Runner.Worker.exe") {
Write-Host "Found Runner.Worker.exe process which means we are running on GitHub Actions"
Write-Host "Aborting search early and using process: $p"
Break
} elseif ($p[0].Name -eq "Agent.Worker.exe") {
Write-Host "Found Agent.Worker.exe process which means we are running on Azure Pipelines"
Write-Host "Aborting search early and using process: $p"
Break
} else {
$id = $p[0].ParentProcessId
}
}
Write-Host "Final process: $p"
Invoke-Expression "&$tracer --inject=$id"`;
}
const injectTracerPath = path.join(config.tempDir, "inject-tracer.ps1");
fs.writeFileSync(injectTracerPath, script);
await new toolrunner.ToolRunner(
await safeWhich.safeWhich("powershell"),
[
"-ExecutionPolicy",
"Bypass",
"-file",
injectTracerPath,
path.resolve(
path.dirname(codeql.getPath()),
"tools",
"win64",
"tracer.exe"
),
],
{ env: { ODASA_TRACER_CONFIGURATION: tracerConfig.spec } }
).exec();
}
export async function installPythonDeps(codeql: CodeQL, logger: Logger) {
logger.startGroup("Setup Python dependencies");

View File

@@ -1,5 +1,5 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"$schema": "https://json-schema.org/draft/2020-12/schema",
"title": "Static Analysis Results Format (SARIF) Version 2.1.0 JSON Schema",
"$id": "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json",
"description": "Static Analysis Results Format (SARIF) Version 2.1.0 JSON Schema: a standard format for the output of static analysis tools.",
@@ -15,13 +15,15 @@
"version": {
"description": "The SARIF format version of this log file.",
"enum": [ "2.1.0" ]
"enum": [ "2.1.0" ],
"type": "string"
},
"runs": {
"description": "The set of runs contained in this log file.",
"type": "array",
"type": [ "array", "null" ],
"minItems": 0,
"uniqueItems": false,
"items": {
"$ref": "#/definitions/run"
}
@@ -180,7 +182,8 @@
"userSpecifiedConfiguration",
"toolSpecifiedConfiguration",
"debugOutputFile"
]
],
"type": "string"
}
},
@@ -241,6 +244,7 @@
"description": "An array of replacement objects, each of which represents the replacement of a single region in a single artifact specified by 'artifactLocation'.",
"type": "array",
"minItems": 1,
"uniqueItems": false,
"items": {
"$ref": "#/definitions/replacement"
}
@@ -382,6 +386,7 @@
"description": "An array of one or more unique threadFlow objects, each of which describes the progress of a program through a thread of execution.",
"type": "array",
"minItems": 1,
"uniqueItems": false,
"items": {
"$ref": "#/definitions/threadFlow"
}
@@ -556,6 +561,7 @@
"description": "An array of exception objects each of which is considered a cause of this exception.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"$ref": "#/definitions/exception"
@@ -583,17 +589,18 @@
"version": {
"description": "The SARIF format version of this external properties object.",
"enum": [ "2.1.0" ]
"enum": [ "2.1.0" ],
"type": "string"
},
"guid": {
"description": "A stable, unique identifer for this external properties object, in the form of a GUID.",
"description": "A stable, unique identifier for this external properties object, in the form of a GUID.",
"type": "string",
"pattern": "^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[1-5][0-9a-fA-F]{3}-[89abAB][0-9a-fA-F]{3}-[0-9a-fA-F]{12}$"
},
"runGuid": {
"description": "A stable, unique identifer for the run associated with this external properties object, in the form of a GUID.",
"description": "A stable, unique identifier for the run associated with this external properties object, in the form of a GUID.",
"type": "string",
"pattern": "^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[1-5][0-9a-fA-F]{3}-[89abAB][0-9a-fA-F]{3}-[0-9a-fA-F]{12}$"
},
@@ -633,6 +640,7 @@
"description": "Describes the invocation of the analysis tool that will be merged with a separate run.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"$ref": "#/definitions/invocation"
@@ -665,6 +673,7 @@
"description": "An array of result objects that will be merged with a separate run.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"$ref": "#/definitions/result"
@@ -724,6 +733,7 @@
"description": "Addresses that will be merged with a separate run.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"$ref": "#/definitions/address"
@@ -771,7 +781,7 @@
},
"guid": {
"description": "A stable, unique identifer for the external property file in the form of a GUID.",
"description": "A stable, unique identifier for the external property file in the form of a GUID.",
"type": "string",
"pattern": "^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[1-5][0-9a-fA-F]{3}-[89abAB][0-9a-fA-F]{3}-[0-9a-fA-F]{12}$"
},
@@ -1079,6 +1089,7 @@
"description": "The sequences of edges traversed by this graph traversal.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"$ref": "#/definitions/edgeTraversal"
@@ -1111,6 +1122,7 @@
"description": "An array of strings, containing in order the command line arguments passed to the tool from the operating system.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"items": {
"type": "string"
}
@@ -1127,13 +1139,13 @@
},
"startTimeUtc": {
"description": "The Coordinated Universal Time (UTC) date and time at which the run started. See \"Date/time properties\" in the SARIF spec for the required format.",
"description": "The Coordinated Universal Time (UTC) date and time at which the invocation started. See \"Date/time properties\" in the SARIF spec for the required format.",
"type": "string",
"format": "date-time"
},
"endTimeUtc": {
"description": "The Coordinated Universal Time (UTC) date and time at which the run ended. See \"Date/time properties\" in the SARIF spec for the required format.",
"description": "The Coordinated Universal Time (UTC) date and time at which the invocation ended. See \"Date/time properties\" in the SARIF spec for the required format.",
"type": "string",
"format": "date-time"
},
@@ -1169,6 +1181,7 @@
"description": "A list of runtime conditions detected by the tool during the analysis.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"$ref": "#/definitions/notification"
@@ -1179,6 +1192,7 @@
"description": "A list of conditions detected by the tool that are relevant to the tool's configuration.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"$ref": "#/definitions/notification"
@@ -1211,27 +1225,27 @@
},
"machine": {
"description": "The machine that hosted the analysis tool run.",
"description": "The machine on which the invocation occurred.",
"type": "string"
},
"account": {
"description": "The account that ran the analysis tool.",
"description": "The account under which the invocation occurred.",
"type": "string"
},
"processId": {
"description": "The process id for the analysis tool run.",
"description": "The id of the process in which the invocation occurred.",
"type": "integer"
},
"executableLocation": {
"description": "An absolute URI specifying the location of the analysis tool's executable.",
"description": "An absolute URI specifying the location of the executable that was invoked.",
"$ref": "#/definitions/artifactLocation"
},
"workingDirectory": {
"description": "The working directory for the analysis tool run.",
"description": "The working directory for the invocation.",
"$ref": "#/definitions/artifactLocation"
},
@@ -1442,6 +1456,7 @@
"description": "An array of strings to substitute into the message string.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"type": "string"
@@ -1551,7 +1566,8 @@
"level": {
"description": "A value specifying the severity level of the notification.",
"default": "warning",
"enum": [ "none", "note", "warning", "error" ]
"enum": [ "none", "note", "warning", "error" ],
"type": "string"
},
"threadId": {
@@ -1762,7 +1778,13 @@
"properties": {
"description": "Key/value pairs that provide additional information about the region.",
"$ref": "#/definitions/propertyBag"
}
},
"anyOf": [
{ "required": [ "startLine" ] },
{ "required": [ "charOffset" ] },
{ "required": [ "byteOffset" ] }
]
}
},
@@ -1813,7 +1835,7 @@
},
"guid": {
"description": "A unique identifer for the reporting descriptor in the form of a GUID.",
"description": "A unique identifier for the reporting descriptor in the form of a GUID.",
"type": "string",
"pattern": "^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[1-5][0-9a-fA-F]{3}-[89abAB][0-9a-fA-F]{3}-[0-9a-fA-F]{12}$"
},
@@ -1912,7 +1934,8 @@
"level": {
"description": "Specifies the failure level for the report.",
"default": "warning",
"enum": [ "none", "note", "warning", "error" ]
"enum": [ "none", "note", "warning", "error" ],
"type": "string"
},
"rank": {
@@ -2017,7 +2040,7 @@
"properties": {
"ruleId": {
"description": "The stable, unique identifier of the rule, if any, to which this notification is relevant. This member can be used to retrieve rule metadata from the rules dictionary, if it exists.",
"description": "The stable, unique identifier of the rule, if any, to which this result is relevant.",
"type": "string"
},
@@ -2036,13 +2059,15 @@
"kind": {
"description": "A value that categorizes results by evaluation state.",
"default": "fail",
"enum": [ "notApplicable", "pass", "fail", "review", "open", "informational" ]
"enum": [ "notApplicable", "pass", "fail", "review", "open", "informational" ],
"type": "string"
},
"level": {
"description": "A value specifying the severity level of the result.",
"default": "warning",
"enum": [ "none", "note", "warning", "error" ]
"enum": [ "none", "note", "warning", "error" ],
"type": "string"
},
"message": {
@@ -2059,6 +2084,7 @@
"description": "The set of locations where the result was detected. Specify only one location unless the problem indicated by the result can only be corrected by making a change at every specified location.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"$ref": "#/definitions/location"
@@ -2066,7 +2092,7 @@
},
"guid": {
"description": "A stable, unique identifer for the result in the form of a GUID.",
"description": "A stable, unique identifier for the result in the form of a GUID.",
"type": "string",
"pattern": "^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[1-5][0-9a-fA-F]{3}-[89abAB][0-9a-fA-F]{3}-[0-9a-fA-F]{12}$"
},
@@ -2114,6 +2140,7 @@
"description": "An array of 'codeFlow' objects relevant to the result.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"$ref": "#/definitions/codeFlow"
@@ -2170,7 +2197,8 @@
"unchanged",
"updated",
"absent"
]
],
"type": "string"
},
"rank": {
@@ -2324,6 +2352,7 @@
"description": "Describes the invocation of the analysis tool.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"$ref": "#/definitions/invocation"
@@ -2341,6 +2370,7 @@
"default": "en-US",
"pattern": "^[a-zA-Z]{2}(-[a-zA-Z]{2})?$"
},
"versionControlProvenance": {
"description": "Specifies the revision in version control of the artifacts that were scanned.",
"type": "array",
@@ -2396,6 +2426,7 @@
"description": "The set of results contained in an SARIF log. The results array can be omitted when a run is solely exporting rules metadata. It must be present (but may be empty) if a log file represents an actual scan.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"items": {
"$ref": "#/definitions/result"
}
@@ -2457,7 +2488,8 @@
"columnKind": {
"description": "Specifies the unit in which the tool measures columns.",
"enum": [ "utf16CodeUnits", "unicodeCodePoints" ]
"enum": [ "utf16CodeUnits", "unicodeCodePoints" ],
"type": "string"
},
"externalPropertyFileReferences": {
@@ -2491,6 +2523,7 @@
"description": "Addresses associated with this run instance, if any.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"$ref": "#/definitions/address"
@@ -2572,7 +2605,7 @@
},
"guid": {
"description": "A stable, unique identifer for this object's containing run object in the form of a GUID.",
"description": "A stable, unique identifier for this object's containing run object in the form of a GUID.",
"type": "string",
"pattern": "^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[1-5][0-9a-fA-F]{3}-[89abAB][0-9a-fA-F]{3}-[0-9a-fA-F]{12}$"
},
@@ -2623,6 +2656,7 @@
"description": "An array of stack frames that represents a sequence of calls, rendered in reverse chronological order, that comprise the call stack.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"items": {
"$ref": "#/definitions/stackFrame"
}
@@ -2661,6 +2695,7 @@
"description": "The parameters of the call that is executing.",
"type": "array",
"minItems": 0,
"uniqueItems": false,
"default": [],
"items": {
"type": "string",
@@ -2682,7 +2717,7 @@
"properties": {
"guid": {
"description": "A stable, unique identifer for the suprression in the form of a GUID.",
"description": "A stable, unique identifier for the suprression in the form of a GUID.",
"type": "string",
"pattern": "^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[1-5][0-9a-fA-F]{3}-[89abAB][0-9a-fA-F]{3}-[0-9a-fA-F]{12}$"
},
@@ -2692,16 +2727,18 @@
"enum": [
"inSource",
"external"
]
],
"type": "string"
},
"state": {
"description": "A string that indicates the state of the suppression.",
"status": {
"description": "A string that indicates the review status of the suppression.",
"enum": [
"accepted",
"underReview",
"rejected"
]
],
"type": "string"
},
"justification": {
@@ -2759,6 +2796,7 @@
"description": "A temporally ordered array of 'threadFlowLocation' objects, each of which describes a location visited by the tool while producing the result.",
"type": "array",
"minItems": 1,
"uniqueItems": false,
"items": {
"$ref": "#/definitions/threadFlowLocation"
}
@@ -2853,7 +2891,8 @@
"importance": {
"description": "Specifies the importance of this location in understanding the code flow in which it occurs. The order from most to least important is \"essential\", \"important\", \"unimportant\". Default: \"important\".",
"enum": [ "important", "essential", "unimportant" ],
"default": "important"
"default": "important",
"type": "string"
},
"webRequest": {
@@ -2911,7 +2950,7 @@
"properties": {
"guid": {
"description": "A unique identifer for the tool component in the form of a GUID.",
"description": "A unique identifier for the tool component in the form of a GUID.",
"type": "string",
"pattern": "^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[1-5][0-9a-fA-F]{3}-[89abAB][0-9a-fA-F]{3}-[0-9a-fA-F]{12}$"
},
@@ -3051,7 +3090,8 @@
"enum": [
"localizedData",
"nonLocalizedData"
]
],
"type": "string"
}
},

Some files were not shown because too many files have changed in this diff Show More