Compare commits

..

35 Commits

Author SHA1 Message Date
Michael B. Gale
0d9416ee4e Run PR checks after unit-tests job.
Remove all `push` and `pull_request` triggers from the individual PR checks to avoid running everything twice.
2025-09-20 14:34:51 +01:00
Michael B. Gale
5dfad66c06 Add workflow_call trigger to collections 2025-09-20 14:30:08 +01:00
Michael B. Gale
9a22cc2c3f Add collection of all PR checks 2025-09-20 14:29:11 +01:00
Michael B. Gale
0337c4c06e Merge pull request #3123 from github/mbg/fix/upload-sarif-cq-only 2025-09-19 18:48:48 +01:00
Chuan-kai Lin
c22ae04dd3 Merge pull request #3125 from github/cklin/overlay-restore-timeout
Overlay: use restoreCache() timeout
2025-09-19 10:25:21 -07:00
Chuan-kai Lin
80273e2bc1 Overlay: use restoreCache() timeout
This commit changes overlay-base database download to pass the
segmentTimeoutInMs option to restoreCache(), so that restoreCache()
itself can properly abort slow downloads.

The waitForResultWithTimeLimit() wrapper around restoreCache() remains
as a second line of defense, but with a higher 10-minute time limit, to
guard against cache restore hangs outside segment downloads.
2025-09-19 09:40:09 -07:00
Michael B. Gale
db37d924ee Fix condition 2025-09-19 16:17:34 +01:00
Michael B. Gale
6249793233 Disable cpp in upload-quality-sarif check 2025-09-19 16:17:33 +01:00
Michael B. Gale
e33b0ab3ac Update upload-quality-sarif check to only use code-quality 2025-09-19 16:17:33 +01:00
Michael B. Gale
7bea0e2e12 Fix outdated comment 2025-09-19 16:17:33 +01:00
Michael B. Gale
d378195403 Add new sarif-ids output to upload-sarif action
Unlike `sarif-id` which is for the single Code Scanning SARIF id, `sarif-ids` contains stringified JSON object with details of all SARIF ids.
2025-09-19 16:17:31 +01:00
Chuan-kai Lin
12dda79905 Merge pull request #3124 from github/cklin/rename-withtimeout
Rename withTimeout() to waitForResultWithTimeLimit()
2025-09-18 13:34:56 -07:00
Michael B. Gale
a2ce099060 Use findAndUpload for Code Scanning 2025-09-18 16:29:25 +01:00
Michael B. Gale
696b467654 Handle single file case in findAndUpload 2025-09-18 16:29:23 +01:00
Michael B. Gale
c8e017d3e7 Move isDirectory check into findAndUpload 2025-09-18 16:28:39 +01:00
Chuan-kai Lin
8185897cad Rename withTimeout() to waitForResultWithTimeLimit()
The name withTimeout() gives the impression that it would limit the
execution of the promise to the given time bound. But that is not the
case: it is only the _waiting_ that is limited, and the promise would
keep running beyond the time bound.

This commit renames withTimeout() to waitForResultWithTimeLimit() so
that developers are more likely to understand the actual behavior of
this function.
2025-09-18 08:27:36 -07:00
Michael B. Gale
a6161a8092 Call lstatSync on sarifPath earlier and check that the path exists then 2025-09-18 14:13:17 +01:00
Michael B. Gale
35454d39b2 Refactor CQ SARIF upload in upload-sarif into a function 2025-09-18 14:13:14 +01:00
Henry Mercer
b73659a4ff Merge pull request #3122 from felickz/main
Update ref description in action.ymls to include expected format for uploads
2025-09-18 09:52:36 +01:00
Chad Bentz
2f35a47982 Update upload-sarif/action.yml
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-09-17 19:07:57 -04:00
Chad Bentz
242ca1c0a1 Update ref description in action.ymls to include expected format for uploads 2025-09-17 19:02:50 -04:00
Henry Mercer
573acd9552 Merge pull request #3115 from github/dependabot/npm_and_yarn/npm-75b7851ed5
Bump uuid from 12.0.0 to 13.0.0 in the npm group
2025-09-15 18:38:40 +01:00
github-actions[bot]
668f0f00da Rebuild 2025-09-15 17:18:08 +00:00
dependabot[bot]
0b263ec528 Bump uuid from 12.0.0 to 13.0.0 in the npm group
Bumps the npm group with 1 update: [uuid](https://github.com/uuidjs/uuid).


Updates `uuid` from 12.0.0 to 13.0.0
- [Release notes](https://github.com/uuidjs/uuid/releases)
- [Changelog](https://github.com/uuidjs/uuid/blob/main/CHANGELOG.md)
- [Commits](https://github.com/uuidjs/uuid/compare/v12.0.0...v13.0.0)

---
updated-dependencies:
- dependency-name: uuid
  dependency-version: 13.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-15 17:16:56 +00:00
Michael B. Gale
9e5383b3b1 Merge pull request #3113 from github/nickrolfe/minimize-jars-followup
Only enable Java dependency minimisation when caching is enabled
2025-09-15 16:57:27 +01:00
Henry Mercer
8279538f3d Merge pull request #3114 from github/henrymercer/pr-checks-codeql-2.22
Run PR checks over CodeQL v2.22 release series
2025-09-15 16:52:03 +01:00
Henry Mercer
86f23c3336 Run PR checks over CodeQL v2.22 release series 2025-09-15 16:34:20 +01:00
Henry Mercer
77c3d2533d Merge pull request #3112 from github/henrymercer/scan-python
CI: Configure Python analysis
2025-09-15 16:25:56 +01:00
Henry Mercer
1069ace04e Update .github/workflows/codeql.yml 2025-09-15 16:09:21 +01:00
Nick Rolfe
4014b75309 Only enable JAVA dependency minimisation when caching is enabled 2025-09-15 15:11:28 +01:00
Henry Mercer
bce0fa7b27 Remove build mode from matrix 2025-09-15 14:45:40 +01:00
Henry Mercer
8105843d42 Specify paths-ignore for other languages 2025-09-15 14:20:15 +01:00
Henry Mercer
61b8b636e3 Only upload a single matrix case for JS 2025-09-15 14:15:05 +01:00
Henry Mercer
73ead84d0a Reorder strategy properties 2025-09-15 14:12:47 +01:00
Henry Mercer
793fe1783c CI: Configure Python analysis 2025-09-15 14:10:32 +01:00
84 changed files with 847 additions and 763 deletions

View File

@@ -1,4 +0,0 @@
# Configuration for the CodeQL Actions Queries
name: "CodeQL Actions Queries config"
queries:
- uses: security-and-quality

View File

@@ -7,9 +7,9 @@ queries:
# we include both even though one is a superset of the
# other, because we're testing the parsing logic and
# that the suites exist in the codeql bundle.
- uses: security-and-quality
- uses: security-experimental
- uses: security-extended
- uses: security-and-quality
paths-ignore:
- tests
- lib
- tests

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

449
.github/workflows/__all.yml generated vendored Normal file
View File

@@ -0,0 +1,449 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# pr-checks/sync.sh
# to regenerate this file.
name: Manual Check - all
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
paths:
- .github/workflows/__all.yml
workflow_dispatch:
inputs:
go-version:
type: string
description: The version of Go to install
required: false
default: '>=1.21.0'
java-version:
type: string
description: The version of Java to install
required: false
default: '17'
workflow_call:
inputs:
go-version:
type: string
description: The version of Go to install
required: false
default: '>=1.21.0'
java-version:
type: string
description: The version of Java to install
required: false
default: '17'
jobs:
all-platform-bundle:
name: All-platform bundle
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__all-platform-bundle.yml
with:
go-version: ${{ inputs.go-version }}
analyze-ref-input:
name: "Analyze: 'ref' and 'sha' from inputs"
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__analyze-ref-input.yml
with:
go-version: ${{ inputs.go-version }}
autobuild-action:
name: autobuild-action
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__autobuild-action.yml
with: {}
autobuild-direct-tracing-with-working-dir:
name: Autobuild direct tracing (custom working directory)
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__autobuild-direct-tracing-with-working-dir.yml
with:
java-version: ${{ inputs.java-version }}
autobuild-direct-tracing:
name: Autobuild direct tracing
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__autobuild-direct-tracing.yml
with:
java-version: ${{ inputs.java-version }}
build-mode-autobuild:
name: Build mode autobuild
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__build-mode-autobuild.yml
with: {}
build-mode-manual:
name: Build mode manual
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__build-mode-manual.yml
with:
go-version: ${{ inputs.go-version }}
build-mode-none:
name: Build mode none
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__build-mode-none.yml
with: {}
build-mode-rollback:
name: Build mode rollback
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__build-mode-rollback.yml
with: {}
bundle-toolcache:
name: 'Bundle: Caching checks'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__bundle-toolcache.yml
with: {}
bundle-zstd:
name: 'Bundle: Zstandard checks'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__bundle-zstd.yml
with: {}
cleanup-db-cluster-dir:
name: Clean up database cluster directory
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__cleanup-db-cluster-dir.yml
with: {}
config-export:
name: Config export
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__config-export.yml
with: {}
config-input:
name: Config input
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__config-input.yml
with: {}
cpp-deptrace-disabled:
name: 'C/C++: disabling autoinstalling dependencies (Linux)'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__cpp-deptrace-disabled.yml
with: {}
cpp-deptrace-enabled-on-macos:
name: 'C/C++: autoinstalling dependencies is skipped (macOS)'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__cpp-deptrace-enabled-on-macos.yml
with: {}
cpp-deptrace-enabled:
name: 'C/C++: autoinstalling dependencies (Linux)'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__cpp-deptrace-enabled.yml
with: {}
diagnostics-export:
name: Diagnostic export
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__diagnostics-export.yml
with: {}
export-file-baseline-information:
name: Export file baseline information
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__export-file-baseline-information.yml
with:
go-version: ${{ inputs.go-version }}
extractor-ram-threads:
name: Extractor ram and threads options test
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__extractor-ram-threads.yml
with: {}
go-custom-queries:
name: 'Go: Custom queries'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__go-custom-queries.yml
with:
go-version: ${{ inputs.go-version }}
go-indirect-tracing-workaround-diagnostic:
name: 'Go: diagnostic when Go is changed after init step'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__go-indirect-tracing-workaround-diagnostic.yml
with:
go-version: ${{ inputs.go-version }}
go-indirect-tracing-workaround-no-file-program:
name: 'Go: diagnostic when `file` is not installed'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__go-indirect-tracing-workaround-no-file-program.yml
with:
go-version: ${{ inputs.go-version }}
go-indirect-tracing-workaround:
name: 'Go: workaround for indirect tracing'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__go-indirect-tracing-workaround.yml
with:
go-version: ${{ inputs.go-version }}
go-tracing-autobuilder:
name: 'Go: tracing with autobuilder step'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__go-tracing-autobuilder.yml
with:
go-version: ${{ inputs.go-version }}
go-tracing-custom-build-steps:
name: 'Go: tracing with custom build steps'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__go-tracing-custom-build-steps.yml
with:
go-version: ${{ inputs.go-version }}
go-tracing-legacy-workflow:
name: 'Go: tracing with legacy workflow'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__go-tracing-legacy-workflow.yml
with:
go-version: ${{ inputs.go-version }}
init-with-registries:
name: 'Packaging: Download using registries'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__init-with-registries.yml
with: {}
javascript-source-root:
name: Custom source root
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__javascript-source-root.yml
with: {}
job-run-uuid-sarif:
name: Job run UUID added to SARIF
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__job-run-uuid-sarif.yml
with: {}
language-aliases:
name: Language aliases
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__language-aliases.yml
with: {}
multi-language-autodetect:
name: Multi-language repository
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__multi-language-autodetect.yml
with:
go-version: ${{ inputs.go-version }}
overlay-init-fallback:
name: Overlay database init fallback
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__overlay-init-fallback.yml
with: {}
packaging-codescanning-config-inputs-js:
name: 'Packaging: Config and input passed to the CLI'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__packaging-codescanning-config-inputs-js.yml
with:
go-version: ${{ inputs.go-version }}
packaging-config-inputs-js:
name: 'Packaging: Config and input'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__packaging-config-inputs-js.yml
with:
go-version: ${{ inputs.go-version }}
packaging-config-js:
name: 'Packaging: Config file'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__packaging-config-js.yml
with:
go-version: ${{ inputs.go-version }}
packaging-inputs-js:
name: 'Packaging: Action input'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__packaging-inputs-js.yml
with:
go-version: ${{ inputs.go-version }}
quality-queries:
name: Quality queries input
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__quality-queries.yml
with: {}
remote-config:
name: Remote config file
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__remote-config.yml
with:
go-version: ${{ inputs.go-version }}
resolve-environment-action:
name: Resolve environment
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__resolve-environment-action.yml
with: {}
rubocop-multi-language:
name: RuboCop multi-language
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__rubocop-multi-language.yml
with: {}
ruby:
name: Ruby analysis
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__ruby.yml
with: {}
rust:
name: Rust analysis
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__rust.yml
with: {}
split-workflow:
name: Split workflow
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__split-workflow.yml
with:
go-version: ${{ inputs.go-version }}
start-proxy:
name: Start proxy
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__start-proxy.yml
with: {}
submit-sarif-failure:
name: Submit SARIF after failure
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__submit-sarif-failure.yml
with: {}
swift-autobuild:
name: Swift analysis using autobuild
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__swift-autobuild.yml
with: {}
swift-custom-build:
name: Swift analysis using a custom build command
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__swift-custom-build.yml
with:
go-version: ${{ inputs.go-version }}
test-autobuild-working-dir:
name: Autobuild working directory
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__test-autobuild-working-dir.yml
with: {}
test-local-codeql:
name: Local CodeQL bundle
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__test-local-codeql.yml
with:
go-version: ${{ inputs.go-version }}
test-proxy:
name: Proxy test
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__test-proxy.yml
with: {}
unset-environment:
name: Test unsetting environment variables
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__unset-environment.yml
with:
go-version: ${{ inputs.go-version }}
upload-quality-sarif:
name: 'Upload-sarif: code quality endpoint'
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__upload-quality-sarif.yml
with:
go-version: ${{ inputs.go-version }}
upload-ref-sha-input:
name: "Upload-sarif: 'ref' and 'sha' from inputs"
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__upload-ref-sha-input.yml
with:
go-version: ${{ inputs.go-version }}
with-checkout-path:
name: Use a custom `checkout_path`
permissions:
contents: read
security-events: read
uses: ./.github/workflows/__with-checkout-path.yml
with:
go-version: ${{ inputs.go-version }}

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

10
.github/workflows/__bundle-zstd.yml generated vendored
View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

10
.github/workflows/__config-input.yml generated vendored
View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:
@@ -63,6 +53,10 @@ jobs:
version: stable-v2.21.4
- os: macos-latest
version: stable-v2.21.4
- os: ubuntu-latest
version: stable-v2.22.4
- os: macos-latest
version: stable-v2.22.4
- os: ubuntu-latest
version: default
- os: macos-latest

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:
@@ -63,6 +53,10 @@ jobs:
version: stable-v2.21.4
- os: macos-latest
version: stable-v2.21.4
- os: ubuntu-latest
version: stable-v2.22.4
- os: macos-latest
version: stable-v2.22.4
- os: ubuntu-latest
version: default
- os: macos-latest

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:
@@ -63,6 +53,10 @@ jobs:
version: stable-v2.21.4
- os: macos-latest
version: stable-v2.21.4
- os: ubuntu-latest
version: stable-v2.22.4
- os: macos-latest
version: stable-v2.22.4
- os: ubuntu-latest
version: default
- os: macos-latest

7
.github/workflows/__go.yml generated vendored
View File

@@ -18,6 +18,13 @@ on:
description: The version of Go to install
required: false
default: '>=1.21.0'
workflow_call:
inputs:
go-version:
type: string
description: The version of Go to install
required: false
default: '>=1.21.0'
jobs:
go-custom-queries:
name: 'Go: Custom queries'

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:
@@ -63,6 +53,10 @@ jobs:
version: stable-v2.21.4
- os: ubuntu-latest
version: stable-v2.21.4
- os: macos-latest
version: stable-v2.22.4
- os: ubuntu-latest
version: stable-v2.22.4
- os: macos-latest
version: default
- os: ubuntu-latest

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

10
.github/workflows/__ruby.yml generated vendored
View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

10
.github/workflows/__rust.yml generated vendored
View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

10
.github/workflows/__start-proxy.yml generated vendored
View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

10
.github/workflows/__test-proxy.yml generated vendored
View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:
@@ -73,10 +63,8 @@ jobs:
- uses: ./../action/init
with:
tools: ${{ steps.prepare-test.outputs.tools-url }}
languages: cpp,csharp,java,javascript,python
config-file: ${{ github.repository }}/tests/multi-language-repo/.github/codeql/custom-queries.yml@${{
github.sha }}
analysis-kinds: code-scanning,code-quality
languages: csharp,java,javascript,python
analysis-kinds: code-quality
- name: Build code
run: ./build.sh
# Generate some SARIF we can upload with the upload-sarif step
@@ -86,8 +74,12 @@ jobs:
sha: 5e235361806c361d4d3f8859e3c897658025a9a2
upload: never
- uses: ./../action/upload-sarif
id: upload-sarif
with:
ref: refs/heads/main
sha: 5e235361806c361d4d3f8859e3c897658025a9a2
- name: Check output from `upload-sarif` step
if: fromJSON(steps.upload-sarif.outputs.sarif-ids)[0].analysis != 'code-quality'
run: exit 1
env:
CODEQL_ACTION_TEST_MODE: true

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -8,16 +8,6 @@ env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch:

View File

@@ -95,7 +95,7 @@ jobs:
id: init
with:
languages: javascript
config-file: ./.github/codeql/codeql-config.yml
config-file: ./.github/codeql/codeql-config-javascript.yml
tools: ${{ matrix.tools }}
# confirm steps.init.outputs.codeql-path points to the codeql binary
- name: Print CodeQL Version
@@ -107,13 +107,17 @@ jobs:
uses: ./analyze
with:
category: "/language:javascript"
upload: ${{ (matrix.os == 'ubuntu-24.04' && !matrix.tools && 'always') || 'never' }}
analyze-actions:
analyze-other:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
include:
- language: actions
- language: python
permissions:
contents: read
@@ -125,32 +129,15 @@ jobs:
- name: Initialize CodeQL
uses: ./init
with:
languages: actions
config-file: ./.github/codeql/codeql-actions-config.yml
languages: ${{ matrix.language }}
build-mode: none
config: >
paths-ignore:
- lib
- tests
queries:
- uses: security-and-quality
- name: Perform CodeQL Analysis
uses: ./analyze
with:
category: "/language:actions"
analyze-python:
runs-on: ubuntu-latest
strategy:
fail-fast: false
permissions:
contents: read
security-events: write
steps:
- name: Checkout
uses: actions/checkout@v5
- name: Initialize CodeQL
uses: ./init
with:
languages: python
config-file: ./.github/codeql/codeql-actions-config.yml
- name: Perform CodeQL Analysis
uses: ./analyze
with:
category: "/language:python"
category: "/language:${{ matrix.language }}"

View File

@@ -31,7 +31,7 @@ jobs:
run: git config --global core.autocrlf false
- uses: actions/checkout@v5
- name: Set up Node.js
uses: actions/setup-node@v4
with:
@@ -70,6 +70,12 @@ jobs:
sarif_file: eslint.sarif
category: eslint
pr-checks:
name: "Run all PR checks"
needs:
- unit-tests
uses: ./.github/workflows/__all.yml
check-node-version:
if: github.event.pull_request
name: Check Action Node versions

View File

@@ -58,7 +58,7 @@ inputs:
# If changing this, make sure to update workflow.ts accordingly.
default: ${{ github.workspace }}
ref:
description: "The ref where results will be uploaded. If not provided, the Action will use the GITHUB_REF environment variable. If provided, the sha input must be provided as well. This input is ignored for pull requests from forks."
description: "The ref where results will be uploaded. If not provided, the Action will use the GITHUB_REF environment variable. If provided, the sha input must be provided as well. This input is ignored for pull requests from forks. Expected format: refs/heads/<branch name>, refs/tags/<tag>, refs/pull/<number>/merge, or refs/pull/<number>/head."
required: false
sha:
description: "The sha of the HEAD of the ref where results will be uploaded. If not provided, the Action will use the GITHUB_SHA environment variable. If provided, the ref input must be provided as well. This input is ignored for pull requests from forks."

View File

@@ -26486,7 +26486,7 @@ var require_package = __commonJS({
"node-forge": "^1.3.1",
octokit: "^5.0.3",
semver: "^7.7.2",
uuid: "^12.0.0"
uuid: "^13.0.0"
},
devDependencies: {
"@ava/typescript": "6.0.0",

18
lib/analyze-action.js generated
View File

@@ -32335,7 +32335,7 @@ var require_package = __commonJS({
"node-forge": "^1.3.1",
octokit: "^5.0.3",
semver: "^7.7.2",
uuid: "^12.0.0"
uuid: "^13.0.0"
},
devDependencies: {
"@ava/typescript": "6.0.0",
@@ -89754,7 +89754,7 @@ async function tryGetFolderBytes(cacheDir, logger, quiet = false) {
}
}
var hadTimeout = false;
async function withTimeout(timeoutMs, promise, onTimeout) {
async function waitForResultWithTimeLimit(timeoutMs, promise, onTimeout) {
let finished2 = false;
const mainTask = async () => {
const result = await promise;
@@ -90872,7 +90872,7 @@ function computeChangedFiles(baseFileOids, overlayFileOids) {
}
var CACHE_VERSION = 1;
var CACHE_PREFIX = "codeql-overlay-base-database";
var MAX_CACHE_OPERATION_MS = 12e4;
var MAX_CACHE_OPERATION_MS = 6e5;
function checkOverlayBaseDatabase(config, logger, warningPrefix) {
const baseDatabaseOidsFilePath = getBaseDatabaseOidsFilePath(config);
if (!fs6.existsSync(baseDatabaseOidsFilePath)) {
@@ -90940,7 +90940,7 @@ async function uploadOverlayBaseDatabaseToCache(codeql, config, logger) {
`Uploading overlay-base database to Actions cache with key ${cacheSaveKey}`
);
try {
const cacheId = await withTimeout(
const cacheId = await waitForResultWithTimeLimit(
MAX_CACHE_OPERATION_MS,
actionsCache.saveCache([dbLocation], cacheSaveKey),
() => {
@@ -91498,7 +91498,7 @@ async function uploadTrapCaches(codeql, config, logger) {
process.env.GITHUB_SHA || "unknown"
);
logger.info(`Uploading TRAP cache to Actions cache with key ${key}`);
await withTimeout(
await waitForResultWithTimeLimit(
MAX_CACHE_OPERATION_MS2,
actionsCache2.saveCache([cacheDir], key),
() => {
@@ -91683,7 +91683,7 @@ var toolcache3 = __toESM(require_tool_cache());
var import_fast_deep_equal = __toESM(require_fast_deep_equal());
var semver7 = __toESM(require_semver2());
// node_modules/uuid/dist/stringify.js
// node_modules/uuid/dist-node/stringify.js
var byteToHex = [];
for (let i = 0; i < 256; ++i) {
byteToHex.push((i + 256).toString(16).slice(1));
@@ -91692,7 +91692,7 @@ function unsafeStringify(arr, offset = 0) {
return (byteToHex[arr[offset + 0]] + byteToHex[arr[offset + 1]] + byteToHex[arr[offset + 2]] + byteToHex[arr[offset + 3]] + "-" + byteToHex[arr[offset + 4]] + byteToHex[arr[offset + 5]] + "-" + byteToHex[arr[offset + 6]] + byteToHex[arr[offset + 7]] + "-" + byteToHex[arr[offset + 8]] + byteToHex[arr[offset + 9]] + "-" + byteToHex[arr[offset + 10]] + byteToHex[arr[offset + 11]] + byteToHex[arr[offset + 12]] + byteToHex[arr[offset + 13]] + byteToHex[arr[offset + 14]] + byteToHex[arr[offset + 15]]).toLowerCase();
}
// node_modules/uuid/dist/rng.js
// node_modules/uuid/dist-node/rng.js
var import_node_crypto = require("node:crypto");
var rnds8Pool = new Uint8Array(256);
var poolPtr = rnds8Pool.length;
@@ -91704,11 +91704,11 @@ function rng() {
return rnds8Pool.slice(poolPtr, poolPtr += 16);
}
// node_modules/uuid/dist/native.js
// node_modules/uuid/dist-node/native.js
var import_node_crypto2 = require("node:crypto");
var native_default = { randomUUID: import_node_crypto2.randomUUID };
// node_modules/uuid/dist/v4.js
// node_modules/uuid/dist-node/v4.js
function _v4(options, buf, offset) {
options = options || {};
const rnds = options.random ?? options.rng?.() ?? rng();

View File

@@ -26486,7 +26486,7 @@ var require_package = __commonJS({
"node-forge": "^1.3.1",
octokit: "^5.0.3",
semver: "^7.7.2",
uuid: "^12.0.0"
uuid: "^13.0.0"
},
devDependencies: {
"@ava/typescript": "6.0.0",

View File

@@ -32335,7 +32335,7 @@ var require_package = __commonJS({
"node-forge": "^1.3.1",
octokit: "^5.0.3",
semver: "^7.7.2",
uuid: "^12.0.0"
uuid: "^13.0.0"
},
devDependencies: {
"@ava/typescript": "6.0.0",
@@ -129626,7 +129626,7 @@ var toolcache3 = __toESM(require_tool_cache());
var import_fast_deep_equal = __toESM(require_fast_deep_equal());
var semver7 = __toESM(require_semver2());
// node_modules/uuid/dist/stringify.js
// node_modules/uuid/dist-node/stringify.js
var byteToHex = [];
for (let i = 0; i < 256; ++i) {
byteToHex.push((i + 256).toString(16).slice(1));
@@ -129635,7 +129635,7 @@ function unsafeStringify(arr, offset = 0) {
return (byteToHex[arr[offset + 0]] + byteToHex[arr[offset + 1]] + byteToHex[arr[offset + 2]] + byteToHex[arr[offset + 3]] + "-" + byteToHex[arr[offset + 4]] + byteToHex[arr[offset + 5]] + "-" + byteToHex[arr[offset + 6]] + byteToHex[arr[offset + 7]] + "-" + byteToHex[arr[offset + 8]] + byteToHex[arr[offset + 9]] + "-" + byteToHex[arr[offset + 10]] + byteToHex[arr[offset + 11]] + byteToHex[arr[offset + 12]] + byteToHex[arr[offset + 13]] + byteToHex[arr[offset + 14]] + byteToHex[arr[offset + 15]]).toLowerCase();
}
// node_modules/uuid/dist/rng.js
// node_modules/uuid/dist-node/rng.js
var import_node_crypto = require("node:crypto");
var rnds8Pool = new Uint8Array(256);
var poolPtr = rnds8Pool.length;
@@ -129647,11 +129647,11 @@ function rng() {
return rnds8Pool.slice(poolPtr, poolPtr += 16);
}
// node_modules/uuid/dist/native.js
// node_modules/uuid/dist-node/native.js
var import_node_crypto2 = require("node:crypto");
var native_default = { randomUUID: import_node_crypto2.randomUUID };
// node_modules/uuid/dist/v4.js
// node_modules/uuid/dist-node/v4.js
function _v4(options, buf, offset) {
options = options || {};
const rnds = options.random ?? options.rng?.() ?? rng();

52
lib/init-action.js generated
View File

@@ -32335,7 +32335,7 @@ var require_package = __commonJS({
"node-forge": "^1.3.1",
octokit: "^5.0.3",
semver: "^7.7.2",
uuid: "^12.0.0"
uuid: "^13.0.0"
},
devDependencies: {
"@ava/typescript": "6.0.0",
@@ -81687,7 +81687,7 @@ var core13 = __toESM(require_core());
var io6 = __toESM(require_io());
var semver8 = __toESM(require_semver2());
// node_modules/uuid/dist/stringify.js
// node_modules/uuid/dist-node/stringify.js
var byteToHex = [];
for (let i = 0; i < 256; ++i) {
byteToHex.push((i + 256).toString(16).slice(1));
@@ -81696,7 +81696,7 @@ function unsafeStringify(arr, offset = 0) {
return (byteToHex[arr[offset + 0]] + byteToHex[arr[offset + 1]] + byteToHex[arr[offset + 2]] + byteToHex[arr[offset + 3]] + "-" + byteToHex[arr[offset + 4]] + byteToHex[arr[offset + 5]] + "-" + byteToHex[arr[offset + 6]] + byteToHex[arr[offset + 7]] + "-" + byteToHex[arr[offset + 8]] + byteToHex[arr[offset + 9]] + "-" + byteToHex[arr[offset + 10]] + byteToHex[arr[offset + 11]] + byteToHex[arr[offset + 12]] + byteToHex[arr[offset + 13]] + byteToHex[arr[offset + 14]] + byteToHex[arr[offset + 15]]).toLowerCase();
}
// node_modules/uuid/dist/rng.js
// node_modules/uuid/dist-node/rng.js
var import_node_crypto = require("node:crypto");
var rnds8Pool = new Uint8Array(256);
var poolPtr = rnds8Pool.length;
@@ -81708,11 +81708,11 @@ function rng() {
return rnds8Pool.slice(poolPtr, poolPtr += 16);
}
// node_modules/uuid/dist/native.js
// node_modules/uuid/dist-node/native.js
var import_node_crypto2 = require("node:crypto");
var native_default = { randomUUID: import_node_crypto2.randomUUID };
// node_modules/uuid/dist/v4.js
// node_modules/uuid/dist-node/v4.js
function _v4(options, buf, offset) {
options = options || {};
const rnds = options.random ?? options.rng?.() ?? rng();
@@ -85619,7 +85619,7 @@ async function tryGetFolderBytes(cacheDir, logger, quiet = false) {
}
}
var hadTimeout = false;
async function withTimeout(timeoutMs, promise, onTimeout) {
async function waitForResultWithTimeLimit(timeoutMs, promise, onTimeout) {
let finished2 = false;
const mainTask = async () => {
const result = await promise;
@@ -86478,7 +86478,7 @@ function computeChangedFiles(baseFileOids, overlayFileOids) {
}
var CACHE_VERSION = 1;
var CACHE_PREFIX = "codeql-overlay-base-database";
var MAX_CACHE_OPERATION_MS = 12e4;
var MAX_CACHE_OPERATION_MS = 6e5;
function checkOverlayBaseDatabase(config, logger, warningPrefix) {
const baseDatabaseOidsFilePath = getBaseDatabaseOidsFilePath(config);
if (!fs6.existsSync(baseDatabaseOidsFilePath)) {
@@ -86521,9 +86521,39 @@ async function downloadOverlayBaseDatabaseFromCache(codeql, config, logger) {
let databaseDownloadDurationMs = 0;
try {
const databaseDownloadStart = performance.now();
const foundKey = await withTimeout(
const foundKey = await waitForResultWithTimeLimit(
// This ten-minute limit for the cache restore operation is mainly to
// guard against the possibility that the cache service is unresponsive
// and hangs outside the data download.
//
// Data download (which is normally the most time-consuming part of the
// restore operation) should not run long enough to hit this limit. Even
// for an extremely large 10GB database, at a download speed of 40MB/s
// (see below), the download should complete within five minutes. If we
// do hit this limit, there are likely more serious problems other than
// mere slow download speed.
//
// This is important because we don't want any ongoing file operations
// on the database directory when we do hit this limit. Hitting this
// time limit takes us to a fallback path where we re-initialize the
// database from scratch at dbLocation, and having the cache restore
// operation continue to write into dbLocation in the background would
// really mess things up. We want to hit this limit only in the case
// of a hung cache service, not just slow download speed.
MAX_CACHE_OPERATION_MS,
actionsCache.restoreCache([dbLocation], cacheRestoreKeyPrefix),
actionsCache.restoreCache(
[dbLocation],
cacheRestoreKeyPrefix,
void 0,
{
// Azure SDK download (which is the default) uses 128MB segments; see
// https://github.com/actions/toolkit/blob/main/packages/cache/README.md.
// Setting segmentTimeoutInMs to 3000 translates to segment download
// speed of about 40 MB/s, which should be achievable unless the
// download is unreliable (in which case we do want to abort).
segmentTimeoutInMs: 3e3
}
),
() => {
logger.info("Timed out downloading overlay-base database from cache");
}
@@ -87106,7 +87136,7 @@ async function downloadTrapCaches(codeql, languages, logger) {
logger.info(
`Looking in Actions cache for TRAP cache with key ${preferredKey}`
);
const found = await withTimeout(
const found = await waitForResultWithTimeLimit(
MAX_CACHE_OPERATION_MS2,
actionsCache2.restoreCache([cacheDir], preferredKey, [
// Fall back to any cache with the right key prefix
@@ -90679,7 +90709,7 @@ exec ${goBinaryPath} "$@"`
logger.debug(
`${"CODEQL_EXTRACTOR_JAVA_OPTION_MINIMIZE_DEPENDENCY_JARS" /* JAVA_EXTRACTOR_MINIMIZE_DEPENDENCY_JARS */} is already set to '${process.env["CODEQL_EXTRACTOR_JAVA_OPTION_MINIMIZE_DEPENDENCY_JARS" /* JAVA_EXTRACTOR_MINIMIZE_DEPENDENCY_JARS */]}', so the Action will not override it.`
);
} else if (minimizeJavaJars && config.buildMode === "none" /* None */ && config.languages.includes("java" /* java */)) {
} else if (minimizeJavaJars && config.dependencyCachingEnabled && config.buildMode === "none" /* None */ && config.languages.includes("java" /* java */)) {
core13.exportVariable(
"CODEQL_EXTRACTOR_JAVA_OPTION_MINIMIZE_DEPENDENCY_JARS" /* JAVA_EXTRACTOR_MINIMIZE_DEPENDENCY_JARS */,
"true"

View File

@@ -26486,7 +26486,7 @@ var require_package = __commonJS({
"node-forge": "^1.3.1",
octokit: "^5.0.3",
semver: "^7.7.2",
uuid: "^12.0.0"
uuid: "^13.0.0"
},
devDependencies: {
"@ava/typescript": "6.0.0",

View File

@@ -26486,7 +26486,7 @@ var require_package = __commonJS({
"node-forge": "^1.3.1",
octokit: "^5.0.3",
semver: "^7.7.2",
uuid: "^12.0.0"
uuid: "^13.0.0"
},
devDependencies: {
"@ava/typescript": "6.0.0",

View File

@@ -45014,7 +45014,7 @@ var require_package = __commonJS({
"node-forge": "^1.3.1",
octokit: "^5.0.3",
semver: "^7.7.2",
uuid: "^12.0.0"
uuid: "^13.0.0"
},
devDependencies: {
"@ava/typescript": "6.0.0",

10
lib/upload-lib.js generated
View File

@@ -33632,7 +33632,7 @@ var require_package = __commonJS({
"node-forge": "^1.3.1",
octokit: "^5.0.3",
semver: "^7.7.2",
uuid: "^12.0.0"
uuid: "^13.0.0"
},
devDependencies: {
"@ava/typescript": "6.0.0",
@@ -89454,7 +89454,7 @@ var toolcache3 = __toESM(require_tool_cache());
var import_fast_deep_equal = __toESM(require_fast_deep_equal());
var semver7 = __toESM(require_semver2());
// node_modules/uuid/dist/stringify.js
// node_modules/uuid/dist-node/stringify.js
var byteToHex = [];
for (let i = 0; i < 256; ++i) {
byteToHex.push((i + 256).toString(16).slice(1));
@@ -89463,7 +89463,7 @@ function unsafeStringify(arr, offset = 0) {
return (byteToHex[arr[offset + 0]] + byteToHex[arr[offset + 1]] + byteToHex[arr[offset + 2]] + byteToHex[arr[offset + 3]] + "-" + byteToHex[arr[offset + 4]] + byteToHex[arr[offset + 5]] + "-" + byteToHex[arr[offset + 6]] + byteToHex[arr[offset + 7]] + "-" + byteToHex[arr[offset + 8]] + byteToHex[arr[offset + 9]] + "-" + byteToHex[arr[offset + 10]] + byteToHex[arr[offset + 11]] + byteToHex[arr[offset + 12]] + byteToHex[arr[offset + 13]] + byteToHex[arr[offset + 14]] + byteToHex[arr[offset + 15]]).toLowerCase();
}
// node_modules/uuid/dist/rng.js
// node_modules/uuid/dist-node/rng.js
var import_node_crypto = require("node:crypto");
var rnds8Pool = new Uint8Array(256);
var poolPtr = rnds8Pool.length;
@@ -89475,11 +89475,11 @@ function rng() {
return rnds8Pool.slice(poolPtr, poolPtr += 16);
}
// node_modules/uuid/dist/native.js
// node_modules/uuid/dist-node/native.js
var import_node_crypto2 = require("node:crypto");
var native_default = { randomUUID: import_node_crypto2.randomUUID };
// node_modules/uuid/dist/v4.js
// node_modules/uuid/dist-node/v4.js
function _v4(options, buf, offset) {
options = options || {};
const rnds = options.random ?? options.rng?.() ?? rng();

View File

@@ -26486,7 +26486,7 @@ var require_package = __commonJS({
"node-forge": "^1.3.1",
octokit: "^5.0.3",
semver: "^7.7.2",
uuid: "^12.0.0"
uuid: "^13.0.0"
},
devDependencies: {
"@ava/typescript": "6.0.0",

View File

@@ -32335,7 +32335,7 @@ var require_package = __commonJS({
"node-forge": "^1.3.1",
octokit: "^5.0.3",
semver: "^7.7.2",
uuid: "^12.0.0"
uuid: "^13.0.0"
},
devDependencies: {
"@ava/typescript": "6.0.0",
@@ -90155,7 +90155,7 @@ var toolcache3 = __toESM(require_tool_cache());
var import_fast_deep_equal = __toESM(require_fast_deep_equal());
var semver7 = __toESM(require_semver2());
// node_modules/uuid/dist/stringify.js
// node_modules/uuid/dist-node/stringify.js
var byteToHex = [];
for (let i = 0; i < 256; ++i) {
byteToHex.push((i + 256).toString(16).slice(1));
@@ -90164,7 +90164,7 @@ function unsafeStringify(arr, offset = 0) {
return (byteToHex[arr[offset + 0]] + byteToHex[arr[offset + 1]] + byteToHex[arr[offset + 2]] + byteToHex[arr[offset + 3]] + "-" + byteToHex[arr[offset + 4]] + byteToHex[arr[offset + 5]] + "-" + byteToHex[arr[offset + 6]] + byteToHex[arr[offset + 7]] + "-" + byteToHex[arr[offset + 8]] + byteToHex[arr[offset + 9]] + "-" + byteToHex[arr[offset + 10]] + byteToHex[arr[offset + 11]] + byteToHex[arr[offset + 12]] + byteToHex[arr[offset + 13]] + byteToHex[arr[offset + 14]] + byteToHex[arr[offset + 15]]).toLowerCase();
}
// node_modules/uuid/dist/rng.js
// node_modules/uuid/dist-node/rng.js
var import_node_crypto = require("node:crypto");
var rnds8Pool = new Uint8Array(256);
var poolPtr = rnds8Pool.length;
@@ -90176,11 +90176,11 @@ function rng() {
return rnds8Pool.slice(poolPtr, poolPtr += 16);
}
// node_modules/uuid/dist/native.js
// node_modules/uuid/dist-node/native.js
var import_node_crypto2 = require("node:crypto");
var native_default = { randomUUID: import_node_crypto2.randomUUID };
// node_modules/uuid/dist/v4.js
// node_modules/uuid/dist-node/v4.js
function _v4(options, buf, offset) {
options = options || {};
const rnds = options.random ?? options.rng?.() ?? rng();
@@ -92985,23 +92985,6 @@ function findSarifFilesInDir(sarifPath, isSarif) {
walkSarifFiles(sarifPath);
return sarifFiles;
}
function getSarifFilePaths(sarifPath, isSarif) {
if (!fs14.existsSync(sarifPath)) {
throw new ConfigurationError(`Path does not exist: ${sarifPath}`);
}
let sarifFiles;
if (fs14.lstatSync(sarifPath).isDirectory()) {
sarifFiles = findSarifFilesInDir(sarifPath, isSarif);
if (sarifFiles.length === 0) {
throw new ConfigurationError(
`No SARIF files found to upload in "${sarifPath}".`
);
}
} else {
sarifFiles = [sarifPath];
}
return sarifFiles;
}
function countResultsInSarif(sarif) {
let numResults = 0;
const parsedSarif = JSON.parse(sarif);
@@ -93097,20 +93080,6 @@ function buildPayload(commitOid, ref, analysisKey, analysisName, zippedSarif, wo
}
return payloadObj;
}
async function uploadFiles(inputSarifPath, checkoutPath, category, features, logger, uploadTarget) {
const sarifPaths = getSarifFilePaths(
inputSarifPath,
uploadTarget.sarifPredicate
);
return uploadSpecifiedFiles(
sarifPaths,
checkoutPath,
category,
features,
logger,
uploadTarget
);
}
async function uploadSpecifiedFiles(sarifPaths, checkoutPath, category, features, logger, uploadTarget) {
logger.startGroup(`Uploading ${uploadTarget.name} results`);
logger.info(`Processing sarif files: ${JSON.stringify(sarifPaths)}`);
@@ -93358,6 +93327,30 @@ function filterAlertsByDiffRange(logger, sarif) {
}
// src/upload-sarif-action.ts
async function findAndUpload(logger, features, sarifPath, pathStats, checkoutPath, analysis, category) {
let sarifFiles;
if (pathStats.isDirectory()) {
sarifFiles = findSarifFilesInDir(
sarifPath,
analysis.sarifPredicate
);
} else if (pathStats.isFile() && analysis.sarifPredicate(sarifPath)) {
sarifFiles = [sarifPath];
} else {
return void 0;
}
if (sarifFiles.length !== 0) {
return await uploadSpecifiedFiles(
sarifFiles,
checkoutPath,
category,
features,
logger,
analysis
);
}
return void 0;
}
async function sendSuccessStatusReport(startedAt, uploadStats, logger) {
const statusReportBase = await createStatusReportBase(
"upload-sarif" /* UploadSarif */,
@@ -93404,41 +93397,59 @@ async function run() {
const sarifPath = getRequiredInput("sarif_file");
const checkoutPath = getRequiredInput("checkout_path");
const category = getOptionalInput("category");
const uploadResult = await uploadFiles(
sarifPath,
checkoutPath,
category,
features,
logger,
CodeScanning
);
core13.setOutput("sarif-id", uploadResult.sarifID);
if (fs15.lstatSync(sarifPath).isDirectory()) {
const qualitySarifFiles = findSarifFilesInDir(
sarifPath,
CodeQuality.sarifPredicate
);
if (qualitySarifFiles.length !== 0) {
await uploadSpecifiedFiles(
qualitySarifFiles,
checkoutPath,
fixCodeQualityCategory(logger, category),
features,
logger,
CodeQuality
);
}
const pathStats = fs15.lstatSync(sarifPath, { throwIfNoEntry: false });
if (pathStats === void 0) {
throw new ConfigurationError(`Path does not exist: ${sarifPath}.`);
}
const sarifIds = [];
const uploadResult = await findAndUpload(
logger,
features,
sarifPath,
pathStats,
checkoutPath,
CodeScanning,
category
);
if (uploadResult !== void 0) {
core13.setOutput("sarif-id", uploadResult.sarifID);
sarifIds.push({
analysis: "code-scanning" /* CodeScanning */,
id: uploadResult.sarifID
});
}
const qualityUploadResult = await findAndUpload(
logger,
features,
sarifPath,
pathStats,
checkoutPath,
CodeQuality,
fixCodeQualityCategory(logger, category)
);
if (qualityUploadResult !== void 0) {
sarifIds.push({
analysis: "code-quality" /* CodeQuality */,
id: qualityUploadResult.sarifID
});
}
core13.setOutput("sarif-ids", JSON.stringify(sarifIds));
if (isInTestMode()) {
core13.debug("In test mode. Waiting for processing is disabled.");
} else if (getRequiredInput("wait-for-processing") === "true") {
await waitForProcessing(
getRepositoryNwo(),
uploadResult.sarifID,
logger
);
if (uploadResult !== void 0) {
await waitForProcessing(
getRepositoryNwo(),
uploadResult.sarifID,
logger
);
}
}
await sendSuccessStatusReport(startedAt, uploadResult.statusReport, logger);
await sendSuccessStatusReport(
startedAt,
uploadResult?.statusReport || {},
logger
);
} catch (unwrappedError) {
const error2 = isThirdPartyAnalysis("upload-sarif" /* UploadSarif */) && unwrappedError instanceof InvalidSarifUploadError ? new ConfigurationError(unwrappedError.message) : wrapError(unwrappedError);
const message = error2.message;

10
package-lock.json generated
View File

@@ -34,7 +34,7 @@
"node-forge": "^1.3.1",
"octokit": "^5.0.3",
"semver": "^7.7.2",
"uuid": "^12.0.0"
"uuid": "^13.0.0"
},
"devDependencies": {
"@ava/typescript": "6.0.0",
@@ -9076,16 +9076,16 @@
"license": "MIT"
},
"node_modules/uuid": {
"version": "12.0.0",
"resolved": "https://registry.npmjs.org/uuid/-/uuid-12.0.0.tgz",
"integrity": "sha512-USe1zesMYh4fjCA8ZH5+X5WIVD0J4V1Jksm1bFTVBX2F/cwSXt0RO5w/3UXbdLKmZX65MiWV+hwhSS8p6oBTGA==",
"version": "13.0.0",
"resolved": "https://registry.npmjs.org/uuid/-/uuid-13.0.0.tgz",
"integrity": "sha512-XQegIaBTVUjSHliKqcnFqYypAd4S+WCYt5NIeRs6w/UAry7z8Y9j5ZwRRL4kzq9U3sD6v+85er9FvkEaBpji2w==",
"funding": [
"https://github.com/sponsors/broofa",
"https://github.com/sponsors/ctavan"
],
"license": "MIT",
"bin": {
"uuid": "dist/bin/uuid"
"uuid": "dist-node/bin/uuid"
}
},
"node_modules/webidl-conversions": {

View File

@@ -48,7 +48,7 @@
"node-forge": "^1.3.1",
"octokit": "^5.0.3",
"semver": "^7.7.2",
"uuid": "^12.0.0"
"uuid": "^13.0.0"
},
"devDependencies": {
"@ava/typescript": "6.0.0",

View File

@@ -6,9 +6,8 @@ steps:
- uses: ./../action/init
with:
tools: ${{ steps.prepare-test.outputs.tools-url }}
languages: cpp,csharp,java,javascript,python
config-file: ${{ github.repository }}/tests/multi-language-repo/.github/codeql/custom-queries.yml@${{ github.sha }}
analysis-kinds: code-scanning,code-quality
languages: csharp,java,javascript,python
analysis-kinds: code-quality
- name: Build code
run: ./build.sh
# Generate some SARIF we can upload with the upload-sarif step
@@ -18,6 +17,10 @@ steps:
sha: '5e235361806c361d4d3f8859e3c897658025a9a2'
upload: never
- uses: ./../action/upload-sarif
id: upload-sarif
with:
ref: 'refs/heads/main'
sha: '5e235361806c361d4d3f8859e3c897658025a9a2'
- name: "Check output from `upload-sarif` step"
if: fromJSON(steps.upload-sarif.outputs.sarif-ids)[0].analysis != 'code-quality'
run: exit 1

View File

@@ -1,9 +1,8 @@
#!/usr/bin/env python
import ruamel.yaml
from ruamel.yaml.scalarstring import FoldedScalarString, SingleQuotedScalarString
from ruamel.yaml.scalarstring import SingleQuotedScalarString
import pathlib
import textwrap
import os
# The default set of CodeQL Bundle versions to use for the PR checks.
@@ -18,6 +17,8 @@ defaultTestVersions = [
"stable-v2.20.7",
# The last CodeQL release in the 2.21 series.
"stable-v2.21.4",
# The last CodeQL release in the 2.22 series.
"stable-v2.22.4",
# The default version of CodeQL for Dotcom, as determined by feature flags.
"default",
# The version of CodeQL shipped with the Action in `defaults.json`. During the release process
@@ -230,6 +231,13 @@ for file in sorted((this_dir / 'checks').glob('*.yml')):
checkJob['env']['CODEQL_ACTION_TEST_MODE'] = True
checkName = file.stem
# Add this check to the collection of all PR checks.
collections.setdefault("all", []).append({
'specification': checkSpecification,
'checkName': checkName,
'inputs': workflowInputs
})
# If this check belongs to a named collection, record it.
if 'collection' in checkSpecification:
collection_name = checkSpecification['collection']
@@ -249,12 +257,6 @@ for file in sorted((this_dir / 'checks').glob('*.yml')):
'GO111MODULE': 'auto'
},
'on': {
'push': {
'branches': ['main', 'releases/v*']
},
'pull_request': {
'types': ["opened", "synchronize", "reopened", "ready_for_review"]
},
'schedule': [{'cron': SingleQuotedScalarString('0 5 * * *')}],
'workflow_dispatch': {
'inputs': workflowInputs
@@ -323,6 +325,9 @@ for collection_name in collections:
'workflow_dispatch': {
'inputs': combinedInputs
},
'workflow_call': {
'inputs': combinedInputs
}
},
'jobs': jobs
}, output_stream)

View File

@@ -608,13 +608,17 @@ async function run() {
// If the feature flag to minimize Java dependency jars is enabled, and we are doing a Java
// `build-mode: none` analysis (i.e. the flag is relevant), then set the environment variable
// that enables the corresponding option in the Java extractor.
// that enables the corresponding option in the Java extractor. We also only do this if
// dependency caching is enabled, since the option is intended to reduce the size of
// dependency caches, but the jar-rewriting does have a performance cost that we'd like to avoid
// when caching is not being used.
if (process.env[EnvVar.JAVA_EXTRACTOR_MINIMIZE_DEPENDENCY_JARS]) {
logger.debug(
`${EnvVar.JAVA_EXTRACTOR_MINIMIZE_DEPENDENCY_JARS} is already set to '${process.env[EnvVar.JAVA_EXTRACTOR_MINIMIZE_DEPENDENCY_JARS]}', so the Action will not override it.`,
);
} else if (
minimizeJavaJars &&
config.dependencyCachingEnabled &&
config.buildMode === BuildMode.None &&
config.languages.includes(KnownLanguage.java)
) {

View File

@@ -10,7 +10,11 @@ import { type CodeQL } from "./codeql";
import { type Config } from "./config-utils";
import { getCommitOid, getFileOidsUnderPath } from "./git-utils";
import { Logger, withGroupAsync } from "./logging";
import { isInTestMode, tryGetFolderBytes, withTimeout } from "./util";
import {
isInTestMode,
tryGetFolderBytes,
waitForResultWithTimeLimit,
} from "./util";
export enum OverlayDatabaseMode {
Overlay = "overlay",
@@ -154,7 +158,12 @@ function computeChangedFiles(
// Constants for database caching
const CACHE_VERSION = 1;
const CACHE_PREFIX = "codeql-overlay-base-database";
const MAX_CACHE_OPERATION_MS = 120_000; // Two minutes
// The purpose of this ten-minute limit is to guard against the possibility
// that the cache service is unresponsive, which would otherwise cause the
// entire action to hang. Normally we expect cache operations to complete
// within two minutes.
const MAX_CACHE_OPERATION_MS = 600_000;
/**
* Checks that the overlay-base database is valid by checking for the
@@ -268,7 +277,7 @@ export async function uploadOverlayBaseDatabaseToCache(
);
try {
const cacheId = await withTimeout(
const cacheId = await waitForResultWithTimeLimit(
MAX_CACHE_OPERATION_MS,
actionsCache.saveCache([dbLocation], cacheSaveKey),
() => {},
@@ -346,9 +355,39 @@ export async function downloadOverlayBaseDatabaseFromCache(
let databaseDownloadDurationMs = 0;
try {
const databaseDownloadStart = performance.now();
const foundKey = await withTimeout(
const foundKey = await waitForResultWithTimeLimit(
// This ten-minute limit for the cache restore operation is mainly to
// guard against the possibility that the cache service is unresponsive
// and hangs outside the data download.
//
// Data download (which is normally the most time-consuming part of the
// restore operation) should not run long enough to hit this limit. Even
// for an extremely large 10GB database, at a download speed of 40MB/s
// (see below), the download should complete within five minutes. If we
// do hit this limit, there are likely more serious problems other than
// mere slow download speed.
//
// This is important because we don't want any ongoing file operations
// on the database directory when we do hit this limit. Hitting this
// time limit takes us to a fallback path where we re-initialize the
// database from scratch at dbLocation, and having the cache restore
// operation continue to write into dbLocation in the background would
// really mess things up. We want to hit this limit only in the case
// of a hung cache service, not just slow download speed.
MAX_CACHE_OPERATION_MS,
actionsCache.restoreCache([dbLocation], cacheRestoreKeyPrefix),
actionsCache.restoreCache(
[dbLocation],
cacheRestoreKeyPrefix,
undefined,
{
// Azure SDK download (which is the default) uses 128MB segments; see
// https://github.com/actions/toolkit/blob/main/packages/cache/README.md.
// Setting segmentTimeoutInMs to 3000 translates to segment download
// speed of about 40 MB/s, which should be achievable unless the
// download is unreliable (in which case we do want to abort).
segmentTimeoutInMs: 3000,
},
),
() => {
logger.info("Timed out downloading overlay-base database from cache");
},

View File

@@ -16,7 +16,7 @@ import {
getErrorMessage,
isHTTPError,
tryGetFolderBytes,
withTimeout,
waitForResultWithTimeLimit,
} from "./util";
// This constant should be bumped if we make a breaking change
@@ -96,7 +96,7 @@ export async function downloadTrapCaches(
logger.info(
`Looking in Actions cache for TRAP cache with key ${preferredKey}`,
);
const found = await withTimeout(
const found = await waitForResultWithTimeLimit(
MAX_CACHE_OPERATION_MS,
actionsCache.restoreCache([cacheDir], preferredKey, [
// Fall back to any cache with the right key prefix
@@ -156,7 +156,7 @@ export async function uploadTrapCaches(
process.env.GITHUB_SHA || "unknown",
);
logger.info(`Uploading TRAP cache to Actions cache with key ${key}`);
await withTimeout(
await waitForResultWithTimeLimit(
MAX_CACHE_OPERATION_MS,
actionsCache.saveCache([cacheDir], key),
() => {

View File

@@ -32,6 +32,55 @@ interface UploadSarifStatusReport
extends StatusReportBase,
upload_lib.UploadStatusReport {}
/**
* Searches for SARIF files for the given `analysis` in the given `sarifPath`.
* If any are found, then they are uploaded to the appropriate endpoint for the given `analysis`.
*
* @param logger The logger to use.
* @param features Information about FFs.
* @param sarifPath The path to a SARIF file or directory containing SARIF files.
* @param pathStats Information about `sarifPath`.
* @param checkoutPath The checkout path.
* @param analysis The configuration of the analysis we should upload SARIF files for.
* @param category The SARIF category to use for the upload.
* @returns The result of uploading the SARIF file(s) or `undefined` if there are none.
*/
async function findAndUpload(
logger: Logger,
features: Features,
sarifPath: string,
pathStats: fs.Stats,
checkoutPath: string,
analysis: analyses.AnalysisConfig,
category?: string,
): Promise<upload_lib.UploadResult | undefined> {
let sarifFiles: string[] | undefined;
if (pathStats.isDirectory()) {
sarifFiles = upload_lib.findSarifFilesInDir(
sarifPath,
analysis.sarifPredicate,
);
} else if (pathStats.isFile() && analysis.sarifPredicate(sarifPath)) {
sarifFiles = [sarifPath];
} else {
return undefined;
}
if (sarifFiles.length !== 0) {
return await upload_lib.uploadSpecifiedFiles(
sarifFiles,
checkoutPath,
category,
features,
logger,
analysis,
);
}
return undefined;
}
async function sendSuccessStatusReport(
startedAt: Date,
uploadStats: upload_lib.UploadStatusReport,
@@ -86,54 +135,71 @@ async function run() {
}
try {
// `sarifPath` can either be a path to a single file, or a path to a directory.
const sarifPath = actionsUtil.getRequiredInput("sarif_file");
const checkoutPath = actionsUtil.getRequiredInput("checkout_path");
const category = actionsUtil.getOptionalInput("category");
const pathStats = fs.lstatSync(sarifPath, { throwIfNoEntry: false });
const uploadResult = await upload_lib.uploadFiles(
sarifPath,
checkoutPath,
category,
features,
if (pathStats === undefined) {
throw new ConfigurationError(`Path does not exist: ${sarifPath}.`);
}
const sarifIds: Array<{ analysis: string; id: string }> = [];
const uploadResult = await findAndUpload(
logger,
features,
sarifPath,
pathStats,
checkoutPath,
analyses.CodeScanning,
category,
);
core.setOutput("sarif-id", uploadResult.sarifID);
if (uploadResult !== undefined) {
core.setOutput("sarif-id", uploadResult.sarifID);
sarifIds.push({
analysis: analyses.AnalysisKind.CodeScanning,
id: uploadResult.sarifID,
});
}
// If there are `.quality.sarif` files in `sarifPath`, then upload those to the code quality service.
// Code quality can currently only be enabled on top of security, so we'd currently always expect to
// have a directory for the results here.
if (fs.lstatSync(sarifPath).isDirectory()) {
const qualitySarifFiles = upload_lib.findSarifFilesInDir(
sarifPath,
analyses.CodeQuality.sarifPredicate,
);
if (qualitySarifFiles.length !== 0) {
await upload_lib.uploadSpecifiedFiles(
qualitySarifFiles,
checkoutPath,
actionsUtil.fixCodeQualityCategory(logger, category),
features,
logger,
analyses.CodeQuality,
);
}
const qualityUploadResult = await findAndUpload(
logger,
features,
sarifPath,
pathStats,
checkoutPath,
analyses.CodeQuality,
actionsUtil.fixCodeQualityCategory(logger, category),
);
if (qualityUploadResult !== undefined) {
sarifIds.push({
analysis: analyses.AnalysisKind.CodeQuality,
id: qualityUploadResult.sarifID,
});
}
core.setOutput("sarif-ids", JSON.stringify(sarifIds));
// We don't upload results in test mode, so don't wait for processing
if (isInTestMode()) {
core.debug("In test mode. Waiting for processing is disabled.");
} else if (actionsUtil.getRequiredInput("wait-for-processing") === "true") {
await upload_lib.waitForProcessing(
getRepositoryNwo(),
uploadResult.sarifID,
logger,
);
if (uploadResult !== undefined) {
await upload_lib.waitForProcessing(
getRepositoryNwo(),
uploadResult.sarifID,
logger,
);
}
// The code quality service does not currently have an endpoint to wait for SARIF processing,
// so we can't wait for that here.
}
await sendSuccessStatusReport(startedAt, uploadResult.statusReport, logger);
await sendSuccessStatusReport(
startedAt,
uploadResult?.statusReport || {},
logger,
);
} catch (unwrappedError) {
const error =
isThirdPartyAnalysis(ActionName.UploadSarif) &&

View File

@@ -297,7 +297,7 @@ test("listFolder", async (t) => {
const longTime = 999_999;
const shortTime = 10;
test("withTimeout on long task", async (t) => {
test("waitForResultWithTimeLimit on long task", async (t) => {
let longTaskTimedOut = false;
const longTask = new Promise((resolve) => {
const timer = setTimeout(() => {
@@ -305,35 +305,43 @@ test("withTimeout on long task", async (t) => {
}, longTime);
t.teardown(() => clearTimeout(timer));
});
const result = await util.withTimeout(shortTime, longTask, () => {
longTaskTimedOut = true;
});
const result = await util.waitForResultWithTimeLimit(
shortTime,
longTask,
() => {
longTaskTimedOut = true;
},
);
t.deepEqual(longTaskTimedOut, true);
t.deepEqual(result, undefined);
});
test("withTimeout on short task", async (t) => {
test("waitForResultWithTimeLimit on short task", async (t) => {
let shortTaskTimedOut = false;
const shortTask = new Promise((resolve) => {
setTimeout(() => {
resolve(99);
}, shortTime);
});
const result = await util.withTimeout(longTime, shortTask, () => {
shortTaskTimedOut = true;
});
const result = await util.waitForResultWithTimeLimit(
longTime,
shortTask,
() => {
shortTaskTimedOut = true;
},
);
t.deepEqual(shortTaskTimedOut, false);
t.deepEqual(result, 99);
});
test("withTimeout doesn't call callback if promise resolves", async (t) => {
test("waitForResultWithTimeLimit doesn't call callback if promise resolves", async (t) => {
let shortTaskTimedOut = false;
const shortTask = new Promise((resolve) => {
setTimeout(() => {
resolve(99);
}, shortTime);
});
const result = await util.withTimeout(100, shortTask, () => {
const result = await util.waitForResultWithTimeLimit(100, shortTask, () => {
shortTaskTimedOut = true;
});
await new Promise((r) => setTimeout(r, 200));

View File

@@ -864,7 +864,7 @@ let hadTimeout = false;
* @param onTimeout A callback to call if the promise times out.
* @returns The result of the promise, or undefined if the promise times out.
*/
export async function withTimeout<T>(
export async function waitForResultWithTimeLimit<T>(
timeoutMs: number,
promise: Promise<T>,
onTimeout: () => void,
@@ -894,7 +894,7 @@ export async function withTimeout<T>(
* Check if the global hadTimeout variable has been set, and if so then
* exit the process to ensure any background tasks that are still running
* are killed. This should be called at the end of execution if the
* `withTimeout` function has been used.
* `waitForResultWithTimeLimit` function has been used.
*/
export async function checkForTimeout() {
if (hadTimeout === true) {

View File

@@ -14,7 +14,7 @@ inputs:
required: false
default: ${{ github.workspace }}
ref:
description: "The ref where results will be uploaded. If not provided, the Action will use the GITHUB_REF environment variable. If provided, the sha input must be provided as well. This input is ignored for pull requests from forks."
description: "The ref where results will be uploaded. If not provided, the Action will use the GITHUB_REF environment variable. If provided, the sha input must be provided as well. This input is ignored for pull requests from forks. Expected format: refs/heads/<branch name>, refs/tags/<tag>, refs/pull/<number>/merge, or refs/pull/<number>/head."
required: false
sha:
description: "The sha of the HEAD of the ref where results will be uploaded. If not provided, the Action will use the GITHUB_SHA environment variable. If provided, the ref input must be provided as well. This input is ignored for pull requests from forks."
@@ -34,7 +34,12 @@ inputs:
default: "true"
outputs:
sarif-id:
description: The ID of the uploaded SARIF file.
description: The ID of the uploaded Code Scanning SARIF file, if any.
sarif-ids:
description: |
A stringified JSON object containing the SARIF ID for each kind of analysis. For example:
{ "code-scanning": "some-id", "code-quality": "some-other-id" }
runs:
using: node20
main: '../lib/upload-sarif-action.js'