Compare commits

...

129 Commits

Author SHA1 Message Date
Andrew Eisenberg
70df9def86 Merge pull request #2808 from github/aeisenberg/fix-dependabot
Fix dependabot errors
2025-03-14 13:49:58 -07:00
Andrew Eisenberg
5f98c40063 Fix dependabot errors
I explicitly had to downgrade "@octokit/plugin-retry" to "^6.0.0". Other
dependencies were upgraded.
2025-03-14 13:13:56 -07:00
Chuan-kai Lin
f338ec87a3 Merge pull request #2806 from github/cklin/delete-unused-git-utils
git-utils: deleted unused functions
2025-03-13 11:51:05 -07:00
Chuan-kai Lin
c31f6c89e8 git-utils: deleted unused functions 2025-03-13 10:45:14 -07:00
Andrew Eisenberg
dc49dcabdb Merge pull request #2800 from github/aeisenberg/remove-minimatch
Minimally remove micromatch
2025-03-11 16:01:07 -07:00
Andrew Eisenberg
7254660adc Merge pull request #2804 from github/dependabot/github_actions/actions-96d25c356e
build(deps): bump ruby/setup-ruby from 1.221.0 to 1.222.0 in the actions group
2025-03-11 08:53:45 -07:00
Chuan-kai Lin
13f2f96cdd Merge pull request #2801 from github/cklin/overlay-databases
Basic support for overlay databases
2025-03-11 08:33:33 -07:00
Chuan-kai Lin
0efe12d12c build: refresh js files 2025-03-10 13:31:46 -07:00
Chuan-kai Lin
ff5f0b9efd Support overlay database creation
This commit adds support for creating overlay-base and overlay
databases, controlled via the CODEQL_OVERLAY_DATABASE_MODE environment
variable.
2025-03-10 13:25:46 -07:00
Chuan-kai Lin
270886f805 Pass overlay mode into databaseInitCluster()
This commit adds a OverlayDatabaseMode parameter to
databaseInitCluster(). The parameter controls the "codeql database init"
flags concerning overlay database creation.

There is no behavior change in this commit because we always pass
OverlayDatabaseMode.None to databaseInitCluster(). That will change in
the next commit.
2025-03-10 13:22:24 -07:00
Andrew Eisenberg
d3762699d1 Update pr-check 2025-03-10 11:22:58 -07:00
Henry Mercer
b46b37a8a3 Merge pull request #2803 from github/dependabot/npm_and_yarn/npm-129f0c3752
build(deps-dev): bump the npm group with 3 updates
2025-03-10 18:01:08 +00:00
dependabot[bot]
aecf01557d build(deps): bump ruby/setup-ruby in the actions group
Bumps the actions group with 1 update: [ruby/setup-ruby](https://github.com/ruby/setup-ruby).


Updates `ruby/setup-ruby` from 1.221.0 to 1.222.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](32110d4e31...277ba2a127)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-10 17:57:35 +00:00
github-actions[bot]
053e2184a0 Update checked-in dependencies 2025-03-10 17:42:57 +00:00
dependabot[bot]
248ab9b811 build(deps-dev): bump the npm group with 3 updates
Bumps the npm group with 3 updates: [@eslint/js](https://github.com/eslint/eslint/tree/HEAD/packages/js), [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin) and [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser).


Updates `@eslint/js` from 9.21.0 to 9.22.0
- [Release notes](https://github.com/eslint/eslint/releases)
- [Changelog](https://github.com/eslint/eslint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslint/commits/v9.22.0/packages/js)

Updates `@typescript-eslint/eslint-plugin` from 8.26.0 to 8.26.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.26.1/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.26.0 to 8.26.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.26.1/packages/parser)

---
updated-dependencies:
- dependency-name: "@eslint/js"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-10 17:42:05 +00:00
Chuan-kai Lin
d76f393713 Do not set --expect-discarded-cache on "cleanup-level: overlay"
When a user specifies "cleanup-level: overlay", it suggests that the
user wishes to preserve the evaluation cache for future use. So in this
case we should not set --expect-discarded-cache when running queries.
2025-03-10 10:32:13 -07:00
Andrew Eisenberg
88676f2b14 Minimally remove micromatch 2025-03-07 10:07:08 -08:00
Chuan-kai Lin
b2e6519679 Merge pull request #2799 from github/mergeback/v3.28.11-to-main-6bb031af
Mergeback v3.28.11 refs/heads/releases/v3 into main
2025-03-07 08:34:57 -08:00
github-actions[bot]
ff91c9db25 Update checked-in dependencies 2025-03-07 16:12:00 +00:00
github-actions[bot]
d1b3f740d8 Update changelog and version after v3.28.11 2025-03-07 16:09:54 +00:00
Chuan-kai Lin
6bb031afdd Merge pull request #2798 from github/update-v3.28.11-56b25d5d5
Merge main into releases/v3
2025-03-07 08:09:23 -08:00
github-actions[bot]
6bca7dd940 Update changelog for v3.28.11 2025-03-07 14:28:04 +00:00
Chuan-kai Lin
56b25d5d52 Merge pull request #2793 from github/update-bundle/codeql-bundle-v2.20.6
Update default bundle to 2.20.6
2025-03-06 07:12:12 -08:00
Chuan-kai Lin
256aa16582 Merge branch 'main' into update-bundle/codeql-bundle-v2.20.6 2025-03-06 06:59:38 -08:00
Nick Fyson
911d845ab6 Merge pull request #2796 from github/nickfyson/adjust-rate-error-string
adjust string for handling rate limit error
2025-03-06 10:45:00 +00:00
nickfyson
7b7ed63503 adjust string for handling rate limit error 2025-03-06 10:33:25 +00:00
Henry Mercer
608ccd6cd9 Merge pull request #2794 from github/update-supported-enterprise-server-versions
Update supported GitHub Enterprise Server versions
2025-03-05 14:41:52 +00:00
github-actions[bot]
35d04d3627 Update supported GitHub Enterprise Server versions 2025-03-05 00:15:30 +00:00
Chuan-kai Lin
ec3b22164b Update supported GitHub Enterprise Server versions 2025-03-03 13:06:35 -08:00
github-actions[bot]
8dc01f6342 Add changelog note 2025-03-03 20:54:07 +00:00
github-actions[bot]
b378daf0bc Update default bundle to codeql-bundle-v2.20.6 2025-03-03 20:54:03 +00:00
Dave Bartolomeo
80f9930395 Merge pull request #2788 from github/dbartol/use-real-actions-extractor
Use embedded `actions` extractor only for old CLI versions
2025-03-03 13:59:30 -05:00
Angela P Wen
f544ec5e4a Merge pull request #2791 from github/dependabot/npm_and_yarn/npm-24c237cb71
build(deps): bump the npm group with 9 updates
2025-03-03 10:56:56 -08:00
Dave Bartolomeo
d37931ae65 Merge remote-tracking branch 'origin/main' into dbartol/use-real-actions-extractor 2025-03-03 13:01:21 -05:00
Angela P Wen
4b35b04661 Merge pull request #2792 from github/dependabot/github_actions/actions-f0e7f3112e
build(deps): bump actions/create-github-app-token from 1.11.5 to 1.11.6 in the actions group
2025-03-03 09:40:10 -08:00
dependabot[bot]
1a69221aeb build(deps): bump actions/create-github-app-token in the actions group
Bumps the actions group with 1 update: [actions/create-github-app-token](https://github.com/actions/create-github-app-token).


Updates `actions/create-github-app-token` from 1.11.5 to 1.11.6
- [Release notes](https://github.com/actions/create-github-app-token/releases)
- [Commits](https://github.com/actions/create-github-app-token/compare/v1.11.5...v1.11.6)

---
updated-dependencies:
- dependency-name: actions/create-github-app-token
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-03 17:26:51 +00:00
github-actions[bot]
452ffd6e8e Update checked-in dependencies 2025-03-03 17:25:47 +00:00
dependabot[bot]
a8ade63a2f build(deps): bump the npm group with 9 updates
Bumps the npm group with 9 updates:

| Package | From | To |
| --- | --- | --- |
| [@actions/cache](https://github.com/actions/toolkit/tree/HEAD/packages/cache) | `4.0.1` | `4.0.2` |
| [uuid](https://github.com/uuidjs/uuid) | `11.0.5` | `11.1.0` |
| [@eslint/eslintrc](https://github.com/eslint/eslintrc) | `3.2.0` | `3.3.0` |
| [@eslint/js](https://github.com/eslint/eslint/tree/HEAD/packages/js) | `9.20.0` | `9.21.0` |
| [@types/sinon](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/sinon) | `17.0.3` | `17.0.4` |
| [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin) | `8.24.1` | `8.26.0` |
| [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser) | `8.24.1` | `8.26.0` |
| [eslint-import-resolver-typescript](https://github.com/import-js/eslint-import-resolver-typescript) | `3.8.1` | `3.8.3` |
| [typescript](https://github.com/microsoft/TypeScript) | `5.7.3` | `5.8.2` |


Updates `@actions/cache` from 4.0.1 to 4.0.2
- [Changelog](https://github.com/actions/toolkit/blob/main/packages/cache/RELEASES.md)
- [Commits](https://github.com/actions/toolkit/commits/HEAD/packages/cache)

Updates `uuid` from 11.0.5 to 11.1.0
- [Release notes](https://github.com/uuidjs/uuid/releases)
- [Changelog](https://github.com/uuidjs/uuid/blob/main/CHANGELOG.md)
- [Commits](https://github.com/uuidjs/uuid/compare/v11.0.5...v11.1.0)

Updates `@eslint/eslintrc` from 3.2.0 to 3.3.0
- [Release notes](https://github.com/eslint/eslintrc/releases)
- [Changelog](https://github.com/eslint/eslintrc/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslintrc/compare/v3.2.0...v3.3.0)

Updates `@eslint/js` from 9.20.0 to 9.21.0
- [Release notes](https://github.com/eslint/eslint/releases)
- [Changelog](https://github.com/eslint/eslint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslint/commits/v9.21.0/packages/js)

Updates `@types/sinon` from 17.0.3 to 17.0.4
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/sinon)

Updates `@typescript-eslint/eslint-plugin` from 8.24.1 to 8.26.0
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.26.0/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.24.1 to 8.26.0
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.26.0/packages/parser)

Updates `eslint-import-resolver-typescript` from 3.8.1 to 3.8.3
- [Release notes](https://github.com/import-js/eslint-import-resolver-typescript/releases)
- [Changelog](https://github.com/import-js/eslint-import-resolver-typescript/blob/master/CHANGELOG.md)
- [Commits](https://github.com/import-js/eslint-import-resolver-typescript/compare/v3.8.1...v3.8.3)

Updates `typescript` from 5.7.3 to 5.8.2
- [Release notes](https://github.com/microsoft/TypeScript/releases)
- [Changelog](https://github.com/microsoft/TypeScript/blob/main/azure-pipelines.release.yml)
- [Commits](https://github.com/microsoft/TypeScript/compare/v5.7.3...v5.8.2)

---
updated-dependencies:
- dependency-name: "@actions/cache"
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: uuid
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@eslint/eslintrc"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@eslint/js"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@types/sinon"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: eslint-import-resolver-typescript
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: typescript
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-03 17:24:56 +00:00
Henry Mercer
2db5b5a35f Merge pull request #2786 from github/henrymercer/more-config-errors
Add some more configuration errors
2025-03-03 12:21:31 +00:00
Dave Bartolomeo
85e30fe57a Use embedded actions extractor only for old CLI versions 2025-02-27 15:42:11 -05:00
Paolo Tranquilli
83923549f6 Merge pull request #2776 from github/redsun82/just
Do some just+pre-commit tweaking
2025-02-27 12:40:34 +01:00
Paolo Tranquilli
96632630a9 Do some just+pre-commit tweaking
* pre-commit: move the linting check ahead of the compiling one, as a
  typescript lint can change the compilaed javascript, so you can end up
  in a situation where the pre-commit check fails twice in a row
* just: add linting and make the default to run all
2025-02-27 08:10:04 +01:00
Ian Lynagh
97aac9bb56 Merge pull request #2785 from github/igfoo/mb
Warn about small amounts of MB, not GB
2025-02-26 16:56:30 +00:00
Henry Mercer
d59d0eb99a Add CLI error for failure to create temp directory 2025-02-26 16:52:50 +00:00
Henry Mercer
0ae74e1ae0 Check for running out of disk space 2025-02-26 16:52:50 +00:00
Ian Lynagh
146dd5cfb0 npm run build 2025-02-26 15:12:53 +00:00
Ian Lynagh
32505c6f2d Warn about small amounts of MB, not GB
The number of GB is at most 2, and can be tiny. MB gives a more
comprehensible range of values.
2025-02-26 15:11:14 +00:00
Angela P Wen
8c69433c34 Merge pull request #2782 from github/angelapwen/fix-unversioned-immutable-action
PR Checks: use semantic versioning for `create-github-app-token`
2025-02-25 08:49:25 -08:00
Angela P Wen
c4f2a076e5 PR Checks: use semantic versioning for create-github-app-token 2025-02-24 17:06:31 -08:00
Angela P Wen
a8849fbe63 Merge pull request #2781 from github/angelapwen/fix-code-injection-warning
Fix code injection warnings in `check-codescanning-config` internal Action
2025-02-24 16:53:51 -08:00
Angela P Wen
628c1e669a Remove print debugging 2025-02-24 13:29:47 -08:00
Angela P Wen
e12eb8d7c1 Set environment variable in the correct step 2025-02-24 13:24:22 -08:00
Angela P Wen
3b348d9a54 Debug only: print environment variable 2025-02-24 13:18:08 -08:00
Angela P Wen
7567eab606 Fail when expected config does not exist 2025-02-24 13:17:24 -08:00
Angela P Wen
a9f7529f47 Quote expected-config-file-contents input 2025-02-24 13:05:29 -08:00
Angela P Wen
5e88a178fe Update .github/actions/check-codescanning-config/action.yml
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-02-24 12:52:19 -08:00
Angela P Wen
c0a8eb9a67 Use $RUNNER_TEMP for good measure
`runner.temp` is not user-controlled but we replace it with `$RUNNER_TEMP` in any case.
2025-02-24 12:35:51 -08:00
Angela P Wen
286fd68a67 Use env var for EXPECTED_CONFIG_FILE_CONTENTS 2025-02-24 12:35:17 -08:00
Angela P Wen
d3c7d03197 Merge pull request #2780 from github/angelapwen/fix-inconsistent-action-input
Unify `token` description for `resolve-environment`, `start-proxy`, and `upload-sarif`
2025-02-24 12:09:09 -08:00
Angela P Wen
03c921eac5 Unify token description for resolve-environment, start-proxy, and upload-sarif` 2025-02-24 11:26:00 -08:00
Ian Lynagh
ff79de67cc Merge pull request #2779 from github/mergeback/v3.28.10-to-main-b56ba49b
Mergeback v3.28.10 refs/heads/releases/v3 into main
2025-02-21 16:32:29 +00:00
github-actions[bot]
5d1a3cb0ee Update checked-in dependencies 2025-02-21 16:12:38 +00:00
github-actions[bot]
2923046360 Update changelog and version after v3.28.10 2025-02-21 16:09:55 +00:00
Ian Lynagh
b56ba49b26 Merge pull request #2778 from github/update-v3.28.10-9856c48b1
Merge main into releases/v3
2025-02-21 16:09:01 +00:00
github-actions[bot]
60c9c77c33 Update changelog for v3.28.10 2025-02-21 15:15:06 +00:00
Paolo Tranquilli
9856c48b1a Merge pull request #2773 from github/redsun82/rust
Support rust analysis
2025-02-20 18:03:30 +01:00
Paolo Tranquilli
9572e09da4 Rust: fix log string 2025-02-20 17:38:35 +01:00
Paolo Tranquilli
1a529366ac Rust: special case default setup 2025-02-20 17:38:02 +01:00
Ian Lynagh
cf7e90952b Merge pull request #2772 from github/update-bundle/codeql-bundle-v2.20.5
Update default bundle to 2.20.5
2025-02-20 14:19:30 +00:00
Ian Lynagh
b7006aab6d Merge branch 'main' into update-bundle/codeql-bundle-v2.20.5 2025-02-20 13:27:14 +00:00
Paolo Tranquilli
cfedae723e Rust: throw configuration errors if requested and not correctly enabled 2025-02-20 11:49:32 +01:00
Paolo Tranquilli
3971ed2a74 Merge branch 'main' into redsun82/rust 2025-02-20 08:13:54 +01:00
Angela P Wen
d38c6e60df Merge pull request #2775 from github/angelapwen/bump-octokit
Upgrade `octokit` to v4.1.2
2025-02-19 11:31:42 -08:00
github-actions[bot]
c0d59dba56 Update checked-in dependencies 2025-02-19 19:16:52 +00:00
Angela P Wen
c1745a9831 Upgrade octokit to v4.1.2 2025-02-19 11:13:12 -08:00
Henry Mercer
67e48c1eaf Merge branch 'main' into update-bundle/codeql-bundle-v2.20.5 2025-02-19 18:38:45 +00:00
Chuan-kai Lin
dbbcbe019d Merge pull request #2765 from github/cklin/alert-diff-filtering
Perform consistent diff-informed alert filtering in the action
2025-02-19 10:15:01 -08:00
Paolo Tranquilli
fb3e7cdd88 Merge pull request #2774 from github/redsun82/sync
Fix sync recipes and add base `justfile`
2025-02-19 17:26:08 +01:00
Paolo Tranquilli
ff50469ca0 Add comments to the justfile 2025-02-19 17:13:51 +01:00
Paolo Tranquilli
d0aab9fc20 Fix sync recipes and add base justfile
Both the justfile and the pre-commit configuration for the `pr-check`
sync were broken:
* justfiles run recipes one line at a time in a fresh shell, so the venv
  activation was not working
* the pre-commit config was relying on an installed `ruamel.yaml`
  pakcage, but the default one installable via `apt` on Ubuntu 24.04 is
  old and generates different output (with formatting differences).

Now:
* the venv dance is put in a separate bash script
* both just and pre-commit will use that same script, so both problems
  will be fixed

As a bonus, a root `justfile` is added exposing the `update-pr-checks`
recipes plus a `build` one. Running `just` without arguments will also
now call the default `sync` recipes that will call both of the above.
2025-02-19 16:51:46 +01:00
Paolo Tranquilli
c9ebc3bb8b Regenerate workflows with more recent ruamel.yaml 2025-02-19 16:21:48 +01:00
Paolo Tranquilli
a7b17782a9 Support rust analysis
This is supposed to enable rust analysis for the staff ship only.
2025-02-19 15:56:52 +01:00
Chuan-kai Lin
f85d8b5a74 build: refresh js files 2025-02-19 06:26:33 -08:00
Chuan-kai Lin
dae1626680 Filter alerts by pr-diff-range JSON file 2025-02-19 06:26:11 -08:00
Henry Mercer
d99c7e8e5b Merge pull request #2771 from github/revert-2767-cklin/prefer-gtar
Revert "Prefer gtar if available"
2025-02-18 16:05:36 +00:00
github-actions[bot]
eb88b40ca4 Add changelog note 2025-02-18 12:37:24 +00:00
github-actions[bot]
6b1da0d33e Update default bundle to codeql-bundle-v2.20.5 2025-02-18 12:37:20 +00:00
Henry Mercer
906452d251 Merge branch 'main' into revert-2767-cklin/prefer-gtar 2025-02-18 10:47:19 +00:00
Henry Mercer
0656d7fb91 Add changelog note for #2768 2025-02-18 10:45:37 +00:00
Henry Mercer
1bb15d06a6 Merge pull request #2768 from github/smowton/fix/zstd-tarball-trailing-zeros
Pass `--ignore-zeros` to `tar` when decompressing `zstd`-compressed tarballs
2025-02-18 10:42:42 +00:00
Henry Mercer
65a3aa1fbc Revert "Prefer gtar if available" 2025-02-18 10:38:41 +00:00
Henry Mercer
acadfedea5 Merge pull request #2770 from github/dependabot/npm_and_yarn/npm-17cd1da1dd
build(deps): bump the npm group with 5 updates
2025-02-17 19:30:47 +00:00
Henry Mercer
1930ca4359 Merge pull request #2769 from github/dependabot/github_actions/actions-60ccfc8cbe
build(deps): bump the actions group with 2 updates
2025-02-17 19:30:27 +00:00
Henry Mercer
1d4f241470 Update generated workflow source 2025-02-17 19:17:28 +00:00
github-actions[bot]
9dfa165835 Update checked-in dependencies 2025-02-17 18:21:02 +00:00
dependabot[bot]
47d5364431 build(deps): bump the npm group with 5 updates
Bumps the npm group with 5 updates:

| Package | From | To |
| --- | --- | --- |
| [@actions/cache](https://github.com/actions/toolkit/tree/HEAD/packages/cache) | `4.0.0` | `4.0.1` |
| [long](https://github.com/dcodeIO/long.js) | `5.3.0` | `5.3.1` |
| [@typescript-eslint/eslint-plugin](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/eslint-plugin) | `8.23.0` | `8.24.1` |
| [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser) | `8.23.0` | `8.24.1` |
| [eslint-import-resolver-typescript](https://github.com/import-js/eslint-import-resolver-typescript) | `3.7.0` | `3.8.1` |


Updates `@actions/cache` from 4.0.0 to 4.0.1
- [Changelog](https://github.com/actions/toolkit/blob/main/packages/cache/RELEASES.md)
- [Commits](https://github.com/actions/toolkit/commits/HEAD/packages/cache)

Updates `long` from 5.3.0 to 5.3.1
- [Release notes](https://github.com/dcodeIO/long.js/releases)
- [Commits](https://github.com/dcodeIO/long.js/compare/v5.3.0...v5.3.1)

Updates `@typescript-eslint/eslint-plugin` from 8.23.0 to 8.24.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/eslint-plugin/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.24.1/packages/eslint-plugin)

Updates `@typescript-eslint/parser` from 8.23.0 to 8.24.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.24.1/packages/parser)

Updates `eslint-import-resolver-typescript` from 3.7.0 to 3.8.1
- [Release notes](https://github.com/import-js/eslint-import-resolver-typescript/releases)
- [Changelog](https://github.com/import-js/eslint-import-resolver-typescript/blob/master/CHANGELOG.md)
- [Commits](https://github.com/import-js/eslint-import-resolver-typescript/compare/v3.7.0...v3.8.1)

---
updated-dependencies:
- dependency-name: "@actions/cache"
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: long
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@typescript-eslint/eslint-plugin"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: "@typescript-eslint/parser"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: eslint-import-resolver-typescript
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-17 18:20:12 +00:00
dependabot[bot]
30b1c2ae15 build(deps): bump the actions group with 2 updates
Bumps the actions group with 2 updates: [ruby/setup-ruby](https://github.com/ruby/setup-ruby) and [actions/create-github-app-token](https://github.com/actions/create-github-app-token).


Updates `ruby/setup-ruby` from 1.218.0 to 1.221.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](d781c1b4ed...32110d4e31)

Updates `actions/create-github-app-token` from 1.11.3 to 1.11.5
- [Release notes](https://github.com/actions/create-github-app-token/releases)
- [Commits](67e27a7eb7...0d564482f0)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
- dependency-name: actions/create-github-app-token
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-17 17:37:07 +00:00
github-actions[bot]
c4158ff890 Rebuild 2025-02-17 17:21:30 +00:00
Chris Smowton
2be5f244ff Pass --ignore-zeros to tar when decompressing zstd-compressed tarballs
See comment in the diff for full explanation.
2025-02-17 17:04:36 +00:00
Ian Lynagh
8c1551cdd4 Merge pull request #2767 from github/cklin/prefer-gtar
Prefer gtar if available
2025-02-17 12:31:54 +00:00
Chuan-kai Lin
fc4873bed7 Changelog entry: Prefer gtar if available 2025-02-14 13:57:09 -08:00
Chuan-kai Lin
c3ad6e9deb build: refresh js files 2025-02-14 13:40:54 -08:00
Chuan-kai Lin
61c77a48ff Prefer gtar if available 2025-02-14 13:34:30 -08:00
Chuan-kai Lin
4267fa66a2 getTarVersion(): add programName parameter
This commit changes getTarVersion() so that it receives the name of the
tar program from the caller instead of using the hardcoded string "tar".
2025-02-14 13:24:19 -08:00
Chuan-kai Lin
c4a8587f45 Add TarVersion.name field
This refactoring commit records the name of the tar program in the new
TarVersion.name field and makes extractTarZst() use the new field
instead of the hardcoded name "tar". Code behavior remains unchanged
because currently TarVersion.name is always "tar".

This is the first step toward supporting a tar program under a different
executable name.
2025-02-14 12:08:07 -08:00
Chuan-kai Lin
77bc2a595e Write pr-diff-range JSON file 2025-02-14 08:50:52 -08:00
Henry Mercer
1c15a48f3f Merge pull request #2762 from github/henrymercer/debug-upload-nit
Improve logs for combined SARIF debug artifact
2025-02-12 20:49:09 +00:00
Henry Mercer
3df6d20d31 Improve logs for combined SARIF debug artifact
Don't start a "Uploading combined SARIF debug artifact" log group if we aren't going to do the upload.
2025-02-12 16:27:40 +00:00
Michael B. Gale
affec202b3 Merge pull request #2656 from github/go/1.24
Go: Use Go `1.24` in PR checks
2025-02-12 10:03:43 +00:00
Owen Mansel-Chan
a963b41ebd Merge branch 'main' into go/1.24 2025-02-11 22:38:14 +00:00
Owen Mansel-Chan
683c0f5360 Update Go version to 1.24.0 2025-02-11 22:15:05 +00:00
Henry Mercer
6063925771 Merge pull request #2760 from github/dependabot/github_actions/actions-ee85065439
build(deps): bump the actions group with 2 updates
2025-02-10 17:48:20 +00:00
Henry Mercer
67eb53aecb Merge pull request #2759 from github/dependabot/npm_and_yarn/npm-692b17fb19
build(deps): bump the npm group with 5 updates
2025-02-10 17:39:57 +00:00
Henry Mercer
226ab86c29 Update generated workflow source 2025-02-10 17:36:44 +00:00
dependabot[bot]
078f43891a build(deps): bump the actions group with 2 updates
Bumps the actions group with 2 updates: [ruby/setup-ruby](https://github.com/ruby/setup-ruby) and [actions/create-github-app-token](https://github.com/actions/create-github-app-token).


Updates `ruby/setup-ruby` from 1.215.0 to 1.218.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](2654679fe7...d781c1b4ed)

Updates `actions/create-github-app-token` from 1.11.2 to 1.11.3
- [Release notes](https://github.com/actions/create-github-app-token/releases)
- [Commits](136412a57a...67e27a7eb7)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
- dependency-name: actions/create-github-app-token
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-10 17:31:53 +00:00
github-actions[bot]
ccc5046d0b Update checked-in dependencies 2025-02-10 17:23:18 +00:00
dependabot[bot]
8c70d43f73 build(deps): bump the npm group with 5 updates
Bumps the npm group with 5 updates:

| Package | From | To |
| --- | --- | --- |
| [long](https://github.com/dcodeIO/long.js) | `5.2.4` | `5.3.0` |
| [semver](https://github.com/npm/node-semver) | `7.7.0` | `7.7.1` |
| [@eslint/js](https://github.com/eslint/eslint/tree/HEAD/packages/js) | `9.19.0` | `9.20.0` |
| [eslint-plugin-github](https://github.com/github/eslint-plugin-github) | `5.1.7` | `5.1.8` |
| [nock](https://github.com/nock/nock) | `14.0.0` | `14.0.1` |


Updates `long` from 5.2.4 to 5.3.0
- [Release notes](https://github.com/dcodeIO/long.js/releases)
- [Commits](https://github.com/dcodeIO/long.js/compare/v5.2.4...v5.3.0)

Updates `semver` from 7.7.0 to 7.7.1
- [Release notes](https://github.com/npm/node-semver/releases)
- [Changelog](https://github.com/npm/node-semver/blob/main/CHANGELOG.md)
- [Commits](https://github.com/npm/node-semver/compare/v7.7.0...v7.7.1)

Updates `@eslint/js` from 9.19.0 to 9.20.0
- [Release notes](https://github.com/eslint/eslint/releases)
- [Changelog](https://github.com/eslint/eslint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/eslint/eslint/commits/v9.20.0/packages/js)

Updates `eslint-plugin-github` from 5.1.7 to 5.1.8
- [Release notes](https://github.com/github/eslint-plugin-github/releases)
- [Commits](https://github.com/github/eslint-plugin-github/compare/v5.1.7...v5.1.8)

Updates `nock` from 14.0.0 to 14.0.1
- [Release notes](https://github.com/nock/nock/releases)
- [Changelog](https://github.com/nock/nock/blob/main/CHANGELOG.md)
- [Commits](https://github.com/nock/nock/compare/v14.0.0...v14.0.1)

---
updated-dependencies:
- dependency-name: long
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: semver
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: "@eslint/js"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm
- dependency-name: eslint-plugin-github
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
- dependency-name: nock
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-10 17:22:20 +00:00
Arthur Baars
0a35e8f686 Merge pull request #2758 from github/mergeback/v3.28.9-to-main-9e8d0789
Mergeback v3.28.9 refs/heads/releases/v3 into main
2025-02-07 11:58:16 +01:00
github-actions[bot]
fb1a08b0c7 Update checked-in dependencies 2025-02-07 10:36:17 +00:00
github-actions[bot]
fc5ba27156 Update changelog and version after v3.28.9 2025-02-07 10:35:07 +00:00
Arthur Baars
9e8d0789d4 Merge pull request #2757 from github/update-v3.28.9-24e1c2d33
Merge main into releases/v3
2025-02-07 11:34:10 +01:00
github-actions[bot]
43d9be6701 Update changelog for v3.28.9 2025-02-07 10:18:39 +00:00
Owen Mansel-Chan
7b5dd253ad Update Go version to 1.24.0-rc.3 2025-02-06 17:07:29 +00:00
Arthur Baars
24e1c2d337 Merge pull request #2753 from github/update-bundle/codeql-bundle-v2.20.4
Update default bundle to 2.20.4
2025-02-06 11:59:36 +01:00
github-actions[bot]
57a08c0c7f Add changelog note 2025-02-04 11:22:54 +00:00
github-actions[bot]
52189d23af Update default bundle to codeql-bundle-v2.20.4 2025-02-04 11:22:50 +00:00
Owen Mansel-Chan
0d043c929c Update to rc2 2025-01-17 09:20:52 +00:00
Henry Mercer
695f3263e3 Merge branch 'main' into go/1.24 2024-12-19 16:14:19 +00:00
Michael B. Gale
7b4c9fef7d Go: Use 1.24rc1 in PR checks 2024-12-17 15:50:18 +00:00
5267 changed files with 339211 additions and 539067 deletions

View File

@@ -61,11 +61,12 @@ runs:
- name: Check config
working-directory: ${{ github.action_path }}
shell: bash
run: ts-node ./index.ts "${{ runner.temp }}/user-config.yaml" '${{ inputs.expected-config-file-contents }}'
env:
EXPECTED_CONFIG_FILE_CONTENTS: '${{ inputs.expected-config-file-contents }}'
run: ts-node ./index.ts "$RUNNER_TEMP/user-config.yaml" "$EXPECTED_CONFIG_FILE_CONTENTS"
- name: Clean up
shell: bash
if: always()
run: |
rm -rf ${{ runner.temp }}/codescanning-config-cli-test
rm -rf ${{ runner.temp }}/user-config.yaml
rm -rf $RUNNER_TEMP/codescanning-config-cli-test
rm -rf $RUNNER_TEMP/user-config.yaml

View File

@@ -8,7 +8,7 @@ const actualConfig = loadActualConfig()
const rawExpectedConfig = process.argv[3].trim()
if (!rawExpectedConfig) {
core.info('No expected configuration provided')
core.setFailed('No expected configuration provided')
} else {
core.startGroup('Expected generated user config')
core.info(yaml.dump(JSON.parse(rawExpectedConfig)))

View File

@@ -77,7 +77,7 @@ jobs:
setup-kotlin: 'true'
- uses: actions/setup-go@v5
with:
go-version: ~1.23.0
go-version: ~1.24.0
# to avoid potentially misleading autobuilder results where we expect it to download
# dependencies successfully, but they actually come from a warm cache
cache: false

View File

@@ -77,7 +77,7 @@ jobs:
setup-kotlin: 'true'
- uses: actions/setup-go@v5
with:
go-version: ~1.23.0
go-version: ~1.24.0
# to avoid potentially misleading autobuilder results where we expect it to download
# dependencies successfully, but they actually come from a warm cache
cache: false

View File

@@ -77,7 +77,7 @@ jobs:
setup-kotlin: 'true'
- uses: actions/setup-go@v5
with:
go-version: ~1.23.0
go-version: ~1.24.0
# to avoid potentially misleading autobuilder results where we expect it to download
# dependencies successfully, but they actually come from a warm cache
cache: false

View File

@@ -46,7 +46,7 @@ jobs:
use-all-platform-bundle: 'false'
setup-kotlin: 'true'
- name: Set up Ruby
uses: ruby/setup-ruby@2654679fe7f7c29875c669398a8ec0791b8a64a1 # v1.215.0
uses: ruby/setup-ruby@277ba2a127aba66d45bad0fa2dc56f80dbfedffa # v1.222.0
with:
ruby-version: 2.6
- name: Install Code Scanning integration

71
.github/workflows/__rust.yml generated vendored Normal file
View File

@@ -0,0 +1,71 @@
# Warning: This file is generated automatically, and should not be modified.
# Instead, please modify the template in the pr-checks directory and run:
# (cd pr-checks; pip install ruamel.yaml@0.17.31 && python3 sync.py)
# to regenerate this file.
name: PR Check - Rust analysis
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GO111MODULE: auto
on:
push:
branches:
- main
- releases/v*
pull_request:
types:
- opened
- synchronize
- reopened
- ready_for_review
schedule:
- cron: '0 5 * * *'
workflow_dispatch: {}
jobs:
rust:
strategy:
fail-fast: false
matrix:
include:
- os: ubuntu-latest
version: linked
- os: ubuntu-latest
version: default
- os: ubuntu-latest
version: nightly-latest
name: Rust analysis
permissions:
contents: read
security-events: read
timeout-minutes: 45
runs-on: ${{ matrix.os }}
steps:
- name: Check out repository
uses: actions/checkout@v4
- name: Prepare test
id: prepare-test
uses: ./.github/actions/prepare-test
with:
version: ${{ matrix.version }}
use-all-platform-bundle: 'false'
setup-kotlin: 'true'
- uses: ./../action/init
with:
languages: rust
tools: ${{ steps.prepare-test.outputs.tools-url }}
env:
CODEQL_ACTION_RUST_ANALYSIS: true
- uses: ./../action/analyze
id: analysis
with:
upload-database: false
- name: Check database
shell: bash
run: |
RUST_DB="${{ fromJson(steps.analysis.outputs.db-locations).rust }}"
if [[ ! -d "$RUST_DB" ]]; then
echo "Did not create a database for Rust."
exit 1
fi
env:
CODEQL_ACTION_TEST_MODE: true

View File

@@ -168,7 +168,7 @@ jobs:
--draft
- name: Generate token
uses: actions/create-github-app-token@136412a57a7081aa63c935a2cc2918f76c34f514
uses: actions/create-github-app-token@v1.11.6
id: app-token
with:
app-id: ${{ vars.AUTOMATION_APP_ID }}

View File

@@ -124,7 +124,7 @@ jobs:
pull-requests: write # needed to create pull request
steps:
- name: Generate token
uses: actions/create-github-app-token@136412a57a7081aa63c935a2cc2918f76c34f514
uses: actions/create-github-app-token@v1.11.6
id: app-token
with:
app-id: ${{ vars.AUTOMATION_APP_ID }}

View File

@@ -1,20 +1,20 @@
repos:
- repo: local
hooks:
- id: lint-ts
name: Lint typescript code
files: \.ts$
language: system
entry: npm run lint -- --fix
- id: compile-ts
name: Compile typescript
files: \.[tj]s$
language: system
entry: npm run build
pass_filenames: false
- id: lint-ts
name: Lint typescript code
files: \.ts$
language: system
entry: npm run lint -- --fix
- id: pr-checks-sync
name: Synchronize PR check workflows
files: ^.github/workflows/__.*\.yml$|^pr-checks
language: system
entry: python3 pr-checks/sync.py
entry: pr-checks/sync.sh
pass_filenames: false

View File

@@ -6,6 +6,19 @@ See the [releases page](https://github.com/github/codeql-action/releases) for th
No user facing changes.
## 3.28.11 - 07 Mar 2025
- Update default CodeQL bundle version to 2.20.6. [#2793](https://github.com/github/codeql-action/pull/2793)
## 3.28.10 - 21 Feb 2025
- Update default CodeQL bundle version to 2.20.5. [#2772](https://github.com/github/codeql-action/pull/2772)
- Address an issue where the CodeQL Bundle would occasionally fail to decompress on macOS. [#2768](https://github.com/github/codeql-action/pull/2768)
## 3.28.9 - 07 Feb 2025
- Update default CodeQL bundle version to 2.20.4. [#2753](https://github.com/github/codeql-action/pull/2753)
## 3.28.8 - 29 Jan 2025
- Enable support for Kotlin 2.1.10 when running with CodeQL CLI v2.20.3. [#2744](https://github.com/github/codeql-action/pull/2744)

17
justfile Normal file
View File

@@ -0,0 +1,17 @@
# Perform all working copy cleanup operations
all: lint sync
# Lint source typescript
lint:
npm run lint -- --fix
# Sync generated files (javascript and PR checks)
sync: build update-pr-checks
# Perform all necessary steps to update the PR checks
update-pr-checks:
pr-checks/sync.sh
# Transpile typescript code into javascript
build:
npm run build

View File

@@ -60,7 +60,7 @@ async function runWrapper() {
if (config !== undefined) {
const codeql = await (0, codeql_1.getCodeQL)(config.codeQLCmd);
const version = await codeql.getVersion();
await (0, logging_1.withGroup)("Uploading combined SARIF debug artifact", () => debugArtifacts.uploadCombinedSarifArtifacts(logger, config.gitHubVersion.type, version.version));
await debugArtifacts.uploadCombinedSarifArtifacts(logger, config.gitHubVersion.type, version.version);
}
}
}

View File

@@ -1 +1 @@
{"version":3,"file":"analyze-action-post.js","sourceRoot":"","sources":["../src/analyze-action-post.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA;;;;GAIG;AACH,oDAAsC;AAEtC,4DAA8C;AAC9C,6CAAgD;AAChD,qCAAqC;AACrC,iDAA2C;AAC3C,kEAAoD;AACpD,+CAAuC;AACvC,uCAAwD;AACxD,iCAAoE;AAEpE,KAAK,UAAU,UAAU;IACvB,IAAI,CAAC;QACH,WAAW,CAAC,aAAa,EAAE,CAAC;QAC5B,MAAM,MAAM,GAAG,IAAA,0BAAgB,GAAE,CAAC;QAClC,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;QAC/C,IAAA,gCAAyB,EAAC,aAAa,EAAE,MAAM,CAAC,CAAC;QAEjD,kFAAkF;QAClF,wFAAwF;QACxF,IAAI,OAAO,CAAC,GAAG,CAAC,oBAAM,CAAC,mBAAmB,CAAC,KAAK,MAAM,EAAE,CAAC;YACvD,MAAM,MAAM,GAAG,MAAM,IAAA,wBAAS,EAC5B,WAAW,CAAC,qBAAqB,EAAE,EACnC,MAAM,CACP,CAAC;YACF,IAAI,MAAM,KAAK,SAAS,EAAE,CAAC;gBACzB,MAAM,MAAM,GAAG,MAAM,IAAA,kBAAS,EAAC,MAAM,CAAC,SAAS,CAAC,CAAC;gBACjD,MAAM,OAAO,GAAG,MAAM,MAAM,CAAC,UAAU,EAAE,CAAC;gBAC1C,MAAM,IAAA,mBAAS,EAAC,yCAAyC,EAAE,GAAG,EAAE,CAC9D,cAAc,CAAC,4BAA4B,CACzC,MAAM,EACN,MAAM,CAAC,aAAa,CAAC,IAAI,EACzB,OAAO,CAAC,OAAO,CAChB,CACF,CAAC;YACJ,CAAC;QACH,CAAC;IACH,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,IAAI,CAAC,SAAS,CACZ,oCAAoC,IAAA,sBAAe,EAAC,KAAK,CAAC,EAAE,CAC7D,CAAC;IACJ,CAAC;AACH,CAAC;AAED,KAAK,UAAU,EAAE,CAAC"}
{"version":3,"file":"analyze-action-post.js","sourceRoot":"","sources":["../src/analyze-action-post.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA;;;;GAIG;AACH,oDAAsC;AAEtC,4DAA8C;AAC9C,6CAAgD;AAChD,qCAAqC;AACrC,iDAA2C;AAC3C,kEAAoD;AACpD,+CAAuC;AACvC,uCAA6C;AAC7C,iCAAoE;AAEpE,KAAK,UAAU,UAAU;IACvB,IAAI,CAAC;QACH,WAAW,CAAC,aAAa,EAAE,CAAC;QAC5B,MAAM,MAAM,GAAG,IAAA,0BAAgB,GAAE,CAAC;QAClC,MAAM,aAAa,GAAG,MAAM,IAAA,6BAAgB,GAAE,CAAC;QAC/C,IAAA,gCAAyB,EAAC,aAAa,EAAE,MAAM,CAAC,CAAC;QAEjD,kFAAkF;QAClF,wFAAwF;QACxF,IAAI,OAAO,CAAC,GAAG,CAAC,oBAAM,CAAC,mBAAmB,CAAC,KAAK,MAAM,EAAE,CAAC;YACvD,MAAM,MAAM,GAAG,MAAM,IAAA,wBAAS,EAC5B,WAAW,CAAC,qBAAqB,EAAE,EACnC,MAAM,CACP,CAAC;YACF,IAAI,MAAM,KAAK,SAAS,EAAE,CAAC;gBACzB,MAAM,MAAM,GAAG,MAAM,IAAA,kBAAS,EAAC,MAAM,CAAC,SAAS,CAAC,CAAC;gBACjD,MAAM,OAAO,GAAG,MAAM,MAAM,CAAC,UAAU,EAAE,CAAC;gBAC1C,MAAM,cAAc,CAAC,4BAA4B,CAC/C,MAAM,EACN,MAAM,CAAC,aAAa,CAAC,IAAI,EACzB,OAAO,CAAC,OAAO,CAChB,CAAC;YACJ,CAAC;QACH,CAAC;IACH,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,IAAI,CAAC,SAAS,CACZ,oCAAoC,IAAA,sBAAe,EAAC,KAAK,CAAC,EAAE,CAC7D,CAAC;IACJ,CAAC;AACH,CAAC;AAED,KAAK,UAAU,EAAE,CAAC"}

7
lib/analyze-action.js generated
View File

@@ -200,11 +200,12 @@ async function run() {
await (0, analyze_1.warnIfGoInstalledAfterInit)(config, logger);
await runAutobuildIfLegacyGoWorkflow(config, logger);
dbCreationTimings = await (0, analyze_1.runFinalize)(outputDir, threads, memory, codeql, config, logger);
const cleanupLevel = actionsUtil.getOptionalInput("cleanup-level") || "brutal";
if (actionsUtil.getRequiredInput("skip-queries") !== "true") {
runStats = await (0, analyze_1.runQueries)(outputDir, memory, util.getAddSnippetsFlag(actionsUtil.getRequiredInput("add-snippets")), threads, diffRangePackDir, actionsUtil.getOptionalInput("category"), config, logger, features);
runStats = await (0, analyze_1.runQueries)(outputDir, memory, util.getAddSnippetsFlag(actionsUtil.getRequiredInput("add-snippets")), threads, cleanupLevel, diffRangePackDir, actionsUtil.getOptionalInput("category"), config, logger, features);
}
if (actionsUtil.getOptionalInput("cleanup-level") !== "none") {
await (0, analyze_1.runCleanup)(config, actionsUtil.getOptionalInput("cleanup-level") || "brutal", logger);
if (cleanupLevel !== "none") {
await (0, analyze_1.runCleanup)(config, cleanupLevel, logger);
}
const dbLocations = {};
for (const language of config.languages) {

File diff suppressed because one or more lines are too long

21
lib/analyze.js generated
View File

@@ -55,6 +55,7 @@ const api_client_1 = require("./api-client");
const autobuild_1 = require("./autobuild");
const codeql_1 = require("./codeql");
const diagnostics_1 = require("./diagnostics");
const diff_filtering_utils_1 = require("./diff-filtering-utils");
const environment_1 = require("./environment");
const feature_flags_1 = require("./feature-flags");
const languages_1 = require("./languages");
@@ -368,23 +369,27 @@ extensions:
const extensionFilePath = path.join(diffRangeDir, "pr-diff-range.yml");
fs.writeFileSync(extensionFilePath, extensionContents);
logger.debug(`Wrote pr-diff-range extension pack to ${extensionFilePath}:\n${extensionContents}`);
// Write the diff ranges to a JSON file, for action-side alert filtering by the
// upload-lib module.
(0, diff_filtering_utils_1.writeDiffRangesJsonFile)(logger, ranges);
return diffRangeDir;
}
// Runs queries and creates sarif files in the given folder
async function runQueries(sarifFolder, memoryFlag, addSnippetsFlag, threadsFlag, diffRangePackDir, automationDetailsId, config, logger, features) {
async function runQueries(sarifFolder, memoryFlag, addSnippetsFlag, threadsFlag, cleanupLevel, diffRangePackDir, automationDetailsId, config, logger, features) {
const statusReport = {};
const queryFlags = [memoryFlag, threadsFlag];
if (cleanupLevel !== "overlay") {
queryFlags.push("--expect-discarded-cache");
}
statusReport.analysis_is_diff_informed = diffRangePackDir !== undefined;
const dataExtensionFlags = diffRangePackDir
? [
`--additional-packs=${diffRangePackDir}`,
"--extension-packs=codeql-action/pr-diff-range",
]
: [];
if (diffRangePackDir) {
queryFlags.push(`--additional-packs=${diffRangePackDir}`);
queryFlags.push("--extension-packs=codeql-action/pr-diff-range");
}
const sarifRunPropertyFlag = diffRangePackDir
? "--sarif-run-property=incrementalMode=diff-informed"
: undefined;
const codeql = await (0, codeql_1.getCodeQL)(config.codeQLCmd);
const queryFlags = [memoryFlag, threadsFlag, ...dataExtensionFlags];
for (const language of config.languages) {
try {
const sarifFile = path.join(sarifFolder, `${language}.sarif`);

File diff suppressed because one or more lines are too long

2
lib/analyze.test.js generated
View File

@@ -114,7 +114,7 @@ const util = __importStar(require("./util"));
fs.mkdirSync(util.getCodeQLDatabasePath(config, language), {
recursive: true,
});
const statusReport = await (0, analyze_1.runQueries)(tmpDir, memoryFlag, addSnippetsFlag, threadsFlag, undefined, undefined, config, (0, logging_1.getRunnerLogger)(true), (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.QaTelemetryEnabled]));
const statusReport = await (0, analyze_1.runQueries)(tmpDir, memoryFlag, addSnippetsFlag, threadsFlag, "brutal", undefined, undefined, config, (0, logging_1.getRunnerLogger)(true), (0, testing_utils_1.createFeatures)([feature_flags_1.Feature.QaTelemetryEnabled]));
t.deepEqual(Object.keys(statusReport).sort(), [
"analysis_is_diff_informed",
`analyze_builtin_queries_${language}_duration_ms`,

File diff suppressed because one or more lines are too long

2
lib/api-client.js generated
View File

@@ -206,7 +206,7 @@ async function deleteActionsCache(id) {
}
function wrapApiConfigurationError(e) {
if ((0, util_1.isHTTPError)(e)) {
if (e.message.includes("API rate limit exceeded for site ID installation") ||
if (e.message.includes("API rate limit exceeded for installation") ||
e.message.includes("commit not found") ||
/^ref .* not found in this repository$/.test(e.message)) {
return new util_1.ConfigurationError(e.message);

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
{ "maximumVersion": "3.16", "minimumVersion": "3.12" }
{ "maximumVersion": "3.17", "minimumVersion": "3.12" }

4
lib/cli-errors.js generated
View File

@@ -110,6 +110,7 @@ function extractAutobuildErrors(error) {
var CliConfigErrorCategory;
(function (CliConfigErrorCategory) {
CliConfigErrorCategory["AutobuildError"] = "AutobuildError";
CliConfigErrorCategory["CouldNotCreateTempDir"] = "CouldNotCreateTempDir";
CliConfigErrorCategory["ExternalRepositoryCloneFailed"] = "ExternalRepositoryCloneFailed";
CliConfigErrorCategory["GradleBuildFailed"] = "GradleBuildFailed";
CliConfigErrorCategory["IncompatibleWithActionVersion"] = "IncompatibleWithActionVersion";
@@ -139,6 +140,9 @@ exports.cliErrorsConfig = {
new RegExp("We were unable to automatically build your code"),
],
},
[CliConfigErrorCategory.CouldNotCreateTempDir]: {
cliErrorMessageCandidates: [new RegExp("Could not create temp directory")],
},
[CliConfigErrorCategory.ExternalRepositoryCloneFailed]: {
cliErrorMessageCandidates: [
new RegExp("Failed to clone external Git repository"),

File diff suppressed because one or more lines are too long

34
lib/codeql.js generated
View File

@@ -55,6 +55,7 @@ const environment_1 = require("./environment");
const feature_flags_1 = require("./feature-flags");
const git_utils_1 = require("./git-utils");
const languages_1 = require("./languages");
const overlay_database_utils_1 = require("./overlay-database-utils");
const setupCodeql = __importStar(require("./setup-codeql"));
const tools_features_1 = require("./tools-features");
const tracer_config_1 = require("./tracer-config");
@@ -133,7 +134,11 @@ async function setupCodeQL(toolsInput, apiDetails, tempDir, variant, defaultCliV
};
}
catch (e) {
throw new Error(`Unable to download and extract CodeQL CLI: ${(0, util_1.getErrorMessage)(e)}${e instanceof Error && e.stack ? `\n\nDetails: ${e.stack}` : ""}`);
const ErrorClass = e instanceof util.ConfigurationError ||
(e instanceof Error && e.message.includes("ENOSPC")) // out of disk space
? util.ConfigurationError
: Error;
throw new ErrorClass(`Unable to download and extract CodeQL CLI: ${(0, util_1.getErrorMessage)(e)}${e instanceof Error && e.stack ? `\n\nDetails: ${e.stack}` : ""}`);
}
}
/**
@@ -250,7 +255,7 @@ async function getCodeQLForCmd(cmd, checkVersion) {
async supportsFeature(feature) {
return (0, tools_features_1.isSupportedToolsFeature)(await this.getVersion(), feature);
},
async databaseInitCluster(config, sourceRoot, processName, qlconfigFile, logger) {
async databaseInitCluster(config, sourceRoot, processName, qlconfigFile, overlayDatabaseMode, logger) {
const extraArgs = config.languages.map((language) => `--language=${language}`);
if (await (0, tracer_config_1.shouldEnableIndirectTracing)(codeql, config)) {
extraArgs.push("--begin-tracing");
@@ -258,9 +263,17 @@ async function getCodeQLForCmd(cmd, checkVersion) {
extraArgs.push(`--trace-process-name=${processName}`);
}
if (config.languages.indexOf(languages_1.Language.actions) >= 0) {
extraArgs.push("--search-path");
const extractorPath = path.resolve(__dirname, "../actions-extractor");
extraArgs.push(extractorPath);
// We originally added an embedded version of the Actions extractor to the CodeQL Action
// itself in order to deploy the extractor between CodeQL releases. When we did add the
// extractor to the CLI, though, its autobuild script was missing the execute bit.
// 2.20.6 is the first CLI release with the fully-functional extractor in the CLI. For older
// versions, we'll keep using the embedded extractor. We can remove the embedded extractor
// once 2.20.6 is deployed in the runner images.
if (!(await util.codeQlVersionAtLeast(codeql, "2.20.6"))) {
extraArgs.push("--search-path");
const extractorPath = path.resolve(__dirname, "../actions-extractor");
extraArgs.push(extractorPath);
}
}
const codeScanningConfigFile = await generateCodeScanningConfig(config, logger);
const externalRepositoryToken = (0, actions_util_1.getOptionalInput)("external-repository-token");
@@ -278,10 +291,18 @@ async function getCodeQLForCmd(cmd, checkVersion) {
const overwriteFlag = (0, tools_features_1.isSupportedToolsFeature)(await this.getVersion(), tools_features_1.ToolsFeature.ForceOverwrite)
? "--force-overwrite"
: "--overwrite";
if (overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.Overlay) {
extraArgs.push("--overlay");
}
else if (overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.OverlayBase) {
extraArgs.push("--overlay-base");
}
await runCli(cmd, [
"database",
"init",
overwriteFlag,
...(overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.Overlay
? []
: [overwriteFlag]),
"--db-cluster",
config.dbLocation,
`--source-root=${sourceRoot}`,
@@ -446,7 +467,6 @@ async function getCodeQLForCmd(cmd, checkVersion) {
"run-queries",
...flags,
databasePath,
"--expect-discarded-cache",
"--intra-layer-parallelism",
"--min-disk-free=1024", // Try to leave at least 1GB free
"-v",

File diff suppressed because one or more lines are too long

11
lib/codeql.test.js generated
View File

@@ -53,6 +53,7 @@ const defaults = __importStar(require("./defaults.json"));
const doc_url_1 = require("./doc-url");
const languages_1 = require("./languages");
const logging_1 = require("./logging");
const overlay_database_utils_1 = require("./overlay-database-utils");
const setup_codeql_1 = require("./setup-codeql");
const testing_utils_1 = require("./testing-utils");
const tools_features_1 = require("./tools-features");
@@ -335,7 +336,7 @@ const injectedConfigMacro = ava_1.default.macro({
tempDir,
augmentationProperties,
};
await codeqlObject.databaseInitCluster(thisStubConfig, "", undefined, undefined, (0, logging_1.getRunnerLogger)(true));
await codeqlObject.databaseInitCluster(thisStubConfig, "", undefined, undefined, overlay_database_utils_1.OverlayDatabaseMode.None, (0, logging_1.getRunnerLogger)(true));
const args = runnerConstructorStub.firstCall.args[1];
// should have used an config file
const configArg = args.find((arg) => arg.startsWith("--codescanning-config="));
@@ -471,7 +472,7 @@ const injectedConfigMacro = ava_1.default.macro({
const runnerConstructorStub = stubToolRunnerConstructor();
const codeqlObject = await codeql.getCodeQLForTesting();
sinon.stub(codeqlObject, "getVersion").resolves((0, testing_utils_1.makeVersionInfo)("2.17.6"));
await codeqlObject.databaseInitCluster({ ...stubConfig, tempDir }, "", undefined, "/path/to/qlconfig.yml", (0, logging_1.getRunnerLogger)(true));
await codeqlObject.databaseInitCluster({ ...stubConfig, tempDir }, "", undefined, "/path/to/qlconfig.yml", overlay_database_utils_1.OverlayDatabaseMode.None, (0, logging_1.getRunnerLogger)(true));
const args = runnerConstructorStub.firstCall.args[1];
// should have used a config file
const hasCodeScanningConfigArg = args.some((arg) => arg.startsWith("--codescanning-config="));
@@ -487,7 +488,7 @@ const injectedConfigMacro = ava_1.default.macro({
const codeqlObject = await codeql.getCodeQLForTesting();
sinon.stub(codeqlObject, "getVersion").resolves((0, testing_utils_1.makeVersionInfo)("2.17.6"));
await codeqlObject.databaseInitCluster({ ...stubConfig, tempDir }, "", undefined, undefined, // undefined qlconfigFile
(0, logging_1.getRunnerLogger)(true));
overlay_database_utils_1.OverlayDatabaseMode.None, (0, logging_1.getRunnerLogger)(true));
const args = runnerConstructorStub.firstCall.args[1];
const hasQlconfigArg = args.some((arg) => arg.startsWith("--qlconfig-file="));
t.false(hasQlconfigArg, "should NOT have injected a qlconfig");
@@ -612,7 +613,7 @@ for (const { codeqlVersion, flagPassed, githubVersion, negativeFlagPassed, } of
sinon.stub(io, "which").resolves("");
await t.throwsAsync(async () => await codeqlObject.databaseRunQueries(stubConfig.dbLocation, []), {
instanceOf: cli_errors_1.CliError,
message: `Encountered a fatal error while running "codeql-for-testing database run-queries --expect-discarded-cache --intra-layer-parallelism --min-disk-free=1024 -v". Exit code was 1 and error was: Oops! A fatal internal error occurred. Details:
message: `Encountered a fatal error while running "codeql-for-testing database run-queries --intra-layer-parallelism --min-disk-free=1024 -v". Exit code was 1 and error was: Oops! A fatal internal error occurred. Details:
com.semmle.util.exception.CatastrophicError: An error occurred while evaluating ControlFlowGraph::ControlFlow::Root.isRootOf/1#dispred#f610e6ed/2@86282cc8
Severe disk cache trouble (corruption or out of space) at /home/runner/work/_temp/codeql_databases/go/db-go/default/cache/pages/28/33.pack: Failed to write item to disk. See the logs for more details.`,
});
@@ -638,7 +639,7 @@ for (const { codeqlVersion, flagPassed, githubVersion, negativeFlagPassed, } of
sinon.stub(io, "which").resolves("");
process.env["CODEQL_ACTION_EXTRA_OPTIONS"] =
'{ "database": { "init": ["--overwrite"] } }';
await codeqlObject.databaseInitCluster(stubConfig, "sourceRoot", undefined, undefined, (0, logging_1.getRunnerLogger)(false));
await codeqlObject.databaseInitCluster(stubConfig, "sourceRoot", undefined, undefined, overlay_database_utils_1.OverlayDatabaseMode.None, (0, logging_1.getRunnerLogger)(false));
t.true(runnerConstructorStub.calledOnce);
const args = runnerConstructorStub.firstCall.args[1];
t.is(args.filter((option) => option === "--overwrite").length, 1, "--overwrite should only be passed once");

File diff suppressed because one or more lines are too long

38
lib/debug-artifacts.js generated
View File

@@ -66,26 +66,28 @@ async function uploadCombinedSarifArtifacts(logger, gitHubVariant, codeQlVersion
const tempDir = (0, actions_util_1.getTemporaryDirectory)();
// Upload Actions SARIF artifacts for debugging when environment variable is set
if (process.env["CODEQL_ACTION_DEBUG_COMBINED_SARIF"] === "true") {
logger.info("Uploading available combined SARIF files as Actions debugging artifact...");
const baseTempDir = path.resolve(tempDir, "combined-sarif");
const toUpload = [];
if (fs.existsSync(baseTempDir)) {
const outputDirs = fs.readdirSync(baseTempDir);
for (const outputDir of outputDirs) {
const sarifFiles = fs
.readdirSync(path.resolve(baseTempDir, outputDir))
.filter((f) => f.endsWith(".sarif"));
for (const sarifFile of sarifFiles) {
toUpload.push(path.resolve(baseTempDir, outputDir, sarifFile));
await (0, logging_1.withGroup)("Uploading combined SARIF debug artifact", async () => {
logger.info("Uploading available combined SARIF files as Actions debugging artifact...");
const baseTempDir = path.resolve(tempDir, "combined-sarif");
const toUpload = [];
if (fs.existsSync(baseTempDir)) {
const outputDirs = fs.readdirSync(baseTempDir);
for (const outputDir of outputDirs) {
const sarifFiles = fs
.readdirSync(path.resolve(baseTempDir, outputDir))
.filter((f) => f.endsWith(".sarif"));
for (const sarifFile of sarifFiles) {
toUpload.push(path.resolve(baseTempDir, outputDir, sarifFile));
}
}
}
}
try {
await uploadDebugArtifacts(logger, toUpload, baseTempDir, "combined-sarif-artifacts", gitHubVariant, codeQlVersion);
}
catch (e) {
logger.warning(`Failed to upload combined SARIF files as Actions debugging artifact. Reason: ${(0, util_1.getErrorMessage)(e)}`);
}
try {
await uploadDebugArtifacts(logger, toUpload, baseTempDir, "combined-sarif-artifacts", gitHubVariant, codeQlVersion);
}
catch (e) {
logger.warning(`Failed to upload combined SARIF files as Actions debugging artifact. Reason: ${(0, util_1.getErrorMessage)(e)}`);
}
});
}
}
/**

File diff suppressed because one or more lines are too long

View File

@@ -1,6 +1,6 @@
{
"bundleVersion": "codeql-bundle-v2.20.3",
"cliVersion": "2.20.3",
"priorBundleVersion": "codeql-bundle-v2.20.2",
"priorCliVersion": "2.20.2"
"bundleVersion": "codeql-bundle-v2.20.6",
"cliVersion": "2.20.6",
"priorBundleVersion": "codeql-bundle-v2.20.5",
"priorCliVersion": "2.20.5"
}

60
lib/diff-filtering-utils.js generated Normal file
View File

@@ -0,0 +1,60 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || (function () {
var ownKeys = function(o) {
ownKeys = Object.getOwnPropertyNames || function (o) {
var ar = [];
for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
return ar;
};
return ownKeys(o);
};
return function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
__setModuleDefault(result, mod);
return result;
};
})();
Object.defineProperty(exports, "__esModule", { value: true });
exports.writeDiffRangesJsonFile = writeDiffRangesJsonFile;
exports.readDiffRangesJsonFile = readDiffRangesJsonFile;
const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const actionsUtil = __importStar(require("./actions-util"));
function getDiffRangesJsonFilePath() {
return path.join(actionsUtil.getTemporaryDirectory(), "pr-diff-range.json");
}
function writeDiffRangesJsonFile(logger, ranges) {
const jsonContents = JSON.stringify(ranges, null, 2);
const jsonFilePath = getDiffRangesJsonFilePath();
fs.writeFileSync(jsonFilePath, jsonContents);
logger.debug(`Wrote pr-diff-range JSON file to ${jsonFilePath}:\n${jsonContents}`);
}
function readDiffRangesJsonFile(logger) {
const jsonFilePath = getDiffRangesJsonFilePath();
if (!fs.existsSync(jsonFilePath)) {
logger.debug(`Diff ranges JSON file does not exist at ${jsonFilePath}`);
return undefined;
}
const jsonContents = fs.readFileSync(jsonFilePath, "utf8");
logger.debug(`Read pr-diff-range JSON file from ${jsonFilePath}:\n${jsonContents}`);
return JSON.parse(jsonContents);
}
//# sourceMappingURL=diff-filtering-utils.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"diff-filtering-utils.js","sourceRoot":"","sources":["../src/diff-filtering-utils.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAgBA,0DAUC;AAED,wDAaC;AAzCD,uCAAyB;AACzB,2CAA6B;AAE7B,4DAA8C;AAS9C,SAAS,yBAAyB;IAChC,OAAO,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,qBAAqB,EAAE,EAAE,oBAAoB,CAAC,CAAC;AAC9E,CAAC;AAED,SAAgB,uBAAuB,CACrC,MAAc,EACd,MAAwB;IAExB,MAAM,YAAY,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,CAAC;IACrD,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,EAAE,CAAC,aAAa,CAAC,YAAY,EAAE,YAAY,CAAC,CAAC;IAC7C,MAAM,CAAC,KAAK,CACV,oCAAoC,YAAY,MAAM,YAAY,EAAE,CACrE,CAAC;AACJ,CAAC;AAED,SAAgB,sBAAsB,CACpC,MAAc;IAEd,MAAM,YAAY,GAAG,yBAAyB,EAAE,CAAC;IACjD,IAAI,CAAC,EAAE,CAAC,UAAU,CAAC,YAAY,CAAC,EAAE,CAAC;QACjC,MAAM,CAAC,KAAK,CAAC,2CAA2C,YAAY,EAAE,CAAC,CAAC;QACxE,OAAO,SAAS,CAAC;IACnB,CAAC;IACD,MAAM,YAAY,GAAG,EAAE,CAAC,YAAY,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC;IAC3D,MAAM,CAAC,KAAK,CACV,qCAAqC,YAAY,MAAM,YAAY,EAAE,CACtE,CAAC;IACF,OAAO,IAAI,CAAC,KAAK,CAAC,YAAY,CAAqB,CAAC;AACtD,CAAC"}

6
lib/feature-flags.js generated
View File

@@ -68,6 +68,7 @@ var Feature;
Feature["ExtractToToolcache"] = "extract_to_toolcache";
Feature["PythonDefaultIsToNotExtractStdlib"] = "python_default_is_to_not_extract_stdlib";
Feature["QaTelemetryEnabled"] = "qa_telemetry_enabled";
Feature["RustAnalysis"] = "rust_analysis";
Feature["ZstdBundleStreamingExtraction"] = "zstd_bundle_streaming_extraction";
})(Feature || (exports.Feature = Feature = {}));
exports.featureConfig = {
@@ -132,6 +133,11 @@ exports.featureConfig = {
minimumVersion: undefined,
toolsFeature: tools_features_1.ToolsFeature.PythonDefaultIsToNotExtractStdlib,
},
[Feature.RustAnalysis]: {
defaultValue: false,
envVar: "CODEQL_ACTION_RUST_ANALYSIS",
minimumVersion: "2.19.3",
},
[Feature.QaTelemetryEnabled]: {
defaultValue: false,
envVar: "CODEQL_ACTION_QA_TELEMETRY",

File diff suppressed because one or more lines are too long

74
lib/git-utils.js generated
View File

@@ -33,7 +33,7 @@ var __importStar = (this && this.__importStar) || (function () {
};
})();
Object.defineProperty(exports, "__esModule", { value: true });
exports.decodeGitFilePath = exports.getGitDiffHunkHeaders = exports.getAllGitMergeBases = exports.gitRepack = exports.gitFetch = exports.deepenGitHistory = exports.determineBaseBranchHeadCommitOid = exports.getCommitOid = void 0;
exports.getGitRoot = exports.decodeGitFilePath = exports.gitRepack = exports.gitFetch = exports.deepenGitHistory = exports.determineBaseBranchHeadCommitOid = exports.getCommitOid = void 0;
exports.getRef = getRef;
exports.isAnalyzingDefaultBranch = isAnalyzingDefaultBranch;
const core = __importStar(require("@actions/core"));
@@ -185,61 +185,6 @@ const gitRepack = async function (flags) {
}
};
exports.gitRepack = gitRepack;
/**
* Compute the all merge bases between the given refs. Returns an empty array
* if no merge base is found, or if there is an error.
*
* This function uses the `checkout_path` to determine the repository path and
* works only when called from `analyze` or `upload-sarif`.
*/
const getAllGitMergeBases = async function (refs) {
try {
const stdout = await runGitCommand((0, actions_util_1.getOptionalInput)("checkout_path"), ["merge-base", "--all", ...refs], `Cannot get merge base of ${refs}.`);
return stdout.trim().split("\n");
}
catch {
return [];
}
};
exports.getAllGitMergeBases = getAllGitMergeBases;
/**
* Compute the diff hunk headers between the two given refs.
*
* This function uses the `checkout_path` to determine the repository path and
* works only when called from `analyze` or `upload-sarif`.
*
* @returns an array of diff hunk headers (one element per line), or undefined
* if the action was not triggered by a pull request, or if the diff could not
* be determined.
*/
const getGitDiffHunkHeaders = async function (fromRef, toRef) {
let stdout = "";
try {
stdout = await runGitCommand((0, actions_util_1.getOptionalInput)("checkout_path"), [
"-c",
"core.quotePath=false",
"diff",
"--no-renames",
"--irreversible-delete",
"-U0",
fromRef,
toRef,
], `Cannot get diff from ${fromRef} to ${toRef}.`);
}
catch {
return undefined;
}
const headers = [];
for (const line of stdout.split("\n")) {
if (line.startsWith("--- ") ||
line.startsWith("+++ ") ||
line.startsWith("@@ ")) {
headers.push(line);
}
}
return headers;
};
exports.getGitDiffHunkHeaders = getGitDiffHunkHeaders;
/**
* Decode, if necessary, a file path produced by Git. See
* https://git-scm.com/docs/git-config#Documentation/git-config.txt-corequotePath
@@ -285,6 +230,23 @@ const decodeGitFilePath = function (filePath) {
return filePath;
};
exports.decodeGitFilePath = decodeGitFilePath;
/**
* Get the root of the Git repository.
*
* @param sourceRoot The source root of the code being analyzed.
* @returns The root of the Git repository.
*/
const getGitRoot = async function (sourceRoot) {
try {
const stdout = await runGitCommand(sourceRoot, ["rev-parse", "--show-toplevel"], `Cannot find Git repository root from the source root ${sourceRoot}.`);
return stdout.trim();
}
catch {
// Errors are already logged by runGitCommand()
return undefined;
}
};
exports.getGitRoot = getGitRoot;
function getRefFromEnv() {
// To workaround a limitation of Actions dynamic workflows not setting
// the GITHUB_REF in some cases, we accept also the ref within the

File diff suppressed because one or more lines are too long

32
lib/init-action.js generated
View File

@@ -37,6 +37,7 @@ const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const core = __importStar(require("@actions/core"));
const io = __importStar(require("@actions/io"));
const semver = __importStar(require("semver"));
const uuid_1 = require("uuid");
const actions_util_1 = require("./actions-util");
const api_client_1 = require("./api-client");
@@ -49,6 +50,7 @@ const feature_flags_1 = require("./feature-flags");
const init_1 = require("./init");
const languages_1 = require("./languages");
const logging_1 = require("./logging");
const overlay_database_utils_1 = require("./overlay-database-utils");
const repository_1 = require("./repository");
const setup_codeql_1 = require("./setup-codeql");
const status_report_1 = require("./status-report");
@@ -227,7 +229,12 @@ async function run() {
return;
}
try {
(0, init_1.cleanupDatabaseClusterDirectory)(config, logger);
const sourceRoot = path.resolve((0, util_1.getRequiredEnvParam)("GITHUB_WORKSPACE"), (0, actions_util_1.getOptionalInput)("source-root") || "");
const overlayDatabaseMode = await (0, init_1.getOverlayDatabaseMode)((await codeql.getVersion()).version, config, sourceRoot, logger);
logger.info(`Using overlay database mode: ${overlayDatabaseMode}`);
if (overlayDatabaseMode !== overlay_database_utils_1.OverlayDatabaseMode.Overlay) {
(0, init_1.cleanupDatabaseClusterDirectory)(config, logger);
}
if (zstdAvailability) {
await recordZstdAvailability(config, zstdAvailability);
}
@@ -345,6 +352,26 @@ async function run() {
logger.info(`Setting C++ build-mode: none to ${value}`);
core.exportVariable(bmnVar, value);
}
// Set CODEQL_ENABLE_EXPERIMENTAL_FEATURES for rust
if (config.languages.includes(languages_1.Language.rust)) {
const feat = feature_flags_1.Feature.RustAnalysis;
const minVer = feature_flags_1.featureConfig[feat].minimumVersion;
const envVar = "CODEQL_ENABLE_EXPERIMENTAL_FEATURES";
// if in default setup, it means the feature flag was on when rust was enabled
// if the feature flag gets turned off, let's not have rust analysis throwing a configuration error
// in that case rust analysis will be disabled only when default setup is refreshed
if ((0, actions_util_1.isDefaultSetup)() || (await features.getValue(feat, codeql))) {
core.exportVariable(envVar, "true");
}
if (process.env[envVar] !== "true") {
throw new util_1.ConfigurationError(`Experimental and not officially supported Rust analysis requires setting ${envVar}=true in the environment`);
}
const actualVer = (await codeql.getVersion()).version;
if (semver.lt(actualVer, minVer)) {
throw new util_1.ConfigurationError(`Experimental rust analysis is supported by CodeQL CLI version ${minVer} or higher, but found version ${actualVer}`);
}
logger.info("Experimental rust analysis enabled");
}
// Restore dependency cache(s), if they exist.
if ((0, caching_utils_1.shouldRestoreCache)(config.dependencyCachingEnabled)) {
await (0, dependency_caching_1.downloadDependencyCaches)(config.languages, logger);
@@ -387,8 +414,7 @@ async function run() {
core.exportVariable("CODEQL_EXTRACTOR_PYTHON_EXTRACT_STDLIB", "true");
}
}
const sourceRoot = path.resolve((0, util_1.getRequiredEnvParam)("GITHUB_WORKSPACE"), (0, actions_util_1.getOptionalInput)("source-root") || "");
const tracerConfig = await (0, init_1.runInit)(codeql, config, sourceRoot, "Runner.Worker.exe", (0, actions_util_1.getOptionalInput)("registries"), apiDetails, logger);
const tracerConfig = await (0, init_1.runInit)(codeql, config, sourceRoot, "Runner.Worker.exe", (0, actions_util_1.getOptionalInput)("registries"), apiDetails, overlayDatabaseMode, logger);
if (tracerConfig !== undefined) {
for (const [key, value] of Object.entries(tracerConfig.env)) {
core.exportVariable(key, value);

File diff suppressed because one or more lines are too long

34
lib/init.js generated
View File

@@ -35,6 +35,7 @@ var __importStar = (this && this.__importStar) || (function () {
Object.defineProperty(exports, "__esModule", { value: true });
exports.initCodeQL = initCodeQL;
exports.initConfig = initConfig;
exports.getOverlayDatabaseMode = getOverlayDatabaseMode;
exports.runInit = runInit;
exports.printPathFiltersWarning = printPathFiltersWarning;
exports.checkInstallPython311 = checkInstallPython311;
@@ -43,10 +44,13 @@ const fs = __importStar(require("fs"));
const path = __importStar(require("path"));
const toolrunner = __importStar(require("@actions/exec/lib/toolrunner"));
const io = __importStar(require("@actions/io"));
const semver = __importStar(require("semver"));
const actions_util_1 = require("./actions-util");
const codeql_1 = require("./codeql");
const configUtils = __importStar(require("./config-utils"));
const git_utils_1 = require("./git-utils");
const languages_1 = require("./languages");
const overlay_database_utils_1 = require("./overlay-database-utils");
const tools_features_1 = require("./tools-features");
const tracer_config_1 = require("./tracer-config");
const util = __importStar(require("./util"));
@@ -73,7 +77,33 @@ async function initConfig(inputs, codeql) {
logger.endGroup();
return config;
}
async function runInit(codeql, config, sourceRoot, processName, registriesInput, apiDetails, logger) {
async function getOverlayDatabaseMode(codeqlVersion, config, sourceRoot, logger) {
const overlayDatabaseMode = process.env.CODEQL_OVERLAY_DATABASE_MODE;
if (overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.Overlay ||
overlayDatabaseMode === overlay_database_utils_1.OverlayDatabaseMode.OverlayBase) {
if (config.buildMode !== util.BuildMode.None) {
logger.warning(`Cannot build an ${overlayDatabaseMode} database because ` +
`build-mode is set to "${config.buildMode}" instead of "none". ` +
"Falling back to creating a normal full database instead.");
return overlay_database_utils_1.OverlayDatabaseMode.None;
}
if (semver.lt(codeqlVersion, overlay_database_utils_1.CODEQL_OVERLAY_MINIMUM_VERSION)) {
logger.warning(`Cannot build an ${overlayDatabaseMode} database because ` +
`the CodeQL CLI is older than ${overlay_database_utils_1.CODEQL_OVERLAY_MINIMUM_VERSION}. ` +
"Falling back to creating a normal full database instead.");
return overlay_database_utils_1.OverlayDatabaseMode.None;
}
if ((await (0, git_utils_1.getGitRoot)(sourceRoot)) === undefined) {
logger.warning(`Cannot build an ${overlayDatabaseMode} database because ` +
`the source root "${sourceRoot}" is not inside a git repository. ` +
"Falling back to creating a normal full database instead.");
return overlay_database_utils_1.OverlayDatabaseMode.None;
}
return overlayDatabaseMode;
}
return overlay_database_utils_1.OverlayDatabaseMode.None;
}
async function runInit(codeql, config, sourceRoot, processName, registriesInput, apiDetails, overlayDatabaseMode, logger) {
fs.mkdirSync(config.dbLocation, { recursive: true });
const { registriesAuthTokens, qlconfigFile } = await configUtils.generateRegistries(registriesInput, config.tempDir, logger);
await configUtils.wrapEnvironment({
@@ -81,7 +111,7 @@ async function runInit(codeql, config, sourceRoot, processName, registriesInput,
CODEQL_REGISTRIES_AUTH: registriesAuthTokens,
},
// Init a database cluster
async () => await codeql.databaseInitCluster(config, sourceRoot, processName, qlconfigFile, logger));
async () => await codeql.databaseInitCluster(config, sourceRoot, processName, qlconfigFile, overlayDatabaseMode, logger));
return await (0, tracer_config_1.getCombinedTracerConfig)(codeql, config);
}
function printPathFiltersWarning(config, logger) {

View File

@@ -1 +1 @@
{"version":3,"file":"init.js","sourceRoot":"","sources":["../src/init.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAoBA,gCAyCC;AAED,gCAgBC;AAED,0BAkCC;AAED,0DAeC;AAMD,sDAkBC;AAED,0EAkDC;AAhND,uCAAyB;AACzB,2CAA6B;AAE7B,yEAA2D;AAC3D,gDAAkC;AAElC,iDAAsE;AAEtE,qCAA+C;AAC/C,4DAA8C;AAE9C,2CAA0D;AAK1D,qDAAgD;AAChD,mDAAwE;AACxE,6CAA+B;AAExB,KAAK,UAAU,UAAU,CAC9B,UAA8B,EAC9B,UAA4B,EAC5B,OAAe,EACf,OAA2B,EAC3B,iBAA2C,EAC3C,QAA2B,EAC3B,MAAc;IAQd,MAAM,CAAC,UAAU,CAAC,oBAAoB,CAAC,CAAC;IACxC,MAAM,EACJ,MAAM,EACN,yBAAyB,EACzB,WAAW,EACX,YAAY,EACZ,gBAAgB,GACjB,GAAG,MAAM,IAAA,oBAAW,EACnB,UAAU,EACV,UAAU,EACV,OAAO,EACP,OAAO,EACP,iBAAiB,EACjB,MAAM,EACN,QAAQ,EACR,IAAI,CACL,CAAC;IACF,MAAM,MAAM,CAAC,YAAY,EAAE,CAAC;IAC5B,MAAM,CAAC,QAAQ,EAAE,CAAC;IAClB,OAAO;QACL,MAAM;QACN,yBAAyB;QACzB,WAAW;QACX,YAAY;QACZ,gBAAgB;KACjB,CAAC;AACJ,CAAC;AAEM,KAAK,UAAU,UAAU,CAC9B,MAAoC,EACpC,MAAc;IAEd,MAAM,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC;IAC7B,MAAM,CAAC,UAAU,CAAC,6BAA6B,CAAC,CAAC;IACjD,MAAM,MAAM,GAAG,MAAM,WAAW,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC;IACpD,IACE,CAAC,CAAC,MAAM,MAAM,CAAC,eAAe,CAC5B,6BAAY,CAAC,kCAAkC,CAChD,CAAC,EACF,CAAC;QACD,uBAAuB,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IAC1C,CAAC;IACD,MAAM,CAAC,QAAQ,EAAE,CAAC;IAClB,OAAO,MAAM,CAAC;AAChB,CAAC;AAEM,KAAK,UAAU,OAAO,CAC3B,MAAc,EACd,MAA0B,EAC1B,UAAkB,EAClB,WAA+B,EAC/B,eAAmC,EACnC,UAAoC,EACpC,MAAc;IAEd,EAAE,CAAC,SAAS,CAAC,MAAM,CAAC,UAAU,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;IAErD,MAAM,EAAE,oBAAoB,EAAE,YAAY,EAAE,GAC1C,MAAM,WAAW,CAAC,kBAAkB,CAClC,eAAe,EACf,MAAM,CAAC,OAAO,EACd,MAAM,CACP,CAAC;IACJ,MAAM,WAAW,CAAC,eAAe,CAC/B;QACE,YAAY,EAAE,UAAU,CAAC,IAAI;QAC7B,sBAAsB,EAAE,oBAAoB;KAC7C;IAED,0BAA0B;IAC1B,KAAK,IAAI,EAAE,CACT,MAAM,MAAM,CAAC,mBAAmB,CAC9B,MAAM,EACN,UAAU,EACV,WAAW,EACX,YAAY,EACZ,MAAM,CACP,CACJ,CAAC;IACF,OAAO,MAAM,IAAA,uCAAuB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;AACvD,CAAC;AAED,SAAgB,uBAAuB,CACrC,MAA0B,EAC1B,MAAc;IAEd,qEAAqE;IACrE,sEAAsE;IACtE,IACE,CAAC,MAAM,CAAC,iBAAiB,CAAC,KAAK,EAAE,MAAM;QACrC,MAAM,CAAC,iBAAiB,CAAC,cAAc,CAAC,EAAE,MAAM,CAAC;QACnD,CAAC,MAAM,CAAC,SAAS,CAAC,KAAK,CAAC,6BAAiB,CAAC,EAC1C,CAAC;QACD,MAAM,CAAC,OAAO,CACZ,mGAAmG,CACpG,CAAC;IACJ,CAAC;AACH,CAAC;AAED;;;GAGG;AACI,KAAK,UAAU,qBAAqB,CACzC,SAAqB,EACrB,MAAc;IAEd,IACE,SAAS,CAAC,QAAQ,CAAC,oBAAQ,CAAC,MAAM,CAAC;QACnC,OAAO,CAAC,QAAQ,KAAK,OAAO;QAC5B,CAAC,CAAC,MAAM,MAAM,CAAC,UAAU,EAAE,CAAC,CAAC,QAAQ,EAAE,iBAAiB,EACxD,CAAC;QACD,MAAM,MAAM,GAAG,IAAI,CAAC,OAAO,CACzB,SAAS,EACT,iBAAiB,EACjB,oBAAoB,CACrB,CAAC;QACF,MAAM,IAAI,UAAU,CAAC,UAAU,CAAC,MAAM,EAAE,CAAC,KAAK,CAAC,YAAY,EAAE,IAAI,CAAC,EAAE;YAClE,MAAM;SACP,CAAC,CAAC,IAAI,EAAE,CAAC;IACZ,CAAC;AACH,CAAC;AAED,SAAgB,+BAA+B,CAC7C,MAA0B,EAC1B,MAAc;AACd,+FAA+F;AAC/F,eAAe;AACf,MAAM,GAAG,EAAE,CAAC,MAAM;IAElB,IACE,EAAE,CAAC,UAAU,CAAC,MAAM,CAAC,UAAU,CAAC;QAChC,CAAC,EAAE,CAAC,QAAQ,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,EAAE;YACtC,EAAE,CAAC,WAAW,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,EAC3C,CAAC;QACD,MAAM,CAAC,OAAO,CACZ,kCAAkC,MAAM,CAAC,UAAU,4CAA4C,CAChG,CAAC;QACF,IAAI,CAAC;YACH,MAAM,CAAC,MAAM,CAAC,UAAU,EAAE;gBACxB,KAAK,EAAE,IAAI;gBACX,UAAU,EAAE,CAAC;gBACb,SAAS,EAAE,IAAI;aAChB,CAAC,CAAC;YAEH,MAAM,CAAC,IAAI,CACT,yCAAyC,MAAM,CAAC,UAAU,GAAG,CAC9D,CAAC;QACJ,CAAC;QAAC,OAAO,CAAC,EAAE,CAAC;YACX,MAAM,KAAK,GAAG,mEACZ,IAAA,+BAAgB,EAAC,aAAa,CAAC;gBAC7B,CAAC,CAAC,sCAAsC,MAAM,CAAC,UAAU,IAAI;gBAC7D,CAAC,CAAC,kCAAkC,MAAM,CAAC,UAAU,IAAI;oBACvD,yEACN,iEAAiE,CAAC;YAElE,kGAAkG;YAClG,IAAI,IAAA,iCAAkB,GAAE,EAAE,CAAC;gBACzB,MAAM,IAAI,IAAI,CAAC,kBAAkB,CAC/B,GAAG,KAAK,4GAA4G;oBAClH,sEAAsE,IAAI,CAAC,eAAe,CACxF,CAAC,CACF,EAAE,CACN,CAAC;YACJ,CAAC;iBAAM,CAAC;gBACN,MAAM,IAAI,KAAK,CACb,GAAG,KAAK,sDAAsD;oBAC5D,+EAA+E;oBAC/E,yCAAyC,IAAI,CAAC,eAAe,CAAC,CAAC,CAAC,EAAE,CACrE,CAAC;YACJ,CAAC;QACH,CAAC;IACH,CAAC;AACH,CAAC"}
{"version":3,"file":"init.js","sourceRoot":"","sources":["../src/init.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AA0BA,gCAyCC;AAED,gCAgBC;AAED,wDAuCC;AAED,0BAoCC;AAED,0DAeC;AAMD,sDAkBC;AAED,0EAkDC;AAjQD,uCAAyB;AACzB,2CAA6B;AAE7B,yEAA2D;AAC3D,gDAAkC;AAClC,+CAAiC;AAEjC,iDAAsE;AAEtE,qCAA+C;AAC/C,4DAA8C;AAE9C,2CAAyC;AACzC,2CAA0D;AAE1D,qEAGkC;AAIlC,qDAAgD;AAChD,mDAAwE;AACxE,6CAA+B;AAExB,KAAK,UAAU,UAAU,CAC9B,UAA8B,EAC9B,UAA4B,EAC5B,OAAe,EACf,OAA2B,EAC3B,iBAA2C,EAC3C,QAA2B,EAC3B,MAAc;IAQd,MAAM,CAAC,UAAU,CAAC,oBAAoB,CAAC,CAAC;IACxC,MAAM,EACJ,MAAM,EACN,yBAAyB,EACzB,WAAW,EACX,YAAY,EACZ,gBAAgB,GACjB,GAAG,MAAM,IAAA,oBAAW,EACnB,UAAU,EACV,UAAU,EACV,OAAO,EACP,OAAO,EACP,iBAAiB,EACjB,MAAM,EACN,QAAQ,EACR,IAAI,CACL,CAAC;IACF,MAAM,MAAM,CAAC,YAAY,EAAE,CAAC;IAC5B,MAAM,CAAC,QAAQ,EAAE,CAAC;IAClB,OAAO;QACL,MAAM;QACN,yBAAyB;QACzB,WAAW;QACX,YAAY;QACZ,gBAAgB;KACjB,CAAC;AACJ,CAAC;AAEM,KAAK,UAAU,UAAU,CAC9B,MAAoC,EACpC,MAAc;IAEd,MAAM,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC;IAC7B,MAAM,CAAC,UAAU,CAAC,6BAA6B,CAAC,CAAC;IACjD,MAAM,MAAM,GAAG,MAAM,WAAW,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC;IACpD,IACE,CAAC,CAAC,MAAM,MAAM,CAAC,eAAe,CAC5B,6BAAY,CAAC,kCAAkC,CAChD,CAAC,EACF,CAAC;QACD,uBAAuB,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IAC1C,CAAC;IACD,MAAM,CAAC,QAAQ,EAAE,CAAC;IAClB,OAAO,MAAM,CAAC;AAChB,CAAC;AAEM,KAAK,UAAU,sBAAsB,CAC1C,aAAqB,EACrB,MAA0B,EAC1B,UAAkB,EAClB,MAAc;IAEd,MAAM,mBAAmB,GAAG,OAAO,CAAC,GAAG,CAAC,4BAA4B,CAAC;IAErE,IACE,mBAAmB,KAAK,4CAAmB,CAAC,OAAO;QACnD,mBAAmB,KAAK,4CAAmB,CAAC,WAAW,EACvD,CAAC;QACD,IAAI,MAAM,CAAC,SAAS,KAAK,IAAI,CAAC,SAAS,CAAC,IAAI,EAAE,CAAC;YAC7C,MAAM,CAAC,OAAO,CACZ,mBAAmB,mBAAmB,oBAAoB;gBACxD,yBAAyB,MAAM,CAAC,SAAS,uBAAuB;gBAChE,0DAA0D,CAC7D,CAAC;YACF,OAAO,4CAAmB,CAAC,IAAI,CAAC;QAClC,CAAC;QACD,IAAI,MAAM,CAAC,EAAE,CAAC,aAAa,EAAE,uDAA8B,CAAC,EAAE,CAAC;YAC7D,MAAM,CAAC,OAAO,CACZ,mBAAmB,mBAAmB,oBAAoB;gBACxD,gCAAgC,uDAA8B,IAAI;gBAClE,0DAA0D,CAC7D,CAAC;YACF,OAAO,4CAAmB,CAAC,IAAI,CAAC;QAClC,CAAC;QACD,IAAI,CAAC,MAAM,IAAA,sBAAU,EAAC,UAAU,CAAC,CAAC,KAAK,SAAS,EAAE,CAAC;YACjD,MAAM,CAAC,OAAO,CACZ,mBAAmB,mBAAmB,oBAAoB;gBACxD,oBAAoB,UAAU,oCAAoC;gBAClE,0DAA0D,CAC7D,CAAC;YACF,OAAO,4CAAmB,CAAC,IAAI,CAAC;QAClC,CAAC;QACD,OAAO,mBAA0C,CAAC;IACpD,CAAC;IACD,OAAO,4CAAmB,CAAC,IAAI,CAAC;AAClC,CAAC;AAEM,KAAK,UAAU,OAAO,CAC3B,MAAc,EACd,MAA0B,EAC1B,UAAkB,EAClB,WAA+B,EAC/B,eAAmC,EACnC,UAAoC,EACpC,mBAAwC,EACxC,MAAc;IAEd,EAAE,CAAC,SAAS,CAAC,MAAM,CAAC,UAAU,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;IAErD,MAAM,EAAE,oBAAoB,EAAE,YAAY,EAAE,GAC1C,MAAM,WAAW,CAAC,kBAAkB,CAClC,eAAe,EACf,MAAM,CAAC,OAAO,EACd,MAAM,CACP,CAAC;IACJ,MAAM,WAAW,CAAC,eAAe,CAC/B;QACE,YAAY,EAAE,UAAU,CAAC,IAAI;QAC7B,sBAAsB,EAAE,oBAAoB;KAC7C;IAED,0BAA0B;IAC1B,KAAK,IAAI,EAAE,CACT,MAAM,MAAM,CAAC,mBAAmB,CAC9B,MAAM,EACN,UAAU,EACV,WAAW,EACX,YAAY,EACZ,mBAAmB,EACnB,MAAM,CACP,CACJ,CAAC;IACF,OAAO,MAAM,IAAA,uCAAuB,EAAC,MAAM,EAAE,MAAM,CAAC,CAAC;AACvD,CAAC;AAED,SAAgB,uBAAuB,CACrC,MAA0B,EAC1B,MAAc;IAEd,qEAAqE;IACrE,sEAAsE;IACtE,IACE,CAAC,MAAM,CAAC,iBAAiB,CAAC,KAAK,EAAE,MAAM;QACrC,MAAM,CAAC,iBAAiB,CAAC,cAAc,CAAC,EAAE,MAAM,CAAC;QACnD,CAAC,MAAM,CAAC,SAAS,CAAC,KAAK,CAAC,6BAAiB,CAAC,EAC1C,CAAC;QACD,MAAM,CAAC,OAAO,CACZ,mGAAmG,CACpG,CAAC;IACJ,CAAC;AACH,CAAC;AAED;;;GAGG;AACI,KAAK,UAAU,qBAAqB,CACzC,SAAqB,EACrB,MAAc;IAEd,IACE,SAAS,CAAC,QAAQ,CAAC,oBAAQ,CAAC,MAAM,CAAC;QACnC,OAAO,CAAC,QAAQ,KAAK,OAAO;QAC5B,CAAC,CAAC,MAAM,MAAM,CAAC,UAAU,EAAE,CAAC,CAAC,QAAQ,EAAE,iBAAiB,EACxD,CAAC;QACD,MAAM,MAAM,GAAG,IAAI,CAAC,OAAO,CACzB,SAAS,EACT,iBAAiB,EACjB,oBAAoB,CACrB,CAAC;QACF,MAAM,IAAI,UAAU,CAAC,UAAU,CAAC,MAAM,EAAE,CAAC,KAAK,CAAC,YAAY,EAAE,IAAI,CAAC,EAAE;YAClE,MAAM;SACP,CAAC,CAAC,IAAI,EAAE,CAAC;IACZ,CAAC;AACH,CAAC;AAED,SAAgB,+BAA+B,CAC7C,MAA0B,EAC1B,MAAc;AACd,+FAA+F;AAC/F,eAAe;AACf,MAAM,GAAG,EAAE,CAAC,MAAM;IAElB,IACE,EAAE,CAAC,UAAU,CAAC,MAAM,CAAC,UAAU,CAAC;QAChC,CAAC,EAAE,CAAC,QAAQ,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,EAAE;YACtC,EAAE,CAAC,WAAW,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,EAC3C,CAAC;QACD,MAAM,CAAC,OAAO,CACZ,kCAAkC,MAAM,CAAC,UAAU,4CAA4C,CAChG,CAAC;QACF,IAAI,CAAC;YACH,MAAM,CAAC,MAAM,CAAC,UAAU,EAAE;gBACxB,KAAK,EAAE,IAAI;gBACX,UAAU,EAAE,CAAC;gBACb,SAAS,EAAE,IAAI;aAChB,CAAC,CAAC;YAEH,MAAM,CAAC,IAAI,CACT,yCAAyC,MAAM,CAAC,UAAU,GAAG,CAC9D,CAAC;QACJ,CAAC;QAAC,OAAO,CAAC,EAAE,CAAC;YACX,MAAM,KAAK,GAAG,mEACZ,IAAA,+BAAgB,EAAC,aAAa,CAAC;gBAC7B,CAAC,CAAC,sCAAsC,MAAM,CAAC,UAAU,IAAI;gBAC7D,CAAC,CAAC,kCAAkC,MAAM,CAAC,UAAU,IAAI;oBACvD,yEACN,iEAAiE,CAAC;YAElE,kGAAkG;YAClG,IAAI,IAAA,iCAAkB,GAAE,EAAE,CAAC;gBACzB,MAAM,IAAI,IAAI,CAAC,kBAAkB,CAC/B,GAAG,KAAK,4GAA4G;oBAClH,sEAAsE,IAAI,CAAC,eAAe,CACxF,CAAC,CACF,EAAE,CACN,CAAC;YACJ,CAAC;iBAAM,CAAC;gBACN,MAAM,IAAI,KAAK,CACb,GAAG,KAAK,sDAAsD;oBAC5D,+EAA+E;oBAC/E,yCAAyC,IAAI,CAAC,eAAe,CAAC,CAAC,CAAC,EAAE,CACrE,CAAC;YACJ,CAAC;QACH,CAAC;IACH,CAAC;AACH,CAAC"}

11
lib/overlay-database-utils.js generated Normal file
View File

@@ -0,0 +1,11 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.CODEQL_OVERLAY_MINIMUM_VERSION = exports.OverlayDatabaseMode = void 0;
var OverlayDatabaseMode;
(function (OverlayDatabaseMode) {
OverlayDatabaseMode["Overlay"] = "overlay";
OverlayDatabaseMode["OverlayBase"] = "overlay-base";
OverlayDatabaseMode["None"] = "none";
})(OverlayDatabaseMode || (exports.OverlayDatabaseMode = OverlayDatabaseMode = {}));
exports.CODEQL_OVERLAY_MINIMUM_VERSION = "2.20.5";
//# sourceMappingURL=overlay-database-utils.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"overlay-database-utils.js","sourceRoot":"","sources":["../src/overlay-database-utils.ts"],"names":[],"mappings":";;;AAAA,IAAY,mBAIX;AAJD,WAAY,mBAAmB;IAC7B,0CAAmB,CAAA;IACnB,mDAA4B,CAAA;IAC5B,oCAAa,CAAA;AACf,CAAC,EAJW,mBAAmB,mCAAnB,mBAAmB,QAI9B;AAEY,QAAA,8BAA8B,GAAG,QAAQ,CAAC"}

11
lib/tar.js generated
View File

@@ -143,7 +143,16 @@ async function extractTarZst(tar, dest, tarVersion, logger) {
: ""}`);
try {
// Initialize args
const args = ["-x", "--zstd"];
//
// `--ignore-zeros` means that trailing zero bytes at the end of an archive will be read
// by `tar` in case a further concatenated archive follows. Otherwise when a tarball built
// by GNU tar, which writes many trailing zeroes, is read by BSD tar, which expects less, then
// BSD tar can hang up the pipe to its filter program early, and if that program is `zstd`
// then it will try to write the remaining zeroes, get an EPIPE error because `tar` has closed
// its end of the pipe, return 1, and `tar` will pass the error along.
//
// See also https://github.com/facebook/zstd/issues/4294
const args = ["-x", "--zstd", "--ignore-zeros"];
if (tarVersion.type === "gnu") {
// Suppress warnings when using GNU tar to extract archives created by BSD tar
args.push("--warning=no-unknown-keyword");

File diff suppressed because one or more lines are too long

41
lib/upload-lib.js generated
View File

@@ -58,6 +58,7 @@ const api = __importStar(require("./api-client"));
const api_client_1 = require("./api-client");
const codeql_1 = require("./codeql");
const config_utils_1 = require("./config-utils");
const diff_filtering_utils_1 = require("./diff-filtering-utils");
const environment_1 = require("./environment");
const fingerprints = __importStar(require("./fingerprints"));
const gitUtils = __importStar(require("./git-utils"));
@@ -412,6 +413,7 @@ async function uploadFiles(sarifPath, checkoutPath, category, features, logger)
validateSarifFileSchema(file, logger);
}
let sarif = await combineSarifFilesUsingCLI(sarifFiles, gitHubVersion, features, logger);
sarif = filterAlertsByDiffRange(logger, sarif);
sarif = await fingerprints.addFingerprints(sarif, checkoutPath, logger);
const analysisKey = await api.getAnalysisKey();
const environment = actionsUtil.getRequiredInput("matrix");
@@ -607,4 +609,43 @@ class InvalidSarifUploadError extends Error {
}
}
exports.InvalidSarifUploadError = InvalidSarifUploadError;
function filterAlertsByDiffRange(logger, sarif) {
const diffRanges = (0, diff_filtering_utils_1.readDiffRangesJsonFile)(logger);
if (!diffRanges?.length) {
return sarif;
}
const checkoutPath = actionsUtil.getRequiredInput("checkout_path");
for (const run of sarif.runs) {
if (run.results) {
run.results = run.results.filter((result) => {
const locations = [
...(result.locations || []).map((loc) => loc.physicalLocation),
...(result.relatedLocations || []).map((loc) => loc.physicalLocation),
];
return locations.some((physicalLocation) => {
const locationUri = physicalLocation?.artifactLocation?.uri;
const locationStartLine = physicalLocation?.region?.startLine;
if (!locationUri || locationStartLine === undefined) {
return false;
}
// CodeQL always uses forward slashes as the path separator, so on Windows we
// need to replace any backslashes with forward slashes.
const locationPath = path
.join(checkoutPath, locationUri)
.replaceAll(path.sep, "/");
// Alert filtering here replicates the same behavior as the restrictAlertsTo
// extensible predicate in CodeQL. See the restrictAlertsTo documentation
// https://codeql.github.com/codeql-standard-libraries/csharp/codeql/util/AlertFiltering.qll/predicate.AlertFiltering$restrictAlertsTo.3.html
// for more details, such as why the filtering applies only to the first line
// of an alert location.
return diffRanges.some((range) => range.path === locationPath &&
((range.startLine <= locationStartLine &&
range.endLine >= locationStartLine) ||
(range.startLine === 0 && range.endLine === 0)));
});
});
}
}
return sarif;
}
//# sourceMappingURL=upload-lib.js.map

File diff suppressed because one or more lines are too long

3
lib/util.js generated
View File

@@ -823,10 +823,11 @@ async function checkDiskUsage(logger) {
return undefined;
}
const diskUsage = await (0, check_disk_space_1.default)(getRequiredEnvParam("GITHUB_WORKSPACE"));
const mbInBytes = 1024 * 1024;
const gbInBytes = 1024 * 1024 * 1024;
if (diskUsage.free < 2 * gbInBytes) {
const message = "The Actions runner is running low on disk space " +
`(${(diskUsage.free / gbInBytes).toPrecision(4)} GB available).`;
`(${(diskUsage.free / mbInBytes).toPrecision(4)} MB available).`;
if (process.env[environment_1.EnvVar.HAS_WARNED_ABOUT_DISK_SPACE] !== "true") {
logger.warning(message);
}

File diff suppressed because one or more lines are too long

1
node_modules/.bin/dot-object generated vendored
View File

@@ -1 +0,0 @@
../dot-object/bin/dot-object

View File

@@ -1 +0,0 @@
../twirp-ts/protoc-gen-twirp_ts

1467
node_modules/.package-lock.json generated vendored

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +1,4 @@
export * from './google/protobuf/timestamp';
export * from './google/protobuf/wrappers';
export * from './results/api/v1/artifact';
export * from './results/api/v1/artifact.twirp';
export * from './results/api/v1/artifact.twirp-client';

View File

@@ -17,5 +17,5 @@ Object.defineProperty(exports, "__esModule", { value: true });
__exportStar(require("./google/protobuf/timestamp"), exports);
__exportStar(require("./google/protobuf/wrappers"), exports);
__exportStar(require("./results/api/v1/artifact"), exports);
__exportStar(require("./results/api/v1/artifact.twirp"), exports);
__exportStar(require("./results/api/v1/artifact.twirp-client"), exports);
//# sourceMappingURL=index.js.map

View File

@@ -1 +1 @@
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/generated/index.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;AAAA,8DAA2C;AAC3C,6DAA0C;AAC1C,4DAAyC;AACzC,kEAA+C"}
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/generated/index.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;AAAA,8DAA2C;AAC3C,6DAA0C;AAC1C,4DAAyC;AACzC,yEAAsD"}

View File

@@ -8,6 +8,66 @@ import { MessageType } from "@protobuf-ts/runtime";
import { Int64Value } from "../../../google/protobuf/wrappers";
import { StringValue } from "../../../google/protobuf/wrappers";
import { Timestamp } from "../../../google/protobuf/timestamp";
/**
* @generated from protobuf message github.actions.results.api.v1.MigrateArtifactRequest
*/
export interface MigrateArtifactRequest {
/**
* @generated from protobuf field: string workflow_run_backend_id = 1;
*/
workflowRunBackendId: string;
/**
* @generated from protobuf field: string name = 2;
*/
name: string;
/**
* @generated from protobuf field: google.protobuf.Timestamp expires_at = 3;
*/
expiresAt?: Timestamp;
}
/**
* @generated from protobuf message github.actions.results.api.v1.MigrateArtifactResponse
*/
export interface MigrateArtifactResponse {
/**
* @generated from protobuf field: bool ok = 1;
*/
ok: boolean;
/**
* @generated from protobuf field: string signed_upload_url = 2;
*/
signedUploadUrl: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactRequest
*/
export interface FinalizeMigratedArtifactRequest {
/**
* @generated from protobuf field: string workflow_run_backend_id = 1;
*/
workflowRunBackendId: string;
/**
* @generated from protobuf field: string name = 2;
*/
name: string;
/**
* @generated from protobuf field: int64 size = 3;
*/
size: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactResponse
*/
export interface FinalizeMigratedArtifactResponse {
/**
* @generated from protobuf field: bool ok = 1;
*/
ok: boolean;
/**
* @generated from protobuf field: int64 artifact_id = 2;
*/
artifactId: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.CreateArtifactRequest
*/
@@ -162,6 +222,12 @@ export interface ListArtifactsResponse_MonolithArtifact {
* @generated from protobuf field: google.protobuf.Timestamp created_at = 6;
*/
createdAt?: Timestamp;
/**
* The SHA-256 digest of the artifact, calculated on upload for upload-artifact v4 & newer
*
* @generated from protobuf field: google.protobuf.StringValue digest = 7;
*/
digest?: StringValue;
}
/**
* @generated from protobuf message github.actions.results.api.v1.GetSignedArtifactURLRequest
@@ -219,6 +285,46 @@ export interface DeleteArtifactResponse {
*/
artifactId: string;
}
declare class MigrateArtifactRequest$Type extends MessageType<MigrateArtifactRequest> {
constructor();
create(value?: PartialMessage<MigrateArtifactRequest>): MigrateArtifactRequest;
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: MigrateArtifactRequest): MigrateArtifactRequest;
internalBinaryWrite(message: MigrateArtifactRequest, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter;
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactRequest
*/
export declare const MigrateArtifactRequest: MigrateArtifactRequest$Type;
declare class MigrateArtifactResponse$Type extends MessageType<MigrateArtifactResponse> {
constructor();
create(value?: PartialMessage<MigrateArtifactResponse>): MigrateArtifactResponse;
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: MigrateArtifactResponse): MigrateArtifactResponse;
internalBinaryWrite(message: MigrateArtifactResponse, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter;
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactResponse
*/
export declare const MigrateArtifactResponse: MigrateArtifactResponse$Type;
declare class FinalizeMigratedArtifactRequest$Type extends MessageType<FinalizeMigratedArtifactRequest> {
constructor();
create(value?: PartialMessage<FinalizeMigratedArtifactRequest>): FinalizeMigratedArtifactRequest;
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: FinalizeMigratedArtifactRequest): FinalizeMigratedArtifactRequest;
internalBinaryWrite(message: FinalizeMigratedArtifactRequest, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter;
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactRequest
*/
export declare const FinalizeMigratedArtifactRequest: FinalizeMigratedArtifactRequest$Type;
declare class FinalizeMigratedArtifactResponse$Type extends MessageType<FinalizeMigratedArtifactResponse> {
constructor();
create(value?: PartialMessage<FinalizeMigratedArtifactResponse>): FinalizeMigratedArtifactResponse;
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: FinalizeMigratedArtifactResponse): FinalizeMigratedArtifactResponse;
internalBinaryWrite(message: FinalizeMigratedArtifactResponse, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter;
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactResponse
*/
export declare const FinalizeMigratedArtifactResponse: FinalizeMigratedArtifactResponse$Type;
declare class CreateArtifactRequest$Type extends MessageType<CreateArtifactRequest> {
constructor();
create(value?: PartialMessage<CreateArtifactRequest>): CreateArtifactRequest;

View File

@@ -1,6 +1,6 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.ArtifactService = exports.DeleteArtifactResponse = exports.DeleteArtifactRequest = exports.GetSignedArtifactURLResponse = exports.GetSignedArtifactURLRequest = exports.ListArtifactsResponse_MonolithArtifact = exports.ListArtifactsResponse = exports.ListArtifactsRequest = exports.FinalizeArtifactResponse = exports.FinalizeArtifactRequest = exports.CreateArtifactResponse = exports.CreateArtifactRequest = void 0;
exports.ArtifactService = exports.DeleteArtifactResponse = exports.DeleteArtifactRequest = exports.GetSignedArtifactURLResponse = exports.GetSignedArtifactURLRequest = exports.ListArtifactsResponse_MonolithArtifact = exports.ListArtifactsResponse = exports.ListArtifactsRequest = exports.FinalizeArtifactResponse = exports.FinalizeArtifactRequest = exports.CreateArtifactResponse = exports.CreateArtifactRequest = exports.FinalizeMigratedArtifactResponse = exports.FinalizeMigratedArtifactRequest = exports.MigrateArtifactResponse = exports.MigrateArtifactRequest = void 0;
// @generated by protobuf-ts 2.9.1 with parameter long_type_string,client_none,generate_dependencies
// @generated from protobuf file "results/api/v1/artifact.proto" (package "github.actions.results.api.v1", syntax proto3)
// tslint:disable
@@ -14,6 +14,236 @@ const wrappers_1 = require("../../../google/protobuf/wrappers");
const wrappers_2 = require("../../../google/protobuf/wrappers");
const timestamp_1 = require("../../../google/protobuf/timestamp");
// @generated message type with reflection information, may provide speed optimized methods
class MigrateArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.MigrateArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "expires_at", kind: "message", T: () => timestamp_1.Timestamp }
]);
}
create(value) {
const message = { workflowRunBackendId: "", name: "" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string name */ 2:
message.name = reader.string();
break;
case /* google.protobuf.Timestamp expires_at */ 3:
message.expiresAt = timestamp_1.Timestamp.internalBinaryRead(reader, reader.uint32(), options, message.expiresAt);
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, runtime_1.WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string name = 2; */
if (message.name !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.name);
/* google.protobuf.Timestamp expires_at = 3; */
if (message.expiresAt)
timestamp_1.Timestamp.internalBinaryWrite(message.expiresAt, writer.tag(3, runtime_1.WireType.LengthDelimited).fork(), options).join();
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactRequest
*/
exports.MigrateArtifactRequest = new MigrateArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class MigrateArtifactResponse$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.MigrateArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "signed_upload_url", kind: "scalar", T: 9 /*ScalarType.STRING*/ }
]);
}
create(value) {
const message = { ok: false, signedUploadUrl: "" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* string signed_upload_url */ 2:
message.signedUploadUrl = reader.string();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, runtime_1.WireType.Varint).bool(message.ok);
/* string signed_upload_url = 2; */
if (message.signedUploadUrl !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.signedUploadUrl);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactResponse
*/
exports.MigrateArtifactResponse = new MigrateArtifactResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class FinalizeMigratedArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.FinalizeMigratedArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "size", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value) {
const message = { workflowRunBackendId: "", name: "", size: "0" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string name */ 2:
message.name = reader.string();
break;
case /* int64 size */ 3:
message.size = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, runtime_1.WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string name = 2; */
if (message.name !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.name);
/* int64 size = 3; */
if (message.size !== "0")
writer.tag(3, runtime_1.WireType.Varint).int64(message.size);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactRequest
*/
exports.FinalizeMigratedArtifactRequest = new FinalizeMigratedArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class FinalizeMigratedArtifactResponse$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.FinalizeMigratedArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "artifact_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value) {
const message = { ok: false, artifactId: "0" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* int64 artifact_id */ 2:
message.artifactId = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, runtime_1.WireType.Varint).bool(message.ok);
/* int64 artifact_id = 2; */
if (message.artifactId !== "0")
writer.tag(2, runtime_1.WireType.Varint).int64(message.artifactId);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactResponse
*/
exports.FinalizeMigratedArtifactResponse = new FinalizeMigratedArtifactResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class CreateArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.CreateArtifactRequest", [
@@ -395,7 +625,8 @@ class ListArtifactsResponse_MonolithArtifact$Type extends runtime_5.MessageType
{ no: 3, name: "database_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ },
{ no: 4, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 5, name: "size", kind: "scalar", T: 3 /*ScalarType.INT64*/ },
{ no: 6, name: "created_at", kind: "message", T: () => timestamp_1.Timestamp }
{ no: 6, name: "created_at", kind: "message", T: () => timestamp_1.Timestamp },
{ no: 7, name: "digest", kind: "message", T: () => wrappers_2.StringValue }
]);
}
create(value) {
@@ -428,6 +659,9 @@ class ListArtifactsResponse_MonolithArtifact$Type extends runtime_5.MessageType
case /* google.protobuf.Timestamp created_at */ 6:
message.createdAt = timestamp_1.Timestamp.internalBinaryRead(reader, reader.uint32(), options, message.createdAt);
break;
case /* google.protobuf.StringValue digest */ 7:
message.digest = wrappers_2.StringValue.internalBinaryRead(reader, reader.uint32(), options, message.digest);
break;
default:
let u = options.readUnknownField;
if (u === "throw")
@@ -458,6 +692,9 @@ class ListArtifactsResponse_MonolithArtifact$Type extends runtime_5.MessageType
/* google.protobuf.Timestamp created_at = 6; */
if (message.createdAt)
timestamp_1.Timestamp.internalBinaryWrite(message.createdAt, writer.tag(6, runtime_1.WireType.LengthDelimited).fork(), options).join();
/* google.protobuf.StringValue digest = 7; */
if (message.digest)
wrappers_2.StringValue.internalBinaryWrite(message.digest, writer.tag(7, runtime_1.WireType.LengthDelimited).fork(), options).join();
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
@@ -699,6 +936,8 @@ exports.ArtifactService = new runtime_rpc_1.ServiceType("github.actions.results.
{ name: "FinalizeArtifact", options: {}, I: exports.FinalizeArtifactRequest, O: exports.FinalizeArtifactResponse },
{ name: "ListArtifacts", options: {}, I: exports.ListArtifactsRequest, O: exports.ListArtifactsResponse },
{ name: "GetSignedArtifactURL", options: {}, I: exports.GetSignedArtifactURLRequest, O: exports.GetSignedArtifactURLResponse },
{ name: "DeleteArtifact", options: {}, I: exports.DeleteArtifactRequest, O: exports.DeleteArtifactResponse }
{ name: "DeleteArtifact", options: {}, I: exports.DeleteArtifactRequest, O: exports.DeleteArtifactResponse },
{ name: "MigrateArtifact", options: {}, I: exports.MigrateArtifactRequest, O: exports.MigrateArtifactResponse },
{ name: "FinalizeMigratedArtifact", options: {}, I: exports.FinalizeMigratedArtifactRequest, O: exports.FinalizeMigratedArtifactResponse }
]);
//# sourceMappingURL=artifact.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,30 @@
import { CreateArtifactRequest, CreateArtifactResponse, FinalizeArtifactRequest, FinalizeArtifactResponse, ListArtifactsRequest, ListArtifactsResponse, GetSignedArtifactURLRequest, GetSignedArtifactURLResponse, DeleteArtifactRequest, DeleteArtifactResponse } from "./artifact";
interface Rpc {
request(service: string, method: string, contentType: "application/json" | "application/protobuf", data: object | Uint8Array): Promise<object | Uint8Array>;
}
export interface ArtifactServiceClient {
CreateArtifact(request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export declare class ArtifactServiceClientJSON implements ArtifactServiceClient {
private readonly rpc;
constructor(rpc: Rpc);
CreateArtifact(request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export declare class ArtifactServiceClientProtobuf implements ArtifactServiceClient {
private readonly rpc;
constructor(rpc: Rpc);
CreateArtifact(request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export {};

View File

@@ -0,0 +1,100 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.ArtifactServiceClientProtobuf = exports.ArtifactServiceClientJSON = void 0;
const artifact_1 = require("./artifact");
class ArtifactServiceClientJSON {
constructor(rpc) {
this.rpc = rpc;
this.CreateArtifact.bind(this);
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(request) {
const data = artifact_1.CreateArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "CreateArtifact", "application/json", data);
return promise.then((data) => artifact_1.CreateArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
FinalizeArtifact(request) {
const data = artifact_1.FinalizeArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "FinalizeArtifact", "application/json", data);
return promise.then((data) => artifact_1.FinalizeArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
ListArtifacts(request) {
const data = artifact_1.ListArtifactsRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "ListArtifacts", "application/json", data);
return promise.then((data) => artifact_1.ListArtifactsResponse.fromJson(data, { ignoreUnknownFields: true }));
}
GetSignedArtifactURL(request) {
const data = artifact_1.GetSignedArtifactURLRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "GetSignedArtifactURL", "application/json", data);
return promise.then((data) => artifact_1.GetSignedArtifactURLResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
DeleteArtifact(request) {
const data = artifact_1.DeleteArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "DeleteArtifact", "application/json", data);
return promise.then((data) => artifact_1.DeleteArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
}
exports.ArtifactServiceClientJSON = ArtifactServiceClientJSON;
class ArtifactServiceClientProtobuf {
constructor(rpc) {
this.rpc = rpc;
this.CreateArtifact.bind(this);
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(request) {
const data = artifact_1.CreateArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "CreateArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.CreateArtifactResponse.fromBinary(data));
}
FinalizeArtifact(request) {
const data = artifact_1.FinalizeArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "FinalizeArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.FinalizeArtifactResponse.fromBinary(data));
}
ListArtifacts(request) {
const data = artifact_1.ListArtifactsRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "ListArtifacts", "application/protobuf", data);
return promise.then((data) => artifact_1.ListArtifactsResponse.fromBinary(data));
}
GetSignedArtifactURL(request) {
const data = artifact_1.GetSignedArtifactURLRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "GetSignedArtifactURL", "application/protobuf", data);
return promise.then((data) => artifact_1.GetSignedArtifactURLResponse.fromBinary(data));
}
DeleteArtifact(request) {
const data = artifact_1.DeleteArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "DeleteArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.DeleteArtifactResponse.fromBinary(data));
}
}
exports.ArtifactServiceClientProtobuf = ArtifactServiceClientProtobuf;
//# sourceMappingURL=artifact.twirp-client.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"artifact.twirp-client.js","sourceRoot":"","sources":["../../../../../src/generated/results/api/v1/artifact.twirp-client.ts"],"names":[],"mappings":";;;AAAA,yCAWoB;AA+BpB,MAAa,yBAAyB;IAEpC,YAAY,GAAQ;QAClB,IAAI,CAAC,GAAG,GAAG,GAAG,CAAC;QACf,IAAI,CAAC,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QAC/B,IAAI,CAAC,gBAAgB,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QACjC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QAC9B,IAAI,CAAC,oBAAoB,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QACrC,IAAI,CAAC,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;IACjC,CAAC;IACD,cAAc,CACZ,OAA8B;QAE9B,MAAM,IAAI,GAAG,gCAAqB,CAAC,MAAM,CAAC,OAAO,EAAE;YACjD,iBAAiB,EAAE,IAAI;YACvB,iBAAiB,EAAE,KAAK;SACzB,CAAC,CAAC;QACH,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,gBAAgB,EAChB,kBAAkB,EAClB,IAAc,CACf,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,iCAAsB,CAAC,QAAQ,CAAC,IAAW,EAAE;YAC3C,mBAAmB,EAAE,IAAI;SAC1B,CAAC,CACH,CAAC;IACJ,CAAC;IAED,gBAAgB,CACd,OAAgC;QAEhC,MAAM,IAAI,GAAG,kCAAuB,CAAC,MAAM,CAAC,OAAO,EAAE;YACnD,iBAAiB,EAAE,IAAI;YACvB,iBAAiB,EAAE,KAAK;SACzB,CAAC,CAAC;QACH,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,kBAAkB,EAClB,kBAAkB,EAClB,IAAc,CACf,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,mCAAwB,CAAC,QAAQ,CAAC,IAAW,EAAE;YAC7C,mBAAmB,EAAE,IAAI;SAC1B,CAAC,CACH,CAAC;IACJ,CAAC;IAED,aAAa,CAAC,OAA6B;QACzC,MAAM,IAAI,GAAG,+BAAoB,CAAC,MAAM,CAAC,OAAO,EAAE;YAChD,iBAAiB,EAAE,IAAI;YACvB,iBAAiB,EAAE,KAAK;SACzB,CAAC,CAAC;QACH,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,eAAe,EACf,kBAAkB,EAClB,IAAc,CACf,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,gCAAqB,CAAC,QAAQ,CAAC,IAAW,EAAE,EAAE,mBAAmB,EAAE,IAAI,EAAE,CAAC,CAC3E,CAAC;IACJ,CAAC;IAED,oBAAoB,CAClB,OAAoC;QAEpC,MAAM,IAAI,GAAG,sCAA2B,CAAC,MAAM,CAAC,OAAO,EAAE;YACvD,iBAAiB,EAAE,IAAI;YACvB,iBAAiB,EAAE,KAAK;SACzB,CAAC,CAAC;QACH,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,sBAAsB,EACtB,kBAAkB,EAClB,IAAc,CACf,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,uCAA4B,CAAC,QAAQ,CAAC,IAAW,EAAE;YACjD,mBAAmB,EAAE,IAAI;SAC1B,CAAC,CACH,CAAC;IACJ,CAAC;IAED,cAAc,CACZ,OAA8B;QAE9B,MAAM,IAAI,GAAG,gCAAqB,CAAC,MAAM,CAAC,OAAO,EAAE;YACjD,iBAAiB,EAAE,IAAI;YACvB,iBAAiB,EAAE,KAAK;SACzB,CAAC,CAAC;QACH,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,gBAAgB,EAChB,kBAAkB,EAClB,IAAc,CACf,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,iCAAsB,CAAC,QAAQ,CAAC,IAAW,EAAE;YAC3C,mBAAmB,EAAE,IAAI;SAC1B,CAAC,CACH,CAAC;IACJ,CAAC;CACF;AAzGD,8DAyGC;AAED,MAAa,6BAA6B;IAExC,YAAY,GAAQ;QAClB,IAAI,CAAC,GAAG,GAAG,GAAG,CAAC;QACf,IAAI,CAAC,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QAC/B,IAAI,CAAC,gBAAgB,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QACjC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QAC9B,IAAI,CAAC,oBAAoB,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QACrC,IAAI,CAAC,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;IACjC,CAAC;IACD,cAAc,CACZ,OAA8B;QAE9B,MAAM,IAAI,GAAG,gCAAqB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC;QACrD,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,gBAAgB,EAChB,sBAAsB,EACtB,IAAI,CACL,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,iCAAsB,CAAC,UAAU,CAAC,IAAkB,CAAC,CACtD,CAAC;IACJ,CAAC;IAED,gBAAgB,CACd,OAAgC;QAEhC,MAAM,IAAI,GAAG,kCAAuB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC;QACvD,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,kBAAkB,EAClB,sBAAsB,EACtB,IAAI,CACL,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,mCAAwB,CAAC,UAAU,CAAC,IAAkB,CAAC,CACxD,CAAC;IACJ,CAAC;IAED,aAAa,CAAC,OAA6B;QACzC,MAAM,IAAI,GAAG,+BAAoB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC;QACpD,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,eAAe,EACf,sBAAsB,EACtB,IAAI,CACL,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,gCAAqB,CAAC,UAAU,CAAC,IAAkB,CAAC,CACrD,CAAC;IACJ,CAAC;IAED,oBAAoB,CAClB,OAAoC;QAEpC,MAAM,IAAI,GAAG,sCAA2B,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC;QAC3D,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,sBAAsB,EACtB,sBAAsB,EACtB,IAAI,CACL,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,uCAA4B,CAAC,UAAU,CAAC,IAAkB,CAAC,CAC5D,CAAC;IACJ,CAAC;IAED,cAAc,CACZ,OAA8B;QAE9B,MAAM,IAAI,GAAG,gCAAqB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC;QACrD,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAC9B,+CAA+C,EAC/C,gBAAgB,EAChB,sBAAsB,EACtB,IAAI,CACL,CAAC;QACF,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,EAAE,CAC3B,iCAAsB,CAAC,UAAU,CAAC,IAAkB,CAAC,CACtD,CAAC;IACJ,CAAC;CACF;AAlFD,sEAkFC"}

View File

@@ -1,48 +0,0 @@
/// <reference types="node" />
import { TwirpContext, TwirpServer } from "twirp-ts";
import { CreateArtifactRequest, CreateArtifactResponse, FinalizeArtifactRequest, FinalizeArtifactResponse, ListArtifactsRequest, ListArtifactsResponse, GetSignedArtifactURLRequest, GetSignedArtifactURLResponse, DeleteArtifactRequest, DeleteArtifactResponse } from "./artifact";
interface Rpc {
request(service: string, method: string, contentType: "application/json" | "application/protobuf", data: object | Uint8Array): Promise<object | Uint8Array>;
}
export interface ArtifactServiceClient {
CreateArtifact(request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export declare class ArtifactServiceClientJSON implements ArtifactServiceClient {
private readonly rpc;
constructor(rpc: Rpc);
CreateArtifact(request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export declare class ArtifactServiceClientProtobuf implements ArtifactServiceClient {
private readonly rpc;
constructor(rpc: Rpc);
CreateArtifact(request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export interface ArtifactServiceTwirp<T extends TwirpContext = TwirpContext> {
CreateArtifact(ctx: T, request: CreateArtifactRequest): Promise<CreateArtifactResponse>;
FinalizeArtifact(ctx: T, request: FinalizeArtifactRequest): Promise<FinalizeArtifactResponse>;
ListArtifacts(ctx: T, request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(ctx: T, request: GetSignedArtifactURLRequest): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(ctx: T, request: DeleteArtifactRequest): Promise<DeleteArtifactResponse>;
}
export declare enum ArtifactServiceMethod {
CreateArtifact = "CreateArtifact",
FinalizeArtifact = "FinalizeArtifact",
ListArtifacts = "ListArtifacts",
GetSignedArtifactURL = "GetSignedArtifactURL",
DeleteArtifact = "DeleteArtifact"
}
export declare const ArtifactServiceMethodList: ArtifactServiceMethod[];
export declare function createArtifactServiceServer<T extends TwirpContext = TwirpContext>(service: ArtifactServiceTwirp<T>): TwirpServer<ArtifactServiceTwirp<TwirpContext<import("http").IncomingMessage, import("http").ServerResponse<import("http").IncomingMessage>>>, T>;
export {};

View File

@@ -1,508 +0,0 @@
"use strict";
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
return new (P || (P = Promise))(function (resolve, reject) {
function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }
step((generator = generator.apply(thisArg, _arguments || [])).next());
});
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.createArtifactServiceServer = exports.ArtifactServiceMethodList = exports.ArtifactServiceMethod = exports.ArtifactServiceClientProtobuf = exports.ArtifactServiceClientJSON = void 0;
const twirp_ts_1 = require("twirp-ts");
const artifact_1 = require("./artifact");
class ArtifactServiceClientJSON {
constructor(rpc) {
this.rpc = rpc;
this.CreateArtifact.bind(this);
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(request) {
const data = artifact_1.CreateArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "CreateArtifact", "application/json", data);
return promise.then((data) => artifact_1.CreateArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
FinalizeArtifact(request) {
const data = artifact_1.FinalizeArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "FinalizeArtifact", "application/json", data);
return promise.then((data) => artifact_1.FinalizeArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
ListArtifacts(request) {
const data = artifact_1.ListArtifactsRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "ListArtifacts", "application/json", data);
return promise.then((data) => artifact_1.ListArtifactsResponse.fromJson(data, { ignoreUnknownFields: true }));
}
GetSignedArtifactURL(request) {
const data = artifact_1.GetSignedArtifactURLRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "GetSignedArtifactURL", "application/json", data);
return promise.then((data) => artifact_1.GetSignedArtifactURLResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
DeleteArtifact(request) {
const data = artifact_1.DeleteArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "DeleteArtifact", "application/json", data);
return promise.then((data) => artifact_1.DeleteArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
}
exports.ArtifactServiceClientJSON = ArtifactServiceClientJSON;
class ArtifactServiceClientProtobuf {
constructor(rpc) {
this.rpc = rpc;
this.CreateArtifact.bind(this);
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(request) {
const data = artifact_1.CreateArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "CreateArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.CreateArtifactResponse.fromBinary(data));
}
FinalizeArtifact(request) {
const data = artifact_1.FinalizeArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "FinalizeArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.FinalizeArtifactResponse.fromBinary(data));
}
ListArtifacts(request) {
const data = artifact_1.ListArtifactsRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "ListArtifacts", "application/protobuf", data);
return promise.then((data) => artifact_1.ListArtifactsResponse.fromBinary(data));
}
GetSignedArtifactURL(request) {
const data = artifact_1.GetSignedArtifactURLRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "GetSignedArtifactURL", "application/protobuf", data);
return promise.then((data) => artifact_1.GetSignedArtifactURLResponse.fromBinary(data));
}
DeleteArtifact(request) {
const data = artifact_1.DeleteArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "DeleteArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.DeleteArtifactResponse.fromBinary(data));
}
}
exports.ArtifactServiceClientProtobuf = ArtifactServiceClientProtobuf;
var ArtifactServiceMethod;
(function (ArtifactServiceMethod) {
ArtifactServiceMethod["CreateArtifact"] = "CreateArtifact";
ArtifactServiceMethod["FinalizeArtifact"] = "FinalizeArtifact";
ArtifactServiceMethod["ListArtifacts"] = "ListArtifacts";
ArtifactServiceMethod["GetSignedArtifactURL"] = "GetSignedArtifactURL";
ArtifactServiceMethod["DeleteArtifact"] = "DeleteArtifact";
})(ArtifactServiceMethod || (exports.ArtifactServiceMethod = ArtifactServiceMethod = {}));
exports.ArtifactServiceMethodList = [
ArtifactServiceMethod.CreateArtifact,
ArtifactServiceMethod.FinalizeArtifact,
ArtifactServiceMethod.ListArtifacts,
ArtifactServiceMethod.GetSignedArtifactURL,
ArtifactServiceMethod.DeleteArtifact,
];
function createArtifactServiceServer(service) {
return new twirp_ts_1.TwirpServer({
service,
packageName: "github.actions.results.api.v1",
serviceName: "ArtifactService",
methodList: exports.ArtifactServiceMethodList,
matchRoute: matchArtifactServiceRoute,
});
}
exports.createArtifactServiceServer = createArtifactServiceServer;
function matchArtifactServiceRoute(method, events) {
switch (method) {
case "CreateArtifact":
return (ctx, service, data, interceptors) => __awaiter(this, void 0, void 0, function* () {
ctx = Object.assign(Object.assign({}, ctx), { methodName: "CreateArtifact" });
yield events.onMatch(ctx);
return handleArtifactServiceCreateArtifactRequest(ctx, service, data, interceptors);
});
case "FinalizeArtifact":
return (ctx, service, data, interceptors) => __awaiter(this, void 0, void 0, function* () {
ctx = Object.assign(Object.assign({}, ctx), { methodName: "FinalizeArtifact" });
yield events.onMatch(ctx);
return handleArtifactServiceFinalizeArtifactRequest(ctx, service, data, interceptors);
});
case "ListArtifacts":
return (ctx, service, data, interceptors) => __awaiter(this, void 0, void 0, function* () {
ctx = Object.assign(Object.assign({}, ctx), { methodName: "ListArtifacts" });
yield events.onMatch(ctx);
return handleArtifactServiceListArtifactsRequest(ctx, service, data, interceptors);
});
case "GetSignedArtifactURL":
return (ctx, service, data, interceptors) => __awaiter(this, void 0, void 0, function* () {
ctx = Object.assign(Object.assign({}, ctx), { methodName: "GetSignedArtifactURL" });
yield events.onMatch(ctx);
return handleArtifactServiceGetSignedArtifactURLRequest(ctx, service, data, interceptors);
});
case "DeleteArtifact":
return (ctx, service, data, interceptors) => __awaiter(this, void 0, void 0, function* () {
ctx = Object.assign(Object.assign({}, ctx), { methodName: "DeleteArtifact" });
yield events.onMatch(ctx);
return handleArtifactServiceDeleteArtifactRequest(ctx, service, data, interceptors);
});
default:
events.onNotFound();
const msg = `no handler found`;
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceCreateArtifactRequest(ctx, service, data, interceptors) {
switch (ctx.contentType) {
case twirp_ts_1.TwirpContentType.JSON:
return handleArtifactServiceCreateArtifactJSON(ctx, service, data, interceptors);
case twirp_ts_1.TwirpContentType.Protobuf:
return handleArtifactServiceCreateArtifactProtobuf(ctx, service, data, interceptors);
default:
const msg = "unexpected Content-Type";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceFinalizeArtifactRequest(ctx, service, data, interceptors) {
switch (ctx.contentType) {
case twirp_ts_1.TwirpContentType.JSON:
return handleArtifactServiceFinalizeArtifactJSON(ctx, service, data, interceptors);
case twirp_ts_1.TwirpContentType.Protobuf:
return handleArtifactServiceFinalizeArtifactProtobuf(ctx, service, data, interceptors);
default:
const msg = "unexpected Content-Type";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceListArtifactsRequest(ctx, service, data, interceptors) {
switch (ctx.contentType) {
case twirp_ts_1.TwirpContentType.JSON:
return handleArtifactServiceListArtifactsJSON(ctx, service, data, interceptors);
case twirp_ts_1.TwirpContentType.Protobuf:
return handleArtifactServiceListArtifactsProtobuf(ctx, service, data, interceptors);
default:
const msg = "unexpected Content-Type";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceGetSignedArtifactURLRequest(ctx, service, data, interceptors) {
switch (ctx.contentType) {
case twirp_ts_1.TwirpContentType.JSON:
return handleArtifactServiceGetSignedArtifactURLJSON(ctx, service, data, interceptors);
case twirp_ts_1.TwirpContentType.Protobuf:
return handleArtifactServiceGetSignedArtifactURLProtobuf(ctx, service, data, interceptors);
default:
const msg = "unexpected Content-Type";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceDeleteArtifactRequest(ctx, service, data, interceptors) {
switch (ctx.contentType) {
case twirp_ts_1.TwirpContentType.JSON:
return handleArtifactServiceDeleteArtifactJSON(ctx, service, data, interceptors);
case twirp_ts_1.TwirpContentType.Protobuf:
return handleArtifactServiceDeleteArtifactProtobuf(ctx, service, data, interceptors);
default:
const msg = "unexpected Content-Type";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceCreateArtifactJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
const body = JSON.parse(data.toString() || "{}");
request = artifact_1.CreateArtifactRequest.fromJson(body, {
ignoreUnknownFields: true,
});
}
catch (e) {
if (e instanceof Error) {
const msg = "the json request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.CreateArtifact(ctx, inputReq);
});
}
else {
response = yield service.CreateArtifact(ctx, request);
}
return JSON.stringify(artifact_1.CreateArtifactResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false,
}));
});
}
function handleArtifactServiceFinalizeArtifactJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
const body = JSON.parse(data.toString() || "{}");
request = artifact_1.FinalizeArtifactRequest.fromJson(body, {
ignoreUnknownFields: true,
});
}
catch (e) {
if (e instanceof Error) {
const msg = "the json request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.FinalizeArtifact(ctx, inputReq);
});
}
else {
response = yield service.FinalizeArtifact(ctx, request);
}
return JSON.stringify(artifact_1.FinalizeArtifactResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false,
}));
});
}
function handleArtifactServiceListArtifactsJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
const body = JSON.parse(data.toString() || "{}");
request = artifact_1.ListArtifactsRequest.fromJson(body, {
ignoreUnknownFields: true,
});
}
catch (e) {
if (e instanceof Error) {
const msg = "the json request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.ListArtifacts(ctx, inputReq);
});
}
else {
response = yield service.ListArtifacts(ctx, request);
}
return JSON.stringify(artifact_1.ListArtifactsResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false,
}));
});
}
function handleArtifactServiceGetSignedArtifactURLJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
const body = JSON.parse(data.toString() || "{}");
request = artifact_1.GetSignedArtifactURLRequest.fromJson(body, {
ignoreUnknownFields: true,
});
}
catch (e) {
if (e instanceof Error) {
const msg = "the json request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.GetSignedArtifactURL(ctx, inputReq);
});
}
else {
response = yield service.GetSignedArtifactURL(ctx, request);
}
return JSON.stringify(artifact_1.GetSignedArtifactURLResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false,
}));
});
}
function handleArtifactServiceDeleteArtifactJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
const body = JSON.parse(data.toString() || "{}");
request = artifact_1.DeleteArtifactRequest.fromJson(body, {
ignoreUnknownFields: true,
});
}
catch (e) {
if (e instanceof Error) {
const msg = "the json request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.DeleteArtifact(ctx, inputReq);
});
}
else {
response = yield service.DeleteArtifact(ctx, request);
}
return JSON.stringify(artifact_1.DeleteArtifactResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false,
}));
});
}
function handleArtifactServiceCreateArtifactProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
request = artifact_1.CreateArtifactRequest.fromBinary(data);
}
catch (e) {
if (e instanceof Error) {
const msg = "the protobuf request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.CreateArtifact(ctx, inputReq);
});
}
else {
response = yield service.CreateArtifact(ctx, request);
}
return Buffer.from(artifact_1.CreateArtifactResponse.toBinary(response));
});
}
function handleArtifactServiceFinalizeArtifactProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
request = artifact_1.FinalizeArtifactRequest.fromBinary(data);
}
catch (e) {
if (e instanceof Error) {
const msg = "the protobuf request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.FinalizeArtifact(ctx, inputReq);
});
}
else {
response = yield service.FinalizeArtifact(ctx, request);
}
return Buffer.from(artifact_1.FinalizeArtifactResponse.toBinary(response));
});
}
function handleArtifactServiceListArtifactsProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
request = artifact_1.ListArtifactsRequest.fromBinary(data);
}
catch (e) {
if (e instanceof Error) {
const msg = "the protobuf request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.ListArtifacts(ctx, inputReq);
});
}
else {
response = yield service.ListArtifacts(ctx, request);
}
return Buffer.from(artifact_1.ListArtifactsResponse.toBinary(response));
});
}
function handleArtifactServiceGetSignedArtifactURLProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
request = artifact_1.GetSignedArtifactURLRequest.fromBinary(data);
}
catch (e) {
if (e instanceof Error) {
const msg = "the protobuf request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.GetSignedArtifactURL(ctx, inputReq);
});
}
else {
response = yield service.GetSignedArtifactURL(ctx, request);
}
return Buffer.from(artifact_1.GetSignedArtifactURLResponse.toBinary(response));
});
}
function handleArtifactServiceDeleteArtifactProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
request = artifact_1.DeleteArtifactRequest.fromBinary(data);
}
catch (e) {
if (e instanceof Error) {
const msg = "the protobuf request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.DeleteArtifact(ctx, inputReq);
});
}
else {
response = yield service.DeleteArtifact(ctx, request);
}
return Buffer.from(artifact_1.DeleteArtifactResponse.toBinary(response));
});
}
//# sourceMappingURL=artifact.twirp.js.map

File diff suppressed because one or more lines are too long

View File

@@ -1,4 +1,4 @@
import { DownloadArtifactOptions, DownloadArtifactResponse } from '../shared/interfaces';
export declare function streamExtractExternal(url: string, directory: string): Promise<void>;
import { DownloadArtifactOptions, DownloadArtifactResponse, StreamExtractResponse } from '../shared/interfaces';
export declare function streamExtractExternal(url: string, directory: string): Promise<StreamExtractResponse>;
export declare function downloadArtifactPublic(artifactId: number, repositoryOwner: string, repositoryName: string, token: string, options?: DownloadArtifactOptions): Promise<DownloadArtifactResponse>;
export declare function downloadArtifactInternal(artifactId: number, options?: DownloadArtifactOptions): Promise<DownloadArtifactResponse>;

View File

@@ -37,6 +37,8 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
Object.defineProperty(exports, "__esModule", { value: true });
exports.downloadArtifactInternal = exports.downloadArtifactPublic = exports.streamExtractExternal = void 0;
const promises_1 = __importDefault(require("fs/promises"));
const crypto = __importStar(require("crypto"));
const stream = __importStar(require("stream"));
const github = __importStar(require("@actions/github"));
const core = __importStar(require("@actions/core"));
const httpClient = __importStar(require("@actions/http-client"));
@@ -73,8 +75,7 @@ function streamExtract(url, directory) {
let retryCount = 0;
while (retryCount < 5) {
try {
yield streamExtractExternal(url, directory);
return;
return yield streamExtractExternal(url, directory);
}
catch (error) {
retryCount++;
@@ -94,12 +95,18 @@ function streamExtractExternal(url, directory) {
throw new Error(`Unexpected HTTP response from blob storage: ${response.message.statusCode} ${response.message.statusMessage}`);
}
const timeout = 30 * 1000; // 30 seconds
let sha256Digest = undefined;
return new Promise((resolve, reject) => {
const timerFn = () => {
response.message.destroy(new Error(`Blob storage chunk did not respond in ${timeout}ms`));
};
const timer = setTimeout(timerFn, timeout);
response.message
const hashStream = crypto.createHash('sha256').setEncoding('hex');
const passThrough = new stream.PassThrough();
response.message.pipe(passThrough);
passThrough.pipe(hashStream);
const extractStream = passThrough;
extractStream
.on('data', () => {
timer.refresh();
})
@@ -111,7 +118,12 @@ function streamExtractExternal(url, directory) {
.pipe(unzip_stream_1.default.Extract({ path: directory }))
.on('close', () => {
clearTimeout(timer);
resolve();
if (hashStream) {
hashStream.end();
sha256Digest = hashStream.read();
core.info(`SHA256 digest of downloaded artifact is ${sha256Digest}`);
}
resolve({ sha256Digest: `sha256:${sha256Digest}` });
})
.on('error', (error) => {
reject(error);
@@ -124,6 +136,7 @@ function downloadArtifactPublic(artifactId, repositoryOwner, repositoryName, tok
return __awaiter(this, void 0, void 0, function* () {
const downloadPath = yield resolveOrCreateDirectory(options === null || options === void 0 ? void 0 : options.path);
const api = github.getOctokit(token);
let digestMismatch = false;
core.info(`Downloading artifact '${artifactId}' from '${repositoryOwner}/${repositoryName}'`);
const { headers, status } = yield api.rest.actions.downloadArtifact({
owner: repositoryOwner,
@@ -144,13 +157,20 @@ function downloadArtifactPublic(artifactId, repositoryOwner, repositoryName, tok
core.info(`Redirecting to blob download url: ${scrubQueryParameters(location)}`);
try {
core.info(`Starting download of artifact to: ${downloadPath}`);
yield streamExtract(location, downloadPath);
const extractResponse = yield streamExtract(location, downloadPath);
core.info(`Artifact download completed successfully.`);
if (options === null || options === void 0 ? void 0 : options.expectedHash) {
if ((options === null || options === void 0 ? void 0 : options.expectedHash) !== extractResponse.sha256Digest) {
digestMismatch = true;
core.debug(`Computed digest: ${extractResponse.sha256Digest}`);
core.debug(`Expected digest: ${options.expectedHash}`);
}
}
}
catch (error) {
throw new Error(`Unable to download and extract artifact: ${error.message}`);
}
return { downloadPath };
return { downloadPath, digestMismatch };
});
}
exports.downloadArtifactPublic = downloadArtifactPublic;
@@ -158,6 +178,7 @@ function downloadArtifactInternal(artifactId, options) {
return __awaiter(this, void 0, void 0, function* () {
const downloadPath = yield resolveOrCreateDirectory(options === null || options === void 0 ? void 0 : options.path);
const artifactClient = (0, artifact_twirp_client_1.internalArtifactTwirpClient)();
let digestMismatch = false;
const { workflowRunBackendId, workflowJobRunBackendId } = (0, util_1.getBackendIdsFromToken)();
const listReq = {
workflowRunBackendId,
@@ -180,13 +201,20 @@ function downloadArtifactInternal(artifactId, options) {
core.info(`Redirecting to blob download url: ${scrubQueryParameters(signedUrl)}`);
try {
core.info(`Starting download of artifact to: ${downloadPath}`);
yield streamExtract(signedUrl, downloadPath);
const extractResponse = yield streamExtract(signedUrl, downloadPath);
core.info(`Artifact download completed successfully.`);
if (options === null || options === void 0 ? void 0 : options.expectedHash) {
if ((options === null || options === void 0 ? void 0 : options.expectedHash) !== extractResponse.sha256Digest) {
digestMismatch = true;
core.debug(`Computed digest: ${extractResponse.sha256Digest}`);
core.debug(`Expected digest: ${options.expectedHash}`);
}
}
}
catch (error) {
throw new Error(`Unable to download and extract artifact: ${error.message}`);
}
return { downloadPath };
return { downloadPath, digestMismatch };
});
}
exports.downloadArtifactInternal = downloadArtifactInternal;

File diff suppressed because one or more lines are too long

View File

@@ -80,13 +80,17 @@ function getArtifactPublic(artifactName, workflowRunId, repositoryOwner, reposit
name: artifact.name,
id: artifact.id,
size: artifact.size_in_bytes,
createdAt: artifact.created_at ? new Date(artifact.created_at) : undefined
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: artifact.digest
}
};
});
}
exports.getArtifactPublic = getArtifactPublic;
function getArtifactInternal(artifactName) {
var _a;
return __awaiter(this, void 0, void 0, function* () {
const artifactClient = (0, artifact_twirp_client_1.internalArtifactTwirpClient)();
const { workflowRunBackendId, workflowJobRunBackendId } = (0, util_1.getBackendIdsFromToken)();
@@ -113,7 +117,8 @@ function getArtifactInternal(artifactName) {
size: Number(artifact.size),
createdAt: artifact.createdAt
? generated_1.Timestamp.toDate(artifact.createdAt)
: undefined
: undefined,
digest: (_a = artifact.digest) === null || _a === void 0 ? void 0 : _a.value
}
};
});

View File

@@ -1 +1 @@
{"version":3,"file":"get-artifact.js","sourceRoot":"","sources":["../../../src/internal/find/get-artifact.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,4CAA0C;AAC1C,wDAA2C;AAC3C,oDAAqC;AAErC,qDAA0E;AAC1E,mDAA+C;AAC/C,oEAAsD;AAEtD,yCAAqD;AACrD,qDAAuD;AACvD,2EAA2E;AAC3E,+CAA4E;AAC5E,6CAA4E;AAE5E,SAAsB,iBAAiB,CACrC,YAAoB,EACpB,aAAqB,EACrB,eAAuB,EACvB,cAAsB,EACtB,KAAa;;;QAEb,MAAM,CAAC,SAAS,EAAE,WAAW,CAAC,GAAG,IAAA,+BAAe,EAAC,gBAAoB,CAAC,CAAA;QAEtE,MAAM,IAAI,GAAmB;YAC3B,GAAG,EAAE,SAAS;YACd,SAAS,EAAE,IAAA,+BAAkB,GAAE;YAC/B,QAAQ,EAAE,SAAS;YACnB,KAAK,EAAE,SAAS;YAChB,OAAO,EAAE,WAAW;SACrB,CAAA;QAED,MAAM,MAAM,GAAG,IAAA,mBAAU,EAAC,KAAK,EAAE,IAAI,EAAE,oBAAK,EAAE,+BAAU,CAAC,CAAA;QAEzD,MAAM,eAAe,GAAG,MAAM,MAAM,CAAC,OAAO,CAC1C,kEAAkE,EAClE;YACE,KAAK,EAAE,eAAe;YACtB,IAAI,EAAE,cAAc;YACpB,MAAM,EAAE,aAAa;YACrB,IAAI,EAAE,YAAY;SACnB,CACF,CAAA;QAED,IAAI,eAAe,CAAC,MAAM,KAAK,GAAG,EAAE;YAClC,MAAM,IAAI,6BAAoB,CAC5B,qCAAqC,eAAe,CAAC,MAAM,KAAK,MAAA,eAAe,aAAf,eAAe,uBAAf,eAAe,CAAE,OAAO,0CAAG,qBAAqB,CAAC,GAAG,CACrH,CAAA;SACF;QAED,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;YAC/C,MAAM,IAAI,8BAAqB,CAC7B,gCAAgC,YAAY;;yIAEuF,CACpI,CAAA;SACF;QAED,IAAI,QAAQ,GAAG,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAA;QAChD,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YAC7C,QAAQ,GAAG,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAA;YACxE,IAAI,CAAC,KAAK,CACR,yEAAyE,QAAQ,CAAC,EAAE,GAAG,CACxF,CAAA;SACF;QAED,OAAO;YACL,QAAQ,EAAE;gBACR,IAAI,EAAE,QAAQ,CAAC,IAAI;gBACnB,EAAE,EAAE,QAAQ,CAAC,EAAE;gBACf,IAAI,EAAE,QAAQ,CAAC,aAAa;gBAC5B,SAAS,EAAE,QAAQ,CAAC,UAAU,CAAC,CAAC,CAAC,IAAI,IAAI,CAAC,QAAQ,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,SAAS;aAC3E;SACF,CAAA;;CACF;AA3DD,8CA2DC;AAED,SAAsB,mBAAmB,CACvC,YAAoB;;QAEpB,MAAM,cAAc,GAAG,IAAA,mDAA2B,GAAE,CAAA;QAEpD,MAAM,EAAC,oBAAoB,EAAE,uBAAuB,EAAC,GACnD,IAAA,6BAAsB,GAAE,CAAA;QAE1B,MAAM,GAAG,GAAyB;YAChC,oBAAoB;YACpB,uBAAuB;YACvB,UAAU,EAAE,uBAAW,CAAC,MAAM,CAAC,EAAC,KAAK,EAAE,YAAY,EAAC,CAAC;SACtD,CAAA;QAED,MAAM,GAAG,GAAG,MAAM,cAAc,CAAC,aAAa,CAAC,GAAG,CAAC,CAAA;QAEnD,IAAI,GAAG,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;YAC9B,MAAM,IAAI,8BAAqB,CAC7B,gCAAgC,YAAY;;yIAEuF,CACpI,CAAA;SACF;QAED,IAAI,QAAQ,GAAG,GAAG,CAAC,SAAS,CAAC,CAAC,CAAC,CAAA;QAC/B,IAAI,GAAG,CAAC,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YAC5B,QAAQ,GAAG,GAAG,CAAC,SAAS,CAAC,IAAI,CAC3B,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,MAAM,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,MAAM,CAAC,CAAC,CAAC,UAAU,CAAC,CACtD,CAAC,CAAC,CAAC,CAAA;YAEJ,IAAI,CAAC,KAAK,CACR,yEAAyE,QAAQ,CAAC,UAAU,GAAG,CAChG,CAAA;SACF;QAED,OAAO;YACL,QAAQ,EAAE;gBACR,IAAI,EAAE,QAAQ,CAAC,IAAI;gBACnB,EAAE,EAAE,MAAM,CAAC,QAAQ,CAAC,UAAU,CAAC;gBAC/B,IAAI,EAAE,MAAM,CAAC,QAAQ,CAAC,IAAI,CAAC;gBAC3B,SAAS,EAAE,QAAQ,CAAC,SAAS;oBAC3B,CAAC,CAAC,qBAAS,CAAC,MAAM,CAAC,QAAQ,CAAC,SAAS,CAAC;oBACtC,CAAC,CAAC,SAAS;aACd;SACF,CAAA;IACH,CAAC;CAAA;AA7CD,kDA6CC"}
{"version":3,"file":"get-artifact.js","sourceRoot":"","sources":["../../../src/internal/find/get-artifact.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,4CAA0C;AAC1C,wDAA2C;AAC3C,oDAAqC;AAErC,qDAA0E;AAC1E,mDAA+C;AAC/C,oEAAsD;AAEtD,yCAAqD;AACrD,qDAAuD;AACvD,2EAA2E;AAC3E,+CAA4E;AAC5E,6CAA4E;AAE5E,SAAsB,iBAAiB,CACrC,YAAoB,EACpB,aAAqB,EACrB,eAAuB,EACvB,cAAsB,EACtB,KAAa;;;QAEb,MAAM,CAAC,SAAS,EAAE,WAAW,CAAC,GAAG,IAAA,+BAAe,EAAC,gBAAoB,CAAC,CAAA;QAEtE,MAAM,IAAI,GAAmB;YAC3B,GAAG,EAAE,SAAS;YACd,SAAS,EAAE,IAAA,+BAAkB,GAAE;YAC/B,QAAQ,EAAE,SAAS;YACnB,KAAK,EAAE,SAAS;YAChB,OAAO,EAAE,WAAW;SACrB,CAAA;QAED,MAAM,MAAM,GAAG,IAAA,mBAAU,EAAC,KAAK,EAAE,IAAI,EAAE,oBAAK,EAAE,+BAAU,CAAC,CAAA;QAEzD,MAAM,eAAe,GAAG,MAAM,MAAM,CAAC,OAAO,CAC1C,kEAAkE,EAClE;YACE,KAAK,EAAE,eAAe;YACtB,IAAI,EAAE,cAAc;YACpB,MAAM,EAAE,aAAa;YACrB,IAAI,EAAE,YAAY;SACnB,CACF,CAAA;QAED,IAAI,eAAe,CAAC,MAAM,KAAK,GAAG,EAAE;YAClC,MAAM,IAAI,6BAAoB,CAC5B,qCAAqC,eAAe,CAAC,MAAM,KAAK,MAAA,eAAe,aAAf,eAAe,uBAAf,eAAe,CAAE,OAAO,0CAAG,qBAAqB,CAAC,GAAG,CACrH,CAAA;SACF;QAED,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;YAC/C,MAAM,IAAI,8BAAqB,CAC7B,gCAAgC,YAAY;;yIAEuF,CACpI,CAAA;SACF;QAED,IAAI,QAAQ,GAAG,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAA;QAChD,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YAC7C,QAAQ,GAAG,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAA;YACxE,IAAI,CAAC,KAAK,CACR,yEAAyE,QAAQ,CAAC,EAAE,GAAG,CACxF,CAAA;SACF;QAED,OAAO;YACL,QAAQ,EAAE;gBACR,IAAI,EAAE,QAAQ,CAAC,IAAI;gBACnB,EAAE,EAAE,QAAQ,CAAC,EAAE;gBACf,IAAI,EAAE,QAAQ,CAAC,aAAa;gBAC5B,SAAS,EAAE,QAAQ,CAAC,UAAU;oBAC5B,CAAC,CAAC,IAAI,IAAI,CAAC,QAAQ,CAAC,UAAU,CAAC;oBAC/B,CAAC,CAAC,SAAS;gBACb,MAAM,EAAE,QAAQ,CAAC,MAAM;aACxB;SACF,CAAA;;CACF;AA9DD,8CA8DC;AAED,SAAsB,mBAAmB,CACvC,YAAoB;;;QAEpB,MAAM,cAAc,GAAG,IAAA,mDAA2B,GAAE,CAAA;QAEpD,MAAM,EAAC,oBAAoB,EAAE,uBAAuB,EAAC,GACnD,IAAA,6BAAsB,GAAE,CAAA;QAE1B,MAAM,GAAG,GAAyB;YAChC,oBAAoB;YACpB,uBAAuB;YACvB,UAAU,EAAE,uBAAW,CAAC,MAAM,CAAC,EAAC,KAAK,EAAE,YAAY,EAAC,CAAC;SACtD,CAAA;QAED,MAAM,GAAG,GAAG,MAAM,cAAc,CAAC,aAAa,CAAC,GAAG,CAAC,CAAA;QAEnD,IAAI,GAAG,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;YAC9B,MAAM,IAAI,8BAAqB,CAC7B,gCAAgC,YAAY;;yIAEuF,CACpI,CAAA;SACF;QAED,IAAI,QAAQ,GAAG,GAAG,CAAC,SAAS,CAAC,CAAC,CAAC,CAAA;QAC/B,IAAI,GAAG,CAAC,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YAC5B,QAAQ,GAAG,GAAG,CAAC,SAAS,CAAC,IAAI,CAC3B,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,MAAM,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,MAAM,CAAC,CAAC,CAAC,UAAU,CAAC,CACtD,CAAC,CAAC,CAAC,CAAA;YAEJ,IAAI,CAAC,KAAK,CACR,yEAAyE,QAAQ,CAAC,UAAU,GAAG,CAChG,CAAA;SACF;QAED,OAAO;YACL,QAAQ,EAAE;gBACR,IAAI,EAAE,QAAQ,CAAC,IAAI;gBACnB,EAAE,EAAE,MAAM,CAAC,QAAQ,CAAC,UAAU,CAAC;gBAC/B,IAAI,EAAE,MAAM,CAAC,QAAQ,CAAC,IAAI,CAAC;gBAC3B,SAAS,EAAE,QAAQ,CAAC,SAAS;oBAC3B,CAAC,CAAC,qBAAS,CAAC,MAAM,CAAC,QAAQ,CAAC,SAAS,CAAC;oBACtC,CAAC,CAAC,SAAS;gBACb,MAAM,EAAE,MAAA,QAAQ,CAAC,MAAM,0CAAE,KAAK;aAC/B;SACF,CAAA;;CACF;AA9CD,kDA8CC"}

View File

@@ -38,7 +38,7 @@ function listArtifactsPublic(workflowRunId, repositoryOwner, repositoryName, tok
};
const github = (0, github_1.getOctokit)(token, opts, plugin_retry_1.retry, plugin_request_log_1.requestLog);
let currentPageNumber = 1;
const { data: listArtifactResponse } = yield github.rest.actions.listWorkflowRunArtifacts({
const { data: listArtifactResponse } = yield github.request('GET /repos/{owner}/{repo}/actions/runs/{run_id}/artifacts', {
owner: repositoryOwner,
repo: repositoryName,
run_id: workflowRunId,
@@ -57,14 +57,18 @@ function listArtifactsPublic(workflowRunId, repositoryOwner, repositoryName, tok
name: artifact.name,
id: artifact.id,
size: artifact.size_in_bytes,
createdAt: artifact.created_at ? new Date(artifact.created_at) : undefined
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: artifact.digest
});
}
// Move to the next page
currentPageNumber++;
// Iterate over any remaining pages
for (currentPageNumber; currentPageNumber < numberOfPages; currentPageNumber++) {
currentPageNumber++;
(0, core_1.debug)(`Fetching page ${currentPageNumber} of artifact list`);
const { data: listArtifactResponse } = yield github.rest.actions.listWorkflowRunArtifacts({
const { data: listArtifactResponse } = yield github.request('GET /repos/{owner}/{repo}/actions/runs/{run_id}/artifacts', {
owner: repositoryOwner,
repo: repositoryName,
run_id: workflowRunId,
@@ -78,7 +82,8 @@ function listArtifactsPublic(workflowRunId, repositoryOwner, repositoryName, tok
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined
: undefined,
digest: artifact.digest
});
}
}
@@ -101,14 +106,18 @@ function listArtifactsInternal(latest = false) {
workflowJobRunBackendId
};
const res = yield artifactClient.ListArtifacts(req);
let artifacts = res.artifacts.map(artifact => ({
name: artifact.name,
id: Number(artifact.databaseId),
size: Number(artifact.size),
createdAt: artifact.createdAt
? generated_1.Timestamp.toDate(artifact.createdAt)
: undefined
}));
let artifacts = res.artifacts.map(artifact => {
var _a;
return ({
name: artifact.name,
id: Number(artifact.databaseId),
size: Number(artifact.size),
createdAt: artifact.createdAt
? generated_1.Timestamp.toDate(artifact.createdAt)
: undefined,
digest: (_a = artifact.digest) === null || _a === void 0 ? void 0 : _a.value
});
});
if (latest) {
artifacts = filterLatest(artifacts);
}

View File

@@ -1 +1 @@
{"version":3,"file":"list-artifacts.js","sourceRoot":"","sources":["../../../src/internal/find/list-artifacts.ts"],"names":[],"mappings":";;;;;;;;;;;;AAAA,wCAAkD;AAClD,4CAA0C;AAE1C,qDAAuD;AACvD,mDAA+C;AAC/C,qDAA0E;AAC1E,oEAAsD;AACtD,wDAA2C;AAE3C,2EAA2E;AAC3E,yCAAqD;AACrD,+CAA+D;AAE/D,oCAAoC;AACpC,MAAM,oBAAoB,GAAG,IAAI,CAAA;AACjC,MAAM,eAAe,GAAG,GAAG,CAAA;AAC3B,MAAM,gBAAgB,GAAG,oBAAoB,GAAG,eAAe,CAAA;AAE/D,SAAsB,mBAAmB,CACvC,aAAqB,EACrB,eAAuB,EACvB,cAAsB,EACtB,KAAa,EACb,MAAM,GAAG,KAAK;;QAEd,IAAA,WAAI,EACF,2CAA2C,aAAa,kBAAkB,eAAe,IAAI,cAAc,EAAE,CAC9G,CAAA;QAED,IAAI,SAAS,GAAe,EAAE,CAAA;QAC9B,MAAM,CAAC,SAAS,EAAE,WAAW,CAAC,GAAG,IAAA,+BAAe,EAAC,gBAAoB,CAAC,CAAA;QAEtE,MAAM,IAAI,GAAmB;YAC3B,GAAG,EAAE,SAAS;YACd,SAAS,EAAE,IAAA,+BAAkB,GAAE;YAC/B,QAAQ,EAAE,SAAS;YACnB,KAAK,EAAE,SAAS;YAChB,OAAO,EAAE,WAAW;SACrB,CAAA;QAED,MAAM,MAAM,GAAG,IAAA,mBAAU,EAAC,KAAK,EAAE,IAAI,EAAE,oBAAK,EAAE,+BAAU,CAAC,CAAA;QAEzD,IAAI,iBAAiB,GAAG,CAAC,CAAA;QACzB,MAAM,EAAC,IAAI,EAAE,oBAAoB,EAAC,GAChC,MAAM,MAAM,CAAC,IAAI,CAAC,OAAO,CAAC,wBAAwB,CAAC;YACjD,KAAK,EAAE,eAAe;YACtB,IAAI,EAAE,cAAc;YACpB,MAAM,EAAE,aAAa;YACrB,QAAQ,EAAE,eAAe;YACzB,IAAI,EAAE,iBAAiB;SACxB,CAAC,CAAA;QAEJ,IAAI,aAAa,GAAG,IAAI,CAAC,IAAI,CAC3B,oBAAoB,CAAC,WAAW,GAAG,eAAe,CACnD,CAAA;QACD,MAAM,kBAAkB,GAAG,oBAAoB,CAAC,WAAW,CAAA;QAC3D,IAAI,kBAAkB,GAAG,oBAAoB,EAAE;YAC7C,IAAA,cAAO,EACL,gBAAgB,aAAa,+EAA+E,oBAAoB,6BAA6B,CAC9J,CAAA;YACD,aAAa,GAAG,gBAAgB,CAAA;SACjC;QAED,8BAA8B;QAC9B,KAAK,MAAM,QAAQ,IAAI,oBAAoB,CAAC,SAAS,EAAE;YACrD,SAAS,CAAC,IAAI,CAAC;gBACb,IAAI,EAAE,QAAQ,CAAC,IAAI;gBACnB,EAAE,EAAE,QAAQ,CAAC,EAAE;gBACf,IAAI,EAAE,QAAQ,CAAC,aAAa;gBAC5B,SAAS,EAAE,QAAQ,CAAC,UAAU,CAAC,CAAC,CAAC,IAAI,IAAI,CAAC,QAAQ,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,SAAS;aAC3E,CAAC,CAAA;SACH;QAED,mCAAmC;QACnC,KACE,iBAAiB,EACjB,iBAAiB,GAAG,aAAa,EACjC,iBAAiB,EAAE,EACnB;YACA,iBAAiB,EAAE,CAAA;YACnB,IAAA,YAAK,EAAC,iBAAiB,iBAAiB,mBAAmB,CAAC,CAAA;YAE5D,MAAM,EAAC,IAAI,EAAE,oBAAoB,EAAC,GAChC,MAAM,MAAM,CAAC,IAAI,CAAC,OAAO,CAAC,wBAAwB,CAAC;gBACjD,KAAK,EAAE,eAAe;gBACtB,IAAI,EAAE,cAAc;gBACpB,MAAM,EAAE,aAAa;gBACrB,QAAQ,EAAE,eAAe;gBACzB,IAAI,EAAE,iBAAiB;aACxB,CAAC,CAAA;YAEJ,KAAK,MAAM,QAAQ,IAAI,oBAAoB,CAAC,SAAS,EAAE;gBACrD,SAAS,CAAC,IAAI,CAAC;oBACb,IAAI,EAAE,QAAQ,CAAC,IAAI;oBACnB,EAAE,EAAE,QAAQ,CAAC,EAAE;oBACf,IAAI,EAAE,QAAQ,CAAC,aAAa;oBAC5B,SAAS,EAAE,QAAQ,CAAC,UAAU;wBAC5B,CAAC,CAAC,IAAI,IAAI,CAAC,QAAQ,CAAC,UAAU,CAAC;wBAC/B,CAAC,CAAC,SAAS;iBACd,CAAC,CAAA;aACH;SACF;QAED,IAAI,MAAM,EAAE;YACV,SAAS,GAAG,YAAY,CAAC,SAAS,CAAC,CAAA;SACpC;QAED,IAAA,WAAI,EAAC,SAAS,SAAS,CAAC,MAAM,cAAc,CAAC,CAAA;QAE7C,OAAO;YACL,SAAS;SACV,CAAA;IACH,CAAC;CAAA;AA9FD,kDA8FC;AAED,SAAsB,qBAAqB,CACzC,MAAM,GAAG,KAAK;;QAEd,MAAM,cAAc,GAAG,IAAA,mDAA2B,GAAE,CAAA;QAEpD,MAAM,EAAC,oBAAoB,EAAE,uBAAuB,EAAC,GACnD,IAAA,6BAAsB,GAAE,CAAA;QAE1B,MAAM,GAAG,GAAyB;YAChC,oBAAoB;YACpB,uBAAuB;SACxB,CAAA;QAED,MAAM,GAAG,GAAG,MAAM,cAAc,CAAC,aAAa,CAAC,GAAG,CAAC,CAAA;QACnD,IAAI,SAAS,GAAe,GAAG,CAAC,SAAS,CAAC,GAAG,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC;YACzD,IAAI,EAAE,QAAQ,CAAC,IAAI;YACnB,EAAE,EAAE,MAAM,CAAC,QAAQ,CAAC,UAAU,CAAC;YAC/B,IAAI,EAAE,MAAM,CAAC,QAAQ,CAAC,IAAI,CAAC;YAC3B,SAAS,EAAE,QAAQ,CAAC,SAAS;gBAC3B,CAAC,CAAC,qBAAS,CAAC,MAAM,CAAC,QAAQ,CAAC,SAAS,CAAC;gBACtC,CAAC,CAAC,SAAS;SACd,CAAC,CAAC,CAAA;QAEH,IAAI,MAAM,EAAE;YACV,SAAS,GAAG,YAAY,CAAC,SAAS,CAAC,CAAA;SACpC;QAED,IAAA,WAAI,EAAC,SAAS,SAAS,CAAC,MAAM,cAAc,CAAC,CAAA;QAE7C,OAAO;YACL,SAAS;SACV,CAAA;IACH,CAAC;CAAA;AAhCD,sDAgCC;AAED;;;;GAIG;AACH,SAAS,YAAY,CAAC,SAAqB;IACzC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC,EAAE,CAAC,CAAA;IACrC,MAAM,eAAe,GAAe,EAAE,CAAA;IACtC,MAAM,iBAAiB,GAAG,IAAI,GAAG,EAAU,CAAA;IAC3C,KAAK,MAAM,QAAQ,IAAI,SAAS,EAAE;QAChC,IAAI,CAAC,iBAAiB,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,CAAC,EAAE;YACzC,eAAe,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAA;YAC9B,iBAAiB,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAA;SACrC;KACF;IACD,OAAO,eAAe,CAAA;AACxB,CAAC"}
{"version":3,"file":"list-artifacts.js","sourceRoot":"","sources":["../../../src/internal/find/list-artifacts.ts"],"names":[],"mappings":";;;;;;;;;;;;AAAA,wCAAkD;AAClD,4CAA0C;AAE1C,qDAAuD;AACvD,mDAA+C;AAC/C,qDAA0E;AAC1E,oEAAsD;AACtD,wDAA2C;AAE3C,2EAA2E;AAC3E,yCAAqD;AACrD,+CAA+D;AAE/D,oCAAoC;AACpC,MAAM,oBAAoB,GAAG,IAAI,CAAA;AACjC,MAAM,eAAe,GAAG,GAAG,CAAA;AAC3B,MAAM,gBAAgB,GAAG,oBAAoB,GAAG,eAAe,CAAA;AAE/D,SAAsB,mBAAmB,CACvC,aAAqB,EACrB,eAAuB,EACvB,cAAsB,EACtB,KAAa,EACb,MAAM,GAAG,KAAK;;QAEd,IAAA,WAAI,EACF,2CAA2C,aAAa,kBAAkB,eAAe,IAAI,cAAc,EAAE,CAC9G,CAAA;QAED,IAAI,SAAS,GAAe,EAAE,CAAA;QAC9B,MAAM,CAAC,SAAS,EAAE,WAAW,CAAC,GAAG,IAAA,+BAAe,EAAC,gBAAoB,CAAC,CAAA;QAEtE,MAAM,IAAI,GAAmB;YAC3B,GAAG,EAAE,SAAS;YACd,SAAS,EAAE,IAAA,+BAAkB,GAAE;YAC/B,QAAQ,EAAE,SAAS;YACnB,KAAK,EAAE,SAAS;YAChB,OAAO,EAAE,WAAW;SACrB,CAAA;QAED,MAAM,MAAM,GAAG,IAAA,mBAAU,EAAC,KAAK,EAAE,IAAI,EAAE,oBAAK,EAAE,+BAAU,CAAC,CAAA;QAEzD,IAAI,iBAAiB,GAAG,CAAC,CAAA;QAEzB,MAAM,EAAC,IAAI,EAAE,oBAAoB,EAAC,GAAG,MAAM,MAAM,CAAC,OAAO,CACvD,2DAA2D,EAC3D;YACE,KAAK,EAAE,eAAe;YACtB,IAAI,EAAE,cAAc;YACpB,MAAM,EAAE,aAAa;YACrB,QAAQ,EAAE,eAAe;YACzB,IAAI,EAAE,iBAAiB;SACxB,CACF,CAAA;QAED,IAAI,aAAa,GAAG,IAAI,CAAC,IAAI,CAC3B,oBAAoB,CAAC,WAAW,GAAG,eAAe,CACnD,CAAA;QACD,MAAM,kBAAkB,GAAG,oBAAoB,CAAC,WAAW,CAAA;QAC3D,IAAI,kBAAkB,GAAG,oBAAoB,EAAE;YAC7C,IAAA,cAAO,EACL,gBAAgB,aAAa,+EAA+E,oBAAoB,6BAA6B,CAC9J,CAAA;YACD,aAAa,GAAG,gBAAgB,CAAA;SACjC;QAED,8BAA8B;QAC9B,KAAK,MAAM,QAAQ,IAAI,oBAAoB,CAAC,SAAS,EAAE;YACrD,SAAS,CAAC,IAAI,CAAC;gBACb,IAAI,EAAE,QAAQ,CAAC,IAAI;gBACnB,EAAE,EAAE,QAAQ,CAAC,EAAE;gBACf,IAAI,EAAE,QAAQ,CAAC,aAAa;gBAC5B,SAAS,EAAE,QAAQ,CAAC,UAAU;oBAC5B,CAAC,CAAC,IAAI,IAAI,CAAC,QAAQ,CAAC,UAAU,CAAC;oBAC/B,CAAC,CAAC,SAAS;gBACb,MAAM,EAAG,QAA6B,CAAC,MAAM;aAC9C,CAAC,CAAA;SACH;QACD,wBAAwB;QACxB,iBAAiB,EAAE,CAAA;QACnB,mCAAmC;QACnC,KACE,iBAAiB,EACjB,iBAAiB,GAAG,aAAa,EACjC,iBAAiB,EAAE,EACnB;YACA,IAAA,YAAK,EAAC,iBAAiB,iBAAiB,mBAAmB,CAAC,CAAA;YAE5D,MAAM,EAAC,IAAI,EAAE,oBAAoB,EAAC,GAAG,MAAM,MAAM,CAAC,OAAO,CACvD,2DAA2D,EAC3D;gBACE,KAAK,EAAE,eAAe;gBACtB,IAAI,EAAE,cAAc;gBACpB,MAAM,EAAE,aAAa;gBACrB,QAAQ,EAAE,eAAe;gBACzB,IAAI,EAAE,iBAAiB;aACxB,CACF,CAAA;YAED,KAAK,MAAM,QAAQ,IAAI,oBAAoB,CAAC,SAAS,EAAE;gBACrD,SAAS,CAAC,IAAI,CAAC;oBACb,IAAI,EAAE,QAAQ,CAAC,IAAI;oBACnB,EAAE,EAAE,QAAQ,CAAC,EAAE;oBACf,IAAI,EAAE,QAAQ,CAAC,aAAa;oBAC5B,SAAS,EAAE,QAAQ,CAAC,UAAU;wBAC5B,CAAC,CAAC,IAAI,IAAI,CAAC,QAAQ,CAAC,UAAU,CAAC;wBAC/B,CAAC,CAAC,SAAS;oBACb,MAAM,EAAG,QAA6B,CAAC,MAAM;iBAC9C,CAAC,CAAA;aACH;SACF;QAED,IAAI,MAAM,EAAE;YACV,SAAS,GAAG,YAAY,CAAC,SAAS,CAAC,CAAA;SACpC;QAED,IAAA,WAAI,EAAC,SAAS,SAAS,CAAC,MAAM,cAAc,CAAC,CAAA;QAE7C,OAAO;YACL,SAAS;SACV,CAAA;IACH,CAAC;CAAA;AAvGD,kDAuGC;AAED,SAAsB,qBAAqB,CACzC,MAAM,GAAG,KAAK;;QAEd,MAAM,cAAc,GAAG,IAAA,mDAA2B,GAAE,CAAA;QAEpD,MAAM,EAAC,oBAAoB,EAAE,uBAAuB,EAAC,GACnD,IAAA,6BAAsB,GAAE,CAAA;QAE1B,MAAM,GAAG,GAAyB;YAChC,oBAAoB;YACpB,uBAAuB;SACxB,CAAA;QAED,MAAM,GAAG,GAAG,MAAM,cAAc,CAAC,aAAa,CAAC,GAAG,CAAC,CAAA;QACnD,IAAI,SAAS,GAAe,GAAG,CAAC,SAAS,CAAC,GAAG,CAAC,QAAQ,CAAC,EAAE;;YAAC,OAAA,CAAC;gBACzD,IAAI,EAAE,QAAQ,CAAC,IAAI;gBACnB,EAAE,EAAE,MAAM,CAAC,QAAQ,CAAC,UAAU,CAAC;gBAC/B,IAAI,EAAE,MAAM,CAAC,QAAQ,CAAC,IAAI,CAAC;gBAC3B,SAAS,EAAE,QAAQ,CAAC,SAAS;oBAC3B,CAAC,CAAC,qBAAS,CAAC,MAAM,CAAC,QAAQ,CAAC,SAAS,CAAC;oBACtC,CAAC,CAAC,SAAS;gBACb,MAAM,EAAE,MAAA,QAAQ,CAAC,MAAM,0CAAE,KAAK;aAC/B,CAAC,CAAA;SAAA,CAAC,CAAA;QAEH,IAAI,MAAM,EAAE;YACV,SAAS,GAAG,YAAY,CAAC,SAAS,CAAC,CAAA;SACpC;QAED,IAAA,WAAI,EAAC,SAAS,SAAS,CAAC,MAAM,cAAc,CAAC,CAAA;QAE7C,OAAO;YACL,SAAS;SACV,CAAA;IACH,CAAC;CAAA;AAjCD,sDAiCC;AAcD;;;;GAIG;AACH,SAAS,YAAY,CAAC,SAAqB;IACzC,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC,EAAE,CAAC,CAAA;IACrC,MAAM,eAAe,GAAe,EAAE,CAAA;IACtC,MAAM,iBAAiB,GAAG,IAAI,GAAG,EAAU,CAAA;IAC3C,KAAK,MAAM,QAAQ,IAAI,SAAS,EAAE;QAChC,IAAI,CAAC,iBAAiB,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,CAAC,EAAE;YACzC,eAAe,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAA;YAC9B,iBAAiB,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAA;SACrC;KACF;IACD,OAAO,eAAe,CAAA;AACxB,CAAC"}

View File

@@ -5,6 +5,7 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
Object.defineProperty(exports, "__esModule", { value: true });
exports.getUploadChunkTimeout = exports.getConcurrency = exports.getGitHubWorkspaceDir = exports.isGhes = exports.getResultsServiceUrl = exports.getRuntimeToken = exports.getUploadChunkSize = void 0;
const os_1 = __importDefault(require("os"));
const core_1 = require("@actions/core");
// Used for controlling the highWaterMark value of the zip that is being streamed
// The same value is used as the chunk size that is use during upload to blob storage
function getUploadChunkSize() {
@@ -44,20 +45,42 @@ function getGitHubWorkspaceDir() {
return ghWorkspaceDir;
}
exports.getGitHubWorkspaceDir = getGitHubWorkspaceDir;
// Mimics behavior of azcopy: https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-optimize
// If your machine has fewer than 5 CPUs, then the value of this variable is set to 32.
// Otherwise, the default value is equal to 16 multiplied by the number of CPUs. The maximum value of this variable is 300.
// The maximum value of concurrency is 300.
// This value can be changed with ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY variable.
function getConcurrency() {
const numCPUs = os_1.default.cpus().length;
if (numCPUs <= 4) {
return 32;
let concurrencyCap = 32;
if (numCPUs > 4) {
const concurrency = 16 * numCPUs;
concurrencyCap = concurrency > 300 ? 300 : concurrency;
}
const concurrency = 16 * numCPUs;
return concurrency > 300 ? 300 : concurrency;
const concurrencyOverride = process.env['ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY'];
if (concurrencyOverride) {
const concurrency = parseInt(concurrencyOverride);
if (isNaN(concurrency) || concurrency < 1) {
throw new Error('Invalid value set for ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY env variable');
}
if (concurrency < concurrencyCap) {
(0, core_1.info)(`Set concurrency based on the value set in ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY.`);
return concurrency;
}
(0, core_1.info)(`ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY is higher than the cap of ${concurrencyCap} based on the number of cpus. Set it to the maximum value allowed.`);
return concurrencyCap;
}
// default concurrency to 5
return 5;
}
exports.getConcurrency = getConcurrency;
function getUploadChunkTimeout() {
return 30000; // 30 seconds
const timeoutVar = process.env['ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS'];
if (!timeoutVar) {
return 300000; // 5 minutes
}
const timeout = parseInt(timeoutVar);
if (isNaN(timeout)) {
throw new Error('Invalid value set for ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS env variable');
}
return timeout;
}
exports.getUploadChunkTimeout = getUploadChunkTimeout;
//# sourceMappingURL=config.js.map

View File

@@ -1 +1 @@
{"version":3,"file":"config.js","sourceRoot":"","sources":["../../../src/internal/shared/config.ts"],"names":[],"mappings":";;;;;;AAAA,4CAAmB;AAEnB,iFAAiF;AACjF,qFAAqF;AACrF,SAAgB,kBAAkB;IAChC,OAAO,CAAC,GAAG,IAAI,GAAG,IAAI,CAAA,CAAC,cAAc;AACvC,CAAC;AAFD,gDAEC;AAED,SAAgB,eAAe;IAC7B,MAAM,KAAK,GAAG,OAAO,CAAC,GAAG,CAAC,uBAAuB,CAAC,CAAA;IAClD,IAAI,CAAC,KAAK,EAAE;QACV,MAAM,IAAI,KAAK,CAAC,sDAAsD,CAAC,CAAA;KACxE;IACD,OAAO,KAAK,CAAA;AACd,CAAC;AAND,0CAMC;AAED,SAAgB,oBAAoB;IAClC,MAAM,UAAU,GAAG,OAAO,CAAC,GAAG,CAAC,qBAAqB,CAAC,CAAA;IACrD,IAAI,CAAC,UAAU,EAAE;QACf,MAAM,IAAI,KAAK,CAAC,oDAAoD,CAAC,CAAA;KACtE;IAED,OAAO,IAAI,GAAG,CAAC,UAAU,CAAC,CAAC,MAAM,CAAA;AACnC,CAAC;AAPD,oDAOC;AAED,SAAgB,MAAM;IACpB,MAAM,KAAK,GAAG,IAAI,GAAG,CACnB,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,IAAI,oBAAoB,CACzD,CAAA;IAED,MAAM,QAAQ,GAAG,KAAK,CAAC,QAAQ,CAAC,OAAO,EAAE,CAAC,WAAW,EAAE,CAAA;IACvD,MAAM,YAAY,GAAG,QAAQ,KAAK,YAAY,CAAA;IAC9C,MAAM,SAAS,GAAG,QAAQ,CAAC,QAAQ,CAAC,UAAU,CAAC,CAAA;IAC/C,MAAM,WAAW,GAAG,QAAQ,CAAC,QAAQ,CAAC,YAAY,CAAC,CAAA;IAEnD,OAAO,CAAC,YAAY,IAAI,CAAC,SAAS,IAAI,CAAC,WAAW,CAAA;AACpD,CAAC;AAXD,wBAWC;AAED,SAAgB,qBAAqB;IACnC,MAAM,cAAc,GAAG,OAAO,CAAC,GAAG,CAAC,kBAAkB,CAAC,CAAA;IACtD,IAAI,CAAC,cAAc,EAAE;QACnB,MAAM,IAAI,KAAK,CAAC,iDAAiD,CAAC,CAAA;KACnE;IACD,OAAO,cAAc,CAAA;AACvB,CAAC;AAND,sDAMC;AAED,gHAAgH;AAChH,uFAAuF;AACvF,2HAA2H;AAC3H,SAAgB,cAAc;IAC5B,MAAM,OAAO,GAAG,YAAE,CAAC,IAAI,EAAE,CAAC,MAAM,CAAA;IAEhC,IAAI,OAAO,IAAI,CAAC,EAAE;QAChB,OAAO,EAAE,CAAA;KACV;IAED,MAAM,WAAW,GAAG,EAAE,GAAG,OAAO,CAAA;IAChC,OAAO,WAAW,GAAG,GAAG,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,WAAW,CAAA;AAC9C,CAAC;AATD,wCASC;AAED,SAAgB,qBAAqB;IACnC,OAAO,KAAM,CAAA,CAAC,aAAa;AAC7B,CAAC;AAFD,sDAEC"}
{"version":3,"file":"config.js","sourceRoot":"","sources":["../../../src/internal/shared/config.ts"],"names":[],"mappings":";;;;;;AAAA,4CAAmB;AACnB,wCAAkC;AAElC,iFAAiF;AACjF,qFAAqF;AACrF,SAAgB,kBAAkB;IAChC,OAAO,CAAC,GAAG,IAAI,GAAG,IAAI,CAAA,CAAC,cAAc;AACvC,CAAC;AAFD,gDAEC;AAED,SAAgB,eAAe;IAC7B,MAAM,KAAK,GAAG,OAAO,CAAC,GAAG,CAAC,uBAAuB,CAAC,CAAA;IAClD,IAAI,CAAC,KAAK,EAAE;QACV,MAAM,IAAI,KAAK,CAAC,sDAAsD,CAAC,CAAA;KACxE;IACD,OAAO,KAAK,CAAA;AACd,CAAC;AAND,0CAMC;AAED,SAAgB,oBAAoB;IAClC,MAAM,UAAU,GAAG,OAAO,CAAC,GAAG,CAAC,qBAAqB,CAAC,CAAA;IACrD,IAAI,CAAC,UAAU,EAAE;QACf,MAAM,IAAI,KAAK,CAAC,oDAAoD,CAAC,CAAA;KACtE;IAED,OAAO,IAAI,GAAG,CAAC,UAAU,CAAC,CAAC,MAAM,CAAA;AACnC,CAAC;AAPD,oDAOC;AAED,SAAgB,MAAM;IACpB,MAAM,KAAK,GAAG,IAAI,GAAG,CACnB,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,IAAI,oBAAoB,CACzD,CAAA;IAED,MAAM,QAAQ,GAAG,KAAK,CAAC,QAAQ,CAAC,OAAO,EAAE,CAAC,WAAW,EAAE,CAAA;IACvD,MAAM,YAAY,GAAG,QAAQ,KAAK,YAAY,CAAA;IAC9C,MAAM,SAAS,GAAG,QAAQ,CAAC,QAAQ,CAAC,UAAU,CAAC,CAAA;IAC/C,MAAM,WAAW,GAAG,QAAQ,CAAC,QAAQ,CAAC,YAAY,CAAC,CAAA;IAEnD,OAAO,CAAC,YAAY,IAAI,CAAC,SAAS,IAAI,CAAC,WAAW,CAAA;AACpD,CAAC;AAXD,wBAWC;AAED,SAAgB,qBAAqB;IACnC,MAAM,cAAc,GAAG,OAAO,CAAC,GAAG,CAAC,kBAAkB,CAAC,CAAA;IACtD,IAAI,CAAC,cAAc,EAAE;QACnB,MAAM,IAAI,KAAK,CAAC,iDAAiD,CAAC,CAAA;KACnE;IACD,OAAO,cAAc,CAAA;AACvB,CAAC;AAND,sDAMC;AAED,2CAA2C;AAC3C,+EAA+E;AAC/E,SAAgB,cAAc;IAC5B,MAAM,OAAO,GAAG,YAAE,CAAC,IAAI,EAAE,CAAC,MAAM,CAAA;IAChC,IAAI,cAAc,GAAG,EAAE,CAAA;IAEvB,IAAI,OAAO,GAAG,CAAC,EAAE;QACf,MAAM,WAAW,GAAG,EAAE,GAAG,OAAO,CAAA;QAChC,cAAc,GAAG,WAAW,GAAG,GAAG,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,WAAW,CAAA;KACvD;IAED,MAAM,mBAAmB,GAAG,OAAO,CAAC,GAAG,CAAC,qCAAqC,CAAC,CAAA;IAC9E,IAAI,mBAAmB,EAAE;QACvB,MAAM,WAAW,GAAG,QAAQ,CAAC,mBAAmB,CAAC,CAAA;QACjD,IAAI,KAAK,CAAC,WAAW,CAAC,IAAI,WAAW,GAAG,CAAC,EAAE;YACzC,MAAM,IAAI,KAAK,CACb,wEAAwE,CACzE,CAAA;SACF;QAED,IAAI,WAAW,GAAG,cAAc,EAAE;YAChC,IAAA,WAAI,EACF,gFAAgF,CACjF,CAAA;YACD,OAAO,WAAW,CAAA;SACnB;QAED,IAAA,WAAI,EACF,iEAAiE,cAAc,oEAAoE,CACpJ,CAAA;QACD,OAAO,cAAc,CAAA;KACtB;IAED,2BAA2B;IAC3B,OAAO,CAAC,CAAA;AACV,CAAC;AAjCD,wCAiCC;AAED,SAAgB,qBAAqB;IACnC,MAAM,UAAU,GAAG,OAAO,CAAC,GAAG,CAAC,oCAAoC,CAAC,CAAA;IACpE,IAAI,CAAC,UAAU,EAAE;QACf,OAAO,MAAM,CAAA,CAAC,YAAY;KAC3B;IAED,MAAM,OAAO,GAAG,QAAQ,CAAC,UAAU,CAAC,CAAA;IACpC,IAAI,KAAK,CAAC,OAAO,CAAC,EAAE;QAClB,MAAM,IAAI,KAAK,CACb,uEAAuE,CACxE,CAAA;KACF;IAED,OAAO,OAAO,CAAA;AAChB,CAAC;AAdD,sDAcC"}

View File

@@ -11,6 +11,10 @@ export interface UploadArtifactResponse {
* This ID can be used as input to other APIs to download, delete or get more information about an artifact: https://docs.github.com/en/rest/actions/artifacts
*/
id?: number;
/**
* The SHA256 digest of the artifact that was created. Not provided if no artifact was uploaded
*/
digest?: string;
}
/**
* Options for uploading an artifact
@@ -80,6 +84,10 @@ export interface DownloadArtifactResponse {
* The path where the artifact was downloaded to
*/
downloadPath?: string;
/**
* Returns true if the digest of the downloaded artifact does not match the expected hash
*/
digestMismatch?: boolean;
}
/**
* Options for downloading an artifact
@@ -89,6 +97,18 @@ export interface DownloadArtifactOptions {
* Denotes where the artifact will be downloaded to. If not specified then the artifact is download to GITHUB_WORKSPACE
*/
path?: string;
/**
* The hash that was computed for the artifact during upload. If provided, the outcome of the download
* will provide a digestMismatch property indicating whether the hash of the downloaded artifact
* matches the expected hash.
*/
expectedHash?: string;
}
export interface StreamExtractResponse {
/**
* The SHA256 hash of the downloaded file
*/
sha256Digest?: string;
}
/**
* An Actions Artifact
@@ -110,6 +130,10 @@ export interface Artifact {
* The time when the artifact was created
*/
createdAt?: Date;
/**
* The digest of the artifact, computed at time of upload.
*/
digest?: string;
}
export interface FindOptions {
/**

View File

@@ -95,6 +95,7 @@ function uploadArtifact(name, files, rootDirectory, options) {
core.info(`Artifact ${name}.zip successfully finalized. Artifact ID ${artifactId}`);
return {
size: uploadResult.uploadSize,
digest: uploadResult.sha256Hash,
id: Number(artifactId)
};
});

View File

@@ -1 +1 @@
{"version":3,"file":"upload-artifact.js","sourceRoot":"","sources":["../../../src/internal/upload/upload-artifact.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,oDAAqC;AAKrC,2CAAyC;AACzC,2FAAwE;AACxE,2EAA2E;AAC3E,yEAImC;AACnC,yCAAqD;AACrD,+CAAoD;AACpD,+BAA2C;AAC3C,+CAIwB;AACxB,6CAAyE;AAEzE,SAAsB,cAAc,CAClC,IAAY,EACZ,KAAe,EACf,aAAqB,EACrB,OAA2C;;QAE3C,IAAA,wDAAoB,EAAC,IAAI,CAAC,CAAA;QAC1B,IAAA,gDAAqB,EAAC,aAAa,CAAC,CAAA;QAEpC,MAAM,gBAAgB,GAA6B,IAAA,oDAAyB,EAC1E,KAAK,EACL,aAAa,CACd,CAAA;QACD,IAAI,gBAAgB,CAAC,MAAM,KAAK,CAAC,EAAE;YACjC,MAAM,IAAI,2BAAkB,CAC1B,gBAAgB,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CACpE,CAAA;SACF;QAED,+CAA+C;QAC/C,MAAM,UAAU,GAAG,IAAA,6BAAsB,GAAE,CAAA;QAE3C,6BAA6B;QAC7B,MAAM,cAAc,GAAG,IAAA,mDAA2B,GAAE,CAAA;QAEpD,sBAAsB;QACtB,MAAM,iBAAiB,GAA0B;YAC/C,oBAAoB,EAAE,UAAU,CAAC,oBAAoB;YACrD,uBAAuB,EAAE,UAAU,CAAC,uBAAuB;YAC3D,IAAI;YACJ,OAAO,EAAE,CAAC;SACX,CAAA;QAED,wDAAwD;QACxD,MAAM,SAAS,GAAG,IAAA,yBAAa,EAAC,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,aAAa,CAAC,CAAA;QACvD,IAAI,SAAS,EAAE;YACb,iBAAiB,CAAC,SAAS,GAAG,SAAS,CAAA;SACxC;QAED,MAAM,kBAAkB,GACtB,MAAM,cAAc,CAAC,cAAc,CAAC,iBAAiB,CAAC,CAAA;QACxD,IAAI,CAAC,kBAAkB,CAAC,EAAE,EAAE;YAC1B,MAAM,IAAI,6BAAoB,CAC5B,kDAAkD,CACnD,CAAA;SACF;QAED,MAAM,eAAe,GAAG,MAAM,IAAA,2BAAqB,EACjD,gBAAgB,EAChB,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,gBAAgB,CAC1B,CAAA;QAED,6BAA6B;QAC7B,MAAM,YAAY,GAAG,MAAM,IAAA,oCAAsB,EAC/C,kBAAkB,CAAC,eAAe,EAClC,eAAe,CAChB,CAAA;QAED,wBAAwB;QACxB,MAAM,mBAAmB,GAA4B;YACnD,oBAAoB,EAAE,UAAU,CAAC,oBAAoB;YACrD,uBAAuB,EAAE,UAAU,CAAC,uBAAuB;YAC3D,IAAI;YACJ,IAAI,EAAE,YAAY,CAAC,UAAU,CAAC,CAAC,CAAC,YAAY,CAAC,UAAU,CAAC,QAAQ,EAAE,CAAC,CAAC,CAAC,GAAG;SACzE,CAAA;QAED,IAAI,YAAY,CAAC,UAAU,EAAE;YAC3B,mBAAmB,CAAC,IAAI,GAAG,uBAAW,CAAC,MAAM,CAAC;gBAC5C,KAAK,EAAE,UAAU,YAAY,CAAC,UAAU,EAAE;aAC3C,CAAC,CAAA;SACH;QAED,IAAI,CAAC,IAAI,CAAC,4BAA4B,CAAC,CAAA;QAEvC,MAAM,oBAAoB,GACxB,MAAM,cAAc,CAAC,gBAAgB,CAAC,mBAAmB,CAAC,CAAA;QAC5D,IAAI,CAAC,oBAAoB,CAAC,EAAE,EAAE;YAC5B,MAAM,IAAI,6BAAoB,CAC5B,oDAAoD,CACrD,CAAA;SACF;QAED,MAAM,UAAU,GAAG,MAAM,CAAC,oBAAoB,CAAC,UAAU,CAAC,CAAA;QAC1D,IAAI,CAAC,IAAI,CACP,YAAY,IAAI,4CAA4C,UAAU,EAAE,CACzE,CAAA;QAED,OAAO;YACL,IAAI,EAAE,YAAY,CAAC,UAAU;YAC7B,EAAE,EAAE,MAAM,CAAC,UAAU,CAAC;SACvB,CAAA;IACH,CAAC;CAAA;AA3FD,wCA2FC"}
{"version":3,"file":"upload-artifact.js","sourceRoot":"","sources":["../../../src/internal/upload/upload-artifact.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,oDAAqC;AAKrC,2CAAyC;AACzC,2FAAwE;AACxE,2EAA2E;AAC3E,yEAImC;AACnC,yCAAqD;AACrD,+CAAoD;AACpD,+BAA2C;AAC3C,+CAIwB;AACxB,6CAAyE;AAEzE,SAAsB,cAAc,CAClC,IAAY,EACZ,KAAe,EACf,aAAqB,EACrB,OAA2C;;QAE3C,IAAA,wDAAoB,EAAC,IAAI,CAAC,CAAA;QAC1B,IAAA,gDAAqB,EAAC,aAAa,CAAC,CAAA;QAEpC,MAAM,gBAAgB,GAA6B,IAAA,oDAAyB,EAC1E,KAAK,EACL,aAAa,CACd,CAAA;QACD,IAAI,gBAAgB,CAAC,MAAM,KAAK,CAAC,EAAE;YACjC,MAAM,IAAI,2BAAkB,CAC1B,gBAAgB,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CACpE,CAAA;SACF;QAED,+CAA+C;QAC/C,MAAM,UAAU,GAAG,IAAA,6BAAsB,GAAE,CAAA;QAE3C,6BAA6B;QAC7B,MAAM,cAAc,GAAG,IAAA,mDAA2B,GAAE,CAAA;QAEpD,sBAAsB;QACtB,MAAM,iBAAiB,GAA0B;YAC/C,oBAAoB,EAAE,UAAU,CAAC,oBAAoB;YACrD,uBAAuB,EAAE,UAAU,CAAC,uBAAuB;YAC3D,IAAI;YACJ,OAAO,EAAE,CAAC;SACX,CAAA;QAED,wDAAwD;QACxD,MAAM,SAAS,GAAG,IAAA,yBAAa,EAAC,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,aAAa,CAAC,CAAA;QACvD,IAAI,SAAS,EAAE;YACb,iBAAiB,CAAC,SAAS,GAAG,SAAS,CAAA;SACxC;QAED,MAAM,kBAAkB,GACtB,MAAM,cAAc,CAAC,cAAc,CAAC,iBAAiB,CAAC,CAAA;QACxD,IAAI,CAAC,kBAAkB,CAAC,EAAE,EAAE;YAC1B,MAAM,IAAI,6BAAoB,CAC5B,kDAAkD,CACnD,CAAA;SACF;QAED,MAAM,eAAe,GAAG,MAAM,IAAA,2BAAqB,EACjD,gBAAgB,EAChB,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,gBAAgB,CAC1B,CAAA;QAED,6BAA6B;QAC7B,MAAM,YAAY,GAAG,MAAM,IAAA,oCAAsB,EAC/C,kBAAkB,CAAC,eAAe,EAClC,eAAe,CAChB,CAAA;QAED,wBAAwB;QACxB,MAAM,mBAAmB,GAA4B;YACnD,oBAAoB,EAAE,UAAU,CAAC,oBAAoB;YACrD,uBAAuB,EAAE,UAAU,CAAC,uBAAuB;YAC3D,IAAI;YACJ,IAAI,EAAE,YAAY,CAAC,UAAU,CAAC,CAAC,CAAC,YAAY,CAAC,UAAU,CAAC,QAAQ,EAAE,CAAC,CAAC,CAAC,GAAG;SACzE,CAAA;QAED,IAAI,YAAY,CAAC,UAAU,EAAE;YAC3B,mBAAmB,CAAC,IAAI,GAAG,uBAAW,CAAC,MAAM,CAAC;gBAC5C,KAAK,EAAE,UAAU,YAAY,CAAC,UAAU,EAAE;aAC3C,CAAC,CAAA;SACH;QAED,IAAI,CAAC,IAAI,CAAC,4BAA4B,CAAC,CAAA;QAEvC,MAAM,oBAAoB,GACxB,MAAM,cAAc,CAAC,gBAAgB,CAAC,mBAAmB,CAAC,CAAA;QAC5D,IAAI,CAAC,oBAAoB,CAAC,EAAE,EAAE;YAC5B,MAAM,IAAI,6BAAoB,CAC5B,oDAAoD,CACrD,CAAA;SACF;QAED,MAAM,UAAU,GAAG,MAAM,CAAC,oBAAoB,CAAC,UAAU,CAAC,CAAA;QAC1D,IAAI,CAAC,IAAI,CACP,YAAY,IAAI,4CAA4C,UAAU,EAAE,CACzE,CAAA;QAED,OAAO;YACL,IAAI,EAAE,YAAY,CAAC,UAAU;YAC7B,MAAM,EAAE,YAAY,CAAC,UAAU;YAC/B,EAAE,EAAE,MAAM,CAAC,UAAU,CAAC;SACvB,CAAA;IACH,CAAC;CAAA;AA5FD,wCA4FC"}

View File

@@ -1,3 +1,5 @@
/// <reference types="node" />
import * as fs from 'fs';
export interface UploadZipSpecification {
/**
* An absolute source path that points to a file that will be added to a zip. Null if creating a new directory
@@ -7,6 +9,11 @@ export interface UploadZipSpecification {
* The destination path in a zip for a file
*/
destinationPath: string;
/**
* Information about the file
* https://nodejs.org/api/fs.html#class-fsstats
*/
stats: fs.Stats;
}
/**
* Checks if a root directory exists and is valid

View File

@@ -79,10 +79,11 @@ function getUploadZipSpecification(filesToZip, rootDirectory) {
- file3.txt
*/
for (let file of filesToZip) {
if (!fs.existsSync(file)) {
const stats = fs.lstatSync(file, { throwIfNoEntry: false });
if (!stats) {
throw new Error(`File ${file} does not exist`);
}
if (!fs.statSync(file).isDirectory()) {
if (!stats.isDirectory()) {
// Normalize and resolve, this allows for either absolute or relative paths to be used
file = (0, path_1.normalize)(file);
file = (0, path_1.resolve)(file);
@@ -94,7 +95,8 @@ function getUploadZipSpecification(filesToZip, rootDirectory) {
(0, path_and_artifact_name_validation_1.validateFilePath)(uploadPath);
specification.push({
sourcePath: file,
destinationPath: uploadPath
destinationPath: uploadPath,
stats
});
}
else {
@@ -103,7 +105,8 @@ function getUploadZipSpecification(filesToZip, rootDirectory) {
(0, path_and_artifact_name_validation_1.validateFilePath)(directoryPath);
specification.push({
sourcePath: null,
destinationPath: directoryPath
destinationPath: directoryPath,
stats
});
}
}

View File

@@ -1 +1 @@
{"version":3,"file":"upload-zip-specification.js","sourceRoot":"","sources":["../../../src/internal/upload/upload-zip-specification.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,uCAAwB;AACxB,wCAAkC;AAClC,+BAAuC;AACvC,2FAAoE;AAcpE;;;GAGG;AACH,SAAgB,qBAAqB,CAAC,aAAqB;IACzD,IAAI,CAAC,EAAE,CAAC,UAAU,CAAC,aAAa,CAAC,EAAE;QACjC,MAAM,IAAI,KAAK,CACb,8BAA8B,aAAa,iBAAiB,CAC7D,CAAA;KACF;IACD,IAAI,CAAC,EAAE,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC,WAAW,EAAE,EAAE;QAC7C,MAAM,IAAI,KAAK,CACb,8BAA8B,aAAa,2BAA2B,CACvE,CAAA;KACF;IACD,IAAA,WAAI,EAAC,gCAAgC,CAAC,CAAA;AACxC,CAAC;AAZD,sDAYC;AAED;;;;GAIG;AACH,SAAgB,yBAAyB,CACvC,UAAoB,EACpB,aAAqB;IAErB,MAAM,aAAa,GAA6B,EAAE,CAAA;IAElD,sFAAsF;IACtF,aAAa,GAAG,IAAA,gBAAS,EAAC,aAAa,CAAC,CAAA;IACxC,aAAa,GAAG,IAAA,cAAO,EAAC,aAAa,CAAC,CAAA;IAEtC;;;;;;;;;;;;;;;;;;;;;;;;;MAyBE;IACF,KAAK,IAAI,IAAI,IAAI,UAAU,EAAE;QAC3B,IAAI,CAAC,EAAE,CAAC,UAAU,CAAC,IAAI,CAAC,EAAE;YACxB,MAAM,IAAI,KAAK,CAAC,QAAQ,IAAI,iBAAiB,CAAC,CAAA;SAC/C;QACD,IAAI,CAAC,EAAE,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,WAAW,EAAE,EAAE;YACpC,sFAAsF;YACtF,IAAI,GAAG,IAAA,gBAAS,EAAC,IAAI,CAAC,CAAA;YACtB,IAAI,GAAG,IAAA,cAAO,EAAC,IAAI,CAAC,CAAA;YACpB,IAAI,CAAC,IAAI,CAAC,UAAU,CAAC,aAAa,CAAC,EAAE;gBACnC,MAAM,IAAI,KAAK,CACb,sBAAsB,aAAa,2CAA2C,IAAI,EAAE,CACrF,CAAA;aACF;YAED,yHAAyH;YACzH,MAAM,UAAU,GAAG,IAAI,CAAC,OAAO,CAAC,aAAa,EAAE,EAAE,CAAC,CAAA;YAClD,IAAA,oDAAgB,EAAC,UAAU,CAAC,CAAA;YAE5B,aAAa,CAAC,IAAI,CAAC;gBACjB,UAAU,EAAE,IAAI;gBAChB,eAAe,EAAE,UAAU;aAC5B,CAAC,CAAA;SACH;aAAM;YACL,kBAAkB;YAClB,MAAM,aAAa,GAAG,IAAI,CAAC,OAAO,CAAC,aAAa,EAAE,EAAE,CAAC,CAAA;YACrD,IAAA,oDAAgB,EAAC,aAAa,CAAC,CAAA;YAE/B,aAAa,CAAC,IAAI,CAAC;gBACjB,UAAU,EAAE,IAAI;gBAChB,eAAe,EAAE,aAAa;aAC/B,CAAC,CAAA;SACH;KACF;IACD,OAAO,aAAa,CAAA;AACtB,CAAC;AAtED,8DAsEC"}
{"version":3,"file":"upload-zip-specification.js","sourceRoot":"","sources":["../../../src/internal/upload/upload-zip-specification.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,uCAAwB;AACxB,wCAAkC;AAClC,+BAAuC;AACvC,2FAAoE;AAoBpE;;;GAGG;AACH,SAAgB,qBAAqB,CAAC,aAAqB;IACzD,IAAI,CAAC,EAAE,CAAC,UAAU,CAAC,aAAa,CAAC,EAAE;QACjC,MAAM,IAAI,KAAK,CACb,8BAA8B,aAAa,iBAAiB,CAC7D,CAAA;KACF;IACD,IAAI,CAAC,EAAE,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC,WAAW,EAAE,EAAE;QAC7C,MAAM,IAAI,KAAK,CACb,8BAA8B,aAAa,2BAA2B,CACvE,CAAA;KACF;IACD,IAAA,WAAI,EAAC,gCAAgC,CAAC,CAAA;AACxC,CAAC;AAZD,sDAYC;AAED;;;;GAIG;AACH,SAAgB,yBAAyB,CACvC,UAAoB,EACpB,aAAqB;IAErB,MAAM,aAAa,GAA6B,EAAE,CAAA;IAElD,sFAAsF;IACtF,aAAa,GAAG,IAAA,gBAAS,EAAC,aAAa,CAAC,CAAA;IACxC,aAAa,GAAG,IAAA,cAAO,EAAC,aAAa,CAAC,CAAA;IAEtC;;;;;;;;;;;;;;;;;;;;;;;;;MAyBE;IACF,KAAK,IAAI,IAAI,IAAI,UAAU,EAAE;QAC3B,MAAM,KAAK,GAAG,EAAE,CAAC,SAAS,CAAC,IAAI,EAAE,EAAC,cAAc,EAAE,KAAK,EAAC,CAAC,CAAA;QACzD,IAAI,CAAC,KAAK,EAAE;YACV,MAAM,IAAI,KAAK,CAAC,QAAQ,IAAI,iBAAiB,CAAC,CAAA;SAC/C;QACD,IAAI,CAAC,KAAK,CAAC,WAAW,EAAE,EAAE;YACxB,sFAAsF;YACtF,IAAI,GAAG,IAAA,gBAAS,EAAC,IAAI,CAAC,CAAA;YACtB,IAAI,GAAG,IAAA,cAAO,EAAC,IAAI,CAAC,CAAA;YACpB,IAAI,CAAC,IAAI,CAAC,UAAU,CAAC,aAAa,CAAC,EAAE;gBACnC,MAAM,IAAI,KAAK,CACb,sBAAsB,aAAa,2CAA2C,IAAI,EAAE,CACrF,CAAA;aACF;YAED,yHAAyH;YACzH,MAAM,UAAU,GAAG,IAAI,CAAC,OAAO,CAAC,aAAa,EAAE,EAAE,CAAC,CAAA;YAClD,IAAA,oDAAgB,EAAC,UAAU,CAAC,CAAA;YAE5B,aAAa,CAAC,IAAI,CAAC;gBACjB,UAAU,EAAE,IAAI;gBAChB,eAAe,EAAE,UAAU;gBAC3B,KAAK;aACN,CAAC,CAAA;SACH;aAAM;YACL,kBAAkB;YAClB,MAAM,aAAa,GAAG,IAAI,CAAC,OAAO,CAAC,aAAa,EAAE,EAAE,CAAC,CAAA;YACrD,IAAA,oDAAgB,EAAC,aAAa,CAAC,CAAA;YAE/B,aAAa,CAAC,IAAI,CAAC;gBACjB,UAAU,EAAE,IAAI;gBAChB,eAAe,EAAE,aAAa;gBAC9B,KAAK;aACN,CAAC,CAAA;SACH;KACF;IACD,OAAO,aAAa,CAAA;AACtB,CAAC;AAzED,8DAyEC"}

View File

@@ -34,6 +34,7 @@ var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, ge
Object.defineProperty(exports, "__esModule", { value: true });
exports.createZipUploadStream = exports.ZipUploadStream = exports.DEFAULT_COMPRESSION_LEVEL = void 0;
const stream = __importStar(require("stream"));
const promises_1 = require("fs/promises");
const archiver = __importStar(require("archiver"));
const core = __importStar(require("@actions/core"));
const config_1 = require("../shared/config");
@@ -66,8 +67,13 @@ function createZipUploadStream(uploadSpecification, compressionLevel = exports.D
zip.on('end', zipEndCallback);
for (const file of uploadSpecification) {
if (file.sourcePath !== null) {
// Add a normal file to the zip
zip.file(file.sourcePath, {
// Check if symlink and resolve the source path
let sourcePath = file.sourcePath;
if (file.stats.isSymbolicLink()) {
sourcePath = yield (0, promises_1.realpath)(file.sourcePath);
}
// Add the file to the zip
zip.file(sourcePath, {
name: file.destinationPath
});
}

View File

@@ -1 +1 @@
{"version":3,"file":"zip.js","sourceRoot":"","sources":["../../../src/internal/upload/zip.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,+CAAgC;AAChC,mDAAoC;AACpC,oDAAqC;AAErC,6CAAmD;AAEtC,QAAA,yBAAyB,GAAG,CAAC,CAAA;AAE1C,qEAAqE;AACrE,iDAAiD;AACjD,MAAa,eAAgB,SAAQ,MAAM,CAAC,SAAS;IACnD,YAAY,UAAkB;QAC5B,KAAK,CAAC;YACJ,aAAa,EAAE,UAAU;SAC1B,CAAC,CAAA;IACJ,CAAC;IAED,8DAA8D;IAC9D,UAAU,CAAC,KAAU,EAAE,GAAQ,EAAE,EAAO;QACtC,EAAE,CAAC,IAAI,EAAE,KAAK,CAAC,CAAA;IACjB,CAAC;CACF;AAXD,0CAWC;AAED,SAAsB,qBAAqB,CACzC,mBAA6C,EAC7C,mBAA2B,iCAAyB;;QAEpD,IAAI,CAAC,KAAK,CACR,oDAAoD,gBAAgB,EAAE,CACvE,CAAA;QAED,MAAM,GAAG,GAAG,QAAQ,CAAC,MAAM,CAAC,KAAK,EAAE;YACjC,aAAa,EAAE,IAAA,2BAAkB,GAAE;YACnC,IAAI,EAAE,EAAC,KAAK,EAAE,gBAAgB,EAAC;SAChC,CAAC,CAAA;QAEF,iEAAiE;QACjE,GAAG,CAAC,EAAE,CAAC,OAAO,EAAE,gBAAgB,CAAC,CAAA;QACjC,GAAG,CAAC,EAAE,CAAC,SAAS,EAAE,kBAAkB,CAAC,CAAA;QACrC,GAAG,CAAC,EAAE,CAAC,QAAQ,EAAE,iBAAiB,CAAC,CAAA;QACnC,GAAG,CAAC,EAAE,CAAC,KAAK,EAAE,cAAc,CAAC,CAAA;QAE7B,KAAK,MAAM,IAAI,IAAI,mBAAmB,EAAE;YACtC,IAAI,IAAI,CAAC,UAAU,KAAK,IAAI,EAAE;gBAC5B,+BAA+B;gBAC/B,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC,UAAU,EAAE;oBACxB,IAAI,EAAE,IAAI,CAAC,eAAe;iBAC3B,CAAC,CAAA;aACH;iBAAM;gBACL,6BAA6B;gBAC7B,GAAG,CAAC,MAAM,CAAC,EAAE,EAAE,EAAC,IAAI,EAAE,IAAI,CAAC,eAAe,EAAC,CAAC,CAAA;aAC7C;SACF;QAED,MAAM,UAAU,GAAG,IAAA,2BAAkB,GAAE,CAAA;QACvC,MAAM,eAAe,GAAG,IAAI,eAAe,CAAC,UAAU,CAAC,CAAA;QAEvD,IAAI,CAAC,KAAK,CACR,kCAAkC,eAAe,CAAC,qBAAqB,EAAE,CAC1E,CAAA;QACD,IAAI,CAAC,KAAK,CACR,iCAAiC,eAAe,CAAC,qBAAqB,EAAE,CACzE,CAAA;QAED,GAAG,CAAC,IAAI,CAAC,eAAe,CAAC,CAAA;QACzB,GAAG,CAAC,QAAQ,EAAE,CAAA;QAEd,OAAO,eAAe,CAAA;IACxB,CAAC;CAAA;AA7CD,sDA6CC;AAED,8DAA8D;AAC9D,MAAM,gBAAgB,GAAG,CAAC,KAAU,EAAQ,EAAE;IAC5C,IAAI,CAAC,KAAK,CAAC,8DAA8D,CAAC,CAAA;IAC1E,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,CAAA;IAEhB,MAAM,IAAI,KAAK,CAAC,4DAA4D,CAAC,CAAA;AAC/E,CAAC,CAAA;AAED,8DAA8D;AAC9D,MAAM,kBAAkB,GAAG,CAAC,KAAU,EAAQ,EAAE;IAC9C,IAAI,KAAK,CAAC,IAAI,KAAK,QAAQ,EAAE;QAC3B,IAAI,CAAC,OAAO,CACV,wEAAwE,CACzE,CAAA;QACD,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,CAAA;KACjB;SAAM;QACL,IAAI,CAAC,OAAO,CACV,qEAAqE,KAAK,CAAC,IAAI,EAAE,CAClF,CAAA;QACD,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,CAAA;KACjB;AACH,CAAC,CAAA;AAED,MAAM,iBAAiB,GAAG,GAAS,EAAE;IACnC,IAAI,CAAC,KAAK,CAAC,qCAAqC,CAAC,CAAA;AACnD,CAAC,CAAA;AAED,MAAM,cAAc,GAAG,GAAS,EAAE;IAChC,IAAI,CAAC,KAAK,CAAC,kCAAkC,CAAC,CAAA;AAChD,CAAC,CAAA"}
{"version":3,"file":"zip.js","sourceRoot":"","sources":["../../../src/internal/upload/zip.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,+CAAgC;AAChC,0CAAoC;AACpC,mDAAoC;AACpC,oDAAqC;AAErC,6CAAmD;AAEtC,QAAA,yBAAyB,GAAG,CAAC,CAAA;AAE1C,qEAAqE;AACrE,iDAAiD;AACjD,MAAa,eAAgB,SAAQ,MAAM,CAAC,SAAS;IACnD,YAAY,UAAkB;QAC5B,KAAK,CAAC;YACJ,aAAa,EAAE,UAAU;SAC1B,CAAC,CAAA;IACJ,CAAC;IAED,8DAA8D;IAC9D,UAAU,CAAC,KAAU,EAAE,GAAQ,EAAE,EAAO;QACtC,EAAE,CAAC,IAAI,EAAE,KAAK,CAAC,CAAA;IACjB,CAAC;CACF;AAXD,0CAWC;AAED,SAAsB,qBAAqB,CACzC,mBAA6C,EAC7C,mBAA2B,iCAAyB;;QAEpD,IAAI,CAAC,KAAK,CACR,oDAAoD,gBAAgB,EAAE,CACvE,CAAA;QAED,MAAM,GAAG,GAAG,QAAQ,CAAC,MAAM,CAAC,KAAK,EAAE;YACjC,aAAa,EAAE,IAAA,2BAAkB,GAAE;YACnC,IAAI,EAAE,EAAC,KAAK,EAAE,gBAAgB,EAAC;SAChC,CAAC,CAAA;QAEF,iEAAiE;QACjE,GAAG,CAAC,EAAE,CAAC,OAAO,EAAE,gBAAgB,CAAC,CAAA;QACjC,GAAG,CAAC,EAAE,CAAC,SAAS,EAAE,kBAAkB,CAAC,CAAA;QACrC,GAAG,CAAC,EAAE,CAAC,QAAQ,EAAE,iBAAiB,CAAC,CAAA;QACnC,GAAG,CAAC,EAAE,CAAC,KAAK,EAAE,cAAc,CAAC,CAAA;QAE7B,KAAK,MAAM,IAAI,IAAI,mBAAmB,EAAE;YACtC,IAAI,IAAI,CAAC,UAAU,KAAK,IAAI,EAAE;gBAC5B,+CAA+C;gBAC/C,IAAI,UAAU,GAAG,IAAI,CAAC,UAAU,CAAA;gBAChC,IAAI,IAAI,CAAC,KAAK,CAAC,cAAc,EAAE,EAAE;oBAC/B,UAAU,GAAG,MAAM,IAAA,mBAAQ,EAAC,IAAI,CAAC,UAAU,CAAC,CAAA;iBAC7C;gBAED,0BAA0B;gBAC1B,GAAG,CAAC,IAAI,CAAC,UAAU,EAAE;oBACnB,IAAI,EAAE,IAAI,CAAC,eAAe;iBAC3B,CAAC,CAAA;aACH;iBAAM;gBACL,6BAA6B;gBAC7B,GAAG,CAAC,MAAM,CAAC,EAAE,EAAE,EAAC,IAAI,EAAE,IAAI,CAAC,eAAe,EAAC,CAAC,CAAA;aAC7C;SACF;QAED,MAAM,UAAU,GAAG,IAAA,2BAAkB,GAAE,CAAA;QACvC,MAAM,eAAe,GAAG,IAAI,eAAe,CAAC,UAAU,CAAC,CAAA;QAEvD,IAAI,CAAC,KAAK,CACR,kCAAkC,eAAe,CAAC,qBAAqB,EAAE,CAC1E,CAAA;QACD,IAAI,CAAC,KAAK,CACR,iCAAiC,eAAe,CAAC,qBAAqB,EAAE,CACzE,CAAA;QAED,GAAG,CAAC,IAAI,CAAC,eAAe,CAAC,CAAA;QACzB,GAAG,CAAC,QAAQ,EAAE,CAAA;QAEd,OAAO,eAAe,CAAA;IACxB,CAAC;CAAA;AAnDD,sDAmDC;AAED,8DAA8D;AAC9D,MAAM,gBAAgB,GAAG,CAAC,KAAU,EAAQ,EAAE;IAC5C,IAAI,CAAC,KAAK,CAAC,8DAA8D,CAAC,CAAA;IAC1E,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,CAAA;IAEhB,MAAM,IAAI,KAAK,CAAC,4DAA4D,CAAC,CAAA;AAC/E,CAAC,CAAA;AAED,8DAA8D;AAC9D,MAAM,kBAAkB,GAAG,CAAC,KAAU,EAAQ,EAAE;IAC9C,IAAI,KAAK,CAAC,IAAI,KAAK,QAAQ,EAAE;QAC3B,IAAI,CAAC,OAAO,CACV,wEAAwE,CACzE,CAAA;QACD,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,CAAA;KACjB;SAAM;QACL,IAAI,CAAC,OAAO,CACV,qEAAqE,KAAK,CAAC,IAAI,EAAE,CAClF,CAAA;QACD,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,CAAA;KACjB;AACH,CAAC,CAAA;AAED,MAAM,iBAAiB,GAAG,GAAS,EAAE;IACnC,IAAI,CAAC,KAAK,CAAC,qCAAqC,CAAC,CAAA;AACnD,CAAC,CAAA;AAED,MAAM,cAAc,GAAG,GAAS,EAAE;IAChC,IAAI,CAAC,KAAK,CAAC,kCAAkC,CAAC,CAAA;AAChD,CAAC,CAAA"}

View File

@@ -0,0 +1,9 @@
The MIT License (MIT)
Copyright 2019 GitHub
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -0,0 +1,98 @@
# `@actions/github`
> A hydrated Octokit client.
## Usage
Returns an authenticated Octokit client that follows the machine [proxy settings](https://help.github.com/en/actions/hosting-your-own-runners/using-a-proxy-server-with-self-hosted-runners) and correctly sets GHES base urls. See https://octokit.github.io/rest.js for the API.
```js
const github = require('@actions/github');
const core = require('@actions/core');
async function run() {
// This should be a token with access to your repository scoped in as a secret.
// The YML workflow will need to set myToken with the GitHub Secret Token
// myToken: ${{ secrets.GITHUB_TOKEN }}
// https://help.github.com/en/actions/automating-your-workflow-with-github-actions/authenticating-with-the-github_token#about-the-github_token-secret
const myToken = core.getInput('myToken');
const octokit = github.getOctokit(myToken)
// You can also pass in additional options as a second parameter to getOctokit
// const octokit = github.getOctokit(myToken, {userAgent: "MyActionVersion1"});
const { data: pullRequest } = await octokit.rest.pulls.get({
owner: 'octokit',
repo: 'rest.js',
pull_number: 123,
mediaType: {
format: 'diff'
}
});
console.log(pullRequest);
}
run();
```
You can also make GraphQL requests. See https://github.com/octokit/graphql.js for the API.
```js
const result = await octokit.graphql(query, variables);
```
Finally, you can get the context of the current action:
```js
const github = require('@actions/github');
const context = github.context;
const newIssue = await octokit.rest.issues.create({
...context.repo,
title: 'New issue!',
body: 'Hello Universe!'
});
```
## Webhook payload typescript definitions
The npm module `@octokit/webhooks-definitions` provides type definitions for the response payloads. You can cast the payload to these types for better type information.
First, install the npm module `npm install @octokit/webhooks-definitions`
Then, assert the type based on the eventName
```ts
import * as core from '@actions/core'
import * as github from '@actions/github'
import {PushEvent} from '@octokit/webhooks-definitions/schema'
if (github.context.eventName === 'push') {
const pushPayload = github.context.payload as PushEvent
core.info(`The head commit is: ${pushPayload.head_commit}`)
}
```
## Extending the Octokit instance
`@octokit/core` now supports the [plugin architecture](https://github.com/octokit/core.js#plugins). You can extend the GitHub instance using plugins.
For example, using the `@octokit/plugin-enterprise-server` you can now access enterprise admin apis on GHES instances.
```ts
import { GitHub, getOctokitOptions } from '@actions/github/lib/utils'
import { enterpriseServer220Admin } from '@octokit/plugin-enterprise-server'
const octokit = GitHub.plugin(enterpriseServer220Admin)
// or override some of the default values as well
// const octokit = GitHub.plugin(enterpriseServer220Admin).defaults({userAgent: "MyNewUserAgent"})
const myToken = core.getInput('myToken');
const myOctokit = new octokit(getOctokitOptions(token))
// Create a new user
myOctokit.rest.enterpriseAdmin.createUser({
login: "testuser",
email: "testuser@test.com",
});
```

View File

@@ -0,0 +1,32 @@
import { WebhookPayload } from './interfaces';
export declare class Context {
/**
* Webhook payload object that triggered the workflow
*/
payload: WebhookPayload;
eventName: string;
sha: string;
ref: string;
workflow: string;
action: string;
actor: string;
job: string;
runNumber: number;
runId: number;
apiUrl: string;
serverUrl: string;
graphqlUrl: string;
/**
* Hydrate the context from the environment
*/
constructor();
get issue(): {
owner: string;
repo: string;
number: number;
};
get repo(): {
owner: string;
repo: string;
};
}

View File

@@ -0,0 +1,54 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.Context = void 0;
const fs_1 = require("fs");
const os_1 = require("os");
class Context {
/**
* Hydrate the context from the environment
*/
constructor() {
var _a, _b, _c;
this.payload = {};
if (process.env.GITHUB_EVENT_PATH) {
if (fs_1.existsSync(process.env.GITHUB_EVENT_PATH)) {
this.payload = JSON.parse(fs_1.readFileSync(process.env.GITHUB_EVENT_PATH, { encoding: 'utf8' }));
}
else {
const path = process.env.GITHUB_EVENT_PATH;
process.stdout.write(`GITHUB_EVENT_PATH ${path} does not exist${os_1.EOL}`);
}
}
this.eventName = process.env.GITHUB_EVENT_NAME;
this.sha = process.env.GITHUB_SHA;
this.ref = process.env.GITHUB_REF;
this.workflow = process.env.GITHUB_WORKFLOW;
this.action = process.env.GITHUB_ACTION;
this.actor = process.env.GITHUB_ACTOR;
this.job = process.env.GITHUB_JOB;
this.runNumber = parseInt(process.env.GITHUB_RUN_NUMBER, 10);
this.runId = parseInt(process.env.GITHUB_RUN_ID, 10);
this.apiUrl = (_a = process.env.GITHUB_API_URL) !== null && _a !== void 0 ? _a : `https://api.github.com`;
this.serverUrl = (_b = process.env.GITHUB_SERVER_URL) !== null && _b !== void 0 ? _b : `https://github.com`;
this.graphqlUrl = (_c = process.env.GITHUB_GRAPHQL_URL) !== null && _c !== void 0 ? _c : `https://api.github.com/graphql`;
}
get issue() {
const payload = this.payload;
return Object.assign(Object.assign({}, this.repo), { number: (payload.issue || payload.pull_request || payload).number });
}
get repo() {
if (process.env.GITHUB_REPOSITORY) {
const [owner, repo] = process.env.GITHUB_REPOSITORY.split('/');
return { owner, repo };
}
if (this.payload.repository) {
return {
owner: this.payload.repository.owner.login,
repo: this.payload.repository.name
};
}
throw new Error("context.repo requires a GITHUB_REPOSITORY environment variable like 'owner/repo'");
}
}
exports.Context = Context;
//# sourceMappingURL=context.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"context.js","sourceRoot":"","sources":["../src/context.ts"],"names":[],"mappings":";;;AAEA,2BAA2C;AAC3C,2BAAsB;AAEtB,MAAa,OAAO;IAmBlB;;OAEG;IACH;;QACE,IAAI,CAAC,OAAO,GAAG,EAAE,CAAA;QACjB,IAAI,OAAO,CAAC,GAAG,CAAC,iBAAiB,EAAE;YACjC,IAAI,eAAU,CAAC,OAAO,CAAC,GAAG,CAAC,iBAAiB,CAAC,EAAE;gBAC7C,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,KAAK,CACvB,iBAAY,CAAC,OAAO,CAAC,GAAG,CAAC,iBAAiB,EAAE,EAAC,QAAQ,EAAE,MAAM,EAAC,CAAC,CAChE,CAAA;aACF;iBAAM;gBACL,MAAM,IAAI,GAAG,OAAO,CAAC,GAAG,CAAC,iBAAiB,CAAA;gBAC1C,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,qBAAqB,IAAI,kBAAkB,QAAG,EAAE,CAAC,CAAA;aACvE;SACF;QACD,IAAI,CAAC,SAAS,GAAG,OAAO,CAAC,GAAG,CAAC,iBAA2B,CAAA;QACxD,IAAI,CAAC,GAAG,GAAG,OAAO,CAAC,GAAG,CAAC,UAAoB,CAAA;QAC3C,IAAI,CAAC,GAAG,GAAG,OAAO,CAAC,GAAG,CAAC,UAAoB,CAAA;QAC3C,IAAI,CAAC,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,eAAyB,CAAA;QACrD,IAAI,CAAC,MAAM,GAAG,OAAO,CAAC,GAAG,CAAC,aAAuB,CAAA;QACjD,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,GAAG,CAAC,YAAsB,CAAA;QAC/C,IAAI,CAAC,GAAG,GAAG,OAAO,CAAC,GAAG,CAAC,UAAoB,CAAA;QAC3C,IAAI,CAAC,SAAS,GAAG,QAAQ,CAAC,OAAO,CAAC,GAAG,CAAC,iBAA2B,EAAE,EAAE,CAAC,CAAA;QACtE,IAAI,CAAC,KAAK,GAAG,QAAQ,CAAC,OAAO,CAAC,GAAG,CAAC,aAAuB,EAAE,EAAE,CAAC,CAAA;QAC9D,IAAI,CAAC,MAAM,SAAG,OAAO,CAAC,GAAG,CAAC,cAAc,mCAAI,wBAAwB,CAAA;QACpE,IAAI,CAAC,SAAS,SAAG,OAAO,CAAC,GAAG,CAAC,iBAAiB,mCAAI,oBAAoB,CAAA;QACtE,IAAI,CAAC,UAAU,SACb,OAAO,CAAC,GAAG,CAAC,kBAAkB,mCAAI,gCAAgC,CAAA;IACtE,CAAC;IAED,IAAI,KAAK;QACP,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAE5B,uCACK,IAAI,CAAC,IAAI,KACZ,MAAM,EAAE,CAAC,OAAO,CAAC,KAAK,IAAI,OAAO,CAAC,YAAY,IAAI,OAAO,CAAC,CAAC,MAAM,IAClE;IACH,CAAC;IAED,IAAI,IAAI;QACN,IAAI,OAAO,CAAC,GAAG,CAAC,iBAAiB,EAAE;YACjC,MAAM,CAAC,KAAK,EAAE,IAAI,CAAC,GAAG,OAAO,CAAC,GAAG,CAAC,iBAAiB,CAAC,KAAK,CAAC,GAAG,CAAC,CAAA;YAC9D,OAAO,EAAC,KAAK,EAAE,IAAI,EAAC,CAAA;SACrB;QAED,IAAI,IAAI,CAAC,OAAO,CAAC,UAAU,EAAE;YAC3B,OAAO;gBACL,KAAK,EAAE,IAAI,CAAC,OAAO,CAAC,UAAU,CAAC,KAAK,CAAC,KAAK;gBAC1C,IAAI,EAAE,IAAI,CAAC,OAAO,CAAC,UAAU,CAAC,IAAI;aACnC,CAAA;SACF;QAED,MAAM,IAAI,KAAK,CACb,kFAAkF,CACnF,CAAA;IACH,CAAC;CACF;AA3ED,0BA2EC"}

View File

@@ -0,0 +1,11 @@
import * as Context from './context';
import { GitHub } from './utils';
import { OctokitOptions, OctokitPlugin } from '@octokit/core/dist-types/types';
export declare const context: Context.Context;
/**
* Returns a hydrated octokit ready to use for GitHub Actions
*
* @param token the repo PAT or GITHUB_TOKEN
* @param options other options to set
*/
export declare function getOctokit(token: string, options?: OctokitOptions, ...additionalPlugins: OctokitPlugin[]): InstanceType<typeof GitHub>;

View File

@@ -0,0 +1,37 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k in mod) if (k !== "default" && Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
__setModuleDefault(result, mod);
return result;
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.getOctokit = exports.context = void 0;
const Context = __importStar(require("./context"));
const utils_1 = require("./utils");
exports.context = new Context.Context();
/**
* Returns a hydrated octokit ready to use for GitHub Actions
*
* @param token the repo PAT or GITHUB_TOKEN
* @param options other options to set
*/
function getOctokit(token, options, ...additionalPlugins) {
const GitHubWithPlugins = utils_1.GitHub.plugin(...additionalPlugins);
return new GitHubWithPlugins(utils_1.getOctokitOptions(token, options));
}
exports.getOctokit = getOctokit;
//# sourceMappingURL=github.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"github.js","sourceRoot":"","sources":["../src/github.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;AAAA,mDAAoC;AACpC,mCAAiD;AAKpC,QAAA,OAAO,GAAG,IAAI,OAAO,CAAC,OAAO,EAAE,CAAA;AAE5C;;;;;GAKG;AACH,SAAgB,UAAU,CACxB,KAAa,EACb,OAAwB,EACxB,GAAG,iBAAkC;IAErC,MAAM,iBAAiB,GAAG,cAAM,CAAC,MAAM,CAAC,GAAG,iBAAiB,CAAC,CAAA;IAC7D,OAAO,IAAI,iBAAiB,CAAC,yBAAiB,CAAC,KAAK,EAAE,OAAO,CAAC,CAAC,CAAA;AACjE,CAAC;AAPD,gCAOC"}

View File

@@ -0,0 +1,40 @@
export interface PayloadRepository {
[key: string]: any;
full_name?: string;
name: string;
owner: {
[key: string]: any;
login: string;
name?: string;
};
html_url?: string;
}
export interface WebhookPayload {
[key: string]: any;
repository?: PayloadRepository;
issue?: {
[key: string]: any;
number: number;
html_url?: string;
body?: string;
};
pull_request?: {
[key: string]: any;
number: number;
html_url?: string;
body?: string;
};
sender?: {
[key: string]: any;
type: string;
};
action?: string;
installation?: {
id: number;
[key: string]: any;
};
comment?: {
id: number;
[key: string]: any;
};
}

View File

@@ -0,0 +1,4 @@
"use strict";
/* eslint-disable @typescript-eslint/no-explicit-any */
Object.defineProperty(exports, "__esModule", { value: true });
//# sourceMappingURL=interfaces.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"interfaces.js","sourceRoot":"","sources":["../src/interfaces.ts"],"names":[],"mappings":";AAAA,uDAAuD"}

View File

@@ -0,0 +1,6 @@
/// <reference types="node" />
import * as http from 'http';
import { OctokitOptions } from '@octokit/core/dist-types/types';
export declare function getAuthString(token: string, options: OctokitOptions): string | undefined;
export declare function getProxyAgent(destinationUrl: string): http.Agent;
export declare function getApiBaseUrl(): string;

View File

@@ -0,0 +1,43 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k in mod) if (k !== "default" && Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
__setModuleDefault(result, mod);
return result;
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.getApiBaseUrl = exports.getProxyAgent = exports.getAuthString = void 0;
const httpClient = __importStar(require("@actions/http-client"));
function getAuthString(token, options) {
if (!token && !options.auth) {
throw new Error('Parameter token or opts.auth is required');
}
else if (token && options.auth) {
throw new Error('Parameters token and opts.auth may not both be specified');
}
return typeof options.auth === 'string' ? options.auth : `token ${token}`;
}
exports.getAuthString = getAuthString;
function getProxyAgent(destinationUrl) {
const hc = new httpClient.HttpClient();
return hc.getAgent(destinationUrl);
}
exports.getProxyAgent = getProxyAgent;
function getApiBaseUrl() {
return process.env['GITHUB_API_URL'] || 'https://api.github.com';
}
exports.getApiBaseUrl = getApiBaseUrl;
//# sourceMappingURL=utils.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"utils.js","sourceRoot":"","sources":["../../src/internal/utils.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;AACA,iEAAkD;AAGlD,SAAgB,aAAa,CAC3B,KAAa,EACb,OAAuB;IAEvB,IAAI,CAAC,KAAK,IAAI,CAAC,OAAO,CAAC,IAAI,EAAE;QAC3B,MAAM,IAAI,KAAK,CAAC,0CAA0C,CAAC,CAAA;KAC5D;SAAM,IAAI,KAAK,IAAI,OAAO,CAAC,IAAI,EAAE;QAChC,MAAM,IAAI,KAAK,CAAC,0DAA0D,CAAC,CAAA;KAC5E;IAED,OAAO,OAAO,OAAO,CAAC,IAAI,KAAK,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,SAAS,KAAK,EAAE,CAAA;AAC3E,CAAC;AAXD,sCAWC;AAED,SAAgB,aAAa,CAAC,cAAsB;IAClD,MAAM,EAAE,GAAG,IAAI,UAAU,CAAC,UAAU,EAAE,CAAA;IACtC,OAAO,EAAE,CAAC,QAAQ,CAAC,cAAc,CAAC,CAAA;AACpC,CAAC;AAHD,sCAGC;AAED,SAAgB,aAAa;IAC3B,OAAO,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC,IAAI,wBAAwB,CAAA;AAClE,CAAC;AAFD,sCAEC"}

View File

@@ -0,0 +1,15 @@
import * as Context from './context';
import { Octokit } from '@octokit/core';
import { OctokitOptions } from '@octokit/core/dist-types/types';
export declare const context: Context.Context;
export declare const defaults: OctokitOptions;
export declare const GitHub: typeof Octokit & import("@octokit/core/dist-types/types").Constructor<import("@octokit/plugin-rest-endpoint-methods/dist-types/types").Api & {
paginate: import("@octokit/plugin-paginate-rest").PaginateInterface;
}>;
/**
* Convience function to correctly format Octokit Options to pass into the constructor.
*
* @param token the repo PAT or GITHUB_TOKEN
* @param options other options to set
*/
export declare function getOctokitOptions(token: string, options?: OctokitOptions): OctokitOptions;

View File

@@ -0,0 +1,54 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k in mod) if (k !== "default" && Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
__setModuleDefault(result, mod);
return result;
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.getOctokitOptions = exports.GitHub = exports.defaults = exports.context = void 0;
const Context = __importStar(require("./context"));
const Utils = __importStar(require("./internal/utils"));
// octokit + plugins
const core_1 = require("@octokit/core");
const plugin_rest_endpoint_methods_1 = require("@octokit/plugin-rest-endpoint-methods");
const plugin_paginate_rest_1 = require("@octokit/plugin-paginate-rest");
exports.context = new Context.Context();
const baseUrl = Utils.getApiBaseUrl();
exports.defaults = {
baseUrl,
request: {
agent: Utils.getProxyAgent(baseUrl)
}
};
exports.GitHub = core_1.Octokit.plugin(plugin_rest_endpoint_methods_1.restEndpointMethods, plugin_paginate_rest_1.paginateRest).defaults(exports.defaults);
/**
* Convience function to correctly format Octokit Options to pass into the constructor.
*
* @param token the repo PAT or GITHUB_TOKEN
* @param options other options to set
*/
function getOctokitOptions(token, options) {
const opts = Object.assign({}, options || {}); // Shallow clone - don't mutate the object provided by the caller
// Auth
const auth = Utils.getAuthString(token, opts);
if (auth) {
opts.auth = auth;
}
return opts;
}
exports.getOctokitOptions = getOctokitOptions;
//# sourceMappingURL=utils.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"utils.js","sourceRoot":"","sources":["../src/utils.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;AAAA,mDAAoC;AACpC,wDAAyC;AAEzC,oBAAoB;AACpB,wCAAqC;AAErC,wFAAyE;AACzE,wEAA0D;AAE7C,QAAA,OAAO,GAAG,IAAI,OAAO,CAAC,OAAO,EAAE,CAAA;AAE5C,MAAM,OAAO,GAAG,KAAK,CAAC,aAAa,EAAE,CAAA;AACxB,QAAA,QAAQ,GAAmB;IACtC,OAAO;IACP,OAAO,EAAE;QACP,KAAK,EAAE,KAAK,CAAC,aAAa,CAAC,OAAO,CAAC;KACpC;CACF,CAAA;AAEY,QAAA,MAAM,GAAG,cAAO,CAAC,MAAM,CAClC,kDAAmB,EACnB,mCAAY,CACb,CAAC,QAAQ,CAAC,gBAAQ,CAAC,CAAA;AAEpB;;;;;GAKG;AACH,SAAgB,iBAAiB,CAC/B,KAAa,EACb,OAAwB;IAExB,MAAM,IAAI,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,OAAO,IAAI,EAAE,CAAC,CAAA,CAAC,iEAAiE;IAE/G,OAAO;IACP,MAAM,IAAI,GAAG,KAAK,CAAC,aAAa,CAAC,KAAK,EAAE,IAAI,CAAC,CAAA;IAC7C,IAAI,IAAI,EAAE;QACR,IAAI,CAAC,IAAI,GAAG,IAAI,CAAA;KACjB;IAED,OAAO,IAAI,CAAA;AACb,CAAC;AAbD,8CAaC"}

View File

@@ -0,0 +1,49 @@
{
"name": "@actions/github",
"version": "5.1.1",
"description": "Actions github lib",
"keywords": [
"github",
"actions"
],
"homepage": "https://github.com/actions/toolkit/tree/main/packages/github",
"license": "MIT",
"main": "lib/github.js",
"types": "lib/github.d.ts",
"directories": {
"lib": "lib",
"test": "__tests__"
},
"files": [
"lib",
"!.DS_Store"
],
"publishConfig": {
"access": "public"
},
"repository": {
"type": "git",
"url": "git+https://github.com/actions/toolkit.git",
"directory": "packages/github"
},
"scripts": {
"audit-moderate": "npm install && npm audit --json --audit-level=moderate > audit.json",
"test": "jest",
"build": "tsc",
"format": "prettier --write **/*.ts",
"format-check": "prettier --check **/*.ts",
"tsc": "tsc"
},
"bugs": {
"url": "https://github.com/actions/toolkit/issues"
},
"dependencies": {
"@actions/http-client": "^2.0.1",
"@octokit/core": "^3.6.0",
"@octokit/plugin-paginate-rest": "^2.17.0",
"@octokit/plugin-rest-endpoint-methods": "^5.13.0"
},
"devDependencies": {
"proxy": "^1.0.2"
}
}

Some files were not shown because too many files have changed in this diff Show More