Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add rules_pkg 1.0.1 (was 1.0.0) #2180

Merged
merged 9 commits into from
Jul 9, 2024
Merged

Add rules_pkg 1.0.1 (was 1.0.0) #2180

merged 9 commits into from
Jul 9, 2024

Conversation

aiuto
Copy link
Contributor

@aiuto aiuto commented Jun 3, 2024

Add rules_pkg 1.0.1

1.0.0 was a failed attempt because of a dependency loop.

@bazel-io
Copy link
Member

bazel-io commented Jun 3, 2024

Hello @bazelbuild/bcr-maintainers, modules without existing maintainers (rules_pkg) have been updated in this PR. Please review the changes.

@hunshcn
Copy link

hunshcn commented Jun 11, 2024

ping @aiuto

@Wyverald Wyverald added the presubmit-auto-run Presubmit jobs will be triggered for new changes automatically without reviewer's approval label Jun 11, 2024
@Wyverald
Copy link
Member

checks are red

@aiuto
Copy link
Contributor Author

aiuto commented Jun 11, 2024

I have no idea how to fix what is broken.

So, the obvious problem with enforcing this

  • If I bump protobuf to depend on the new version, that will fail because this is not submitted yet.
  • If I include protobuf in rules_pkg/MODULE.bazel, that won't help, because we'll still be at mismatched levels.

Opened #2232 to track solutions.

@aiuto
Copy link
Contributor Author

aiuto commented Jun 12, 2024

Sigh. Each test requires a new release of rules_pkg because MODULE.bazel in this repo must match what is in the download file. The error explains that.

BcrValidationResult.FAILED: Checked in MODULE.bazel file doesn't match the one in the extracted and patched sources.
  | Please fix the MODULE.bazel file or you can add the following patch to [email protected]:
  | --- MODULE.bazel
  | +++ MODULE.bazel
  | @@ -9,6 +9,10 @@
  | bazel_dep(name = "rules_license", version = "0.0.7")
  | bazel_dep(name = "rules_python", version = "0.31.0")
  | bazel_dep(name = "bazel_skylib", version = "1.4.2")
  | +
  | +# This is just to please module compatibility_level checking. We should not
  | +# specify it ourselves, but use what rules_python expects
  | +bazel_dep(name = "rules_proto", version = "6.0.0-rc3", dev_dependency = True)
  |  
  | # Only for development
  | bazel_dep(name = "platforms", version = "0.0.9", dev_dependency = True)
  |  
 ```
If we are going to do the test, then don't make us pre-create it.  We can do the entire BCR submit directly from the tarball.

I've run of time to work on this for a few weeks. My thought is that I can keep making fake tarballs pointing to a special tag, and if the submit finally works I can create a 1.0.1 release with the new module.bazel.

@alexeagle
Copy link
Contributor

Instead of releasing rules_pkg, you can just apply the prospective fixes here as patches. Once it's green you can either submit this entry with the patches, or bake those patches into rules_pkg for one release cycle you already know will pass.

aiuto added 4 commits July 9, 2024 14:28
This is not a true dependency. We do it to fake out the transitive dependency checking for compatibility.
What we really want is the dependency from rules_python to win.
@bazel-io
Copy link
Member

bazel-io commented Jul 9, 2024

Hello @bazelbuild/bcr-maintainers, modules without existing maintainers (rules_pkg) have been updated in this PR. Please review the changes.

@aiuto aiuto changed the title Add rules_pkg 1.0.0 Add rules_pkg 1.0.1 (was 1.0.0) Jul 9, 2024
@Wyverald Wyverald merged commit 69e58a4 into bazelbuild:main Jul 9, 2024
20 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
presubmit-auto-run Presubmit jobs will be triggered for new changes automatically without reviewer's approval
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants