Bazel release specific documentation - bazel

Is there any place that has the documentation for a specific bazel release? In particular the documentation at https://docs.bazel.build/versions/master/bazel-overview.html is not appropriate because it is updated with master and thus may not be consistent with the version of bazel that I am running. The url structure looks like it would support more than one version (replacing master) but the obvious replacement of 0.5.2 or release doesn't work.
The use case I have is reporting documentation inconsistencies with the actual code so I need documentation that is intended to be consistent, and not documenting yet unreleased features. (e.g. https://github.com/bazelbuild/bazel/issues/3016)

Related

What is a "jobset" in the parlance of the Hydra continuous integration tool?

I found definitions (see below) and how they are usually used (i.e., a Hydra jobset is tracking a git branch) but I still couldn't figure out what they are in the general sense. Maybe if you could explain it laymen's terms with specific examples?
Eelco Dolstra, Eelco Visser: "Hydra: A Declarative Approach to Continuous Integration"
a specification of the location of a Nix expression, along with possible values for the function arguments of the jobs defined by the Nix expression.
NixOS Wiki: "Hydra"
Job Set
A list of jobs which will be run. Often a Jobset fits to a certain branch (master, staging, stable). A jobset is defined by its inputs and will trigger if these inputs change, e.g. like a new commit onto a branch is added. Job sets may depend on each other
Hydra User's Guide
3.2 Job Sets
A project can consist of multiple job sets (hereafter jobsets), separate tasks that can be built separately, but may depend on each other (without cyclic dependencies, of course).
My question may seem pointless after listing all these definitions, but here's an examle to demonstrate my confusion: I looked at the project listing at https://hydra.nixos.org/ and I was under the impression that a project is a channel, and jobsets are the branches in a repo. (I know, there is no mention of "channel" in there, and on the channel page it even says that "Nix channels are not supported by this Hydra server." :)
I could fool myself with that when looking at the Hydra project but this argument fell apart when I clicked on the flakes one (that is, couldn't find a supporting github repo, but generally the jobset names didn't feel like branch names).
Also, in the Dolstra/Visser paper, Hydra was set up using SVN; I don't know if SVN even uses branches (mostly because the paper didn't mention them) but this does prove that Hydra can be set up with VCS/SCM other than git where the underlying concepts can be fundamentally different. Again, I could easily be wrong.
I think I found the best definition in the flox documentation:
Within channels, jobsets allow flox to build packages against multiple versions of dependencies simultaneously.
flox is a framework built around the Nix eco-system to use Nix without having to install it; it seamlessly relays Nix commands to a remote Nix store and Hydra build farm as if they were issued on the local machine, and thus one would consistently get the same results everywhere where flox is installed.
If I understand this correctly,
the stable jobset would then specify as inputs the latest stable releases of all the dependencies of the channel's packages (e.g., from their release branch)
staging / unstable would take the latest revision of a development branch (e.g., master / main) of the dependencies (a.k.a., bleeding edge)
and so on.
Note to self: jobset specification applies to every package in a channel.
Some lingering questions if the above is correct:
What does a jobset specification look like? For example, stable takes the latest commits (i.e. HEAD) of release branches, whereas staging/unstable would do the same but for master/main (or other development) branches?

What tests should be run in preparation for making contributions to Bazel?

I am preparing for making a minor bug fix to bazel java code. Am working on a Linux distribution.
Following the instructions in https://bazel.build/contributing.html but I encounter problems with two of the test instructions:
In the section about "Compiling bazel" the third parapgraph state: "In addition to the Bazel binary, you might want to build the various tools Bazel uses. They are located in //src/java_tools/..., //src/objc_tools/... and //src/tools/... and their directories contain README files describing their respective utility." If I follow this the //src/tools/... fail because there is no xcrun command in the Linux environment I am using. I suppose this is MacOS platform specific tests?
The next paragraph instructs you to build a distribution package, that you then unpack in a new directory, and then do: "bazel test //src/... //third_party/ijar/...". I now get an error that windows.h is missing, which I suppose is Windows platform specific tests.
Some questions:
So is there an easy way to run tests only for the current platform?
Is the instructions good enough?
If the instructions should be updated, what is the best way to notify the ones managing that documentation page?
Thanks for your interest in contributing to Bazel! The bazel-dev mailing list is a better avenue for these questions.
The tests that you want to run largely depend on the changes you make, but when you make a pull request, the Bazel CI will run all of Bazel's tests to make sure that nothing breaks.
So is there an easy way to run tests only for the current platform?
It depends, and this is still a work in progress where we want to make Bazel more aware of platforms and toolchains without specifying additional flags.
In general, you don't need to modify or worry about the //src/*_tools packages unless you're making direct changes to them.
Is the instructions good enough?
The instructions will never be perfect, and we're always looking for ways to make it clearer and more concise.
If the instructions should be updated, what is the best way to notify the ones managing that documentation page?
Please file an issue on the GitHub repository or email the bazel-dev mailing list for further discussion.

I have not changed requirements in my Divio project, so why does the build fail with a dependency conflict?

The last time I deployed the project, the build worked perfectly.
In the meantime I have changed nothing that would affect the pip requirements, yet I get an error when building:
Could not find a version that matches Django<1.10,<1.10.999,<1.11,
<1.12,<1.9.999,<2,<2.0,==1.9.13,>1.3,>=1.11,>=1.3,>=1.4,>=1.4.10,
>=1.4.2,>=1.5,>=1.6,>=1.7,>=1.8
I get the same error when building the project locally with docker-compose build web.
What could be the problem?
The problem here is that although you may not have modified any requirements, the dependencies of a project can sometimes change on their own.
You may even have pinned all of your own requirements (which is generally a good idea) but that still won't help if one of them itself has an unpinned dependency.
Anywhere an unpinned dependency exists, you can run into this.
Here's an example. Suppose your requirements.in contains super-django==1.2.4. That's better than simply specifying super-django, as you won't be taken by surprised if a new, incompatible version of the Super Django package is released.
But suppose that in turn Super Django 1.2.4, in its requirements, lists:
Django==1.11
django-super-admin
If a new version of Django Super Admin is released, that requires say Django>=2.0, your next build will fail because of the mutually exclusive requirements.
To track down the culprit when you run into such a failure, you need to examine the build logs. You'll see there something like:
Could not find a version that matches Django==1.11,>=2.0 [etc].
So now you know to look back through the logs to find what is wanting to install Django>=2.0, and you'll find:
adding Django>=2.0
from django-super-admin==1.7.0
So now you know that it's django-super-admin==1.7.0 that is the key. Since you can't trust super-django to pin the correct version of django-super-admin, you'll have to do it yourself, by adding django-super-admin<1.7.0 to the requirements.in of your project.
There's more information about this at How to identify and resolve a dependency conflict.
You can also Pin all of your project’s Python dependencies to ensure this never happens again with any other dependency, though you sacrifice some flexibility for the guarantee.
Note: I am a member of the Divio team. This question is one that we see quite regularly via our support channels.

Cocoapods - Is it possible to customise 3rd party iOS libaries?

I'm writing a project that uses a range of 3rd party iOS libraries (e.g. TWStatus, BSKeyChain), some of which I've modified (up to 10% custom code), but wouldn't be suitable for the original GitHub project.
I thought about future updates from the library authors, and came across CocoaPods today, which looks to be a good dependency manager in the same vein as Bundle or Vundle (for Vim).
I'm curious whether it's possible for my custom code to co-exist with future changes by the library authors using CocoaPods??
CoocaPods by default pulls in a library as source, along with instructions to build that library. (There are some exceptions for close-sourced libs). So this makes it easy to debug issues and test changes out within the context of your project.
Having decided on a change, you should do one of the following:
Submit the change back to the master library. You'll get the benefit of easily being able to upgrade to new versions that include all of the testing provided by the wider community. The easiest way to to do this is to fork the library, apply the change, and send a pull request
If your changes aren't really of benefit to other users of the library, you can just fork the library, without submitting the change back up to master. If the license permits you can maintain a private fork, and still resolve it from CocoaPods. To do this:
Podfile:
#As long as the pod-spec is published at the root of the git repo, this works.
pod 'MyFantasticLib', :git => 'https://github.com/dogue/Doguetastic.git'
#. . and you can use this lib while you're waiting for your pull req. to be approved.
You could also consider setting up a private spec repo.
The first option is usually the best.
So, to be clear.
You've made changes to a dependency
You also want to keep those dependencies up to date.
--
Seems to me this would be a pain. You'd need to merge the new stuff into your own version for every update, for every pod/dependency.
However, if you've made changes... do you really need to? Consider using the adapter or façade pattern to write a wrapper around them, and then you can update them as much as you like (assuming their interface doesn't change).
Just some thoughts

Can I use a git SHA for the vsn field in an Erlang application?

What are the requirements for the vsn key in an Erlang application?
The Erlang/OTP documentation simply says:
Version number, a string. Defaults to "".
Is there any required ordering between versions? If I use a git SHA, will I still be able to use relups or appups?
To rephrase:
Is there anything in Erlang/OTP that requires a well-defined partial or total ordering in the vsn key?
The version can be any string, but with your idea I see 2 problems:
You will loose the ability to make comparison on version easily, I mean that you will need to maintain a catalog of all existing versions just to know if one version is older than an other (it should be accessible via git)
but as far as git works at the project level, you cannot know the SHA, and as the app file is part of the project, you cannot fill the version before commiting unless the app file is out of the git repository, which is not really interesting.

Resources