Appending to $PATH vs using aliases: Which is better? - path

In at least some cases, aliases and adding $PATH locations can be used interchangeably. For example, looking at the python tool couchapp, I need to either alias the executable (as helpfully described here) or make the executable available via $PATH.
These are the two lines that can achieve this:
alias couchapp="~/Library/Python/2.7/bin/couchapp"
OR
export PATH=$PATH:~/Library/Python/2.7/bin/
Is there a very definite 'better' option of these two? Why or why not?

An alias is a shell feature: any environment that invokes utilities directly, without involving a shell will not see aliases.
Note: Even when calling shell commands from languages such as Python (using, e.g., os.system()), user-specific shell initialization files are typically not called, so user-specific aliases still won't be visible.
A directory added to the $PATH environment variable is respected by any process that tries to invoke an executable by mere filename, whether via a shell or not.
Similarly, this assumes that any calling process sees the $PATH environment-variable additions of interest, so additions made by the user-specific initialization files are typically not seen, unless the calling process was launched from an interactive shell.
Lookup cost
If you know that a shell will be involved in invoking your utility, you can keep overhead down by defining aliases that invoke your executables by their full path.
Of course, you need to do this for each executable you want to be callable by name only.
By contrast, adding directories to the $PATH variable potentially increases the overhead of locating a given executable by mere filename, because all directories listed must be searched one by one until one containing an executable by the specified name is found (if any).
Precedence
If a shell is involved, aliases take precedence over $PATH lookups.
Of course, later alias definitions can override earlier ones.
If no shell is involved or no alias by a given name exists, $PATH lookups happen in the order in which the directories are listed in the variable.

As your example shows, $PATH allows you to do one line for all of your executables in that location. For that reason I use the latter option. You can also chain many $PATH statements together, allowing you to easily add many more locations to your "executables" from the command line.
If for some reason you do not want to make all of the executables available alias would be better.

Related

Removing nonexistent path variables automatically

I've accumulated a lot of environment variables in my user and system Path. I'm sure some of them don't even exist anymore, so I'm going to check one by one. But is there an automatic way to do it?
There is no native Windows function to perform such a purge.
You would need to make a script which would:
split the %PATH% as described in "How can I use a .bat file to remove specific tokens from the PATH environment variable?"
build a string for each existing folder
setx PATH=<new string>

Local variables implementation

I've been using the fish shell for a bit now, and I've recently had a conversation with a coworker over local variables. Apparently, Bash doesn't support local variables, and only uses environment variables to communicate dynamic data between processes. Are local variables also just environment variables, but with something extra? I'm curious as to how fish has created this behavior.
Bash doesn't support local variables
That's not true. Bash (and other shells including dash - it's one of the few POSIX extensions it has) have the local keyword to create local variables. They just default to global, while fish defaults to local.
Also when you say "environment variables" what you mean are "exported" variables, which require an explicit "export" step in posixy shells, and the "-x" or "--export" flag to set in fish.
I e. there are two different things at play here - whether this variable is available just in this function/block/whatever, and not the outside, and whether it is passed on to children, including external processes.
Are local variables also just environment variables, but with something extra?
Non-exported variables are something less. They aren't given to the OS's setenv function, so it doesn't copy them to child processes.
Local variables are removed when the block ends. In practice this can be done nicely by putting them on a stack and "popping" the top.
Note that, in fish at least, these concepts are entirely orthogonal:
You can have local-exported variables (with set -lx), and they'll be passed to external commands and copied to functions (so they get their own local version of it), but removed when the function ends. These are useful to change something temporary - e.g. to set $PATH just for a function, or to override $EDITOR when calling something.
And you can have global-unexported variables, which can be accessed by functions but not external commands. These are useful for shell settings like $fish_function_path, which isn't useful to external tools, or $COLUMN, which might even break external tools if exported (because they start to read it instead of checking the terminal size themselves).
There seems to be some misconceptions here:
bash can have variables that are local to a function: https://www.gnu.org/software/bash/manual/bash.html#index-local
Not every shell (bash/fish/etc) variable is in the environment. This is why the export (bash) and set -x (fish) commands exist.
For two separate processes to share the same variable value, you must pass them via the environment. The environment is the way to expose shell variables to other processes.

Custom environment variable

Is it possible to set a custom environment variable, which will be accessible from any other plugin, the same way as $platform and $path works?
There is a package EnvironmentSettings by Daniele Niero, but it seems my task is simpler and therefore there is a probablity that there is no need to dive deep into its code.
In Sublime, any plugin can modify the global process environment through os.environ from the Python run time. All plugin code runs under the same process, so once one plugin sets an environment variable, any other plugin could access it. I would imagine that this is how the package that you linked to in your question modifies the environment.
A simple example of this in action can be found in Default/exec.py which you can open by using View Package File from the command palette. In the __init__ method of AsyncProcess() there is code that modifies the Sublime process environment if you pass the path argument in your sublime-build file.
A simple example that you can run from the Sublime console would be the following snippet. Once you execute that code, any plugin that you create can access os.environ["MY_VARIABLE"] to see the value.
import os
os.environ["MY_VARIBLE"]="Some Value"
With that said, in Sublime $platform is not an environment variable, it's a special variable that Sublime knows how to expand itself which is divorced from the system environment outlined above.
A complete list of such variables can be viewed by executing the following code from the Sublime console:
from pprint import pprint
pprint(window.extract_variables())
The list of variables you get and their content depends on application state (platform, whether there is currently a project open in the window, the current file, etc).
The names of the variables that this returns are hard coded in the Sublime core and can't be augmented, so if you wanted extra variables here you would need to communicate that to other plugins and they would have to be modified to know how to use them.
From the sounds of what you're trying to accomplish in the comments on your question, what you want might be a sublime-settings file that contains a setting that specifies the directory to use for file actions in your custom plugins. If they all load the settings file to get the path you can modify the location in the config and have it take effect immediately. Alternatively you could do something like a top level module variable in one plugin and import it into the others.

How to autoload environment variables specific to one file path?

I am working on developing a solution that simplifies hands-on debugging of failed Jenkins builds. This involves SSH-ing to the right Jenkins node and going directly on the WORKSPACE so you can interactively try different changes that could solve your problem.
While I solved the problem of starting a SSH session in the right directory there is one missing bit: your shell is missing the original environment variables defined by Jenkins, and these are critical for running any commands after that. So, not the first command of the build is a set > .envrc which saves all into this shell file.
My example refers to the direnv tool which is able to auto-load .envrc files. Due to security concerns this tool does not auto-load these files and gives a message direnv: error .envrc is blocked. Rundirenv allowto approve its content.
So my current solution is to manually run direnv allow after ending up in the right folder.
How can I automate this, so I would not have to type this? A prompting could be ok because it would involve only pressing one key instead of typing ~12.
Please note that I am not forced to use direnv itself, I am open to other solution.
As of v2.15.0, you can now use direnv's whitelist configuration to achieve what you described:
Specifying whitelist directives marks specific directory hierarchies
or specific directories as "trusted" -- direnv will evaluate any
matching .envrc files regardless of whether they have been
specifically allowed. This feature should be used with great care, as
anyone with the ability to write files to that directory (including
collaborators on VCS repositories) will be able to execute arbitrary
code on your computer.
For example, say that the directory hierarchy that contains the .envrcs you want to be evaluated without having to run direnv allow is located under /home/foo/bar.
Create the file /home/foo/.config/direnv/config.toml so that it contains the following:
[whitelist]
prefix = [ "/home/foo/bar" ]
Alternatively, if there are a fixed list of specific paths you want to whitelist, you can use exact rather than prefix:
[whitelist]
exact = [ "/home/foo/projectA", "/home/foo/projectB" ]

aliasing jenkins artifact URLs

Jenkins artifact URLs allow abstracting the "last successful build", so that instead of
http://myjenkins.local/job/MyJob/38/artifact/build/MyJob-v1.0.1.zip
we can say
http://myjenkins.local/job/MyJob/lastSuccessfulBuild/artifact/build/MyJob-v1.0.1.zip
Is it possible to abstract this further? My artifacts have their version number in their filename, which can change from build to build. Ideally I'd like to have a some kind of "alias" URL that looks like this:
http://myjenkins.local/job/MyJob/lastSuccessfulBuild/artifact/build/MyJob-latest.zip
MyJob-latest.zip would then resolve to MyJob-v1.0.1.zip.
If Jenkins itself can't do this, perhaps there's a plugin?
Never seen any such plugin, but Jenkins already has a similar functionality built-in.
You can use /*zip*/filename.zip in your artifact path, where filename is anything you choose. It will take all found artifacts, and download them in a zipfile (you may end up with a zip inside a zip, if your artifact is already a zip file)
In your case, it will be:
http://myjenkins.local/job/MyJob/lastSuccessfulBuild/artifact/build/*zip*/MyJob-latest.zip
This will get you the contents of /artifact/build/ returned in zipped archive with name MyJob-latest.zip. Note that if you have more than just that zip file in that directory, other files will be returned too.
You can use wildcards in the path. A single * for a regular wildcard, a double ** for skipping any number of preceding directories.
For example, to get any file that starts with MyJob, ends with .zip, and to look for it in any artifact directory, you could use:
/lastSuccessfulBuild/artifact/**/MyJob*.zip/*zip*/MyJob-latest.zip
Edit:
You cannot do something like this without some form of a container (a zip in this case). With the container, you are telling the system:
Get any possible [undetermined count] wildcard match and place into this container, then give me the container. This is logical and possible, as there is only one single container, whether it is empty or not.
But you cannot tell the system:
Give me a link to a specific single file, but I don't know which one or how many there are. The system can't guarantee that your wildcards will match one, more than one, or none. This is simply impossible from a logic perspective.
If you need it for some script automation, you can unzip the first level zip and be still left with your desired zipped artifact.
If you need to provide this link to someone else, you need an alternative solution.
Alternative 1:
After your build is complete, execute a post-build step that will take your artifact, and rename it to MyJob-latest.zip, but you are losing versioning in the filename. You can also chose to copy instead of rename, but you end up with double the space used for storing these artifacts.
Alternative 2 (recommended):
As a post-build action, upload the artifact to a central repository. It can be Artifactory, or even plain SVN. When you upload it, it will be renamed MyJob-latest.zip and the previous one would be overwritten. This way you have a static link that will always have the latest artifact from lastSuccessfulBuild
There is actually a plugin to assign aliases to build you've run, and I have found it pretty handy: the Build Alias Setter Plugin.
You can use it for instance to assign an alias in the form of your own version number for a build, instead (or rather in addition) to the internal Jenkins-assigned build number.
I found that it is usually most practical to use it in conjunction with the EnvInject plugin (or your favorite variant): you would export an env variable (e.g. MY_VAR=xyz) with a value to the target version or moniker, and then use the form ${ENV,var="myvar"} in the "Token Macro alias" config that the plugin provides in your job config.
You can also use it to assign aliases in the form of "lastSuccesful" if you have such a need, which allows you to distinguish between different types of successful (or other state) builds.
Wait thee's more! You can also use the /*zip*/ trick in conjunction with the alias setter as well.

Resources