Can cons find executables in my path? - path

I'm trying to debug a cons script, and the problem I'm having is that an executable in my own $PATH doesn't seem to be located. In short: Can cons find executables in my path?
This might seem like a silly question, since the FAQ says
Cons does not pass the user's environment to the child processes that it forks to build the software. Anything you need or want to pass in from the user's environment must be done so explicitly.
It's not clear to me, however, that cons shouldn't be able to see my $PATH (the above is worded such that I wouldn't expect child processes to have access to any of the environment variables). In the execution of the Construct, $PATH evaluates to empty but $ENV{PATH} does contain my path.
It doesn't help that I don't know cons nor Perl, so I don't really know what I'm doing, nor where to start looking for what's causing the problem :) For what it's worth, the script that doesn't work for me, but does work for the original author of the script, is located here.

For what it's worth, the guideline in the FAQ is correct in that something like this will fix the problem:
# A standard construction environment.
$env = new cons(
ENV => { PATH => $ENV{PATH} }
);
which can then be followed by, e.g.,
Command $env 'foo', qq(echo =`which tex`=);
to run/install/build/whatever you're doing with the cons script.

Related

Rename environment variable key

I have following environment variable OLD_KEY=value and want to rename the env var name so it is KEY=value.
Already tryied export KEY=${${OLD_KEY}}.
Anyone who has an idea?
Is a POSIX shell, like bash, you dereference a variable with a single dollar-sign. So this is all you need:
export KEY="$OLD_KEY"
The double-quotes are always a good idea, even if you don't think they're needed due to quirks of the POSIX shell standard. Specifically, how variable expansion interacts with $IFS.
P.S., You should have told us which shell (or language) you're using rather than make us guess.

How to pass an array from Bazel cli to rules?

Let's say I have a rule like this.
foo(
name = "helloworld",
myarray = [
":bar",
"//path/to:qux",
],
)
In this case, myarray is static.
However, I want it to be given by cli, like
bazel run //:helloworld --myarray=":bar,//path/to:qux,:baz,:another"
How is this possible?
Thanks
To get exactly what you're asking for, Bazel would need to support LABEL_LIST in Starlark-defined command line flags, which are documented here:
https://docs.bazel.build/versions/2.1.0/skylark/lib/config.html
and here: https://docs.bazel.build/versions/2.1.0/skylark/config.html
Unfortunately that's not implemented at the moment.
If you don't actually need a list of labels (i.e., to create dependencies between targets), then maybe STRING_LIST will work for you.
If you do need a list of labels, and the different possible values are known, then you can use --define, config_setting(), and select():
https://docs.bazel.build/versions/2.1.0/configurable-attributes.html
The question is, what are you really after. Passing variable, array into the bazel build/run isn't really possible, well not as such and not (mostly) without (very likely unwanted) side effects. Aren't you perhaps really just looking into passing arguments directly to what is being run by the run? I.e. pass it to the executable itself, not bazel?
There are few ways you could sneak stuff in (you'd also in most cases need to come up with a syntax to pass data on CLI and unpack the array in a rule), but many come with relatively substantial price.
You can define your array in a bzl file and load it from where the rule uses it. You can then dump the bzl content rewriting your build/run configuration (also making it obvious, traceable) and load the bits from the rule (only affecting the rule loading and using the variable). E.g, BUILD file:
load(":myarray.bzl", "myarray")
foo(
name = "helloworld",
myarray = myarray,
],
)
And you can then call your build:
$ echo 'myarray=[":bar", "//path/to:qux", ":baz", ":another"]' > myarray.bzl
$ bazel run //:helloworld
Which you can of course put in a single wrapper script. If this really needs to be a bazel array, this one is probably the cleanest way to do that.
--workspace_status_command: you can collection information about your environment, add either or both of the resulting files (depending on whether the inputs are meant to invalidate the rule results or not, you could use volatile or stable status files) as a dependency of your rule and process the incoming file in the what is being executed by the rule (at which point one would wonder why not pass it to as its command line arguments directly). If using stable status file, also each other rule depending on it is invalidated by any change.
You can do similar thing by using --action_env. From within the executable/tool/script underpinning the rule, you can directly access defined environmental variable. However, this also means environment of each rule is affected (not just the one you're targeting); and again, why would it parse the information from environment and not accept arguments on the command line.
There is also --define, but you would not really get direct access it's value as much as you could select() a choice out of possible options.

How to write Travis env variable to file?

I'm trying to write a custom Travis env variable to a file for a simple proof of concept thing that I need. However, I'm having trouble getting this to work.
How would I define this in the travis yaml file if my variable is called VARIABLE_X ?
Thanks!
One way to do this is using linux commands, something like:
printenv | grep VARIABLE > all_env
However I don't know how Travis handles the environment (take a look at their docs, here) but it might not work as easily due to encryption, but it should work since your apps wouldn't function if they didn't have the same level of access. If such a case occurs, modifying a few parameters (maybe TRAVIS_SECURE_ENV_VARS) is worth looking into.
If you solved the problem in another way, consider sharing with the community.
Write the environment variable as usual (Shell - Write variable contents to a file)
Define the following within script:
- echo "$VARIABLE_X" > example.txt

ocaml: Is there an environment variable to set the search path for #use and #load?

I've been wondering, if there is an environment variable to set the search path for #use and #load for the ocaml toplevel.
What I think I know so far:
I can use findlib instead of "raw" #use and #load. findlib looks at some environment variables for the search path.
I can set a search path with -I.
Experiments seem to indicate that CAML_LD_LIBRARY_PATH does not influence #use (script loading) and #load (byte code file loading).
(updated) I can use #directory to add the desired path - but unfortunately this only takes a string literal, so i can't pass something I read from the environment at run time. (Update: I forgot to mention #directory originally and why it doesn't fit my use case).
What I want to do:
Run ocaml programs as scripts
Point ocaml to script libraries and script fragments with an environment variable
Avoid, in some scenarios, to create a full findlib library.
Presently I'm not using ocaml as interpreter, but a wrapper that adds a path to the invocation. I want to avoid the wrapper.
(I hope the questions makes sense now, after you know my use case)
So: Is there an environment variable to set the search path for #use and #load without resorting to a wrapper?
More Details
What I'm currently doing:
#!/usr/bin/env oscript2
#use "MyScript"
#load "SomeModule.cmo"
(* ... more ocaml *)
oscript2 is a wrapper around ocaml that basically sets the search paths for #use and #load, but finally executes the toplevel ocaml withe something like
exec ocaml -I .... ...some-byte-code-modules... "$#"
MyScript and SomeModule.cmo live outside of the "normal" Ocaml search path. The actual location might change, but after login (and working through the profie scripts) there is an environment variable (today it's OSCRIPTLOAD_PATH) that tells me, where alle loadable byte code and ocaml script files might live.
This works well, a variant of that setup has been in use for years (at least 7).
The one thing that bothers me there, is: The wrapper, the simple fact of it's presence, looks homebrew, so I'd like to avoid it, to make a better impression on potential future users of the script collection. What I'd like to do
#!/usr/bin/env ocaml
#use "MyScript"
#load "SomeModule.cmo"
(* ... more ocaml *)
and have ocaml itself pick up the search path from some environment variable (I'm free to change the variable name, that is under my control, but I don't want to install script and byte code libs into the default search path, and, as already stated, I' asking if I can do that without findlib).
Basically (as already stated at the very beginning) I'm looking for an environment variable that controls the search path for #use and #load. I'm not looking for toplevel directives and not for wrappers that retrofit this feature. (Thanks everyone for those suggestions, but unfortunately I've already gone that road, it's feasible, but here I'm looking for an alternative purely for cosmetic reasons).
Recent research didn't bring up that such a variable exists, but I thought I'd be asking here, before giving up on it.
From inside the OCaml toplevel you can use the #directory "foo";; primitive to add an include directory.
Here's a shell script that runs the OCaml toplevel while adding a directory to the search path taken from an environment variable named EXTRA_OCAML_DIR.
#!/bin/sh
ocaml -I "$EXTRA_OCAML_DIR" "$#"
If you run this instead of ocaml, you will have a directory in the load path specified by an environment variable.
It seems a little obvious, but maybe it will spark an idea that is more helpful.

statically analysing Lua code for potential errors

I'm using a closed-source application that loads Lua scripts and allows some customization through modifying these scripts. Unfortunately that application is not very good at generating useful log output (all I get is 'script failed') if something goes wrong in one of the Lua scripts.
I realize that dynamic languages are pretty much resistant to static code analysis in the way C++ code can be analyzed for example.
I was hoping though, there would be a tool that runs through a Lua script and e.g. warns about variables that have not been defined in the context of a particular script.
Essentially what I'm looking for is a tool that for a script:
local a
print b
would output:
warning: script.lua(1): local 'a' is not used'
warning: script.lua(2): 'b' may not be defined'
It can only really be warnings for most things but that would still be useful! Does such a tool exist? Or maybe a Lua IDE with a feature like that build in?
Thanks, Chris
Automated static code analysis for Lua is not an easy task in general. However, for a limited set of practical problems it is quite doable.
Quick googling for "lua lint" yields these two tools: lua-checker and Lua lint.
You may want to roll your own tool for your specific needs however.
Metalua is one of the most powerful tools for static Lua code analysis. For example, please see metalint, the tool for global variable usage analysis.
Please do not hesitate to post your question on Metalua mailing list. People there are usually very helpful.
There is also lua-inspect, which is based on metalua that was already mentioned. I've integrated it into ZeroBrane Studio IDE, which generates an output very similar to what you'd expect. See this SO answer for details: https://stackoverflow.com/a/11789348/1442917.
For checking globals, see this lua-l posting. Checking locals is harder.
You need to find a parser for lua (should be available as open source) and use it to parse the script into a proper AST tree. Use that tree and a simple variable visibility tracker to find out when a variable is or isn't defined.
Usually the scoping rules are simple:
start with the top AST node and an empty scope
item look at the child statements for that node. Every variable declaration should be added in the current scope.
if a new scope is starting (for example via a { operator) create a new variable scope inheriting the variables in the current scope).
when a scope is ending (for example via } ) remove the current child variable scope and return to the parent.
Iterate carefully.
This will provide you with what variables are visible where inside the AST. You can use this information and if you also inspect the expressions AST nodes (read/write of variables) you can find out your information.
I just started using luacheck and it is excellent!
The first release was from 2015.

Resources