Why can't waf find a path that exists? - waf

Let's say I have x.y file in /mydir/a/b (on Linux)
When I run waf, it does not find the file.
def configure(context):
pass
def build(build_context):
build_context(source='/mydir/a/b/x.y',
rule='echo ${SRC} > ${TGT}',
target='test.out')
Result: source not found: '/mydir/a/b/x.y' in bld(features=[], idx=1, meths=['process_rule', 'process_source'] ...
Ok, maybe you want a relative path, Waf? And you are not telling me?
def build(context):
path_str = '/mydir/a/b'
xy_node = context.path.find_dir(path_str)
if xy_node is None:
exit ("Error: Failed to find path {}".format(path_str))
# just refer to the current script
orig_path = context.path.find_resource('wscript')
rel_path = xy_node.path_from(orig_path)
print "Relative path: ", rel_path
Result: Error: Failed to find path /mydir/a/b
But that directory exists! What's up with that?
And, by the way, the relative path for some subdirectory (which it can find) is one off. e.g. a/b under current directory results in relative path "../a/b". I'd expect "a/b"

In general there are (at least) two node objects in each context:
- path: is pointing to the location of the wscript
- root: is pointing to the filesystem root
So in you case the solution is to use context.root:
def build(context):
print context.path.abspath()
print context.root.abspath()
print context.root.find_dir('/mydir/a/b')

Hmm, looks like I found an answer on the waf-users group forum, answered by Mr. Nagy himself:
The source files must be present under the top-level directory. You
may either:
create a symlink to the source directory
copy the external source files into the build directory (which may cause problem if there is a structure of folders to copy)
set top to a common folder such as '/' (may require superuse permissions, so it is a bad idea in general)
The recommendation in conclusion is to add a symlink to the outside directory during the configuration step. I wonder how that would work, if I need this on both, Linux and Windows...

Just pass the Node to the copy rule instead of passing the string representing the path:
def build(build_context):
source_node = build_context.root.find_node('/mydir/a/b/x.y')
build_context(source=source_node,
rule='echo ${SRC} > ${TGT}',
target='test.out')
Waf will be able to find the file even if outside of the top level directory.

Related

Reading the content of directory declared with `actions.declare_directory`

Imagine I have a java_binary target triggered by a custom rule that generates source code and places the generated sources under a directory, let's call it "root".
So after the code generation we will have something like this:
// bazel-bin/...../src/com/example/root
root:
-> Foo.java
-> Bar.java
-> utils
-> Baz.java
Now, I have another target, a java_library, that depends on the previously generated sources, so it depends on the custom rule.
My custom rule definition currently looks something like this:
def _code_generator(ctx):
outputDir = ctx.actions.declare_directory("root")
files = [
ctx.actions.declare_file("root/Foo.java"),
ctx.actions.declare_file("root/Bar.java"),
ctx.actions.declare_file("root/utils/Baz.java"),
// and many,
// many other files
]
outputs = []
outputs.append(outputDir)
outputs.extend(files)
ctx.actions.run(
executable = // executable pointing to the java_binary
outputs = outputs
// ....
)
This works. But as you can see, every anticipated file that is to be generated, is hard-coded in the rule definition. This makes it very fragile, should the code generation produce a different set of files in the future (which it will).
(Without specifying each of the files, as shown above, Bazel will fail the build saying that the files have no generating action)
So I was wondering, is there a way to read the content of the root directory and automatically, somehow, declare each of the files as an output?
What I tried:
The documentation of declare_directory says:
The contents of the directory are not directly accessible from Starlark, but can be expanded in an action command with Args.add_all().
And add_all says:
[...] Each directory File item is replaced by all Files recursively contained in that directory.
This sounds like there could be a way to get access to the individual files in the directory, but I am not sure how.
I tried:
outputDir = ctx.actions.declare_directory("root")
//...
args = ctx.actions.args()
args.add_all(outputDir)
with the intention to access the individual files later from args, but the build fails with: "Error in add_all: expected value of type sequence or depset for values, got File".
Any other ideas on how to implement the rule, so that I don't have to hard-code each and every file that will be generated?

Lua require does not work but file is in the trace

I'm trying to require files in Lua, in one case it is working, but when I want to simplify the requirements in updating the LUA PATH the file is not found, but it is in the trace!
To reproduce my require problem I did the test with the package.searchpath function, which takes the required key and the Lua path in arguments.
So the code :
print('MY LUA PATH')
local myLuaPath = "?;?.lua;D:\\Projets\\wow-addon\\HeyThere\\?;D:\\Projets\\wow-addon\\HeyThere\\src\\HeyThere\\?;D:\\Projets\\wow-addon\\HeyThere\\test\\HeyThere\\?"
print(myLuaPath)
print('package search core.add-test')
print(package.searchpath('core.add-test', myLuaPath))
print('package search test.HeyThere.core.add-test')
print(package.searchpath('test.HeyThere.core.add-test', myLuaPath))
The result :
MY LUA PATH
?;?.lua;D:\Projets\wow-addon\HeyThere\?;D:\Projets\wow-addon\HeyThere\src\HeyThere\?;D:\Projets\wow-addon\HeyThere\test\HeyThere\?
package search core.add-test
nil no file 'core\add-test'
no file 'core\add-test.lua'
no file 'D:\Projets\wow-addon\HeyThere\core\add-test'
no file 'D:\Projets\wow-addon\HeyThere\src\HeyThere\core\add-test'
no file 'D:\Projets\wow-addon\HeyThere\test\HeyThere\core\add-test'
package search test.HeyThere.core.add-test
test\HeyThere\core\add-test.lua
So the first try with 'core.add-test' should work with the 'D:\Projets\wow-addon\HeyThere\test\HeyThere\?' value in the path but fails...
In the trace, there is the file I want!
no file 'D:\Projets\wow-addon\HeyThere\test\HeyThere\core\add-test'
But with the same LUA PATH but starting in a parent folder the file is found... Second test with 'test.HeyThere.core.add-test' found from the 'D:\Projets\wow-addon\HeyThere\?'
-> test\HeyThere\core\add-test.lua
Can someone explains to me why it doesn't work the first time?
EDIT :
My current directory is D:\Projets\wow-addon\HeyThere
My lua.exe is in D:\Projets\wow-addon\HeyThere\bin\lua but is added to my PATH variable (I'm on Windows)
I set the LUA_PATH environment variable and execute
lua "test\test-suite.lua" -v
The code inside test-suite.lua is the test code described above
As #EgorSkriptunoff suggested, adding file extansion in the path resolve the problem...
Ex:
Wrong path D:\Projets\wow-addon\HeyThere\?
Good path D:\Projets\wow-addon\HeyThere\?.lua
The extension should be in the path variable because in the require the dot is replace and used as a folder separator.

How to write Bazel rules that work with external repositories?

The Bazel Starlark API does strange things with files in external repositories. I have the following Starlark snippet:
print(ctx.genfiles_dir)
print(ctx.genfiles_dir.path)
print(output_filename)
ret = ctx.new_file(ctx.genfiles_dir, output_filename)
print(ret.path)
It is creating the following output:
DEBUG: build_defs.bzl:292:5: <derived root>
DEBUG: build_defs.bzl:293:5: bazel-out/k8-fastbuild/genfiles
DEBUG: build_defs.bzl:294:5: google/protobuf/descriptor.upb.c
DEBUG: build_defs.bzl:296:5: bazel-out/k8-fastbuild/genfiles/external/com_google_protobuf/google/protobuf/descriptor.upb.c
That extra external/com_google_protobuf comes seemingly out of nowhere, and it makes my rule fail:
I tell protoc to generate into ctx.genfiles_dir.path (which is bazel-out/k8-fastbuild/genfiles).
So protoc generates bazel-out/k8-fastbuild/genfiles/google/protobuf/descriptor.upb.c
Bazel fails because I didn't generate bazel-out/k8-fastbuild/genfiles/external/com_google_protobuf/google/protobuf/descriptor.upb.c
Likewise, when I try to call file.short_path on a source file from an external repository, I get a result like ../com_google_protobuf/google/protobuf/descriptor.proto. This seems quite unhelpful, so I just wrote some manual code to strip off the leading ../com_google_protobuf/.
Am I missing something? How can I write this rule in a way that doesn't feel like I'm fighting Bazel the whole time?
Am I missing something?
The basic problem, as you already realized, is that you have two path "namespaces" the one that protoc sees (i.e. import paths) and the one that bazel sees (i.e. the path you pass to declare_file().
2 things to note:
1) All paths declared with declare_file() get the path <bin dir>/<package path incl. workspace>/<path you passed to declare_file()>
2) All actions are executed from <bin dir> (unless output_to_genfils=True in which case this switches to <gen dir> as in your example.
Trying to solve the exact same problem you encountered, I resorted to stripping the known path from the output_file's path to determine which directory to pass as p:
# This code is run from the context of the external protobuf dependency
proto_path = "google/a/b.proto"
output_file = ctx.actions.declare_file(proto_path)
# output_file.path would be `<gen_dir>/external/protobuf/google/a/b.proto`
# Strip the known proto_path from output_file.path
protoc_prefix = output_file.path[:-len(proto_path)]
print(protoc_prefix) # Prints: <gen_dir>/external/protobuf
command = "{protoc} {proto_paths} {cpp_out} {plugin} {plugin_options} {proto_file}".format(
...
cpp_out = "--cpp_out=" + protoc_prefix,
...
)
Alternatives
You may also be able to construct the same path with ctx.bin_dir, ctx.label.workspace_name, ctx.label.package, and ctx.label.name.
Misc.
proto_library recently gained an attribute strip_import_prefix. When used, the above is not correct, as all dependent files are symlinked into a new directory from which they have the relative paths declared with strip_import_prefix.
The path format is:
<bin dir>/<repo>/<package>/_virtual_base/<label name>/<path `import`ed in .proto files>
i.e.
<bin dir>/external/protobuf/_virtual_base/b_proto/google/a/b.proto
Assuming you are building an external repo called protobuf, which contains a BUILD file at its root with a target named b_proto, which in turn, relies on a proto_library wrapping google/a/b.proto AND uses the strip_import_prefix attribute.

How can i use relative path names in Doxygen configuration

I have my doxygen in my /utils directory, and my source is in another directory in the root(/code_with_doxygen), how could i make a relative path name for that since it's in a repository that will be on different places on other computers. I can't document the whole root because i don't want the directory /code_without_doxygen build too.
the project tree looks like this:
root
utils
code
code_with_doxygen
code_without_doxygen
documentation
right now i have the settings, but that doesn't seem to work:
FULL_PATH_NAMES = YES
STRIP_FROM_PATH = ../
i can't seem to figure it out with: Relative files paths in doxygen-generated documentation
The relative paths depend on the directory from which directory you are executing doxygen. For example if you have the following project tree:
+ project_root
+ documentation
+ config
- doxyfile
+ pictures
+ output
- run_doxygen.bat
+ code
+ code_with_doxygen
+ code_without_doxygen
In this case all relative paths have they root in the folder "documentation" because you are running the script "run_doxygen.bat" from this folder. So you would set the INPUT tag in the "doxyfile" to
INPUT = ./../code
and the OUTPUT_DIRECTORY tag in the doxyfile to
OUTPUT_DIRECTORY = ./output
The misleading thing is that even if the doxyfile is in the subfolder "config" the paths are NOT relative to the location of the doxyfile because the paths are relative to the location from where doxygen is called. In this case it is the folder "documentation" because this is the location of the script which is calling doxygen.
Doxygen allows for including files into doxyfile. You can generate a file using a script before actually calling doxygen. The content of this file has to look like this:
INPUT += path1
INPUT += path2
...
You seem to to run Linux, I don't know the correct bash-commands.
The file has to be integrated into your doxyfile:
INPUT = (project path)
#INCLUDE = generated filename
This will lead to doxygen using the content of your generated file.
#gmug was right. Don't forget to add comment blocks in your code as specified by doxygen For python I needed to add this: """#package docstring""" at the beginning of the file.
I have been able to use relative paths in my doxygen.cfg file by setting INPUT to a string or set of strings. For example:
INPUT = "." "src"
will tell doxygen to look in both the current directory and its subdirectory, $HERE/src.

Difference in path xxxx/./xxxx and xxxx/xxxx

is there a difference in the following two paths?
Path="/apps/WebLogicPPT/user_projects/wlsPPTDomain/./applications/ssc/sscoc_web_sos_11.1ps_v0.1.ear"
Path="/apps/WebLogicPPT/user_projects/wlsPPTDomain/applications/ssc/sscoc_web_sos_11.1ps_v0.1.ear"
In most operating systems, . represents the current directory and .. represents the parent directory.
Given a file structure like this:
C:.
└───foo
└───bar
You can list the contents of bar in numerous ways:
dir c:\foo\bar
dir c:\foo\.\.\.\.\.\bar
dir c:\foo\..\foo\..\foo\bar\.
...

Resources