How do I avoid symlinks using an Ant FileSet? - ant

I have a directory tree that includes a symlink to . (the current directory). When I attempt to iterate over this using an Ant FileSet, I get the following error:
Caught error while checking for symbolic links
at org.apache.tools.ant.DirectoryScanner.causesIllegalSymlinkLoop(DirectoryScanner.java:1859)
The code that I am using to generate the scanner is:
FileSet files = new FileSet();
Project project = new Project();
project.setBasedir( dir );
files.setProject( project );
files.setDir( project.getBaseDir() );
files.getDirectoryScanner().setFollowSymlinks( false );
for( Iterator iter = files.iterator(); iter.hasNext(); ) {}

You can set followsymlinks property of FileSet to false as documented here.

You're setting the 'not-follow' in the wrong place. Use
files.setFollowSymlinks( false );
instead of
files.getDirectoryScanner().setFollowSymlinks( false );
The code you posted will set 'not-follow symlinks' only when scanning below the top-level directory of the fileset. Symlinks in the top-level directory will still be followed.
Its not clear from what you have posted precisely what's causing an IOException to be raised in the causesIllegalSymlinkLoop() method (that exception is being caught at line 1859).
But if you set not-follow on the fileset rather than its internal directory scanner, the method will not be called at all.

Related

Reading the content of directory declared with `actions.declare_directory`

Imagine I have a java_binary target triggered by a custom rule that generates source code and places the generated sources under a directory, let's call it "root".
So after the code generation we will have something like this:
// bazel-bin/...../src/com/example/root
root:
-> Foo.java
-> Bar.java
-> utils
-> Baz.java
Now, I have another target, a java_library, that depends on the previously generated sources, so it depends on the custom rule.
My custom rule definition currently looks something like this:
def _code_generator(ctx):
outputDir = ctx.actions.declare_directory("root")
files = [
ctx.actions.declare_file("root/Foo.java"),
ctx.actions.declare_file("root/Bar.java"),
ctx.actions.declare_file("root/utils/Baz.java"),
// and many,
// many other files
]
outputs = []
outputs.append(outputDir)
outputs.extend(files)
ctx.actions.run(
executable = // executable pointing to the java_binary
outputs = outputs
// ....
)
This works. But as you can see, every anticipated file that is to be generated, is hard-coded in the rule definition. This makes it very fragile, should the code generation produce a different set of files in the future (which it will).
(Without specifying each of the files, as shown above, Bazel will fail the build saying that the files have no generating action)
So I was wondering, is there a way to read the content of the root directory and automatically, somehow, declare each of the files as an output?
What I tried:
The documentation of declare_directory says:
The contents of the directory are not directly accessible from Starlark, but can be expanded in an action command with Args.add_all().
And add_all says:
[...] Each directory File item is replaced by all Files recursively contained in that directory.
This sounds like there could be a way to get access to the individual files in the directory, but I am not sure how.
I tried:
outputDir = ctx.actions.declare_directory("root")
//...
args = ctx.actions.args()
args.add_all(outputDir)
with the intention to access the individual files later from args, but the build fails with: "Error in add_all: expected value of type sequence or depset for values, got File".
Any other ideas on how to implement the rule, so that I don't have to hard-code each and every file that will be generated?

How to write Bazel rules that work with external repositories?

The Bazel Starlark API does strange things with files in external repositories. I have the following Starlark snippet:
print(ctx.genfiles_dir)
print(ctx.genfiles_dir.path)
print(output_filename)
ret = ctx.new_file(ctx.genfiles_dir, output_filename)
print(ret.path)
It is creating the following output:
DEBUG: build_defs.bzl:292:5: <derived root>
DEBUG: build_defs.bzl:293:5: bazel-out/k8-fastbuild/genfiles
DEBUG: build_defs.bzl:294:5: google/protobuf/descriptor.upb.c
DEBUG: build_defs.bzl:296:5: bazel-out/k8-fastbuild/genfiles/external/com_google_protobuf/google/protobuf/descriptor.upb.c
That extra external/com_google_protobuf comes seemingly out of nowhere, and it makes my rule fail:
I tell protoc to generate into ctx.genfiles_dir.path (which is bazel-out/k8-fastbuild/genfiles).
So protoc generates bazel-out/k8-fastbuild/genfiles/google/protobuf/descriptor.upb.c
Bazel fails because I didn't generate bazel-out/k8-fastbuild/genfiles/external/com_google_protobuf/google/protobuf/descriptor.upb.c
Likewise, when I try to call file.short_path on a source file from an external repository, I get a result like ../com_google_protobuf/google/protobuf/descriptor.proto. This seems quite unhelpful, so I just wrote some manual code to strip off the leading ../com_google_protobuf/.
Am I missing something? How can I write this rule in a way that doesn't feel like I'm fighting Bazel the whole time?
Am I missing something?
The basic problem, as you already realized, is that you have two path "namespaces" the one that protoc sees (i.e. import paths) and the one that bazel sees (i.e. the path you pass to declare_file().
2 things to note:
1) All paths declared with declare_file() get the path <bin dir>/<package path incl. workspace>/<path you passed to declare_file()>
2) All actions are executed from <bin dir> (unless output_to_genfils=True in which case this switches to <gen dir> as in your example.
Trying to solve the exact same problem you encountered, I resorted to stripping the known path from the output_file's path to determine which directory to pass as p:
# This code is run from the context of the external protobuf dependency
proto_path = "google/a/b.proto"
output_file = ctx.actions.declare_file(proto_path)
# output_file.path would be `<gen_dir>/external/protobuf/google/a/b.proto`
# Strip the known proto_path from output_file.path
protoc_prefix = output_file.path[:-len(proto_path)]
print(protoc_prefix) # Prints: <gen_dir>/external/protobuf
command = "{protoc} {proto_paths} {cpp_out} {plugin} {plugin_options} {proto_file}".format(
...
cpp_out = "--cpp_out=" + protoc_prefix,
...
)
Alternatives
You may also be able to construct the same path with ctx.bin_dir, ctx.label.workspace_name, ctx.label.package, and ctx.label.name.
Misc.
proto_library recently gained an attribute strip_import_prefix. When used, the above is not correct, as all dependent files are symlinked into a new directory from which they have the relative paths declared with strip_import_prefix.
The path format is:
<bin dir>/<repo>/<package>/_virtual_base/<label name>/<path `import`ed in .proto files>
i.e.
<bin dir>/external/protobuf/_virtual_base/b_proto/google/a/b.proto
Assuming you are building an external repo called protobuf, which contains a BUILD file at its root with a target named b_proto, which in turn, relies on a proto_library wrapping google/a/b.proto AND uses the strip_import_prefix attribute.

Zip files/Directories in Groovy with AntBuilder

I am trying to zip files and directories in Groovy using AntBuilder. I have the following code:
def ant = new AntBuilder()
ant.zip(basedir: "./Testing", destfile:"${file}.zip",includes:file.name)
This zips the file "blah.txt", but not the file "New Text Document.txt". I think the issue is the spaces. I've tried the following:
ant.zip(basedir: "./Testing", destfile:"${file}.zip",includes:"${file.name}")
ant.zip(basedir: "./Testing", destfile:"${file}.zip",includes:"\"${file.name}\"")
Neither of the above resolved the issue. I'm using Ant because it will zip directories, and I don't have access to org.apache.commons.io.compression at work.
If you look at the docs for the ant zip task, the includes parameter is described as:
comma- or space-separated list of patterns of files that must be included
So you're right, that it is the space separator that's breaking it...
You need to use the longer route to get this to work:
new AntBuilder().zip( destFile: "${file}.zip" ) {
fileset( dir: './Testing' ) {
include( name:file.name )
}
}

load_plugin_textdomain not working

Hey i'm trying to localize a plugin called Donate Plus ( which locallized technicly).
the plugin came with en_CA and de_DE files, i've tried creating a he_IL file without success.
So i've tried with the de files came with the plugin but didn't work.
I've set the WPLANG in wp-config.php to de_DE yet that dosen't change the code.
this is the setting code :
load_plugin_textdomain( 'dplus', '/wp-content/plugins/donate-plus' );
And i did check that all the string are set to be localized.
Anyone has a clue?
I just was with a similar isue, did you try to rename your files from de_DE.po and de_DE.mo to name-of-plugin-de_DE.mo and name-of-plugin-de_DE.po (changing name-of-plugin with yours, of course)?
dplus-de_DE.mo and dplus-de_DE.po It must work ;)
load_plugin_textdomain takes three parameters.
In your case it would be something like this (assuming the .po and .mo files are located in a subdir called 'languages')
load_plugin_textdomain( 'dplus', false, dirname( plugin_basename( __FILE__ ) ) . '/languages/' );
I checked the source of DonatePlus Plugin and I found that the Plugin is doing localization wrongly.
The load_plugin_textdomain() call is made inside the DonatePlus classes constructor. But it should be present inside the 'init' hook. Trying adding the following code (which is at the of the file) inside the init function.
if( class_exists('DonatePlus') )
$donateplus = new DonatePlus();
Where are all the .po and .mo files stored? Are they inside the /wp-content/plugins/donate-plus folder itself? If not then change the path or move the files.
I had a similar issue where I was loading the translation files with the load_plugin_textdomain function from within a service class using PSR-4. This meant that the dirname( plugin_basename( __FILE__ ) ) string returned the wrong path.
The correct path is the relative path your-plugin/languages (assuming you are loading the translation files from the /languages directory).
Absolute paths such as /var/www/html/wp-content/plugins/my-plugin/languages won't work.
My plugins file structure looks something like this:
- my-plugin
- assets
- languages
- services
- Api
- Base
Translation.php
- ...
Plugin.php
- vendor
- views
composer.json
composer.lock
index.php
my-plugin.php
uninstall.php
Since my Translation service is placed in the /services/Base/ directory, this worked for me:
$root = plugin_basename(dirname(__FILE__, 3));
load_plugin_textdomain( 'my-plugin', false, "$root/languages/");
Also, I used no action hook at all instead of init or plugins_loaded and fired the load_plugin_textdomain function at the beginning of the plugin, since the hooks don't fire early enough for the admin menu and action links to get translated.
Use:
load_textdomain( TEXT_DOMAIN , WP_PLUGIN_DIR .'/'.dirname( plugin_basename( FILE ) ) . '/languages/'. get_locale() .'.mo' );

how to set the path to where aapt add command adds the file

I'm using aapt tool to remove some files from different folders of my apk. This works fine.
But when I want to add files to the apk, the aapt tool add command doesn't let me specify the path to where I want the file to be added, therefore I can add files only to the root folder of the apk.
This is strange because I don't think that developers would never want to add files to a subfolder of the apk (res folder for example). Is this possible with aapt or any other method? Cause removing files from any folder works fine, and adding file works only for the root folder of the apk. Can't use it for any other folder.
Thanks
The aapt tool retains the directory structure specified in the add command, if you want to add something to an existing folder in an apk you simply must have a similar folder on your system and must specify each file to add fully listing the directory. Example
$ aapt list test.apk
res/drawable-hdpi/pic1.png
res/drawable-hdpi/pic2.png
AndroidManifest.xml
$ aapt remove test.apk res/drawable-hdpi/pic1.png
$ aapt add test.apk res/drawable-hdpi/pic1.png
The pic1.png that will is added resides in a folder in the current working directory of the terminal res/drawable-hdpi/ , hope this answered your question
There is actually a bug in aapt that will make this randomly impossible. The way it is supposed to work is as the other answer claims: paths are kept, unless you pass -k. Let's see how this is implemented:
The flag that controls whether the path is ignored is mJunkPath:
bool mJunkPath;
This variable is in a class called Bundle, and is controlled by two accessors:
bool getJunkPath(void) const { return mJunkPath; }
void setJunkPath(bool val) { mJunkPath = val; }
If the user specified -k at the command line, it is set to true:
case 'k':
bundle.setJunkPath(true);
break;
And, when the data is being added to the file, it is checked:
if (bundle->getJunkPath()) {
String8 storageName = String8(fileName).getPathLeaf();
printf(" '%s' as '%s'...\n", fileName, storageName.string());
result = zip->add(fileName, storageName.string(),
bundle->getCompressionMethod(), NULL);
} else {
printf(" '%s'...\n", fileName);
result = zip->add(fileName, bundle->getCompressionMethod(), NULL);
}
Unfortunately, the one instance of Bundle used by the application is allocated in main on the stack, and there is no initialization of mJunkPath in the constructor, so the value of the variable is random; without a way to explicitly set it to false, on my system I (seemingly deterministically) am unable to add files at specified paths.
However, you can also just use zip, as an APK is simply a Zip file, and the zip tool works fine.
(For the record, I have not submitted the trivial fix for this as a patch to Android yet, if someone else wants to the world would likely be a better place. My experience with the Android code submission process was having to put up with an incredibly complex submission mechanism that in the end took six months for someone to get back to me, in some cases with minor modifications that could have just been made on their end were their submission process not so horribly complex. Given that there is a really easy workaround to this problem, I do not consider it important enough to bother with all of that again.)

Resources