Dart scripts that invoke scripts by importing them - dart

I have this setup:
├── bin
│   ├── all.dart
│   ├── details
│   │   ├── script1.dart
│   │   └── script2.dart
| | .....
all.dart simply imports script1.dart and script2.dart and calls their main. The goal is to have a bunch of scripts under details that can be run individually. Additionally I want a separate all.dart script that can run them all at once. This will make debugging individual scripts simpler, yet still allowing all to run.
all.dart
import 'details/script1.dart' as script1;
import 'details/script2.dart' as script2;
main() {
script1.main();
script2.main();
}
script1.dart
main() => print('script1 run');
script2.dart
main() => print('script2 run');
So, this is working and I see the print statements expected when running all.dart but I have two issues.
First, I have to softlink packages under details. Apparently pub does not propagate packages softlinks down to subfolders. Is this expected or is there a workaround?
Second, there are errors flagged in all.dart at the point of the second import statement. The analyzer error is:
The imported libraries 'script1.dart' and 'script2.dart' should not have the same name ''
So my guess is since I'm importing other scripts as if they are libraries and since they do not have the library script[12]; statement at the top they both have the same name - the empty name?
Note: Originally I had all of these under lib and I could run them as scripts specifying a suitable --package-root on the command line even though they were libraries with main. But then to debug I need to run in Dart Editor, which is why I'm moving them to bin. Perhaps the editor should allow libraries under lib with a main to be run as a script since they run outside the editor just fine? The actual differences between script/library seems a bit unnecessary (as other scripting languages allow files to be both).
How do I clean this up?

I'm not sure what the actual question is.
If a library has not library statement then the empty string is used as a name.
Just add a library statement with an unique name to fix this.
Adding symlinks to subdirectories solves the problem with the imports for scripts in subdirectories.
I do this regularily.
It was mentioned several times in dartbug.com that symlinks should go away entirely but I have no idea how long this will take.
I have never tried to put script files with a main in lib but it is just against the package layout conventions and I guess this is why DartEditor doesn't support it.

Related

With Bazels `http_archive` - is there a way to add already existing files to the extracted sources?

With Bazel I'm building an external library using http_archive together with
some patches which bring additional features:
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name="some-lib",
build_file="#my-project//some-lib:BUILD.some-lib",
url="https://example/download/some-lib.tar.gz",
sha256="8d9405baf113a9f25e4fb961d56f9f231da02e3ada0f41dbb0fa4654534f717b",
patches=[
"//some-lib/patches:01-add-additional-features.dif",
],
patch_args=["-p1"],
patch_tool="patch",
)
The file structure looks like this:
some-lib
├── BUILD
├── BUILD.some-lib
├── include/additional_features.h
├── some-lib.bzl
└── patches
    ├── 01-add-additional-features.dif
    └── BUILD
This basically works but I still struggle with adding include/additional_features.h
into the extracted source folder.
I first tried to just list the file in the filegroup I use to later run
configure_make like this:
filegroup(
name="all_srcs",
srcs=glob(["**"]) + ["#my-project//some-lib:include/additional_features.h"],
)
then I'm getting
no such target '//some-lib:includes/additional_features.h': target 'includes/additional_features.h' not declared in package 'some-lib'; however, a source file of this name exists.
My next idea was to use the tools http_archive provides to make the file part of the source folder.
While you can use patches to modify the extracted folder you'd need a dedicated
dif file just to create the extra header file, which then you would have to
create in advance (i.E. create a Bazel rule) and declare it a dependency etc..
which I'd like to avoid to keep things simple.
There is also patch_cmds which next to patches can be used to modify the
extracted source folder by running arbitrary bash commands so I tried something like this:
patch_cmds=[
"cp #my-project//some-lib:include/additional_features.h include/",
],
but this does not work for me, I'm getting
Error in fail: Error applying patch command cp #my-project//some-lib:include/additional_features.h include/:
cp: cannot stat '#my-project//some-lib:include/additional_features.h': No such file or directory
So it looks like the syntax for specifying a path like I do with build_file does
not work with patch_cmds or that file can't be accessed at that specific stage.
Does one of the approaches I tried actual work and I just didn't use the right
syntax?
What's the Bazel-way to add (a bunch of) readily available files (i.e. in the same
repository as the Bazel-rules I provide) to a http_archive based source directory?
Try putting exports_files(["include/additional_features.h"], visibility=["//visibility:public"]) in some-lib/BUILD, so that Bazel will let you reference source files from the //some-lib package in the external repository (and elsewhere).
I even thought the "no such target" error message suggested exports_files?

glob patterns doesn't match any files workbox

I am trying to generate service worker for the Polymer 3 with workbox 4.3.1.
I have some specific files inside bower and node_modules I want to cache.
I tried adding "en-in/node_modules/**" to globIgnores and include specific files like - en-in/node_modules/#webcomponents/webcomponentsjs/custom-elements-es5-adapter.*.js in globPattern.
The config I tried is giving a error. I even tried adding globStrict: false. Even that didn't help.
Below is my workbox config:
globDirectory: "dist",
globPatterns: ["en-in/**/*.{js,json,css}",
"en-in/node_modules/#webcomponents/webcomponentsjs/custom-elements-es5-adapter.*.js"],
globIgnores: [
"en-in/sw-reg.js",
"en-in/sw-custom.js",
"en-in/rev-manifest.json",
"en-in/package.json",
"en-in/workbox-v4.3.1/**/*",
"en-in/node_modules/**"
],
globStrict: false,
I am getting the below error:
One of the glob patterns doesn't match any files. Please remove or fix the following:
{
"globDirectory": "dist",
"globPattern": "en-in/node_modules/#webcomponents/webcomponentsjs/custom-elements-es5-adapter.*.js",
"globIgnores": [
"en-in/sw-reg.js",
"en-in/sw-custom.js",
"en-in/rev-manifest.json",
"en-in/package.json",
"en-in/workbox-v4.3.1/**/*",
"en-in/node_modules/**",
"**/service-worker.js"
]
}
The code that does glob-ing in Workbox looks like:
globbedFiles = glob.sync(globPattern, {
cwd: globDirectory,
follow: globFollow,
ignore: globIgnores,
strict: globStrict,
});
Because you're passing in "en-in/node_modules/**" as one of the globIgnores patterns, "en-in/node_modules/#webcomponents/webcomponentsjs/custom-elements-es5-adapter.*.js" is never going to match anything. In the glob module, ignore always takes precedence.
You have a number of approaches that would fix this:
As part of your build process, move the custom-elements-es5-adapter.*.js file out of node_modules and into a different directory, and load it from there.
Change your "en-in/**/*.{js,json,css}" in globPatterns to something like "en-in/{dir1,dir2,dir3}/**/*.{js,json,css}" so that it does not match node_modules by default, and remove "en-in/node_modules/**" from globIgnores. You can then leave "en-in/node_modules/#webcomponents/webcomponentsjs/custom-elements-es5-adapter.*.js" in globPatterns, and it will not longer be ignored.
Change your "en-in/node_modules/**" in globIgnores so that it doesn't match en-in/node_modules/#webcomponents. (I forget the syntax for doing that, but you might be able to figure it out if the other two options don't work.)
There are probably a few other alternatives as well. But hopefully that explains the root cause.
Here below the Polymer 3 notes at its page for service worker. They made easy of life :)
Service Worker
A Service Worker is loaded and registered in the index.html file. However, during development (to make debugging easier), the Service Worker does not actually exist, and only a stub file is used.
The production time Service Worker is automatically created during build time, i.e. by running npm run build or npm run build:static. This file is generated based on the polymer.json and sw-precache-config.js configuration files, and you can find it under each of the build directories:
build/
├── es5-bundled/
| └── service-worker.js
├── es6-bundled/
| └── service-worker.js
├── esm-bundled/
| └── service-worker.js
└── ...
By default, all of the source files (inside the /src directory) will be pre-cached, as specified in the sw-precache-config.js configuration file. If you want to change this behaviour, check out the sw-precache-config docs.
Source: https://pwa-starter-kit.polymer-project.org/building-and-deploying
I had the same issue and it was resolved when I explicitly changed the globPatterns to the following
globPatterns: [
'**/*.js',
'**/*.html',
'**/*.css',
'**/*.json',
'**/*.svg',
'**/*.png',
'**/*.gif',
'**/*.txt',
],

.bzl file in external dependencies

I've an external dependency declared in WORKSPACE as a new_git_repository and provided a BUILD file for it.
proj/
├── BUILD
├── external
│   ├── BUILD.myDep
│   └── code.bzl
└── WORKSPACE
in the BUILD.myDep file, I want to load code.bzl nearby, but when I load it (load("//:external/code.bzl", "some_func")) bazel tries to load #myDep//:external/code.bzl instead!
Of course it's not a target in #myDep repository, but in my local worksapce.
Seems I Rubber Duck ed the Stackoverflow. since the solution appeared when writing the question!
However, the solution is to explicitly mention the local workspace when loading the .bzl file:
Suppose we have declared the name in the WORKSPACE as below:
workspace(name = "local_proj")
Now instead of load("//:external/code.bzl", "some_func"), just load it explicitly as a local workspace file:
load("#local_proj//:external/code.bzl", "some_func")
NOTE: When using this trick just be careful about potential dependency loops (i.e. loading a generated file that itself is produced by a rule depending on the same external repo!)

How to generate language fragment bundles to localize Carbon products

In the blog How to generate language fragment bundles to localize Carbon products by Tanya Madurapperuma, I am having the following problem. Once generated the language bundles with ant localize command, these bundles are generated in the CARBON_HOME/repository/components/dropins/ folder. The problem is that when I run the tool I'm not looking to change the language to Spanish. I would appreciate help to correct what I may be missing to do?
Note: All resources.properties files are translated into Spanish.
If you have the jars with translated resources.properties files in you dropins folder, you need to restart the server and set the Locale setting of your browser to Spanish.
Locale should be changed in the browser, and then the server will pick the matching resources files to use.
UPDATE:
There are some problems here.
First, there's a bug if you have multiple directories in /resources directory. For now, you can make sure that you have only one directory inside resources directory when you run localize task.
You should have the properties files inside a directory with the bundle name, without the tree structure. So your resources directory should look like this.
../resources/
└── org.wso2.carbon.i18n_4.2.0
├── JSResources_es.properties
└── Resources_es.properties
You need to include the locale code as _es in your files as shown above.
Also the localize tool seems to append i18n at the end of the folder structure of the built jar. This works with ui bundles but in the case of org.wso2.carbon.i18n it looks as org/wso2/carbon/i18n/i18n. So open the built jar in dropins folder and remove the extra i18n folder so that the jar tree structure looks like following.
../repository/components/dropins/org.wso2.carbon.i18n.languageBundle_4.2.0.jar
├── META-INF
│   └── MANIFEST.MF
└── org
└── wso2
└── carbon
└── i18n
├── JSResources_es.properties
└── Resources_es.properties
Did you get this to work?
The place I doubt that you might have gone wrong is the folder structure in the resources folder. (You can place your resource files anywhere and execute command as ant localize -Dresources.directory=path_to_your_resources_directory)
Also note that a resource folder should have the proper naming conventions of the osgi bundle.
Ex: org.wso2.carbon.claim.mgt.ui_4.2.0 (This entire thing is name of the folder)
If you still couldn't get this to work mail me your resources folder to tanyamadurapperuma#gmail.com

Include *.sty file from a super/subdirectory of main *.tex file

I want to share a latex document via git with many other people.
Therefore we decided to put all the special sty files, that are not present in everyones latex-installation, into a resources directory. It would be cool, if this dir would be a superdir. of the actual working directory
How exactly can I import those style files?
It is important that even the dependencies of those remote styles are resolved with other remote styles.
You can import a style file (mystyle.sty) into your document in two ways:
If you have it in your path or in the same folder as the .tex file, simply include this line in your preamble: \usepackage{mystyle}
If you have it in a different folder, you can access using its full path as \usepackage{/path/to/folder/mystyle}
That said, if you're not sure if the style file is in everyone's installation, simply include it in the same directory and make sure you do git add mystyle.sty and track it along with the rest of your files (although most likely there won't be any changes to it). There is no need for a parent directory. But if you insist on a different directory, see option 2 above.
It would be better if it were in a subdirectory than in a parent directory, as you can still call the file as \usepackage{subdir/mystyle} and be certain that you are invoking your style file. However, if you escape out to the parent directory, you never know if the other users have a similarly named folder that is not part of your package, which can result in errors.
This probably isn't relevant to you any more, but here is another way to do what you want.
Set up your git repository like this:
mystyle.sty
project/
makefile
project.tex
and put \usepackage{mystyle} in the preamble of project.tex.
Compiling project.tex manually won't work, of course, because mystyle.sty is not in the same directory as project.tex.
However, if makefile contains something along the lines of:
project.pdf: mystyle.sty project.tex
pdflatex project
mystyle.sty: ../mystyle.sty
cp ../$# $#
then running make from within the project directory will cause mystyle.sty to be copied to the correct place before project.tex is (this time successfully) compiled.
This way might seem a little bit over the top, but it does combine the best features of other methods.
If several projects in the same repository require mystyle.sty then having a common mystyle.sty sitting above them all makes more sense than having a copy in each project directory; all these copies would have to be maintained.
The compilation is portable, in the sense that if you gave me your copies of mystyle.sty and project.tex then I would (in theory at least) be able to compile manually without needing to modify the files you gave me.
For example, I would not have to replace \usepackage{/your/path/mystyle} with \usepackage{/my/path/mystyle}.
You can use Makefiles as suggested above. Another option is CMake. I didn't test for parent directories.
If you have the following file structure:
├── CMakeLists.txt
├── cmake
│   └── UseLATEX.cmake
├── img
│   └── logo.jpg
├── lib
│   └── framed.sty
└── main.tex
you should have CMake installed, instructions on CMake resources
UseLATEX.cmake can be downloaded from here
then inside the CMakeLists.txt
╚═$ cat CMakeLists.txt
cmake_minimum_required (VERSION 2.6)
set(PROJECT_NAME_STR myProject)
project(${PROJECT_NAME_STR})
set(CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake")
include(UseLATEX)
ADD_LATEX_DOCUMENT(main.tex
IMAGE_DIRS img
DEFAULT_PDF
MANGLE_TARGET_NAMES)
Some example content for main.tex (note the image)
╚═$ cat main.tex
\documentclass{report}
\begin{document}
\begin{center}
\includegraphics[width=300px]{img/logo.jpg}
\end{center}
\end{document}
The lib directory has the *.sty files
You can now compile:
cd /directory/that/has/CMakeLists.txt/
mkdir build
cd build
cmake ..
make
you can then view main.pdf which is in the build directory.
When you use TeX distribution that uses kpathsea, you can use the TEXINPUTS environment variable to specify where TeX is looking for files. The variable needs to be used in the following way.
The paths in TEXINPUTS are separated by :. An empty path will include the default search paths, i.e., just the colon. Two consecutive slashes means that the directory and all sub-directories are searched.
Thus, e.g., to build a file document.pdf which uses files in the current directory, all sub-directories of the resources directory and the default directories, you can use the following Makefile.
document.pdf: document.tex
TEXINPUTS=.:./resources//: pdflatex document.tex
To speed up the filename lookup, you can build a ls-R database using the mktexlsr command.
For all the details on kpathsea take a look at the manual.
You can use latexmk and its facilities
There is a feature documented under Utility subroutines on page 48 here in latexmk which can update TEXINPUTS during a run. If you can consider to use the .latexmkrc file to configure your chain and options, you can add ensure_path() to the file:
Here is an example:
# .latexmkrc
ensure_path('TEXINPUTS', './path/to/something//', '/full/path/to/something/else//')
# [...] Other options goes here.
$pdf_update_method = 3;
$xelatex = 'xelatex -synctex=1 -interaction=nonstopmode -file-line-error %O %S';
$pdf_previewer = 'start "%ProgramFiles%/SumatraPDF/SumatraPDF.exe" %O %S';
$out_dir = 'build/';
Notice the // at the end of a path, This will aid LaTeX to search for files in the specified directory and in all subdirectories.
Please note that while this is an amazing feature, you need to take good care of your naming scheme. If you use the same file name several places, you can run into trouble when importing them with, say \include{somefile}.

Resources