MATLAB: Set current folder to script's locaton - path

I have a handful of scripts and data in different folders and I use addpath and relative paths very often. My problem is, this only works if my current folder is where the script that I execute is located. For example, if I execute script A which adds path X and later execute script B which lies in path X, Matlab doesn't automatically change the folder and relative paths specified in script B don't work anymore.
Is there a way to automatically set my current folder to the location of the script I'm executing?
/edit: I should note that I use these scripts on different computers with different drive names, so using absolute paths probably won't help.

Put the following line in the script, it would set the current directory = script directory
cd(fileparts(mfilename('fullpath')))

Related

How to get Bazel output base from within rule Args map_each function?

I'm writing a Bazel rule which processes an input depsets with lots of jar files in it. My goal is to get the full (absolute) path of the jars for launching an application.
I use java_binary for the initial application. However, the application itself later then dynamically loads additional jars into the JVM. I need to give that application the absolute path to those jars.
As the depset is pretty expensive to process I want to move that into the execution phase. This requires the use of Args in combination with a map_each function like this:
args.add_joined("--jars", dependencies, join_with = ",", map_each = _expand_jars)
I'm now stuck with how to turn the jar path into an absolute path. In an older version (which was running in the analysis phase) I passed in the path using a rule attribute. This is required because (for some reason) the bootstrap application couldn't access the jars in the external/ directory. In the mapping function I have it hard coded to "bazel-myworkspace/../../", which I'd like to get rid of. It does follow the symlink into the exec root and from there I'm able to get to the output base.
def _expand_jars(file):
if file.path.startswith("external/"):
return "bazel-myworkspace/../../" + file.path
else:
return file.path
Any ideas?

Are absolute paths safe to use in Bazel?

I am experimenting with Bazel to be added along with an old, make/shell based build system. I can easily make shell commands which returns an absolute path to some tool or library build by the old build system as early prerequisites. These commands I can use in a genrule(), which copies the needed files (like headers and libs) into Bazel proper to be exposed in form of a cc_library().
I found out that genrule() does not detect a dependency if the command uses a file with absolute path - it is not caught by the sandbox. In a way I am (ab)using that behavior.
It is it safe? Will some future update of Bazel refuse access to files based on absolute path in that way in a command in genrule?
Most of Bazel's sandboxes allow access to most paths outside of the source tree by default. Details depend on which sandbox implementation you're using. The docker sandbox, for example, allows access to all those paths inside of a docker image. It's kind of hard to make promises about future Bazel versions, but I think it's unlikely that a sandbox will prevent accessing /bin/bash (for example), which means other absolute paths will probably continue to work too.
--sandbox_block_path can be used to explicitly block a path if you want.
If you always have the files available on every machine you build on, your setup should work. Keep in mind that Bazel will not recognize when the contents of those files change, so you can easily get stale results in various caches. You can avoid that by ensuring the external paths change whenever their contents do.
new_local_repository might be a better fit to avoid those problems, if you know the paths ahead of time.
If you don't know the paths ahead of time, you can write a custom repository rule which runs arbitrary commands via repository_ctx.execute to retrieve the paths and them symlinks them in with repository_ctx.symlink.
Tensorflow's third_party/sycl/sycl_configure.bzl has an example of doing something similar (you would do something other than looking at environment variables like find_computecpp_root does, and you might symlink entire directories instead of all the files in them):
def _symlink_dir(repository_ctx, src_dir, dest_dir):
"""Symlinks all the files in a directory.
Args:
repository_ctx: The repository context.
src_dir: The source directory.
dest_dir: The destination directory to create the symlinks in.
"""
files = repository_ctx.path(src_dir).readdir()
for src_file in files:
repository_ctx.symlink(src_file, dest_dir + "/" + src_file.basename)
def find_computecpp_root(repository_ctx):
"""Find ComputeCpp compiler."""
sycl_name = ""
if _COMPUTECPP_TOOLKIT_PATH in repository_ctx.os.environ:
sycl_name = repository_ctx.os.environ[_COMPUTECPP_TOOLKIT_PATH].strip()
if sycl_name.startswith("/"):
return sycl_name
fail("Cannot find SYCL compiler, please correct your path")
def _sycl_autoconf_imp(repository_ctx):
<snip>
computecpp_root = find_computecpp_root(repository_ctx)
<snip>
_symlink_dir(repository_ctx, computecpp_root + "/lib", "sycl/lib")
_symlink_dir(repository_ctx, computecpp_root + "/include", "sycl/include")
_symlink_dir(repository_ctx, computecpp_root + "/bin", "sycl/bin")

how to find and deploy the correct files with Bazel's pkg_tar() in Windows?

please take a look at the bin-win target in my repository here:
https://github.com/thinlizzy/bazelexample/blob/master/demo/BUILD#L28
it seems to be properly packing the executable inside a file named bin-win.tar.gz, but I still have some questions:
1- in my machine, the file is being generated at this directory:
C:\Users\John\AppData\Local\Temp_bazel_John\aS4O8v3V\execroot__main__\bazel-out\x64_windows-fastbuild\bin\demo
which makes finding the tar.gz file a cumbersome task.
The question is how can I make my bin-win target to move the file from there to a "better location"? (perhaps defined by an environment variable or a cmd line parameter/flag)
2- how can I include more files with my executable? My actual use case is I want to supply data files and some DLLs together with the executable. Should I use a filegroup() rule and refer its name in the "srcs" attribute as well?
2a- for the DLLs, is there a way to make a filegroup() rule to interpret environment variables? (e.g: the directories of the DLLs)
Thanks!
Look for the bazel-bin and bazel-genfiles directories in your workspace. These are actually junctions (directory symlinks) that Bazel updates after every build. If you bazel build //:demo, you can access its output as bazel-bin\demo.
(a) You can also set TMP and TEMP in your environment to point to e.g. c:\tmp. Bazel will pick those up instead of C:\Users\John\AppData\Local\Temp, so the full path for the output directory (that bazel-bin points to) will be c:\tmp\aS4O8v3V\execroot\__main__\bazel-out\x64_windows-fastbuild\bin.
(b) Or you can pass the --output_user_root startup flag, e.g. bazel--output_user_root=c:\tmp build //:demo. That will have the same effect as (a).
There's currently no way to get rid of the _bazel_John\aS4O8v3V\execroot part of the path.
Yes, I think you need to put those files in pkg_tar.srcs. Whether you use a filegroup() rule is irrelevant; filegroup just lets you group files together, so you can refer to the group by name, which is useful when you need to refer to the same files in multiple rules.
2.a. I don't think so.

Require dll using string path in F#

Is there a way to do this in F#:
let fakeToolsPath = "D:\tools\FAKE\tools\FakeLib.dll"
#r fakeToolsPath
the fake tools are on a different path depending on the build agent that builds the code, so I need to be able to set it dynamically, from an environment variable or some config file.
Three ideas, in order of increasing hackiness - you'll be the judge which one makes most sense in your scenario:
In .fsx script, you can use __SOURCE_DIRECTORY__ to get the directory where the script is located. If your dll is always located in the same directory relative to the script, you can use that as a "hook" to get to it.
There's a command-line --reference argument to fsi.exe that should do what you want. If you're using fake.exe instead, you can use --fsiargs to pass it in (take a look at the link for details).
If everything else fails, create a symlink as a separate build step in your CI job configuration and just hardcode the path in the script.

Matlab 'addpath/rmpath' not working in my case

Let me explain my situation with some dummy file names.
I am working in directory 'A' which has a sub directory 'a'. I am running a function 'func1' which is present in both folders. 'func1' needs 'file1' & 'file2' during its execution. 'file1' & 'file2' are present in both folders with some parameters changed inside them. It is not possible for me to change file names at all.
Now, the problem is that when I am running 'func1' in 'A', everything is working fine. But, when I run 'func1' in 'a' using 'addpath/rmpath', rather than using 'file1' & 'file2' from 'a', it is using 'file1' & 'file2' from 'A' which is producing wrong results.
Please tell me how can I change path so that when I run 'func1' in sub directory 'a', it always use 'file1' & 'file2' from 'a' rather than directory 'A'.
I hope I am clear in my explanation :S
If I have understood correctly, you are hoping that if you use addpath to add the subdirectory to the search path, Matlab will give the search path precedence over the current directory. Unfortunately, it is precisely the other way around, as per the Matlab documentation: "Functions in the current folder take precedence over functions with the same file name that reside anywhere on the search path." - and this also applies to the load function when reading data files. (incidentally, I suspect that for this reason you are also not running the version of func1 that you think you are running - try typing which func1 to find out).
Anyway, the solution here is to make sure that Matlab picks the right version of file1 and file2, which you could do in several ways:
Change your working directory to a, since the working directory has precedence: cd a
Put the two versions into separate subfolders, e.g. a and b, and use addpath to add them separatley
Change the different versions of func1 to have explicit references to the files, i.e. load('./a/file1')
With addpath and rmpath you modify the search path in Matlab. Your search path basically is a list of folders where Matlab looks for functions. Not for files you want to open.
If you have your files in folder A and this is your current working directory, Matlab will look for the files in A. If you change to a and change your working directory accordingly, Matlab will open the files in a - this has nothing to do with your search path. If you want to open files from a specific directory, use the entire path in the open command:
fileID = fopen('/path/to/A/file1');
In your case, the case may be that the fopen is applied in the way explained above. If you want Matlab to always open files from the current working directory, change it to:
fileID = fopen('file1');

Resources