Dymola mos script environment variables - environment-variables

Is there a way to use the Windows environment variables in Dymolas .mos scripts?
Something like this:
// Load libraries, last one determines the working directory
openModel(%USERPROFILE% + "Documents/Dymola/MyTestLib/package.mo");
Alternatively, does Dymola know some other predefined pathes?
I would like to make .mos script a bit more portable to a different PC.

You can use the getEnvironmentVariable function from the MSL.
So this should do what you want:
user_profile = Modelica.Utilities.System.getEnvironmentVariable("USERPROFILE", convertToSlash=true);
openModel(user_profile + "/Documents/Dymola/MyTestLib/package.mo");
On startup Dymola also defines two useful environment variables:
DYMOLA: the dymola installation directory, e.g. C:/Program Files/Dymola 2019 FD01
DYMOLAWORK: the startup directory, with C:/Users/<user>/Documents/Dymola as default. See user Manual 1 for details.

Related

How can I pass a pointer to a file in helm upgrade command?

I have a truststore file(a binary file) that I need to provide during helm upgrade. This file is different for each target env(dev,qa,staging or prod). So I can only provide this file at time of deployment. helm upgrade --set-file does not take a binary file. This seem to be the issue I found here: https://github.com/helm/helm/issues/3276. This truststore files are stored in Jenkins Credential store.
As the command itself is described below:
--set-file stringArray set values from respective files specified via the command line (can specify multiple or separate values with commas: key1=path1,key2=path2)
it is also important to know The Format and Limitations of
--set.
The error you see: Error: failed parsing --set-file data... means that the file you are trying to use does not meet the requirements. See the example below:
--set-file key=filepath is another variant of --set. It reads the
file and use its content as a value. An example use case of it is to
inject a multi-line text into values without dealing with indentation
in YAML. Say you want to create a brigade project with certain value
containing 5 lines JavaScript code, you might write a values.yaml
like:
defaultScript: |
const { events, Job } = require("brigadier")
function run(e, project) {
console.log("hello default script")
}
events.on("run", run)
Being embedded in a YAML, this makes it harder for you to use IDE
features and testing framework and so on that supports writing code.
Instead, you can use --set-file defaultScript=brigade.js with
brigade.js containing:
const { events, Job } = require("brigadier")
function run(e, project) {
console.log("hello default script")
}
events.on("run", run)
I hope it helps.

How to get the path where the library is installed

I am working in Linux and I have a library written in Fortran 90 (written by 3rd party), that is reading from a file in the current working directory. I would like to be able to call the resulting library from other folders, hence I need to read the path where the library is installed. How can I know the path to the compiled library within the Fortran code?
I need to store in a variable the path within the code.
For who knows python, I want to achieve the same as
import os
os.path.dirname(os.path.abspath(__file__))
but in f90 (see Get location of the .py source file)
Using the suggestions in the comment I have done the following:
export DATAPATH=`pwd`
make
in the Makefile
ifort -O3 -fpic -fpp -DDATAPATH -c mysource.f90
in mysource.f90
subroutine mysub
character(len=100)::dpath
#ifdef DATAPATH
dpath=DATAPATH
#endif
open(10,file=trim(dpath)//'initialise.dat')
....
....
the problem is that at compile time I get
mysource.f90(42): error #6054: A CHARACTER data type is required in this context. [1]
dpath=1
----------^
compilation aborted for mysource.f90 (code 1)
If you wish you can fix the path at compile time. Something like
gfortran -cpp mylib.f90 -DPREFIX=\"/usr/local/\"
open(newunit=u,file=PREFIX//'mylib/initialise.dat')
You must than make sure the library is indeed installed in that place PREFIX/mylib/
You can create an environment variable containing the path of your data. This variable can be set by hand, in your .bashrc or .bash_profile or in the system /etc/profile.d/ or /etc/bash.bashrc, there are manyways and they depend if the library is just for one user or for all users of some large computer.
For example
export MYLIB_PATH='/usr/local/mylib'
Then you can read the variable in Fortran as
CALL get_environment_variable("MYLIB_PATH", mylib_path, status=stat)
and the path is now in variable mylib_path. You can check the success by checking if stat==0.
This is not the only possible approach. You can also have a configuration file for your library in your home directory:
mkdir $HOME/.config/mylib/
echo "/usr/local/mylib" > $HOME/.config/mylib/path
and then you can try to read the path from this file if the environment variable was not set
if (stat/=0) then
CALL get_environment_variable("HOME", home_dir)
open(newunit=path_unit, file=home_dir//'/.config/mylib/path',status='old',action='read',iostat=stat)
if (stat/=0) complain
read(path_unit,'(a)',iostat=stat) mylib_path
if (stat/=0) complain
end if
So when you compiled with -DDATAPATH you have not passed the variable DATAPATH into your code only declared a symbol called DATAPATH as being true, so ifort will substitute DATAPATH as 1. What you need to do is pass it as a value:
-DDATAPATH="$(DATAPATH)"
For the compilation to work.

excel xlwings_udfs module is empty

Using xlwings 0.7.1 UDF on Windows in 64-bit virtual env python 2.7.6.
I see now that instead of requiring full path to module, it takes module names. However it fails silently to import any UDFs when the module name has package name prefixed. Eg:
PYTHONPATH = ThisWorkbook.Path & ";C:\pathTo\Pydev\myproj\src"
UDF_MODULES = "pkg.myudfs"
If I move the package name 'pkg' from UDF_MODULES to PYTHONPATH, then it fails at imports inside myudfs.py (like 'import pkg.module2').
After hit & trial, I fixed it by adding multiple source folders:
PYTHONPATH = ThisWorkbook.Path & ";C:\pathTo\Pydev\myproj\src\pkg;C:\pathTo\Pydev\myproj\src"
Am I expected to do this? Can't I just point UDF_MODULES to base src folder and provide qualified module name like 'pgk.myudfs'?
You're actually doing it right for right now (v0.7.1). I have, however, opened an issue on GitHub so we might make this easier in a future release.

Sharing const variables across FAKE fsx scripts

Is there any way to share a variable by including a fsx script within another fsx script.
e.g script buildConsts.fsx contains
let buildDir = "./build/"
I want to reference this in other build scripts e.g.
#load #".\buildConsts.fsx"
let testDlls = !! (buildDir + "*Test*.dll")
When I attempt to run the script the 'buildDir' variable the script fails to compile.
This is a fairly common approach that is used with tools such as MSBuild and PSAKE to modularise scripts. Is this the correct approach with FAKE ?
What you're doing should work - what exactly is the error message that you're getting?
I suspect that the problem is that F# automatically puts the contents of a file in a module and you need to open the module before you can access the constants. The module is named based on the file name, so in your case buildConsts.fsx will generate a module named BuildConsts. You should be able to use it as follows:
#load #".\buildConsts.fsx"
open BuildConsts
let testDlls = !! (buildDir + "*Test*.dll")
You can also add an explicit module declaration to buildconsts.fsx, which is probably a better idea as it is less fragile (won't change when you rename the file):
moule BuildConstants
let buildDir = "./build/"

Team Foundation Build Activitie "DownloadFiles" is giving error

I am customizing the default build process template in TFS 2010.
i am using "DownloadFiles" build activity and in server path i have given "$/TFS/Libraries/Foo.DLL", when i run the execute definition its throwing error as "Access to the path '\ServerName\SharedFolder\BuildName\TempFolder' is denied.".
But when i give server path as "$/TFS/Libraries" its downloading all the files in Libraries folder into shared TempFolder.
But i need do download only one file. Please help..
Thanks in advance..
Now, DownloadFiles does work for a whole folder only:
ServerPath="$/proj/path" - works great, all is downloaded to LocalPath.
ServerPath="$/proj/path/name.ext" - borked.
I've de-compiled DownloadFiles to see why: First it gets a list of server items, in our case just $/proj/path/name.ext. Then, it calculates the local path like this:
localItemPath = Path.Combine(LocalPath,VersionControlPath.MakeRelative(ServerItem, ServerPath));
In this line, the activity assumes that ServerPath is a path. If it's not, then MakeRelative will not recognize it, and the local path will be LocalPath/$/proj/path/name.ext, as the OP has observed.
Also, if ServerPath is not canonical - for example, $/proj/path/../path2, the same will happen. Solution: use VersionControlPath.GetFullPath(myNonCanonicalPath).
You need to grant the user running the build service with write permissions on the shared folder.
http://msdn.microsoft.com/en-us/library/cc668757.aspx
There are two separate Build activities, DownloadFiles for a folder ServerItem and a DownloadFile for a single file ServerItem.I'd expect it should work with DownloadFile.

Resources