The term 'System.DefaultWorkingDirectory' is not recognized in YAML pipeline - tfs

I had a pipeline with PowerShell task that run some python script. It worked without any problems.
After I convert my pipeline into YAML format to store it as code and got something like this (a part of whole yaml pipeline):
variables:
Build.SyncSources: false
REPO_PATH_DA: '/asdfg/qwerty'
REPO_PATH_DS: '/zxcvbn/tyuio'
PIP_REPO_HOST: 'bbbb.nnnn.yyyy.com'
PIP_REPO_URL: 'https://$(PIP_REPO_HOST)/api/pypi/pypi/simple'
PIP_VENV_NAME: 'my_test_venv'
SelectedBranch: ''
WorkingDirectory: $(System.DefaultWorkingDirectory)
…………………………………………….
- task: PowerShell#1
displayName: 'Install package'
inputs:
scriptType: inlineScript
inlineScript: |
.\$(PIP_VENV_NAME)\Scripts\activate
python.exe -m pip install --index-url=$(PIP_REPO_URL) --trusted-host=$(PIP_REPO_HOST) mypackage
python.exe -m pip install --index-url=$(PIP_REPO_URL) --trusted-host=$(PIP_REPO_HOST) $(WorkingDirectory)$(REPO_PATH_DA)\qwerty
And after I run this pipeline I get an error:
##[error]System.DefaultWorkingDirectory : The term 'System.DefaultWorkingDirectory' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
I tried to change the name of variable in format: $(env:System_DefaultWorkingDirectory) , but no success. I suppose that predefined variables are not passed into yaml pipeline. Do you have any ideas how to resolve it?

Based on my test, the same script could work fine in my yaml pipeline. The $(WorkingDirectory) will be converted to paths xxx/xx/s.
To check if the predefined variable: $(System.DefaultWorkingDirectory) has been passed to Yaml Pipeline.
You could add a task to list all Environment variables:
steps:
- script: SET | more
In the task log, you could search the SYSTEM_DEFAULTWORKINGDIRECTORY and check if the variable exists.
For example:
If this environment variable exists, you could try to use the following format: $env:SYSTEM_DEFAULTWORKINGDIRECTORY
Here is the example:
variables:
Build.SyncSources: false
.....
WorkingDirectory: '$env:SYSTEM_DEFAULTWORKINGDIRECTORY'
If you couldn't find this variable, you can also check if there are equivalent variables.
For example:
BUILD_SOURCESDIRECTORY , BUILD_REPOSITORY_LOCALPATH

Is this a build or a release pipeline? If it's a release, you may need to use $(Pipeline.Workspace) instead of $(System.DefaultWorkingDirectory).
Also, have you tried using $(System.WorkingDirectory) directly in your Powershell task, instead of declaring a variable that references it?

Related

How to I set an ENV VAR in my WORKSPACE in Bazel

I am trying to use Bazel with Pybind, and it requires that I set the following variables:
"""Repository rule for Python autoconfiguration.
`python_configure` depends on the following environment variables:
* `PYTHON_BIN_PATH`: location of python binary.
* `PYTHON_LIB_PATH`: Location of python libraries.
"""
https://github.com/pybind/pybind11_bazel/blob/master/python_configure.bzl
I dont want to have to pass it in manually when building my libraries, how can i hardcode these env vars in my WORKSPACE?
To (always) set environmental variable for a repository rule consumption, you case use --repo_env command line option. And if you want to include those with every invocation in your workspace, you can set add these flags to your .bazelrc file therein.
Now the wisdom of doing that could be questioned. If it's actually a project (repo) and not build host configuration, it would probably make more sense, be more targeted and more explicit, if it was an attribute of the given rule which was then checked in with the rest of the build configuration.
And looking at the name, there may be another question about specifying python configuration (from outside the bazel build) instead of actually using correctly resolved python toolchain (but there I have to say have no background in what the given rule is about and what is it trying to accomplish to render judgment, this is just a general comment).
To address your comment... I don't what other factors make it "not accept" or what exactly does that actually look like, but if I have this mini-example:
.
├── BUILD
├── WORKSPACE
└── customrule.bzl
Where customrule.bzl reads:
def _run_me(repo_ctx):
repo_ctx.file(
"WORKSPACE",
'workspace(name = "{}")\n'.format(repo_ctx.name),
executable = False,
)
repo_ctx.file(
"BUILD",
'exports_files(["var.sh"], visibility=["//visibility:public"])',
executable = False,
)
repo_ctx.file(
"var.sh",
"echo {}\n".format(repo_ctx.os.environ.get("var1")),
executable = True,
)
wsrule = repository_rule(
implementation = _run_me,
environ = ["var1"],
)
The WORKSPACE is:
load(":customrule.bzl", "wsrule")
wsrule(
name = "extdep"
)
And BUILD:
sh_binary(
name = "tgt",
srcs = ["#extdep//:var.sh"],
)
Then I do get:
$ bazel run --repo_env var1=val1 tgt
val1
and:
$ bazel run --repo_env var1=val2 tgt
val2
I.e. this is a way to pass variables to a repo rule and it does (as such) work.
If you absolutely know, you must call a build with some variable set to certain value (which as mentioned above is itself a requirement that is worth closer examination) and you want these to be associated with the project / repo. You can always check in a build.sh or any such file that wraps your bazel call to be exactly what it must be. But again, this looks more likely to not be really entirely "The Right Thing" to do or want.

I am facing an issue while running batch command to push the code from local server to repository (Gitlab)

Here is the attached YML file and error description for more information.
Below is the yml code to execute bat file in GitLab:
job_1:
tags :
-ci
before_script:
- echo "This is the before_script."
- echo "Attempting to run the WindowsCommand 35 version-application.bat file..."
- call C:\ADM\appian-adm-versioning-client-2.5.9\version-application.bat
script:
- version-application.bat -action "addContents" -application_path "C:\Demo\Application Exports\ASD_App_12172019.zip" -commit_message "Sdlc application 12242019"
How can I make this job work?
The relevant error seems to be:
java.io.FileNotFoundException: version-manager.properties
(The system cannot find the file specified)
Check in the Appian documentation if/how that file should be present.
But you should at least version it and push it, in order for the gitlab-ci job to use it.

symfony/yaml backed symfony/config not parsing environment variables

I have recreated a simple example in this tiny github repo. I am attempting to use symfony/dependency-injection to configure monolog/monolog to write logs to php://stderr. I am using a yaml file called services.yml to configure dependency injection.
This all works fine if my yml file looks like this:
parameters:
log.file: 'php://stderr'
log.level: 'DEBUG'
services:
stream_handler:
class: \Monolog\Handler\StreamHandler
arguments:
- '%log.file%'
- '%log.level%'
log:
class: \Monolog\Logger
arguments: [ 'default', ['#stream_handler'] ]
However, my goal is to read the path of the log files and the log level from environment variables, $APP_LOG and LOG_LEVEL respectively. According to The symphony documentations on external paramaters the correct way to do that in the services.yml file is like this:
parameters:
log.file: '%env(APP_LOG)%'
log.level: '%env(LOGGING_LEVEL)%'
In my sample app I verified PHP can read these environment variables with the following:
echo "Hello World!\n\n";
echo 'APP_LOG=' . (getenv('APP_LOG') ?? '__NULL__') . "\n";
echo 'LOG_LEVEL=' . (getenv('LOG_LEVEL') ?? '__NULL__') . "\n";
Which writes the following to the browser when I use my original services.yml with hard coded values.:
Hello World!
APP_LOG=php://stderr
LOG_LEVEL=debug
However, if I use the %env(VAR_NAME)% syntax in services.yml, I get the following error:
Fatal error: Uncaught UnexpectedValueException: The stream or file "env_PATH_a61e1e48db268605210ee2286597d6fb" could not be opened: failed to open stream: Permission denied in /var/www/vendor/monolog/monolog/src/Monolog/Handler/StreamHandler.php:107 Stack trace: #0 /var/www/vendor/monolog/monolog/src/Monolog/Handler/AbstractProcessingHandler.php(37): Monolog\Handler\StreamHandler->write(Array) #1 /var/www/vendor/monolog/monolog/src/Monolog/Logger.php(337): Monolog\Handler\AbstractProcessingHandler->handle(Array) #2 /var/www/vendor/monolog/monolog/src/Monolog/Logger.php(532): Monolog\Logger->addRecord(100, 'Initialized dep...', Array) #3 /var/www/html/index.php(17): Monolog\Logger->debug('Initialized dep...') #4 {main} thrown in /var/www/vendor/monolog/monolog/src/Monolog/Handler/StreamHandler.php on line 107
What am I doing wrong?
Ok you need a few things here. First of all you need version 3.3 of Symfony, which is still in beta. 3.2 was the released version when I encountered this. Second you need to "compile" the environment variables.
Edit your composer.json with the following values and run composer update. You might need to update other dependencies. You can substitute ^3.3 with dev-master.
"symfony/config": "^3.3",
"symfony/console": "^3.3",
"symfony/dependency-injection": "^3.3",
"symfony/yaml": "^3.3",
You will likely have to do this for symfony/__WHATEVER__ if you have other symfony components.
Now in you're code after you load your yaml configuration into your dependency container you compile it.
So after you're lines here (perhaps in bin/console):
$container = new ContainerBuilder();
$loader = new YamlFileLoader($container, new FileLocator(__DIR__ . DIRECTORY_SEPARATOR . '..'));
$loader->load('services.yml');
Do this:
$container->compile(true);
Your IDE's intellisense might tell you compile takes no parameters. That's ok. That's because compile() grabs its args indirectly via func_get_arg().
public function compile(/*$resolveEnvPlaceholders = false*/)
{
if (1 <= func_num_args()) {
$resolveEnvPlaceholders = func_get_arg(0);
} else {
. . .
}
References
Github issue where this was discussed
Pull request to add compile(true)
Using this command after loading your services.yaml file should help.
$containerBuilder->compile(true);
given your files gets also validated by the checks for proper configurations which this method also does. The parameter is $resolveEnvPlaceholders which makes environmental variables accessible to the yaml services configuration.

Trying to configure jmxeval plugin to Nagwin

I get this respond from CMD after checking the jmxeval plugin during setting up,
Command: check_jmxeval.bat
CMD output:
C:\Nagwin_x64\plugins>check_jmxeval.bat
org.kohsuke.args4j.CmdLineException: Argument "<filename>" is required
at org.kohsuke.args4j.CmdLineParser.parseArgument(CmdLineParser.java:448)
at com.adahas.tools.jmxeval.App.execute(App.java:43)
at com.adahas.tools.jmxeval.App.main(App.java:110)
Argument "<filename>" is required
java -jar jmxeval.jar <filename> [--schema <version>] [--set (--define) <name=value>] [--validate <boolean>]
--schema <version> : set schema version
--set (--define) <name=value> : set variable name to value
--validate <boolean> : set validation true|false, default is false
I am following the steps in google code project: https://code.google.com/archive/p/jmxeval/wikis/GettingStarted.wiki
Argument "<filename>" is required
You need to pass as argument to the script the path to the xml file containing the instructions on what to check

Integrate Specs2 results with Jenkins

I want to integrate the Specs2 test results with Jenkins.
I was added the below properties in sbt:
resolver:
"maven specs2" at "http://mvnrepository.com/artifact"
libraryDependencies:
"org.specs2" %% "specs2" % "2.0-RC1" % "test",
System Property:
testOptions in Test += Tests.Setup(() => System.setProperty("specs2.outDir", "/target/specs2-reports")) //Option1
//testOptions in Test += Tests.Setup(() => System.setProperty("specs2.junit.outDir", "/target/specs2-reports")) //Option2
testOptions in Test += Tests.Argument(TestFrameworks.Specs2, "console", "junitxml")
If I run the below command, it is not generating any specs reports in the above mentioned directory("/target/specs2-reports").
sbt> test
If I run the below command, it is asking for the directory as shown in the below error message:
sbt> test-only -- junitxml
[error] Could not run test code.model.UserSpec: java.lang.IllegalArgumentException: junitxml requires directory to be specified, example: junitxml(directory="xxx")
And it is working only if I give the directory as shown below:
sbt> test-only -- junitxml(directory="\target\specs-reports")
But sometimes its not generating all the specs report xmls (some times generating only one report, sometimes only two reports etc.).
If I give test-only -- junitxml(directory="\target\specs-reports") in the jenkins it is giving the below error.
[error] Not a valid key: junitxml (similar: ivy-xml)
[error] junitxml(
[error] ^
My main goal is, I want to generate the consolidated test reports in junit xml format and integrate with Jenkins. Kindly help me to solve my problem.
Best Regards,
Hari
The option for the junitxml output directory is: "specs2.junit.outDir" and the default value is "target/test-reports".
So if you don't change anything you could just instruct Jenkins to grab the xml files from "target/test-reports" which is what I usually do.
Also you might have to enclose your sbt commands in Jenkins with quotes. This is what I typically do:
"test-only -- html junitxml console"

Resources