We use ClearCase as our Version Control Tool. While Integrating Dynamic view to Jenkins in my job at Execute Shell, it is not getting integrated and throwing errors.
My Commands at Execute Shell:
/usr/atria/bin/cleartool setview johns
/usr/atria/bin/cleartool catcs
cd /vob1/pts/
ls
pwd
First thing, it is not identifying the cleartool path and view.
Secondly, it is not entering into VOB (/vob1/pts).
None of the commands working ... like pwd and ls.
Don't use cleartool setview (as I mention in this answer): it spawns a sub-shell which makes any subsequent command not working (because those commands are executed in the parent shell, for which /vobs has been assigned to a view)
Always use the full path of the view: /view/yourView/vobs/yourVob
In your case:
cd /view/johns/vobs/pts
/usr/atria/bin/cleartool catcs
ls
pwd
If the Jenkins ClearCase Plugin is using setview anyway, then, as explained in this blog post, you must make sure it uses a -exe directive:
cleartool setview -exec /tmp/script.sh
(with /tmp/script.sh including all your other commands)
Related
I am using jenkins, when executing Hg purge --all -R D:\path in jenkins pipeline it will return hg: unknown command 'purge'.
But when executing the same command in Windows Terminal it will execute correctly
Note : Jenkins in the same machine the Mercurial exist and purge plugin exist in the big repo and and sub repo
How to solve that issue ?
On the rights on delirium:
You may have more than one Mercurial on your host and Jenkins configured to use another instance (without extension), than command-prompt
You run Jenkins under different user and different mercurial.ini used, without extension (per-repository config can't be read)
You can (select any method):
enable extension on system-wide basis (and disable it in repositories, there you don't want it)
replace extension by "old-style" mercurial alias
forcibly enable extension for single command hg purge --config extensions.purge=
From my POV, system-wide extension in mercurial.ini will be most easy and logical solution
I am using Jenkins to manipulate files and directories in Base ClearCase. I am executing the batch file
cd /D M:\view\path\to\stuff\Jenkins
echo Test to see if Jenkins can add things to ClearCase> foo.txt
cleartool checkout .
cleartool mkelem foo.txt
and I get the output
M:\jenkins_dynamic\CSTS\01_Build\Automated_Build\Jenkins>cleartool checkout .
cleartool: Error: Element "." is already checked out to view "jenkins_dynamic".
M:\jenkins_dynamic\CSTS\01_Build\Automated_Build\Jenkins>cleartool mkelem foo.txt
cleartool: Error: Can't modify directory "." because it is not checked out.
What am I missing here?
Double-check your dynamic view config spec, as in this technote, with cleartool catcs:
cleartool catcs -tag jenkins_dynamic
The cause of this error relates to the current view's config_spec; it may have a -mkbranch rule or use an existing branch name for a branch type that is not mastered at the local site.
If the parent directory can't be checked out on the non-mastered branch, then new elements can't be created in that directory.
Example: this config spec (without the loading rules, since you are in a dynamic view).
Make sure all the parents folders are accessible and visible.
So that kind of error could be in the context of a multi-site ClearCase.
I am trying to run jmeter(.jmx) file using Jenkins by passing Number of Threads as a Parameter. Build getting success but .jmx file is not running. And also not showing any error in console.Following are my setup
In Jmeter Thread properties --Number of thread (Users)- ${__P(USERS,1)
In Jenkins job Created build string parameter -- USER_COUNT
Build using Execute shell and following is my command
cd /apache-jmeter-2.13/bin
./jmeter.sh -n -t /jmxFiles/Jbpm6Rest3Jenkins1.jmx -l /jmxFiles/SIP.jtl -JUSERS=%USER_COUNT%
While starting build passing USER_COUNT value from Jenkins
Following is the Jenkins console output
Jenkins Console Output
Not sure where i am doing wrong.
Note: Not using Ant/Maven to run jmx file.
As the other answer mentioned, change the %_USER_COUNT% to ${USER_COUNT}.
But is there any specific reason you are not using Ant/Maven?
Eventhough you should be able to run your jmeter test using a simple shell script, using Ant/Maven might make your life easier while generating report, charts etc.
I would advise you check the below links.
http://www.testautomationguru.com/jmeter-continuous-performance-testing-part1/
http://www.testautomationguru.com/jmeter-continuous-performance-testing-part2/
From the output, seems you are running a shell build step ($ /bin/sh -xe ....), which means your Jenkins runs on Linux (?). Also the paths use forward slash (/)....
You should put the string ${USER_COUNT} as part of your command (%USER_COUNT% is windows style).
I hope this helps.
I clone a project from GIT to my workspace but I want the plugin I use (Ansible) to start in a subdirectory.
Given I have a structure like:
root-
|
--dir 1
|
--dir 2
I want the plugin to run in dir 2. (The ansible plugin seems to require the playbook to be in the root dir to work, correct me if I'm wrong. You can not specify a path to the playbook in the call to the plugin)
You should be able to specify the path to the playbook as this:
${WORKSPACE}/dir2/playbook.yml
Which uses Jenkins' built-in variable to use the workspace path and then go relative from there.
That said, I've not used the Ansible plugin and we simply use a shell script in the Jenkins job that for some of our repos then cds to the playbook root directory before calling the playbook like this:
cd path/to/playbook_root
ansible-playbook -i inventory/environment playbook.yml
This seems to work fine but I'm not sure if you get some added benefit from the Ansible plugin.
I'm looking at a Jenkins job and trying to understand it.
I have an Execute shell command box in my Build section:
> mkdir mydir
> cd mydir
>
> svn export --force https://example.com/repo/mydir .
When Jenkins is done executing that command, and moves on to the next build step, what is its working directory?
workspece-root/ or workspace-root/mydir ?
As the next step, I have Invoke top-level Maven targets (still in the Build section).
What I really want to know is: why does that execute successfully?
Is it because Jenkins automatically moves back to the workspace-root/ folder after executing a shell command box, or is it because the next job is a "top-level" job, and Jenkins therefore changes back to the workspace-root/?
Each build step is a separate process that Jenkins spawns off. They don't share anything, neither current directory, nor environment variables set/changed within the build step. Each new build step starts by spawning a new process off the parent process (the one running Jenkins)
It's not that Jenkins "move back" to $WORKSPACE. It's that Jenkins discards the previous session.
I lately saw that if you print the CWD , I would get the Project_NAME.
E.g
D:\jenkins\workspace\My_Project
Any script you might be running wont be found. Hence we can do a "CD path" before we start out scripts.
Slav's explanation is very good and I thought of complementing it by providing a real world example that shows how multiple Windows batch commands look like even if they work in the same directory:
Command 1
REM #ensures that all npm packages are downloaded
cd "%WORKSPACE%"
npm install
Command 2
REM #performs a prod-mode build of the project
cd "%WORKSPACE%"
ng build --prod --aot=true -environment=pp
So, each one ensure that current working directory points to the current project directory.