ClearCase claims directory is both checked out and not checked out - jenkins

I am using Jenkins to manipulate files and directories in Base ClearCase. I am executing the batch file
cd /D M:\view\path\to\stuff\Jenkins
echo Test to see if Jenkins can add things to ClearCase> foo.txt
cleartool checkout .
cleartool mkelem foo.txt
and I get the output
M:\jenkins_dynamic\CSTS\01_Build\Automated_Build\Jenkins>cleartool checkout .
cleartool: Error: Element "." is already checked out to view "jenkins_dynamic".
M:\jenkins_dynamic\CSTS\01_Build\Automated_Build\Jenkins>cleartool mkelem foo.txt
cleartool: Error: Can't modify directory "." because it is not checked out.
What am I missing here?

Double-check your dynamic view config spec, as in this technote, with cleartool catcs:
cleartool catcs -tag jenkins_dynamic
The cause of this error relates to the current view's config_spec; it may have a -mkbranch rule or use an existing branch name for a branch type that is not mastered at the local site.
If the parent directory can't be checked out on the non-mastered branch, then new elements can't be created in that directory.
Example: this config spec (without the loading rules, since you are in a dynamic view).
Make sure all the parents folders are accessible and visible.
So that kind of error could be in the context of a multi-site ClearCase.

Related

How to name local workspace using command line interfaces while checking out files from CVS repository by means of a tag

Right now i am using below mentioned cvs command line argument for checking out files from CVS repository.
# Module1_1_20_2017 is the tag name.
#Test/user_Test/work is the module name.
cvs checkout -r Module1_1_20_2017 Test/user_Test/workload
I want contents of this Test/user_Test/workload module to be checked out into a local workspace folder named as work which is located at C:\Jenkins\jobs\workspace\work.
But every time when i use the above command it creates empty directories after this C:\Jenkins\jobs\workspace\work local workspace as C:\Jenkins\jobs\workspace\work\Test\user_Test\workload.
I want to get rid of these entire folders Test\user_Test\workload and after checking out files from Test/user_Test/work this module the local workspace should look like C:\Jenkins\jobs\workspace\work (not C:\Jenkins\jobs\workspace\work\Test\user_Test\workload) and this local workspace C:\Jenkins\jobs\workspace\work should contain all the files of this Test/user_Test/workload module.
What cvs command line will satisfy this requirement?In short I want to create a local name as in jenkins job configuration shown in the picture attached below.
Use the form cvs checkout -d <path> <module>.
In your case that is cvs checkout -d work Test/user_Test/workload
(Did cvs checkout --help not give you this answer?)

Read DSL from file in Jenkins outside of workspace

I know its possible to run a .dsl file from an external source instead of just writing the code of the flow in the job's description, but every time I try to run lets say:
/home/flows/flow_script.dsl
I get the following error:
java.io.FileNotFoundException:/home/flows/flow_script.dsl (No such file or directory)
The path is correct, I can see the file through that path from the shell, but it doesnt let me select anything outside the "builds workspace" apparetly.
I recently ran into this very issue: my DSL script was outside of my workspace (installed via a package). The problem is that the DSL Scripts path is an Ant format that only allows specific patterns (and not absolute paths).
My workaround is hacky, but it did work: add an Execute Shell step before the "Process Job DSLs" step that symlinks the external directory into the workspace.
Something like this:
echo "Creating a symlink from /home/flows to workspace"
ln -sf "/home/flows" .flows
Then you can set the DSL Scripts path to ".flows/flow_script.dsl".
This has some additional caveats, of course: the directory you're symlinking from will need to be accessible by the jenkins user. And it likely violates a lot of best practices.

Dynamic views of ClearCase not integrating in Jenkins

We use ClearCase as our Version Control Tool. While Integrating Dynamic view to Jenkins in my job at Execute Shell, it is not getting integrated and throwing errors.
My Commands at Execute Shell:
/usr/atria/bin/cleartool setview johns
/usr/atria/bin/cleartool catcs
cd /vob1/pts/
ls
pwd
First thing, it is not identifying the cleartool path and view.
Secondly, it is not entering into VOB (/vob1/pts).
None of the commands working ... like pwd and ls.
Don't use cleartool setview (as I mention in this answer): it spawns a sub-shell which makes any subsequent command not working (because those commands are executed in the parent shell, for which /vobs has been assigned to a view)
Always use the full path of the view: /view/yourView/vobs/yourVob
In your case:
cd /view/johns/vobs/pts
/usr/atria/bin/cleartool catcs
ls
pwd
If the Jenkins ClearCase Plugin is using setview anyway, then, as explained in this blog post, you must make sure it uses a -exe directive:
cleartool setview -exec /tmp/script.sh
(with /tmp/script.sh including all your other commands)

Jenkins and SCP

I have set up a jenkins build and everything is working fine except the very last step.
The whole build creates a directory called: build
This directory contains a web-inf and all the files in it I would like to publish via SCP to a different location, so that all the content of the build/web-inf folder will become the content of the target folder.
The settings for jenkins scp plugin are (it is a post-build step):
source: build/web-inf/**
destination: public_html/
that results in:
public_html/build/web-inf/...
but should be:
public_html/...
(the keep hierarchy box is ticked)
how can I make that happen??
EDIT
I could solve the problem without any additional script. The solution is so simple that my question turned out to be stupid.
All I did was telling ant to copy all the webfiles to ./public_html instead of ./build/web-inf/ what made the jenkins scp copy all files from public_html to public_html exactly as it was intended to.
If your goal is just to SCP files generated during the build, and the plugin doesn't seem to be working (I couldn't see anything wrong in your configuration) you can use an "Execute shell" build step and type the scp command something like (try it in a shell first in your job's build directory to get the syntax right):
scp -r build/web-inf/* user#host:/destination-directory

CVS checkout without all of extra folders

I want to checkout a specific folder from deep within a CVS module into my Hudson / Jenkins workspace. Stripping off the other options (such as pruning, branch, etc) the CVS command is ...
cvs checkout -d workspace module\a\b\c\d\e\f
This causes my folder to contain a child folder 'a' and that contains 'b' and that contains ... well you get the idea. All of them are empty until you get down to folder 'f'.
What I'd really like is for myfolder to contain the contents of f. Does CVS support this functionality (without defining f as a module)?
And for bonus karma ... Can I get Jenkins to use this option with a .cvsrc or some other mechanism?
I don't get the behaviour you describe. When I move to an empty directory and do
cvs checkout -d fox modules/a/quick/brown/fox
I just get a new directory called fox containing the contents of the directory I requested. (Note the forward slashes.)
However, if I do
cvs checkout modules/a/quick/brown/fox
then I get what you describe.
I'm using the latest FSF build of CVS on windows, http://ftp.gnu.org/non-gnu/cvs/binary/feature/x86-woe/cvs-1-12-13a.zip .
There is a file called "modules", under your CVSROOT folder.
You can edit it, and a line like the following:
###shortcut name actual path########
f /a/b/c/d/e/f
Check this file back in. Once it sets in, you can just use
cvs checkout -d workspace f
Also, in Hudson, you can (in the Modules(s) ) box, just put f, and it should directly download only f, instead of the entire structure.
Once that is down, you could rename it using a shell/command.
More in General:
Go up 1 level above where you checked out
cvs co -r "TAG"

Resources