I would like to optimize my scp deployment which currently copies all files to only copy files that have changed since the last build. I believe it should be possible with the current setup somehow, but I don't know how to do this.
I have the following:
Project/src/blah/blah/ <---- files I am editing (mostly PHP in this case, some static assets)
Project/build <------- I have a local build step that I use to copy the files to here
I have an scp task right now that copies all of Project/build out to a remote server when I need it.
Is it possible to somehow take advantage of this extra "build" directory to accomplish what I want -- meaning I only want to upload the "diff" between src/** and build/**. Is it possible to somehow retrieve this as a fileset in ANT and then scp that?
I do realize that what it means is that if I somehow delete/mess around with files on the server in between, the ANT script would not notice, but for me this is okay.
You can tell ant scp to only copy files which have been modified since the last push using the modified tag like so:
<scp trust="true" sftp="true"... >
<fileset dir="${local.dir}">
<modified>
<param name="cache.cachefile" value="localdev.cache"/>
</modified>
</fileset>
</scp>
The first time you use this, it will send all files and cache the timestamps in the cachefile declared in the param. After that, it will only send the modified ones.
Tested and verified in sftp mode.
I think you need to use rsync instead. I found the following article that answers your question.
In a nutshell rsync will resume where it left off and it should be possible to tunnel it over ssh.
Related
I want to unleash the power of parallelism by running some Ivy-related Ant tasks on local Bamboo agents. Our Bamboo machine has plenties of CPU horse power and RAM.
If I split my build task into parallel jobs, each that will produce a different artifact based on the result of ivy-retrieve, I have solved my problem in the theory.
In practice, unfortunately, if two Ant tasks happen, for some reason, to run simultaneously on the same machine and the same organization-artifact, they will clash and one gets an XML error.
I don't have the exact error message with me because 1) the problem is randomic to reproduce and 2) I have already done lots of work to put all the jobs into a sequential one. But I have a clear idea of what is happening.
When Ant runs an ivy-retrieve, code below, it will use local user's cache directory, which happens to be /home/bamboo/.ivy2/cache. There I can find lots of resolved-[org]-[artifact]-[version].xml files (each is a different build version of my project). The problem occurs when I want to run the ivy-retrieve task twice, one for example for compile configuration and another for runtime. The two XMLs will clash and Ivy will report a SAX error in reading one of the files, because it looks like it's being written at the moment.
If I run the job on remote agents I expect no problem, but hey I already have 5 local agents and Bamboo won't fire remote agents if the locals are free.
Unfortunately all of my jobs, independent each other, require a different ivy-retrieve. Currently I run them in sequence.
The question is
Is it possible to tell Ivy, running on Bamboo agent, to use a temporary unique cache directory for its work on dependencies.xml files rather than to use global cache? Or at most to synchronize access to files?
The second option would have parallel Ant process read&write the cached dependencies.xml file mutually exclusively. So what they read will be always a consistent file (being the same exact file, I don't care if a process overwrites another)
Ivy has 2 caches - repository cache and resolution cache. The second one is overwritten each resolution, and should never be used by multiple processes at the same time.
Set an environment variable pointing to a temporary directory in
your bamboo agent.
Create a separate ivysettings.xml file for your project.
User an environment variable in project's ivysettings.xml to setup a cache directory.
Here is an example of ivysettings.xml:
<ivysettings>
<properties environment="env" />
<caches resolutionCacheDir="${env.TEMP_RESOLUTION_CACHE}" />
<settings defaultResolver="local" />
<statuses default="development">
<status name="release" integration="false"/>
<status name="integration" integration="true"/>
<status name="development" integration="true"/>
</statuses>
...
</ivysettings>
Or you can give a try to a lock-strategy. I haven't tried it.
I am invoking a windows batch command from Jenkins, after i get the latest version of my project from SVN. the windows batch command just performs certain file copying, after the all the files are retrieved from SVN and runs an ANT build. In the ANT build process, i am generating a JSP file where i have tried to capture the in the following fashion.
%BUILD_TAG%-%BUILD_NUMBER%-%BUILD_ID%-%SVN_REVISION%
Unfortunately none of this information is understood by the build process and it just writes %BUILD_TAG%-%BUILD_NUMBER%-%BUILD_ID%-%SVN_REVISION% into the file.
Could you please let me know if there is a way to capture these information into a file in the way i am trying to do? if not, could you direct me on how these information could be captured into a JSP file during the process that we are following?
BUILD_TAG, SVN_REVISION, etc are all environment variables that are present during a Jenkins build, and to use them in Ant, you would use them as any other environment variable from Ant
First, add a line:
<property environment="env"/>
Then you can reference any environment variable with this prefix, like:
${env.VAR_NAME}
So in your case, you'd do:
${env.BUILD_TAG}-${env.BUILD_NUMBER}-${env.BUILD_ID}-${env.SVN_REVISION}
I am trying to copy files in remote serve to local using scp task in ant. The thing is, I want to exclude certain files with extension *.txt, so I tried using excludes tag. But it seems not to work. And It copies all the files including the files with extension *.txt
<scp file="username:pwd#remotemachine:/path/to/files/*" todir="copycontent" trust="true">
<fileset dir="files" >
<exclude name="**/*.txt"/>
</fileset>
</scp>
The Ant SCP task has some limitations for your scenario:
"FileSet only works for copying files from the local machine to a remote machine." (from the Ant SCP manual page)
The SCP element itself does not provide attributes for includes/excludes patterns
So options for selective copying from remote to local are limited. More flexibility for copying from local to remote (using fileset).
Rather than excluding *.txt, you could instead include one or more file patterns one or more scp blocks.
Or an alternative if the local system is unix-based could be to exec rsync, as suggested in this answer to a similar question.
I'm running an ant build through Jenkins and on the stage where it is deploying to windows-share its returning the following error:
Failed to copy FILE to FILE2 due to failed to create the parent directory for FILE2 (I've taken the paths out to keep the question shorter).
I'm guessing that there might be some problem with permissions with the jenkins default user but this problem has only just started occurring, and any help would be great.
Thanks
This is a pretty old question but I thought I'd come back and complete it with a short update as to what was actually going on. Someone had changed the password for the user that was logged on to the vm that jenkins was running on, and when it was trying to create the directory to stick the files into it was running into permissions errors. Only problem was the error message wasn't very descriptive.
So in the end this was an infrastructure problem rather than anything to do with the ant script.
I take it you're doing something like this:
<copy file="${from.dir}/${from.file}"
tofile="${to.dir}/${to.file}"/>
And, you're getting an error that ${to.dir} doesn't exist.
In earlier versions of Ant, you definitely had to create the directory before doing a copy:
<mkdir dir="${to.dir}"/>
<copy file="${from.dir}/${from.file}"
tofile="${to.dir}/${to.file}"/>
I think I also noticed that later versions of Ant will create the directory for you when that directory didn't exist. I've always been in the habit of putting <mkdir/> in front of any task that creates a new file in a new directory, including things like <zip/> and <tar/>.
Here are some questions:
Do your users also use Ant to run their builds? I know that this isn't always the case. Many users use Eclipse and don't bother with Ant.
Is the version of Ant your users have vs. what your users have the same?
Do you do a clean? Jenkins can emulate a clean either by doing an update, reverting and removing unversioned element, or by simply creating a brand new working directory. If your users don't remove the destination directory where ${to.file} is being placed, it might work locally, but not on Jenkins.
Can you manually run Ant from the Jenkins working directory. If so, what results do you get? (Remember to disable this Jenkins job before doing this. You don't want Jenkins to do a build while you're experimenting in the working directory.)
I need to create a war file through ant build without a manifest file. I want the war to me created without the manifest file.
I am using tag in build.xml to create the war.
you can use <zip/> task with .war extension for destfile attribute to achieve the same result as the <war/> task (without manifest.mf).
<zip destfile="..\...\WarFile.war"basedir="..\basedir" update="true"/>
in case WarFile.war already exists, although you've written I need to create a war file , the attribute update="true" will be of use (by only updating and not overwriting the file).
All a war file is is a zip file in a specific format. That is libraries go in a particular place, class files go elsewhere, etc. The <war> task has sub-entities like <classes/> and <lib> that make configuring a war file correctly without knowing exactly where everything has to go.
However, you can correctly format the war yourself, and use <zip/>.
Why don't you want a manifest file? A manifest, if you don't specify anything, will contain nothing but the Java version used for the build, and the Ant version and won't affect the execution of your war at all.
What you can do is put useful information into the manifest. For example, we use Jenkins for our builds, and we put in the Jenkins project name and the build number which helps us understand what was included in the war.
There's no reason not to use a manifest file. And, a manifest file can contain useful information (which is accessible to the Java program too).