I am trying to copy files in remote serve to local using scp task in ant. The thing is, I want to exclude certain files with extension *.txt, so I tried using excludes tag. But it seems not to work. And It copies all the files including the files with extension *.txt
<scp file="username:pwd#remotemachine:/path/to/files/*" todir="copycontent" trust="true">
<fileset dir="files" >
<exclude name="**/*.txt"/>
</fileset>
</scp>
The Ant SCP task has some limitations for your scenario:
"FileSet only works for copying files from the local machine to a remote machine." (from the Ant SCP manual page)
The SCP element itself does not provide attributes for includes/excludes patterns
So options for selective copying from remote to local are limited. More flexibility for copying from local to remote (using fileset).
Rather than excluding *.txt, you could instead include one or more file patterns one or more scp blocks.
Or an alternative if the local system is unix-based could be to exec rsync, as suggested in this answer to a similar question.
Related
I'm working with Jenkins 2 and trying to copy artifacts between jobs and in turn to an S3 bucket.
I have a simple web build which produces artifacts in /dist/public which I'd like to upload into the S3 bucket.
So once the job completes, I have a folder /dist in the workspace root. Jenkins gives you the ability to copy artifacts between jobs which leverages Ant's fileset.
The issue I'm having is that this is a restricted subset of Ant and all you're provided is include & exclude paths.
I can use dist/public/**/** however this copies the parent directories across also.
What I would prefer is to only copy the content of public/ but after doing some reading it seems this may be difficult to do without a custom Ant tasks, etc.
If you copy files by Ant, you should set:
<fileset dir="/dist/public"/>
at you copy task, or you can use flatten attribute.
If you use Jenkins artefact collector (as I do), I think now you have to copy this files to workspace root (see: Copy Artifact Plugin).
I have a project where I need to download a set of zip files from our nexus and unpack them.
I have the dependency and settings file set up and working but how do I get the files to my local directory so they can be unpacked?
/J
Found a way:
Create a copy task like:
<copy toDir="./dependencies/">
<fileset refId="local_build_deps" />
</copy>
Only problem it copies more than I want but at least I get the files I need.
Anyone with a better solution is welcome to respond.
/J
As I've readen ant doesn't provide 'fileset' attribute when downloading files from remote machine via scp task. It works when sending files from local machine, but It doesn't work when starting in remote machine. That's from documentation. So in remote machine I have some folders and load of files in every directory. I want to download all folders and one specified file from every of them. Downloading all files and then deleting unneeded files won't be solution beacuse there are thousands of files.
So. How to download all folders (only create the on disk without content) from specified directory in remote machine and then download one specified file from every directory in remote machine and put it to corresponding folder using ant?
Since you haven't specified I'll assume that your local and remote systems are unix based and therefore support rsync and ssh.
A more cross-platform solution is challenging...
Example
Configure SSH
Generate an SSH key (specify an empty passphrase):
ssh-keygen -f rsync
This will generate 2 files, corresponding to the private and public keys:
|-- rsync
`-- rsync.pub
Install the public key on the remote server
ssh-copy-id -i rsync.pub user#remote
Test that you can now perform a password-less login, using the ssh private key to authenticate:
ssh -i rsync user#remote
ANT
The "download" target invokes rsync to copy the remote file system tree locally. If required one can additionally specify rsync exclusions (see the rsync doco).
<project name="rsync" default="download">
<target name="download">
<exec executable="rsync">
<arg line="-avz -e 'ssh -i rsync' user#remote:/path/to/data/ data"/>
</exec>
</target>
<target name="clean">
<delete dir="data"/>
</target>
</project>
Is there any way to use mget within Ant, without using the exec task?
Here is the rundown. I have to connect to a third party server that does not support globbing with FTP get, the server requires the client use mget to do a glob.
Here is my task:
<ftp server="host" userid="user" password="pass" action="get">
<fileset dir="mydir">
<include name="pdf/*_PDF.ZIP.pgp"/>
</fileset>
</ftp>
It does not return any files. When I log in directly (Linux FTP command line client) I can see files. "get *" fails but "mget *" works.
Any ideas how to get Ant to use mget instead of get?
Ant uses commons-net.jar for the FTPTask.
If you don't care about platform independence the easiest way would be to use a specific executable and the exec task. You could check in the mget.exe along with the project so the user does not need to install it.
If you need platform independence you will probably need to write you own FTP task. You could copy the one in the Ant source code and make the necessary modifications. You could also choose another FTP library if you want but I think that commons net should have the necessary features.
I would like to optimize my scp deployment which currently copies all files to only copy files that have changed since the last build. I believe it should be possible with the current setup somehow, but I don't know how to do this.
I have the following:
Project/src/blah/blah/ <---- files I am editing (mostly PHP in this case, some static assets)
Project/build <------- I have a local build step that I use to copy the files to here
I have an scp task right now that copies all of Project/build out to a remote server when I need it.
Is it possible to somehow take advantage of this extra "build" directory to accomplish what I want -- meaning I only want to upload the "diff" between src/** and build/**. Is it possible to somehow retrieve this as a fileset in ANT and then scp that?
I do realize that what it means is that if I somehow delete/mess around with files on the server in between, the ANT script would not notice, but for me this is okay.
You can tell ant scp to only copy files which have been modified since the last push using the modified tag like so:
<scp trust="true" sftp="true"... >
<fileset dir="${local.dir}">
<modified>
<param name="cache.cachefile" value="localdev.cache"/>
</modified>
</fileset>
</scp>
The first time you use this, it will send all files and cache the timestamps in the cachefile declared in the param. After that, it will only send the modified ones.
Tested and verified in sftp mode.
I think you need to use rsync instead. I found the following article that answers your question.
In a nutshell rsync will resume where it left off and it should be possible to tunnel it over ssh.