I'm trying to use Jenkins' Publish Over SSH plugin to copy all files AND sub-directories of some given directory, but so far, I've only able to copy files and NOT directory.
I have a directory named foo in my workspace, and during the build, I want to copy everything in this directory to a remote server.
I've tried this pattern foo/**, but it doesn't copy all sub-directories.
Any suggestion? Or this is not the plugin I should be using?
Thanks
For recursive copy of directory you should give
foo/**/*
I verified this on my laptop using locally deploying Jenkins. It works fine.
Related
I'm trying to deploy with Jenkins. I'm sending to a server, and the folder www receives the files from the root project, but doesn't receive any of the folders and subfolders. How can I fix this?
I thought it could be permissions, so I tried to give chmod 777 to www, but didn't work anyway.
I have configured my server (hostname, username, and remote directory). Then I create my job with a git config (url, credentials, and branch to watch), job will run every minute. In the Build option, I put source files (*), Exec Command (commands for migrations and things like that). When it runs, my www folder was empty and receives files, but not folders.
I have found this link that explain how to do a best configuration for Jenkins: Jenkins transferring 0 files using publish over SSH plugin and I've discover why Jenkins just send files and not folders, this * should be **/* in Source files.
I have a structure of artifacts in another build:
/
/bundle/docs
/bundle/bin
/bundle/bin/scripts
I want to copy all files and sudirectories into the current job's workspace subfolder 'product1' from /bundle/bin. I expect to see in %WORKSAPCE%/product1 contents of /bundle/bin.
I've configured it like this:
Artifacts to copy: bundle/bin/**
But it creates %WORKSAPCE%/product1//bundle/bin instead.
Is it possible?
Seems like that's just how the plugin works. Your options are:
Keep the same configuration and manipulate directories later using sh mv (Linux) or cmd move (Windows) command. This is the workaround used in my environment.
Check the "Flatten directories" option (but this will mix together the /bundle/bin and /bundle/bin/scripts)
Improve the plugin and contribute your code to the community :-)
We are working on the configuration process for the Continuous Integration for some projects, we are using TFS and now we have a problem with some releases definitions. We want to use the Web Deploy package created in the Build process for the Deployment.
So far the build definition that we have is following:
enter image description here
The path for the creation of the package is the default, so we are able to find it inside the artifact directory. But the problem is when we nee extract the files in the target folder for the website in the server.
The Release definition that we are using is:
enter image description here
In this part in the Download artifact phase the agent doing the release has access to the published files in the build process, so here we know where the .zip package is, and we can have the path using $(System.ArtifactsDirectory), but if we use Deploy IIS App task, as you can see we are connecting to the servers where we are doing the release and $(System.ArtifactsDirectory) give us the local address for the artifacts where the agents are configured, and the variable give the path like C:\agent_work\r1\a, where C is local for the agent, and the .zip file doesn't exist in that address. And we can't build a new path like \Myserver\$(System.ArtifactsDirectory).... , because $(System.ArtifactsDirectory) is an absolute path and as a result the whole path it is : \Myserver\C:\Myfolder....
We need other solution, we have considered in the build process create the package in a different folder, and in this case we always know where is the package, we aren't depending on the agent folders, and in this way we can use as Web Deploy Package path: \Myserver\packagefolder\file.zip, but we would like to use a different solution.
Is there any way to have the artifact folder with a relative path or something like this????
You could use Windows Machine File Copy task to copy the package file from agent to the servers where you are doing the release.
Use this task to copy application files and other artifacts such as
PowerShell scripts and PowerShell-DSC modules that are required to
install the application on Windows Machines. It uses RoboCopy, the
command-line utility built for fast copying of data.
You could use a temporary folder handling the package file on the agent. Such as Build.StagingDirectory. Build variables
Add a packagelocation such as /p:PackageLocation="$(Build.StagingDirectory)\\ in your MSbuild Arguments. Then copy the files from StagingDirectory to your local folder in remote server by using Windows Machine File Copy task.
I am running the selenium test cases through jenkins. once the workspace is created and src folder is also there. I do not want the src folder to be exposed to all the users. how should I do ?
To restrict from other users,revoke the RWX permission for others(groups as well) to that directory and subdirectory using execute shell.
chmod -R o-rwx src
Here no role to play for jenkins
Thanks.
I used WS clean up plugin and it is able to delete the workspace or specified files and folders.
In the Jenkins copy artifacts plugins, it follows ant includes attributes of fileset.
If I give Output/**/*
it copies everything including the Output folder.
How can I tell to copy only everything inside Output folder but not the Output folder iteself.
source: Output/v2.1/xxx/*.*
Destination:v2.1/xxx/*.*
The answer is, its probably not possible in copy artifact plugin.
But the same objective can be reachable by another plugin, called Artifact deployer plugin. In that plugin when you deploy the artifact to local or remote server, you can specify the base directory, so the artifact is copied from the base directory excluding the base directory. In this case if I specify my base directory as Output then it copies what is inside the output directory.