I have setup Jenkins and everything is fine. It is connected(JNLP) and builds fine.
But how can I get the build back onto the master(the server hosting Jenkins)?
One thing could be to activate a script on the slave/node to copy the build. But since we have this very nice connected JNLP, my first thought was to get it through this connection?
Thanks in advance
Regards
christian
Usually, you'd use the artifacts mechanism to save off the results of the build (the .app for example) and then in another script to retrieve them and take the next step and Jenkins takes care of storing them for you.
To save them off, add a post-build action to Archive the artifacts and then give the path of the artifacts you want to save (optionally excluding some elements, etc).
When I store artifacts for iPhone builds, I usually store the -dSYM.zip and .ipa files.
If you want to use them in another build step, you can then use the Copy Artifact Plugin to copy them as a pre-build step and then operate on them later (for example: if you want to manually release the .ipa and dSYM.zip files to TestFlightApp or HockeyApp or another distribution mechanism).
Related
I have a Jenkins job that gets the code from version control and builds (like what a normal pipeline do), I was doing is that after building the project, I download the build and use FTP to transfer that build to the client's server then I unzip it and then copy the whole build because I copy whole build my application's down time is very high. (I have to use FTP because as a service provider we have some limitations and can't change this policy)
What I wanted to do is that Jenkins know what is changed when it is building so Jenkins will create a package with all the changes and with the correct path where the file should go, and I can download that package and copy that package and just run the package so whatever was changed only that should get updated.
Is that possible? Is there any plugin that I can use?
This really depends on the build tool/language you are using to build you application. I dont think there is a generic jenkins plugin.
Other idea would be to upload your package to a local Nexus server. Download after the next build and the compare the files from old and new build. With this information you can create a patch package for your clienst server.
I have 2 builds (A & B), which create their own artifacts which are dropped into the $(Build.ArtifactStagingDirectory) and then published to 'Visual Studio Team Services/TFS'
Everything works fine for build A, but I am finding that when I am wanting to download an artifact from build B, that artifact cannot be found. When I look at the error message, I can see that TFS is actually looking for it from build A.
I dont want to point to a specific build number for build B, instead just want to point to the latest build of B.
Anyone know how I can update the reference so that TFS is looking at build B?
If I use the 'Download Artifact' Task, I can get this to work if I point to a 'Specific Build', but it does not work if I use the option 'Current Build'
Try below steps to achieve that:
Create 2 build definitions to queue Build A and B :
Build Definition A -- Build A
Build Definition B -- Build B
Create a release definition, add Build Definition A and Build Definition B as the artifacts source.
Trigger the release
Release works with multiple artifacts:
UPDATE1:
The Download Artifact task only works on single artifact, multiple artifacts doesn't work.
Besides, why you have to use the Download Artifact task? By default the release definition has enabled the Download Artifact, that means it will download the multiple artifacts automatically, then you just need to use the multiple artifacts directly in other tasks.
UPDATE2:
Since you already linked multiple artifacts in your release definition, that means you have to download them to use on subsequent Phases/tasks. But based on your description seems you want to use the Download Artifact task to down the latest version of one of them. That seems a bit contradictory for your requirements.
I can think of is that you can download the artifacts to a staging folder, then add copy task to copy the artifacts which you need in your phases.
Besides if you want to download all the latest artifacts, you can try this extension: Download Artifacts
I'm trying to understand what it does. Currently, this is the value that I see - dist/.tgz
From what I understand, our grunt scripts makes a tgz file. However, I don't know what Jenkins does.
I got an error when I didn't specify any pattern
ERROR: No artifacts are configured for archiving.
You probably forgot to set the file pattern, so please go back to the configuration and specify it.
If you really did mean to archive all the files in the workspace, please specify "**"
Build step 'Archive the artifacts' changed build result to FAILURE
Most importantly, it allows you to archive items from your job's workspace in a persistent and accessible way, linked to the specific build number.
I.e. you have a job Build that compiles your sources into program.exe, archiving it linked to the build it was produced by, and keeping it accessible for developers or other jobs can come in very handy.
Additionally, archived artifacts are transferred to your jenkins master, so your job can run on any slave, but your archived files will be always accessible, even when that particular slave is offline.
Also, with the right configuration and plugins, other projects can access archived artifacts from other projects. I.e. a job Deploy that uploads your program.exe to some location is as trivial as copying the archived artifact of the last successful build into its workspace for the upload.
Theres quite some information on SO already, i.e. here.
I'm using TFS 2013 on premises. I have four build agents configured on a Build machine. Several build definitions compile ASP .NET websites. I configured the msbuild parameters to deploy the IIS application to the integration server, which sits out there in Rackspace.
By default webdeploy does differential deployments by comparing file dates. In my case that's a big plus because copying files from our network to Rackspace takes quite some time. Now, in order to preserve file dates the build agent has to compile the same base set of source code. On every build only the differential source code yields a new DLL, minimizing the number of files deployed.
All of that works fine, with a caveat: a given build definition has to be assigned to a build agent (by agent name or tag). The problem is I create a lot of contingency when all builds assigned to the same agent are queued up. They wait in line until the previous build is done.
In an ideal world any agent should be able to take care of any build, but the source code being compiled has to be the same, regardless of the agent.
I tried changing the working folder of all agents to point to the same location but I get an error because two agents can't be mapped to the same folder. I guess there is one workspace per agent.
Any ideas?
Finally I found a way to do this. Here are all the changes you need to do:
By default the working folder of each agent is $(SystemDrive)\Builds\$(BuildAgentId)\$(BuildDefinitionPath). That means there's one working folder per BuildAgentId. I changed it so that all Agents share the same folder: $(SystemDrive)\Builds\WorkingFolder\$(BuildDefinitionPath)
By default at runtime the workflow creates a workspace that looks like "[BuildDefinitionId][AgentId][MachineName]". Because all agents share the same working folder there's an error trying to create each separate workspace. The solution to this is in the build definition: Edit the xaml and look for an activity called "Get sources from Team Foundation Version Control". There's a property called WrokspaceName. Since I want to have one workspace per build definition I set that property to the BuildDetail.BuildDefinition.Name.
Save your customized build template and create a build that uses it.
Make sure the option "1. TF VersionControl/1. Clean workspace" is set to False. Otherwise the build will wipe out all the source code on every build.
Make sure the option "2. Build/3. Clean build" is set to false. Otherwise the build will wipeout the output binaries on every build.
With this setup you can queue up the same build on any agent, and all of them will point to the same source code and bin output. When the source code changes only the affected binaries are recompiled. I have a custom step in the template that deploys the output files to IIS, to all the servers in our webfarm, using msdeploy.exe. Now my builds+deployments take one or two minutes, because only the dlls or content that changed during the build are synchronized to the servers.
You can't run two build agents in the same folder. The point of build agents is to run multiple builds in parallel, usually on separate PCs. If you try to run them on the same source code, then (a) it's pointless as two build of exactly the same source should produce identical results, and (b) they are almost certainly going to trip over each other and cause the builds to fail or produce unexpected results.
If you want to be able to build and then deploy a series of versions of your codebase, then there are two options:
if you queue up multiple builds, then the last one will "win", so the intermediate builds are of no real value. So if you check in New code before your first build completes, you may as well stop the active build and start a new one. you should be asking yourself why the build is so slow, or why you are checking in changes so often that this is necessary.
if each build produces an incremental update to the deployed result, then you need to pass the output of your builds to some deployment agent that is able to diff it against the deployed version and send only the changes to be deployed. This could be set up to gather results from multiple build agents if that would be beneficial.
but I wonder if perhaps your build Is slow because you are doing a complete build each time (which cleans the build folder, gets all the sources, and does a full rebuild), when what you want is an incremental build (which gets the latest changes, compiles only what is affected, and complete quickly). perhaps you should investigate making your build incremental.
1) I want to run my testsuite on every commit, so Jenkins should poll SVN, make an update and run the tests. That's possible without problems, BUT...
2) I also want to be able to create builds, which additionally covers creating zip archives and transferring them via scp to a remote server.
How is this possible within the same project?
Have you checked Jenkins SCP Plugin? You can add an aditional build step in the existing project itself which creates your builds, generates zip archives etc. And at last as a post build action publish the artifacts to a SCP repository. Will that help?