As part of Release pipeline, created a cloud load test TASK.
What should be the path given under Load Test files folder?
On deployment, it gives me an error:
ERR -> The path for the load test files
D:\a\r1\a\SourceCI\drop\LoadTestproject\bin\Release does not exist.
Please provide a valid path.
The error message indicated that the loadtest file cannot be found. You need to make sure that the loadtest file is published to build artifact and downloaded during the release if your release use the build artifact. Or you can configure your release definition to use Git artifact directly.
Related
I am getting the following error when I am queuing a build in TFS 2017.
"Microsoft.Web.Publishing.targets 3009,5): Error : Web deployment task failed. (Package file 'C:\agent_work\3\s\TestApp\TestApp\release' does not have a .zip file name extension.)"
I am not sure why I am getting this error. I have hosted agent on the TFS server. I created the build definition according to this video:
www.youtube.com/watch?v=HjD4A-yeFTE
Does somebody have any idea? Appreciate your help!
Test at my side and everything works correctly.
Please try below items to narrow down the issue:
Please check the drop folder, if the .zip file actually being in the
place that you expected.
Try to specify the output path, eg:
/p:PackageLocation="$(build.artifactstagingdirectory)\\" , then
specify the Copy Root path as $(build.artifactstagingdirectory)
in Copy and Publish Build Artifacts task.
Explicitly specifying .zip extension in MSBuild Argument in the
Build Definition.
p:PackageLocation="$(BuildConfiguration)\package.zip
Check the build log, check if MSbuild works correctly. Also you can try the Msbuild command line locally to check if the .zip package can be generated.
Deploy a new agent on your Develop machine, create a new build
definition, then build with the new agent.
If still can not resolve the issue, just share the build logs here for further troubleshoot.
I am new to jenkins and I have tried downloading a zip archive of this workspace in jenkins, but I only get a part of it. Source folders like tensorflow or tools are not present inside the archive. Is this normal ?
If so, how do I get all of them inside a zip file ?
Use Archive Artifact plugin, to add your workspace into archive folder which will make it easily down-loadable.
But be aware that, an artifact in the Jenkins sense is the result of a build - the intended output of the build process.
A common convention is to put the result of a build into a build, target or bin directory.
The Jenkins archiver can use globs (target/*.jar) to easily pick up the right file even if you have a unique name per build.
putting a complete workspace into it will take lot of time.
I have been tasked with looking into using Jenkins as a build server. So far I have managed to pull a project from git, restore the Nuget packages, build the project and run the unit tests. However I am struggling to find out how to generate the artifact.
The way the business would like to have the build server generate a zip file to a directory on the build server or a remote server for the systems team then to pick up and deploy to the relevant location. E.g. given a windows service project the built bin directory would be zipped up and put in the relevant artifact directory.
I thought that in order to do this I add an archive the artifacts post-build action. However I am getting the below error:
‘Watchdog.WinService.Monitor/bin/Release/*.zip’ doesn’t match anything:
‘Watchdog.WinService.Monitor’ exists but not
‘Watchdog.WinService.Monitor/bin/Release/*.zip’
If I look in the workspace for this project I can browse to the bin directory and see all the files so I unsure what I have done wrong.
Can someone please let me know if what I am trying to accomplish is possible, and also if our approach to using Jenkins is correct?
The problem is that you try to create the artifact using the archive artifatcs step.
But the step is to collect artifacts and show them on the job page.
That means you need to create the artifact first e.g. using a shell or batch script.
You can combine this with the Flexible Publish Plugin.
When you select this as post build step you can create a conditional action that runs the artifact archive task and as condition executes the script that creates the zip file.
So if that fails the task won't be executed. Also it may causes your job to 'fail' but that may not be the case in your job.
When an ANT build step fails in my build I'd like to archive the logs in order to determine the problem. The relevant logs, however, are not located in the workspace, so I have to use a full path to them.
The standard artifact archiving feature does not work well with full paths, so first I have to copy the logs into the workspace within some build step so that I can later archive them. I do not want to incorporate the copying code into the original ANT script (it does not really belong there). On the other hand, since the build step fails the build I can't execute the code that copies the artifacts into the workspace as a separate build step as it is never reached.
I am considering using ANT -keep-going option, but how will I then fail the build?
Any other ideas (artifact plugins that handle full paths gracefully, for example)?
Update: I've worked around the problem by creating a symbolic link in the workspace to the directory that contains the files to be archived. Kludgy, but effective.
I would recommend using Flexible Publish plugin in conjunction with the Conditional Build Step plugin.
The Flexible Publish plugin allows you to schedule build steps AFTER the build steps have normally run. This allows you to catch both successful and failed builds and execute something - say a script that copies the files from OUTSIDE the workspace to INSIDE the workspace. The Conditional BuildSet plugin allows conditionalizing the steps so that they only run when the build fails. Using these two plugins, you can copy the files into the workspace upon failure, then archive them with the usual Jenkins mechanisms.
I'm using Jenkins and have the "Archive the Artifacts" step at the end of my builds to archive them into a zip file.
Instead of using this step, I'd like to use a script to push the artifacts to a remote server at the end of the build. The server I'm pushing to uses a REST API / HTTP PUT request in a script to upload files.
Note that I'm looking to access the artifact created in the same build. So if I'm on build #5, I want the artifacts from build #5, not build #4.
Is there any way to access this zip file with a script, in the same build that it was created in?
I need to upload this zip remotely and don't want to create another job to do so.
You can install one of the "Publish Over..." plugins to upload your artifacts at the end of a build.
The goal of the Publish Over plugins is to provide a consistent set of
features and behaviours when sending build artifacts ... somewhere.
See also the full list of "upload" plugins for other methods of publishing your artifacts.
Like #Christopher said, you can use any of the Publish Over plugins on the Jenkins Plugins page to upload the artifact to any of the
If you want to access the archived zip file from within the build itself, you can use the following link to access it:
http://<server>/job/${JOB_NAME}/lastSuccessfulBuild/artifact/<artifact name w/folder>
For example:
server = myserver.com
job name = myproject
artifact = del/project.zip
Your URL would be:
http://myserver.com/job/myproject/lastSuccessfulBuild/artifact/del/project.zip
EDIT: Question was changed. In any case, this would work for accessing the artifact of the previous build in the current one.
There is no way that I have found to access the "Archive the Artifacts" package of the build that generates it. This step always occurs last in the build. Accessing the URL prior to the build ending (during the build via script for example) results in a blank zip file. To get around this limitation, I'm making a second linked build job to grab the zip and run my script to deploy it.