How do I add external files to a Jenkins job? - jenkins

I'm working with Jenkins to make a build of a Visual Studio C++ project I have in a git repository. However, although I don't upload them to github, my project needs SDL2's external libraries and DLL as well as some assets.
How can I add them to my jenkins job to generate a build of my project? I want to add the SDL2's libs and DLL as well as my assets folder and place them in the job workspace, in a way that won't make me upload the files everytime jenkins builds my project. But I haven't found anything that clears that for me.
Thanks!!

If your project needs assets (something like pixel art), this should probably be uploaded to GitHub along with your code. Another option is to uploaded assets to some other public/private repository that Jenkins can access.
As for the SDL2 libraries and DLL, you are correct that this should not be uploaded to GitHub. Instead, I would recommend using something like Docker to package your C++ project with its dependencies. Manually installing them on the Jenkins server is also an option, but not ideal because you'll have to do this on any machine you want your code to run.
Hope that gives you somewhere to start!

Related

How to create package whatever is updated after the build in Jenkins?

I have a Jenkins job that gets the code from version control and builds (like what a normal pipeline do), I was doing is that after building the project, I download the build and use FTP to transfer that build to the client's server then I unzip it and then copy the whole build because I copy whole build my application's down time is very high. (I have to use FTP because as a service provider we have some limitations and can't change this policy)
What I wanted to do is that Jenkins know what is changed when it is building so Jenkins will create a package with all the changes and with the correct path where the file should go, and I can download that package and copy that package and just run the package so whatever was changed only that should get updated.
Is that possible? Is there any plugin that I can use?
This really depends on the build tool/language you are using to build you application. I dont think there is a generic jenkins plugin.
Other idea would be to upload your package to a local Nexus server. Download after the next build and the compare the files from old and new build. With this information you can create a patch package for your clienst server.

How to set up artifcatory in Jenkins Free Style Project?

I am using Free Style Projects (in Jenkins) to schedule a regression test.
1. Get Source From BitBucket
2. Execute Windows Batch Command.
Earlier we are allowed to upload the jar files in Bitbucket. So we did not face any issue. Now Presently due to some changes in the process, we are not allowed to upload binaries which is affecting to upload jars in the Bitbucket.
Now, They gave given the artifactory url to set up for Maven. But we don't have any Maven projects.
It seems that artifactory is getting populated when it is hosted in the local. But we wanted to use the artifactory which is shared..
Can any one let me know the set up for free style project and the artifactory hosted in other machine and we have only URL.
Thanks
Here is documentation:
https://www.jfrog.com/confluence/display/RTF/Jenkins+Artifactory+Plug-in
I recommend to use Maven Project.

Build and Deploy a Web Application with TFS 2015 Build

We have just installed TFS 2015 (Update 1) on-premise and are trying to create a Continuous Integration/Build system using the new TFS Build system. The build works fine, and gives me a green light, but when I look at the default build it has only built the binaries from the bin directory, and there seems to be no easy way to deploy the app on-premise to a local server.
There are two deploy options for a filesystem copy, and a powershell script, and it would certainly be easy enough to use them to copy files to a new server, but since the build only built the binaries, I don't see a tool to gather up the Web artifacts (cshtml, images, scripts, css, etc..) for this.
After an exhaustive google search, I've only found one article which talks about this at:
http://www.deliveron.com/blog/building-websites-team-foundation-build-2015/
However, this uses WebDeploy and creates a rather messy deploy package.
How can I deploy the site (standard MVC web application, in fact my tests are using the default boilerplate site created by the create project wizard) complete with artifacts to a local server in the easiest possible way? I don't want to have to install WebDeploy on the servers, and would rather use PowerShell or something to deploy the final artifacts.
The build is just the standard Visual Studio build template, with 4 steps (Build, Test, Index & Publish, Publish Build Artifacts).
We use "Visual Studio Build" step and as Arguments for MSBuild we use following line:
/p:DeployOnBuild=True /p:PublishProfile=$(DeploymentConfiguration)
On Variables tab page DeploymentConfiguration has to be configured. It must be the Name of the publish Profile (filename of the pubxml file). If the file Name is Build.pubxml the publish profile is Build.
for example:
/p:DeployOnBuild=True /p:PublishProfile=Build
I wanted to add that Ben Day has an excellent write-up that helped us package quickly and then release to multiple environments through Release Manager.
His msbuild arguments look like this:
/p:DeployOnBuild=True /p:DeployDefaultTarget=WebPublish /p:WebPublishMethod=FileSystem /p:DeleteExistingFiles=True /p:publishUrl=$(build.artifactstagingdirectory)\for-deploy\website
The difference between this and the accepted answer is that this parameter set stages everything in an artifacts folder, and then saves it as part of the build. We can then deploy exactly the same code repeatedly.
We capture the web.env.config files alongside the for-deploy folder and then use xdt transforms in the release process to ensure everything gets updated for whichever environment we're deploying to. It works well for all our web projects.
We use WebDeploy/MSDeploy for 40+ applications and love it. We do install WebDeploy on all our servers so we can deploy more easily but you could also use the Web Deploy On Demand feature which doesn't require WebDeploy be pre-installed.

Using Octopack on a TFS build with a website + windows service

I have a website, a windows service, and some shared class libraries in a single Visual Studio solution. I use Octopack on both the website and windows service, and on my machine these builds work as expected.
When using the TFS Build Server, the website nuget package is generated as expected, but the windows service nuget package contains all files from the website, as well as the service. E.g. it includes the _PublishedWebsites folder as well.
This is because TFS uses a single location to build projects.
What is the best way around this?
I know this question has since been closed, but I cam across this issue and solved it in a different way.
My solution is compromised of a number of websites and windows services and had the same issue of the OctoPack created nuget packages including all the solution assemblies from the 'pooled' output folder when building with Team Build. The reason the nuget packages get all the assemblies is OctoPack uses the outdir msbuild argument as the location to include assemblies from.
The way I got around it was to use the msbuild argument GenerateProjectSpecificOutputFolder=true. This instructs Team build to create a folder for each project in your output folder in the same way Visual Studio uses the bin folders under each project when building locally.
My build definition msbuild arguments looks like:
/p:GenerateProjectSpecificOutputFolder=true;RunOctoPack=true;OctoPackPublishPackageToFileShare=\\<NugetServer>
I currently just push the packages onto a shared folder but the OctoPackPublishPackageToHttp and OctoPackPublishApiKey parameters can also be used.
The benefit of this solution over the one above is you don't need to specify the files to include the nuget package.
Hope this helps someone.
I ended up using this nuget package to ensure the console app built to a seperate directory on the TFS server.
https://nuget.org/packages/PublishedApplications/2.1.0.0
I then had to specify in the nuspec file, which files should be included for the console app. e.g
This works and I can now deploy using Octopus deploy.
The downside of this apporach is that the PublishedApplications build only works on the TFS build server, so I can't build the project locally in release mode. Still looking on how to overcome this.

Trouble building Orchard on a Team Foundation Service Build Controller

I am trying to setup a Team Foundation Service Build of Orchard and auto deploy to Azure. The structure of Orchard comes by default with the source and lib directories as siblings. The solution file does NOT directly reference the lib files.
.\lib(modules)(files)
.\src\solution.sln
.\src(projects)(files)
When executing a build on the Elastic Build of Team Foundation Service, the build errors report that none of the library files can be found. It appears that they are not being downloaded during the Get Source operation even though the workspace mapping is at the parent directory of lib and src. Without visibility into the build server, I cannot verify that.
Does anybody have any ideas on the cause?
Any way to force verify the lib files are downloaded for the build?
An obvious mistake on my part, some of the dlls were excluded from source control and therefore were never downloaded. :-( A gentle reminder to verify all assumptions.

Resources