I've been reading a few articles and watching a few videos on Jenkins. I'm wondering how easy it would be once the master branch has been deployed to a staging server to automatically send an email to the client notifying them of the url to the staging server and also giving them a link to "deploy live". This way the client can see the changes, make sure they're happy with it then deploy it themselves without having to email anyone requesting it to go live.
Anyone got any idea how easy this would be to do with Jenkins? There may be a plugin that does this but so far I've not come across anything.
I saw a talk where a guy does this to notify QA of a new build to test, as well as notify when a build is ready to be published to production.
Basically the last automated job (deploy to staging job) has a post build step to send an email to some address. The body of the email contains a link back to the REST API for the "deploy to production" job, triggering a build.
Email recipient tests things, and if satisfied, clicks the link and Jenkins runs the production job. Obviously this requires that the recipient has some kind of access to (at the very least, the REST API of) the Jenkins instance. That being said, there's no reason you couldn't set up your own system to take limited external requests and forward them to your Jenkins API.
The video link (including time reference of the relevant part) is: https://youtu.be/3HI7mv_791k?t=3169
If you've been watching a few videos you might have already come across it, but it's quite long so you might not have watched it all.
Related
We have a CI jenkins platform and we want to manage our deployments with an email confirmation. A user send a request to the deployment plan for starting. But in the first step, the plan send an email confirmation to an administrator. If the admin click the request link, deployment will be start, otherwise wont start.
Is there a way to do this with jenkins or any jenkins plugin?
yes there is a way to do that, honestly, more than one way to do that and you can choose by your preferneces. What I will suggest you is to consider the way explained in the issues of Jenkins https://issues.jenkins-ci.org/browse/JENKINS-33793
though it have status Unresolved, but it looks like by my knowledge that it will work even right now , cause it is just an URL.
I am developing a Dashboard on top of Jenkins. The Dashboard would list all the jobs available and would also have a trigger button to initiate a build which shoots a post request using the secret token. The problem is every build would have the same cause which says "Started by remote host 19.XX.XX.XX". Since the dashboard needs to display the user name triggering the job as the person who logged, is there a way we can pass the a username as well in the jenkins remote trigger url like below so that jenkins would capture the cause as the user name.
https://jenkinsurl:port/job/testLDAP/build?token=DDJjk$###*bB&userName=abc
There is no parameter that you can use for this. A workaround that I've used is adding &cause=This+was+started+by+abc which results in
Started by remote host 192.168.x.x with note: This was started by abc
Perhaps this might help you.
I'm in the process of planning the development of a mail-server to hand the sending of email across our multiple websites. Below is a description of what we are planning to implement and I'd like your opinion/suggestions.
We use ASP.NET MVC and have many web-sites hosted on Azure. We currently send mail internally within each of the web applications using SMTPServer.Send(). Obviously this is not the ideal way to send emails when you have a decently busy set of websites because the send mail call is blocking and cannot guarantee mails are sent. With this I'm worried out getting an influx of mail requests when we launch our next website (we think it'll get a decent amount of traffic and lots of emails will be sent).
My plan of action : to build a centralised mail-server that runs in the background (we use azure and this will be simply another web-application). When each one of our web applications wants to send a mail, instead of doing this internally, it'll call a web method on the mail-server called sendMail() this function will accept certain parameters and insert the mail parameters, content etc. into a database. The mail server will then poll the database at fixed time intervals, select a set of unsent emails and attempt to send them using the same SMTPServer.Send() function. If an email fails for some reason we won't flag it as sent and in the next poll interval the email will be selected again and another send attempt will be made. (we will cap the number of send attempts to say 20).
This will allow each of the websites to run smoothly without having loads of blocking send mail calls internally and the mail server will handle all the sending sequentially and in a controlled environment as a separate standalone web-application.
Thanks in advance!
Looks like a good design, Don't know the entire scenario which let to you building something like an email server. The problem has been solved well by using a service that already exist like Office 365.
Your design is good, My suggestions would be the following,
You can use Azure WebJobs to build the polling agent. You can make the web job run as a scheduled web job that does the polling and sending the mail and it can be written very clean as a simple console app.
You can use Azure API App to build the SendMail() call and you can use the Azure AD Auth on the API to authenticate the caller of the API using the Authentication and Authorization feature to easily secure your email server. You can also enable CORS easily as well to make sure you receive requests from other websites and process it.
Some issues I foresee for you,
Volume and Scaling : You can only process so much email between each polling. If you cannot then you will need to create another polling agent which will making things complicated as they need to know they are picking different sets of emails to send. If you volume is going to be low you should be fine.
Challenge : Why can't the websites send the mail them selves ? And then record it on the database for tracking. All you have to do build module or a component that they use on their web page to create and send the mail. Polymer 1.0 works well for this scenario.
Hope this helps to get you started.
Is it possible to setup TFS/Test Manager so that it sends out an email after a test fails?
Yes, it is possible but it requires quite a lot of changes/additions to the process template and possibly a custom-made activity.
After tests have run, we check if BuildDetail.BuildPhaseStatus has status failed
We send mail to everyone who has changesets committed to this build, so the build goes through BuildDetail.AssociatedChangesets (you need to have AssociateChangesetsAndWorkItems on) and get the committer username.
Unfortunately for us, there's no good correlation between TFS username and email address at our place, so we had to create a custom activity that looks that up in the AD.
The actual email is sent with the BuildReport action from Community TFS Build Extensions. We modified the xslt, but that's not really necessary. We also wanted to include a listing of the failed tests, and that required modification of the action itself (test data isn't included by default).
Looking at this description and all the work made to get this working, I'm beginning to wonder if it was worth it ;).
I am using TFS 2010 for build service. I need to send an email if the build is running for longer time.
For ex: Suppose the build normally runs for 10 mins, but now if the build is running for more than 20 mins... i need to send an email notification.
May I have your help on this?
This functionality is not available out of the box. This can however, make a great feature request, raise it for consideration here => http://visualstudio.uservoice.com/forums/121579-visual-studio
However, to get this to work here is what you can do... Write a tfs build activity which using the tfs api extracts the last build execution time and insert that at various places in the process workflow ideally before and after each work flow task to check how much time has the build already consumed while measuring this against the expected time. Use the email notification task to send out an email accordingly.
Here is an example that shows u how to get the last build details, http://blogs.microsoft.co.il/blogs/shair/archive/2011/01/11/tfs-api-part-33-get-build-definitions-and-build-details.aspx and here a custom task example http://msdn.microsoft.com/en-us/library/t9883dzc.aspx
Alternatively, query the TFS Build Queue and check the runtime of the builds in progress. When any of the builds exceeds the defined thresholds, send the email. This can be done in a windows service with relative ease.
You'd use the TFS Client Object Model to query the builds like this. Tarun already provided a nice link to that.