Tool to mine plastic scm data - plasticscm

Is there any external tool to mine plastic scm data. I have a requirement to figure out hot spots in my projects which go through frequent changes? if not, is it possible to query this information in a structural way from Plastic SCM itself.

I'm not aware of any external tools to mine the Plastic SCM data but depending on your needs I'm almost sure you can use the "cm find" command in order to get what you need.
The "cm find" command provides a very flexible way to extract Plastic SCM information from the repository, here you have a guide plenty of examples: https://www.plasticscm.com/documentation/cmfind/plastic-scm-version-control-query-system-guide.shtml
Hope it helps!

Related

Is there a way to set/change the changeSet (changelog) content from pipeline script? Needed for preflight type of job

I have a preflight job using perforce in which I retrieve a branch, unshelve (apply) a given changelist on it and then build to validate that the change in question has not broken the build. Very similar to what you would do for a GitHub Pull Request type of CI.
I use the official checkout() pipeline call to get the branch as it simplify dealing with the perforce creds, and that causes the jenkins build to include the changelog of that branch in the build. Yet, those are of no interest to me, as my interest is on the changelist I am unshelving on top of that branch.
Can I, from the pipeline script clear and fill the currentBuild.changeSet? If so, would someone have an example and which fields I can set under currentBuild.changeSet.items?
Or doing is only possible by going through the plugin road in the same way the p4/git plugins are doing this?
My advice, don't play with the currentBuild.changeSet. It also contains the changeset of the shared libraries you are using. I personally don't rely on that anymore.
However, here is an article on how to update the changeSet
https://support.cloudbees.com/hc/en-us/articles/217630098-How-to-access-Changelogs-in-a-Pipeline-Job-
Here is an exemple on how to implement that in a pipeline
https://issues.jenkins-ci.org/browse/JENKINS-58441
Finally, in an ideal world, don't share your jenkins with management or non developers/testers, share only a dashboard that is connected to a database that you filled with the relevant information you need. I use influxdb+grafana to do that with the influxdb plugin

How to do continuous delivery with Jenkins?

I am working for a company now for a couple of weeks. The build process is done mostly manually and takes several hours spread over several days. The languages in use are C#, COBOL, Delphi, Visual Basic 6, and of course the database with T-SQL. For the version control, we use Apache Subversion (SVN), except for COBOL code and the documentation, which is kept in Microsoft Visual SourceSafe (VSS). I have the idea to improve the process using a continuous delivery tool. Do you think that Jenkins would do the job?
Thank you for your reply.
Jenkins is undoubtedly a tool that can help with CI/CD.
Whether it is the right tool for your particular needs you should be able to determine by doing your own research into the capabilities of Jenkins and the tooling that it supports. You may find that you struggle with finding adequate support for the older technologies that you mention and you will likely find that you need to uplift some of that legacy to make it usefully available to any viable, modern CI/CD tool.
e.g. get your code out of SourceSafe. You should do that anyway because .. SourceSafe. :)
Don't get bogged down in how to migrate your history. Just shutter SourceSafe (make it read-only) to retain as a reference to your history and move tip/head into a new repo. (SVN if you have to, though I'd highly recommend Git).
More generally, I would be surprised if you could not find some immediate quick-win improvements that can be made, without needing to invest time/effort/money into a "Silver Bullet" tool, just by putting some scripting in place to automate current manual processes.
Jenkins is definitely the right tool. We use Jenkins as a CI tool for building our Delphi (+Dunit+Innosetup), C# and Cordova/PhoneGap applications (all code in SVN).
I have no idea of the dependencies between the code in SVN of VSS, but if it depends on each other, I would advise to put all the code in a SVN or GIT repository.
There are some simple examples to integrate Delphi in Jenkins, see the following links:
https://community.embarcadero.com/blogs/entry/continuous-integration-with-svn-jenkins-and-dunit-delphi-with-craig-chapman
http://www.ictexpertise.com/blog/2016/02/10/continuous-integration-of-delphi-project-with-jenkins/
http://chapmanworld.com/2015/01/18/use-radstudio-with-jenkins-no-plugin/

Using Mercurial locally with TFS Team Foundation Version Control Server Workspace

At work we are using TFS Team Foundation Version Control (TFVC) and the workspace is a server workspace (very large codebase). The limitations of our setup are that files checked out are locked for edit by other people. Also there is a culture of not committing until work is complete etc as many change-sets complicate merging later.
I am in no position to change the global rules or culture. I would like to locally setup a mercurial (hg) repo on my local machine. The idea is that I can work on my local copy make as many checkins to hg. When I am done I would like to bundle my changes into one changeset and send it off to the TFS location (also on my local machine). Then immediately checkin the changes to TFS server.
That way to the outside world I appear to checkout and then immediately checkin all of my code, only briefly locking the files changed. But locally in hg I get the full ability to make small checkins and work without worrying about locking files out for edit.
Somehow chain two version control systems, giving me the flexibility of HG locally, but continue using the global TFVC for final checkins.
Any ideas on how this could be achieved?
You can use git-tf and the hg-git. This was an intentional design decision when we built git-tf that this was a supported scenario.
That said... this seems a bit... icky.
You may want to write a few shell scripts to make this workflow a little bit easier.
But even with that, it's hard to imagine troubleshooting this when something inevitably goes wrong.
TFS doesn't have Mercurial support, but apparently does have Git support.
You can use the hg-git plugin to access TFS this way.
More details about the lack of support:
https://hglabhq.com/blog/2014/1/17/mercurial-support-in-tfs-declined
https://visualstudio.uservoice.com/forums/121579-visual-studio/suggestions/3607357-add-mercurial-support-to-team-foundation-server

TFS Branching & Merging Strategies

I have a Team Project in TFS where tasks are submitted daily. I would like to work on each task independently and then merge it into the main line after testing.
Currently there is a MAIN branch and a DEV branch which is a child of MAIN. Changes are worked on in the DEV branch and then merged into MAIN when they are ready. This is done via a "cherry-pick" merge. I've been reading everywhere that cherry-pick merges are bad and you should avoid them whenever possible.
I am having trouble wrapping my head around branching and merging in TFS and was wondering if anyone had any suggestions on how to achieve this goal in TFS without having to do cherry pick merges.
Any help is appreciated.
If I left out any key information please leave a comment and I will edit my post.
I think this Codeplex documentation will be a big help:
http://tfsbranchingguideiii.codeplex.com/
The download has several PDFs that outline different scenarios and strategies and give excellent Q&A on different approaches.
The key for your scenario would be to merge all changes up to a specified version from Dev to Main. Run all tests each time code is checked into Dev (and developers get the latest Dev code, then run all tests before checking in). Ideally, if the build in the Dev branch is successful after Dev checkin, merging into Main would be a good idea. Merge frequently from Dev to Main, and run all the tests in Main after each checkin.
So even though developers work individually on specific pieces, once they check into the Dev branch they are essentially saying "this code is ready to integrate." And when merging from Dev to Main, you no longer deal with specific pieces--you merge the whole enchilada. If Developers need source control for work-in-process code, they should use TFS shelvesets and wait to check into Dev until they are "done."
You might find Timpani Software's MergeMagician tool interesting. It is a branch management and automated merging solution that works with TFS (and also Subversion). You create publish/subscribe relationships between branches, and then the server automates the merges.
MM can be used to implement all of that patterns discussed in the TFS Branching Guide that Shawn mentioned.
FYI, it is a commercial tool. I don't know of any open source tools that do anything like this that work with TFS.
Check it out at http://www.timpanisoftware.com. There's a good overview video on the home page.

Kiln integration with JIRA

We are happy with JIRA and there is no willingness to move away from it. At present we have JIRA integrated well with Perforce. However we are considering moving to Kiln.
Losing the integration with JIRA would be a blocker.
Perforce not Proforce, right? Kiln has an API that could be used to integrate with JIRA, but I would also consider using Fisheye (from Atlassian) to interact directly with the Mercurial repositories used by Kiln. You could also use the JIRA Mercurial plugin that I wrote to interact with those repositories if you didn't want Fisheye for some reason.
This is an area that I know Atlassian are interested in finding out what people want. If you want to drop me a note about this I can forward it to their Dev Tools group.
Since Kiln uses Mercurial under the hood, you might also want to consider Atlassian's BitBucket, which is a hosted Mercurial repository.

Resources