Why does the Jenkins API have packages for jenkins and hudson? - jenkins

I'm trying to get into Groovy scripting in Jenkins, but there seems to be no docs about this and the API is kind of split between package hudson and jenkins. I understand that Hudson is Jenkins' former name, and my first guess is that the devs didn't rename the old packages, but used the new name for newer code - effectively creating a mess. Is this true or am I missing something?

You are somewhat right. Jenkins originated from Hudson , so left the old packages and classes as it is for 2 of the obvious reasons:
To support the old legacy classes and codes written/used all over world. Other wise it would be difficult for every developer to either change or use something like #deprecated #SuppressWarnings
It will take more time to change/restructure all the classes for Jenkins. Even Huge Java community also avoid such things and they only restructure the code when it requires the most. Like they did in Dictionary or Vector classes or even Collections framework during generics implementions.
You can also refer this page for some more answers: How to choose between Hudson and Jenkins?

Related

Multiple jandex maven plugin (JBoss vs SmallRye)

I use Weld on a Java SE application, and I was wondering if generate the Jandex index at build with a plugin will improve the startup.
For now, I didn't notice a performance improvement.
But i found 2 plugins for generating the index:
https://github.com/smallrye/jandex
and
https://github.com/wildfly/jandex-maven-plugin
Any idea witch one to use?
Thanks!
For now, I didn't notice a performance improvement.
This would only be noticeable for large deployments. The reasoning behind it is that having a Jandex index makes it possible to skip the discovery phase via reflection and instead allow Weld to browse a pre-built index.
That being said, there is no harm in using Jandex even on smaller deployments, it's just that the difference won't really be noticeable.
Any idea which one to use?
Short answer is - as of Jul 2022, if you want the maven plugin variant, use the WildFly one. If you want core artifact, use the one with org.jboss.jandex artifact group ID.
Longer answer is - the SmallRye one is the original repo, recently migrated from https://github.com/wildfly/jandex to https://github.com/smallrye/jandex.
It holds the sources of what used to be org.jboss.jandex artifact group ID (and is now io.smallrye). It also has maven plugin module but there is no public release yet and it will be a major version bump as well. Therefore, going forward, there will be an artifact groud ID change but otherwise it will retain the same artifact ID and it will all be hosted in one repo.

How to do continuous delivery with Jenkins?

I am working for a company now for a couple of weeks. The build process is done mostly manually and takes several hours spread over several days. The languages in use are C#, COBOL, Delphi, Visual Basic 6, and of course the database with T-SQL. For the version control, we use Apache Subversion (SVN), except for COBOL code and the documentation, which is kept in Microsoft Visual SourceSafe (VSS). I have the idea to improve the process using a continuous delivery tool. Do you think that Jenkins would do the job?
Thank you for your reply.
Jenkins is undoubtedly a tool that can help with CI/CD.
Whether it is the right tool for your particular needs you should be able to determine by doing your own research into the capabilities of Jenkins and the tooling that it supports. You may find that you struggle with finding adequate support for the older technologies that you mention and you will likely find that you need to uplift some of that legacy to make it usefully available to any viable, modern CI/CD tool.
e.g. get your code out of SourceSafe. You should do that anyway because .. SourceSafe. :)
Don't get bogged down in how to migrate your history. Just shutter SourceSafe (make it read-only) to retain as a reference to your history and move tip/head into a new repo. (SVN if you have to, though I'd highly recommend Git).
More generally, I would be surprised if you could not find some immediate quick-win improvements that can be made, without needing to invest time/effort/money into a "Silver Bullet" tool, just by putting some scripting in place to automate current manual processes.
Jenkins is definitely the right tool. We use Jenkins as a CI tool for building our Delphi (+Dunit+Innosetup), C# and Cordova/PhoneGap applications (all code in SVN).
I have no idea of the dependencies between the code in SVN of VSS, but if it depends on each other, I would advise to put all the code in a SVN or GIT repository.
There are some simple examples to integrate Delphi in Jenkins, see the following links:
https://community.embarcadero.com/blogs/entry/continuous-integration-with-svn-jenkins-and-dunit-delphi-with-craig-chapman
http://www.ictexpertise.com/blog/2016/02/10/continuous-integration-of-delphi-project-with-jenkins/
http://chapmanworld.com/2015/01/18/use-radstudio-with-jenkins-no-plugin/

Continuous Integration with BitBucket

I'm developing a private webapp in JSF which is available over the internet and now reached a stage where I wanted to introduce CI (Which I'm fairly new to) into the whole process. My current project setup looks like this:
myApp-persistence: maven project that handles DB access (DAOs and hibernate stuff)
myApp-core: maven project, that includes all the Java code (Beans and Utils). It has a dependency on myApp-persistence.jar
myApp-a: maven project just with frontend code (xhtml, css, JS). Has a dependency on myApp-core.jar
myApp-b: maven project just with frontend code (xhtml, css, JS). Has a dependency on myApp-core.jar
myApp-a and myApp-b are independent from each other, they are just different instances of the core for two different platforms and only display certain components differently or call different bean-methods.
Currently I'm deploying manually, i.e. use the eclipse built-in export as war function and then manually upload it to the deployments dir of my wildfly server on prod. I'm using BitBucket for versioning control and just recently discovered pipelines in BitBucket and implemented one for each repository (every project is a separate repo). Now myApp-persistence builds perfectly fine because all dependencies are accessible via the public maven repo but myApp-core (hence myApp-a and myApp-b, too) fails of course because myApp-persistence isn't published on the central maven repo.
Is it possible to tell BitBucket somehow to use the myApp-persistence.jar in the corresponding repo on BitBucket?
If yes, how? And can I also tell BitBucket to deploy directly to prod in case the build including tests ran fine?
If no, what would be a best practice to do that? I was thinking of using a second dev server (already available, so no big deal) as a CI server but then still I would need some advise or recommendations on which tools (Jenkins, artifactory, etc.) to use.
One important note maybe: I'm the only person working on this project so this might seem like an overkill but for me the process of setting that up is quite some valuable experience. That said, I'm not necessarily looking for the quickest solution but for the most professional and convenient solution.
From my point of view, you can find the solution in this post-https://christiangalsterer.wordpress.com/2015/04/23/continuous-integration-for-pull-requests-with-jenkins-and-stash/. It guides you step by step how to set up everything. The post is from 2015 but the process and idea are still the same. Hope it helps.

Teambuild / MSBuild and stamping QA-approved builds

We have an automated build and QA process for our software, using tfs/teambuild and msbuild, and we want to be able to know (for audit purposes) whether a component has gone through that process or not.
For example, if a library is installed on a user's machine, I'd like to be able to inspect it in some way to tell that it went through the build. In particular, I want to be able to distinguish it from components built directly on a developer's machine, and then manually installed.
What is the best way to do this? Code signing as part of the build process seems closest to these requirements, but presumably this would not cover any 3rd-party libraries that might be used? I also read about the ILMerge tool to merge all assemblies into one, but then I don't know enough to work out whether they can then be signed or not?
I'm sure we're not the first people to have the requirement, so casting around for any ideas or hints from others who might have done such a thing
Thanks!
Our developer builds are set to keep the versions at "0.0.0.0", but our build server marks the build based on a pre-configured version and automagically generated build string. "1.0.3.xxx". Your build server doesn't allow for this?
Your build process should be updating each of your projects assemblyinfo.cs files (or a global linked equivalent), you can do this with the TFS changeset number, so like the previous poster indicated you end up with the property on each dll of 1.0.changeset.buildno or something similar. You can do this easily in msbuild.
You could have the values of each assembly info file set in source control to be something obvious like 0 or 999.
A lot of what your asking is about process and training as well though.
If your using installers or zips to package your deliverables then you can also label them with the build number as part of your build process.
But if you have changeset you have the link from dll to code, so traceable, coupled with links to third party dll references as defined in each csproj.

MyEclipse builders and CI

I'm picking up support on a project that is currently built with MyEclipse and has a decent sized development team that has been working without any CI processes.
From what I can tell, the MyEclipse folks don't see any value in being able to build outside of the Eclipse platform, which makes no sense at all to me. Continuous Integration is extremely helpful when you have to integrate changes into a codebase from more than one development environment, and it's pretty tough to automate builds when you're tied to a GUI.
Does anyone have continuous integration processes set up around MyEclipse style project-sets? If so, what strategy did you use to accomplish it?
AFAIKT there is no OOTB feature that can generate an Ant script (or equivalent headless-build script) from MyEclipse, nor is there an exposed way to invoke MyEclipse builders from a build-script platform.
This would lead me to believe that I'll need to reverse engineer the scripts based on what MyEclipse generates, which I'd rather not have to do.
I'm not particularly concerned with a Maven-style solution for my needs, but if you know of one I'd like to hear about it. From my initial research it looks like Maven/MyEclipse integration is even worse.
This is remarkably similar to the problems I had working with a websphere 5.1 application that could only be built from WSAD6 running on build machine built from a disk image from the company IT dept. WSAD did have a headless mode. It was a real pain to get that working from Hudson.
I would not be surprised if there was a Maven plugin and/or Ant task for each of the builders you are using. I would start there.
Here is a Maven based solution so maybee a bit off topic for you..
In our company, we use MyEclipse as IDE and Hudson and Team City for continuous integration. The projects are Maven based, so Hudson and TC can work with them.
When you want to open the project in Eclipse, you have to check out the sources, setup maven repository path for eclipse with mvn eclispse:add-maven-repo, build them with mvn install and then run target mvn eclipse:eclipse, which creates the Eclipse project setup from the maven's POM configuration. Then it is possible to import the project into Eclipse and work with it seamlessly..
More information can be found on maven-eclipse-plugin project page
..seamlessly until you change something in the POM configuration - then you have to run the mvn eclipse:eclipse again and have the eclipse project configuration recreated acording to the new POM.. it's important not to forget about this step, unless your project in the IDE won't work properly and you'll be wondering why ;)
Me personally don't find this solution the best, but that's the way how Eclipse folks work with Maven :/
Hope this should inspire you at least :)
This is another reason why I intensely dislike Eclipse. The fact that an IDE can force you away from something that's acknowledged to be a best practice is shameful.
"AFAIKT there is no OOTB feature that can generate an Ant script (or equivalent headless-build script) from MyEclipse" - I'm not sure I understand why this is a problem. It's possible to write a simple Ant build.xml in an hour or two that would do the job for most Java EE apps packaged as WAR files. I don't know if you're using EJBs, but even adding app server specific tasks such as EJB and JSP compilation wouldn't be much of a challenge. If you can agree on a common directory structure it would even be reusable across projects.
With that Ant build.xml in hand, you should be able to drive your CI process simply by checking into Subversion. The Eclipse plug-ins to do that work well, I hear.
If it's really a problem, I'd recommend IntelliJ. It works nicely with CI based on either Cruise Control or Hudson or Jet Brains' own Team City. The cost isn't excessive, and it'll pay for itself quickly.
If I'm misreading your question, I apologize. But if I've got it right, there's no way I'd let the IDE dictate to the team this way.

Resources