Continuous Integration with BitBucket - jenkins

I'm developing a private webapp in JSF which is available over the internet and now reached a stage where I wanted to introduce CI (Which I'm fairly new to) into the whole process. My current project setup looks like this:
myApp-persistence: maven project that handles DB access (DAOs and hibernate stuff)
myApp-core: maven project, that includes all the Java code (Beans and Utils). It has a dependency on myApp-persistence.jar
myApp-a: maven project just with frontend code (xhtml, css, JS). Has a dependency on myApp-core.jar
myApp-b: maven project just with frontend code (xhtml, css, JS). Has a dependency on myApp-core.jar
myApp-a and myApp-b are independent from each other, they are just different instances of the core for two different platforms and only display certain components differently or call different bean-methods.
Currently I'm deploying manually, i.e. use the eclipse built-in export as war function and then manually upload it to the deployments dir of my wildfly server on prod. I'm using BitBucket for versioning control and just recently discovered pipelines in BitBucket and implemented one for each repository (every project is a separate repo). Now myApp-persistence builds perfectly fine because all dependencies are accessible via the public maven repo but myApp-core (hence myApp-a and myApp-b, too) fails of course because myApp-persistence isn't published on the central maven repo.
Is it possible to tell BitBucket somehow to use the myApp-persistence.jar in the corresponding repo on BitBucket?
If yes, how? And can I also tell BitBucket to deploy directly to prod in case the build including tests ran fine?
If no, what would be a best practice to do that? I was thinking of using a second dev server (already available, so no big deal) as a CI server but then still I would need some advise or recommendations on which tools (Jenkins, artifactory, etc.) to use.
One important note maybe: I'm the only person working on this project so this might seem like an overkill but for me the process of setting that up is quite some valuable experience. That said, I'm not necessarily looking for the quickest solution but for the most professional and convenient solution.

From my point of view, you can find the solution in this post-https://christiangalsterer.wordpress.com/2015/04/23/continuous-integration-for-pull-requests-with-jenkins-and-stash/. It guides you step by step how to set up everything. The post is from 2015 but the process and idea are still the same. Hope it helps.

Related

Multiple jandex maven plugin (JBoss vs SmallRye)

I use Weld on a Java SE application, and I was wondering if generate the Jandex index at build with a plugin will improve the startup.
For now, I didn't notice a performance improvement.
But i found 2 plugins for generating the index:
https://github.com/smallrye/jandex
and
https://github.com/wildfly/jandex-maven-plugin
Any idea witch one to use?
Thanks!
For now, I didn't notice a performance improvement.
This would only be noticeable for large deployments. The reasoning behind it is that having a Jandex index makes it possible to skip the discovery phase via reflection and instead allow Weld to browse a pre-built index.
That being said, there is no harm in using Jandex even on smaller deployments, it's just that the difference won't really be noticeable.
Any idea which one to use?
Short answer is - as of Jul 2022, if you want the maven plugin variant, use the WildFly one. If you want core artifact, use the one with org.jboss.jandex artifact group ID.
Longer answer is - the SmallRye one is the original repo, recently migrated from https://github.com/wildfly/jandex to https://github.com/smallrye/jandex.
It holds the sources of what used to be org.jboss.jandex artifact group ID (and is now io.smallrye). It also has maven plugin module but there is no public release yet and it will be a major version bump as well. Therefore, going forward, there will be an artifact groud ID change but otherwise it will retain the same artifact ID and it will all be hosted in one repo.

grails 3 creating sub projects

It is common to have a web admin project which produces a war, and an API project which produces a different war. Each can run on different servers with different firewall rules (in production). The common part is the service and domain layer. Additionally, there may be other components which are optionional which also benefit from being exploded plugins. Exploded plugins allow separation but allow developers to see and modify all source together as if it was one giant project.
In grails 2.5 setting this up was trivial:
create your web admin app in your project root
create your core services app as a plugin in the project root
Add one line in your web admin projects BuildConfig.groovy to use the services project as an exploded plugin, e.g. "grails.plugin.location.coreservices = "../coreservices""
To build the project, you just do grails war in the web admin app folder.
Brilliant. Effortless and effective. Developers just checkout both projects from git and off they go. Works seamlessly with intellij 14 also as a bonus (we dont have a license for 15+ unfortunately so no grails 3 support)
Before we can consider moving to grails 3, we need to be able to do the same thing.
We could only find one post on the subject.
This requires extensive "hacking" of gradle scripts and creation of scripts in the dir above the two projects, which is not ideal for use with git.
In the section "keeping things DRY", they move some stuff from the sub projects build.gradle file into a build.gradle file above the projects. Is this required?
The new master gradle file has "repositories {mavenLocal().." twice. once at the top under buildscript, then again under "subprojects{ project->". Is this correct? Should it not either be only on the main project, or only on the two sub projects, not all 3?
If we introduce optional exploded plugins (with different dependencies), the parent gradle will have to be edited by hand by each developer. This makes it hard to version and control.
The article adds spring security core to the "plugin-domain", not the web app project. Surely the security is added to the web app, not the services/domain layer plugin? an API app project would have different security requirements.
Does anyone have a better way with grails 3, or shall we stick to grails 2.5? There are no features in grails 3 we need, but at some point 2.5 will become too old and migration looks to be infeasible for the most part. The fact there is no affordable IDE with integrated grails 3 support similar to intellij ultimate or GGTS is a big negative also.
"hacking" is not necessary.
Here is official multiproject tutorial:
http://guides.grails.org/grails-quickcasts-multi-project-builds/guide/index.html
mavenLocal() - is a local folder that is used to store all your project’s dependencies. The "buildscript" block only controls dependencies for the buildscript process itself, not for the application code, which the top-level "dependencies" block controls. So you can have different repositories for "buildscript" and "dependencies".
Read the Gradle User Guide for more information. Gradle is harder then old grails build system, but more powerful.
I moved project from grails 2 to 3 and I was pleased with the result.
IntelliJ 2016 - 2017 work perfect with grails 3
I found and followed this tutorial, which is different from most of the other tutorials as it uses create-plugin instead of create-app for the plugin part.
The project then works correctly with eclipse neon 2.

Is there a trick to debug shared groovy libraries without pushing?

I'm adding to, and maintaining, groovy files to build a set of repositories - previously they were built with freestyle Jenkins jobs. I support some code in shared libraries and to be honest (mainly for DRY reasons) I want to do that more.
However, the only way I know how to test and debug those library files is to push the changes on a git branch. I know about the "replay" trick to test the main Jenkins file. Is there some approach I've missed to do something similar for library code?
If you set up a job to load the shared library instead of relying on a globally set up shared library (you can have both going, for this particular job), then it is possible to hit "replay" and have all your shared library steps show up as editable files.
This can be helpful in iterative development without a million commits.
EDIT: Here's how that looks on an Organization job in Jenkins.
There is the 3rd party Jenkins Pipeline Unit testing framework.
While it does not yet cover all features of pipeline, it is well documented and maintained so that I would consider starting using it (once I revisit our Jenkins setup).

Any quick way to convert VS .net manual build into Jenkins?

We are migrating 50+ .net project from TFS to GitHub, at the same, we want to use Jenkins to automate the build. Currently all the builds are done inside the Visual Studio manually. I know how to automate this build using MSBuild and we already have a lot of these projects building inside Jenkins.
My question: is there a way to set up these 50+ project quickly w/o creating them one by one manually? Anyway to script them? e.g. a Jenkins project has everything inside a folder, I can copy a sample project/folder to create a new one and modify something. Or create a Jenkins project using a script reading a config file? Any idea can save some time is appreciated.
Not a direct answer but too long for a comment so here it goes anyway. Following the Joel test (which in no way is dogmatic for me but does make a lot of good points), and in my experience, you should already have an msbuild file now to build all those projects 'in one click'. Then, setting up a build server, in fact any build server, is just a matter of making it build that single parent project. This might not work for everyone, but for several projects I've worked on this had the following advantages:
the entire build process gets defined by developpers, working locally on their machine, using 'standard' tools
as such they don't need to spend hours in a web interface figuring out the appropriate build steps, dependencies and whatnot (also those hours would have been worthless in the end if switching to a different build server)
since a complete build is now just a matter of msbuild master.proj, possibly along with some options to define configuration/platform/output directories getting this running on any build server should be painless and quick
in the same manner this makes it easy to test different build servers with a minimum of time and migrate between them (also no need to ask SO questions on how to set everything up :)
this also makes it easy for other developpers to get complete builds as well without having to go round via a build server
Anecdote: we once had Jenkins running on multiple different projects as well. It took us days to get everything running, with the templates etc, and we found the web intercae slow and cumbersome (and getting to know the API would have taken even more days). Then one day I got sick of this and made a bunch of msbuild scripts which could build everything from one msbuild command. That took much less time than setting up Jenkins, a couple of hours or so. Then I took a TeamCity installation we already had and made it build the new master project. Took like an hour and everything worked. Just recently I took the same project and got it working on Visual Studio Online, again in no time.
If those projects are more or less similar to build, you will probably be interested in using the template plug-in for jenkins. There you configure a dummy project such that it does what is common to (most of) the 50+ projects.
Afterwards you create a separate project for each: Create the first project and make it use the template project for each of the steps which can be shared with the template project (use build step from other project). All subsequent projects can be created as slightly adopted copy of this first 'real' project.
I use it such that the variable $JOB_NAME (the actual project name in jenkins that is) is part of the repository path and I can thus clone from http://example.org/$JOB_NAME/
Configured that way, I can include the source code management step in the templating job and use it unmodified. Similar with the build step and post-build step: they are run by a script which is somewhat universal accross all my projects (mostly calling make and guessing deployment / publication paths upon $JOB_NAME again).

MyEclipse builders and CI

I'm picking up support on a project that is currently built with MyEclipse and has a decent sized development team that has been working without any CI processes.
From what I can tell, the MyEclipse folks don't see any value in being able to build outside of the Eclipse platform, which makes no sense at all to me. Continuous Integration is extremely helpful when you have to integrate changes into a codebase from more than one development environment, and it's pretty tough to automate builds when you're tied to a GUI.
Does anyone have continuous integration processes set up around MyEclipse style project-sets? If so, what strategy did you use to accomplish it?
AFAIKT there is no OOTB feature that can generate an Ant script (or equivalent headless-build script) from MyEclipse, nor is there an exposed way to invoke MyEclipse builders from a build-script platform.
This would lead me to believe that I'll need to reverse engineer the scripts based on what MyEclipse generates, which I'd rather not have to do.
I'm not particularly concerned with a Maven-style solution for my needs, but if you know of one I'd like to hear about it. From my initial research it looks like Maven/MyEclipse integration is even worse.
This is remarkably similar to the problems I had working with a websphere 5.1 application that could only be built from WSAD6 running on build machine built from a disk image from the company IT dept. WSAD did have a headless mode. It was a real pain to get that working from Hudson.
I would not be surprised if there was a Maven plugin and/or Ant task for each of the builders you are using. I would start there.
Here is a Maven based solution so maybee a bit off topic for you..
In our company, we use MyEclipse as IDE and Hudson and Team City for continuous integration. The projects are Maven based, so Hudson and TC can work with them.
When you want to open the project in Eclipse, you have to check out the sources, setup maven repository path for eclipse with mvn eclispse:add-maven-repo, build them with mvn install and then run target mvn eclipse:eclipse, which creates the Eclipse project setup from the maven's POM configuration. Then it is possible to import the project into Eclipse and work with it seamlessly..
More information can be found on maven-eclipse-plugin project page
..seamlessly until you change something in the POM configuration - then you have to run the mvn eclipse:eclipse again and have the eclipse project configuration recreated acording to the new POM.. it's important not to forget about this step, unless your project in the IDE won't work properly and you'll be wondering why ;)
Me personally don't find this solution the best, but that's the way how Eclipse folks work with Maven :/
Hope this should inspire you at least :)
This is another reason why I intensely dislike Eclipse. The fact that an IDE can force you away from something that's acknowledged to be a best practice is shameful.
"AFAIKT there is no OOTB feature that can generate an Ant script (or equivalent headless-build script) from MyEclipse" - I'm not sure I understand why this is a problem. It's possible to write a simple Ant build.xml in an hour or two that would do the job for most Java EE apps packaged as WAR files. I don't know if you're using EJBs, but even adding app server specific tasks such as EJB and JSP compilation wouldn't be much of a challenge. If you can agree on a common directory structure it would even be reusable across projects.
With that Ant build.xml in hand, you should be able to drive your CI process simply by checking into Subversion. The Eclipse plug-ins to do that work well, I hear.
If it's really a problem, I'd recommend IntelliJ. It works nicely with CI based on either Cruise Control or Hudson or Jet Brains' own Team City. The cost isn't excessive, and it'll pay for itself quickly.
If I'm misreading your question, I apologize. But if I've got it right, there's no way I'd let the IDE dictate to the team this way.

Resources