Dagger: code generation or reflection? - dependency-injection

How do I know if Dagger's code generation is working correctly? I see several threads where users have eventually discovered that reflection was being used instead.
I have run the example coffee maker application in Eclipse and when I set breakpoints in e.g. Thermosiphon I cannot see any generated classes in the stack. I do see ReflectiveAtInjectBinding which makes me suspect that my setup is not correct.

So there are a few aspects to this.
Dagger has recently removed (or is about to remote) reflection fallback for modules - so you should, upon the next release, not ever have to have reflective module adapters. If a module adapter was not generated, there will be a specific error that will prevent further action.
As to code-generation verification, it is probably worth creating a small verification script that confirms that any sources that contain #Module have a $$ModuleAdapter class generated. Presuming you run in Maven, then this could be attached to the verify step in your project.
If you're running in eclipse, then you need to have m2e plugin, and you need to enable in your maven settings to allow maven to configure the annotation processing settings in eclipse.
One caveat. If you have m2e manage annotation processing configuration, and if you have dagger itself open as an eclipse project, then you must, in the maven settings of your project, disable "Resolve dependencies from Workspace projects"

Related

Is there a trick to debug shared groovy libraries without pushing?

I'm adding to, and maintaining, groovy files to build a set of repositories - previously they were built with freestyle Jenkins jobs. I support some code in shared libraries and to be honest (mainly for DRY reasons) I want to do that more.
However, the only way I know how to test and debug those library files is to push the changes on a git branch. I know about the "replay" trick to test the main Jenkins file. Is there some approach I've missed to do something similar for library code?
If you set up a job to load the shared library instead of relying on a globally set up shared library (you can have both going, for this particular job), then it is possible to hit "replay" and have all your shared library steps show up as editable files.
This can be helpful in iterative development without a million commits.
EDIT: Here's how that looks on an Organization job in Jenkins.
There is the 3rd party Jenkins Pipeline Unit testing framework.
While it does not yet cover all features of pipeline, it is well documented and maintained so that I would consider starting using it (once I revisit our Jenkins setup).

Jenkins pipeline shared library vs plugin

I am working on Jenkins pipeline for two projects. I built some customized configuration alerts messages via slack and emails. We expect my code can be used for my projects and also several other projects. So I am thinking to make it a small lib so that others don't need to ask me every time they onboard a Jenkins pipeline jobs. I was thinking using shared library with #Library() for other to use, as described in the docs.
However, since my lib depends on the existence of slack and emails plugin, it will not be usable when these plugin are not installed.
My question is: is there are way to declare dependency in pipeline Shared Libraries or I have to make jenkins plugin to address this issue?
As far as I know there is no way to declare dependencies to plugins right now (or version of Jenkins). Instead, what you can do is add a check for the plugin and give a proper error to the user of your library:
if (Jenkins.getInstance().getPluginManager().getPlugin("Slack+Plugin") == null) {
error "This shared library function requires "Slack plugin!"
}
Put this at the start of your shared library script, before any uses of the plugin. Note though, this gets tricky if you need to import classes from a plugin (since imports goes first in the groovy file). What you do in that situation is to make two scripts, the first script has the check and is the one the user calls, the second contains all the logic and imports, and is called by the first script once the checks pass.

How to modify TFS build automation workflow without .Net Compiler, need to use Team Developer's Language compiler

I am trying build automation for a project developed using legacy language called Team Developer 6. where each file needs to be compiled as an exe. also need to do some filter activity before building exes. there are 300 exes.
this process I could do in simple .Net utility which does the filtering and invokes Team Developer compiler for required files.
Is it possible to put this in to TFS build work flow? what is the best approach for this?
Write an MSBuild project that invokes the necessary commands for the tooling you require and check it in. In the TFS build definition, make use of the default template (at first) and set the MSBuild project file you created as the 'project to build'.
This way you can test your build process locally with MSBuild on the command line, and determine which command line switches you might need. You can set command line switches into the build definition, or if you need some further control you can modify the default template to inject the command line switches directly into the MSBuild activity.
I recommend this way, as then you won't have to create any customized workflow, and can avoid having to go down the road of using custom workflow activities in TFS (which is absolutely supported, but in my opinion a bit difficult to diagnose/debug/maintain/upgrade).
You would ideally want to use an InvokeProcess activity to call an executable which does the filtering and invoking. An alternative but more complex approach would be to create a custom activity, but that requires installation of binaries on the build servers.

Dart Editor equivalent of Eclipse command "Project >> Clean..."?

Is there an equivalent of the Eclipse "Project" menu "Clean..." command in the Dart Editor? If not, how can a project be cleaned of the files generated by the various tools?
Never heard of such a functionality in Dart.
What files of what tools? I guess from Run as JavaScript?
I guess the development goes in the direction that files are generated only in the build directory.
These files will be purged before rebuild.
But this is WIP.
I have the same question. However I am coming to this question with a suggestion. I would like people to consider the build model implemented by Apache Maven for Dart builds.
The nice thing about the maven model is that you can:
Define custom actions aside from the built-in ones.
Implement custom handlers for actions.
As for the clean action in Eclipse -- Add my vote for Dart Editor.
Eclipse normally use either Apache Ant or Apache Maven for build actions. 'Clean' is a standard action and found its way to the Build menu at a very early stage. Which gives me an idea;
Add some UI meta-management to the Dart Editor
Build tool = Current Dart build (default), OR user specified: Ant, Maven, ...
Build menu = Set standard actions, allow custom actions against the current build tool; e.g. a "database load".
I know with new frameworks and languages there's always more things to do that time to do it. There are probably better examples than just And or Maven. I'm just pondering some flexible options to make the Dart development environment a little "future proof". ;-)

Ant/Ivy for project building

I am considering switching a Maven project that I manage to Apache-Ant/Ivy. I need more control over the build process and am getting very frustrated with Maven. Please no comments about how great Maven is. My question is about Ivy.
I would like to set up a "standard" Ant build template that can later be used for other projects with minimal changes.
I will set up a central "enterprise" repository where we can place third-party libraries that are not available in the public Maven repositories (e.g. commercial libraries, Sun libraries, proprietary libraries, etc.). This enterprise repository will be available on our local LAN, but may not be available from outside the office.
Each developer will have a private repository in ~/.ivy/repository. I would like the Ant build to automatically update this private repository with changed versions of libraries from the enterprise repository.
In ~/.ivy/ant, I plan on placing "standard" modules for including in the individual project build.xml files, using the include task in Ant 1.8. These modules will provide things like Scala and Clojure compilation targets with different versions for different Scala and Clojure versions (e.g.: scala-compile-2.9.1.xml, clojure-compile-1.3.xml, etc.) The build modules will be available in the enterprise repository and should be updated automatically in the private repositories if they change.
Each project will follow a standard Maven directory structure: ${project}/src/main/java, ${project}/target/classes, etc.
In the past, I tried using Ivy but the Ant build files got to be very large (> 500 lines for the template build file) and hard to manage/edit. I am hoping that by putting standard targets in their own build modules in the ~/.ivy/ant directory, I can avoid that code bloat.
Can this be done? Am I way off base? The only documentation I can find on Ivy is at the Apache web site (http://ant.apache.org/ivy). Is there any other documentation available, including books?
Rather sensible idea about dividing template build file into includable helper files. Personally, now i'm switchin' a really large project from ant (no dependency managment at all - only copying files from ftp) to ant/ivy solution. So i've done this way - i have a file with milestones targets - i.e ready-to-compile, compiled, ready-to-archiving, archived - so on. I think u got the idea. I've configured dependencies of all this targets ( dependencies in terms of ant, do't get me wrong). In that way - compiled depends on ready-to-compile, ready-to-compilede depends on initialized - smth like this. This targets don't have body - they are for including in every build-file of every module of your multi-module project. The sole purpose of this targets for maintaining the STATE of build, because of this import stuff things become rather tricky and it's hard time to know what target was overriden, and when this target would be run. But with this file i can easily change state of vy build on every sensible milestone. I want in one module to compile help files with exteran exe. No problem - in this project i just do this - ready-to-archiving depends on the target for compiling help. And as this milestones targets are included - i can override only some of them - all others would presere the desired way of building project.
Another part of my strategy - mixins build files - for every specific area. So i have a file for ivy. There i put initializing, resolving, publishing and so on. When i want to use ivy - i just include this file and manage depdendencies through my milestones targets. If the build is typical - i only include this file and have a convention-over-configuration functionality. All out of the box. How?? Just combining with other mixins. Mixins may include other mixins to depend on them. So each mixin is a reusable part of my build strategy. The stuff from OOP - single-concerned unit. In your case it's scala mixin with targets specific for scala stuff.
Then i have delegate.xml that delegates child projects common build activities. I have dist, all, test and whatever u want for multimodule project. The build order is evaluated with ant-ivy task buildlist.
There also some other files - but this are the strategically basic files that helped me to have a reusable and maintanable build with this BIG and VERY Conservative project. So, if u are interested about details, don't be shy and contact me. I will be very pleased to help you, because ivy docs are really comlicated and incomplete.
EDIT: About books - Ant in Action may help you, i took several ideas from this book, and i really highly recommend it everyone to read. There u can find ivy stuff, also. And about ivy docs - sorry, it's all that is available. But when i was in trouble with this cumbersome ivy+ant - i found several interesting articles on private blogs. So ... that may fill the gap some way.

Resources