Jenkins HelloWorld plugin does not persist config after restart - jenkins

I have been trying to create my first jenkins plugin. Everything is great except that the global config does not persist after the jenkins service is restarted.
THe config saves fine as long as the service is not restarted.
The global config jelly file...
<j:jelly xmlns:j="jelly:core" xmlns:st="jelly:stapler" xmlns:d="jelly:define" xmlns:l="/lib/layout" xmlns:t="/lib/hudson" xmlns:f="/lib/form">
Jenkins uses a set of tag libraries to provide uniformity in forms.
To determine where this tag is defined, first check the namespace URI,
and then look under $JENKINS/views/. For example, <f:section> is defined
in $JENKINS/views/lib/form/section.jelly.
It's also often useful to just check other similar scripts to see what
tags they use. Views are always organized according to its owner class,
so it should be straightforward to find them.
-->
<f:section title="Hello World Builder">
<f:entry title="French" field="useFrench"
description="Check if we should say hello in French">
<f:checkbox />
</f:entry>
</f:section>
</j:jelly>
After save jenkins is constructing a config file named
examplePlugin.examplePlugin.HelloWorldBuilder.xml
With Content
false
The descriptor itself is the following.
// Overridden for better type safety.
// If your plugin doesn't really define any property on Descriptor,
// you don't have to do this.
#Override
public DescriptorImpl getDescriptor() {
return (DescriptorImpl)super.getDescriptor();
}
/**
* Descriptor for {#link HelloWorldBuilder}. Used as a singleton.
* The class is marked as public so that it can be accessed from views.
*
* <p>
* See <tt>src/main/resources/hudson/plugins/hello_world/HelloWorldBuilder/*.jelly</tt>
* for the actual HTML fragment for the configuration screen.
*/
#Extension // This indicates to Jenkins that this is an implementation of an extension point.
public static final class DescriptorImpl extends BuildStepDescriptor<Builder> {
/**
* To persist global configuration information,
* simply store it in a field and call save().
*
* <p>
* If you don't want fields to be persisted, use <tt>transient</tt>.
*/
private boolean useFrench;
/**
* Performs on-the-fly validation of the form field 'name'.
*
* #param value
* This parameter receives the value that the user has typed.
* #return
* Indicates the outcome of the validation. This is sent to the browser.
*/
public FormValidation doCheckName(#QueryParameter String value)
throws IOException, ServletException {
if (value.length() == 0)
return FormValidation.error("Please set a name");
if (value.length() < 4)
return FormValidation.warning("Isn't the name too short?");
return FormValidation.ok();
}
public boolean isApplicable(Class<? extends AbstractProject> aClass) {
// Indicates that this builder can be used with all kinds of project types
return true;
}
/**
* This human readable name is used in the configuration screen.
*/
public String getDisplayName() {
return "Say hello world";
}
#Override
public boolean configure(StaplerRequest req, JSONObject formData) throws FormException {
// To persist global configuration information,
// set that to properties and call save().
useFrench = formData.getBoolean("useFrench");
// ^Can also use req.bindJSON(this, formData);
// (easier when there are many fields; need set* methods for this, like setUseFrench)
save();
return super.configure(req,formData);
}
/**
* This method returns true if the global configuration says we should speak French.
*
* The method name is bit awkward because global.jelly calls this method to determine
* the initial state of the checkbox by the naming convention.
*/
public boolean getUseFrench() {
return useFrench;
}
}
Any help with why this is not reloading on reboot would be very helpful, since this seems to be a problem with the example project created by the maven archetype.

So this is problem with the hello world application. You need to define in your constructor that you want to load the configuration.
public DescriptorImpl(){
load();
}
That fixes the issue I was seeing with the configuration not being persisted.

Related

How can I inject with Guice my api into dataflow jobs without needed to be serializable?

This question is a follow on after such a great answer Is there a way to upload jars for a dataflow job so we don't have to serialize everything?
This made me realize 'ok, what I want is injection with no serialization so that I can mock and test'.
Our current method requires our apis/mocks to be serialiable BUT THEN, I have to put static fields in the mock because it gets serialized and deserialized creating a new instance that dataflow uses.
My colleague pointed out that perhaps this needs to be a sink and that is treated differently? <- We may try that later and update but we are not sure right now.
My desire is from the top to replace the apis with mocks during testing. Does someone have an example for this?
Here is our bootstrap code that does not know if it is in production or inside a feature test. We test end to end results with no apache beam imports in our tests meaning we swap to any tech if we want to pivot and keep all our tests. Not only that, we catch way more integration bugs and can refactor without rewriting tests since the contracts we test are customer ones we can't easily change.
public class App {
private Pipeline pipeline;
private RosterFileTransform transform;
#Inject
public App(Pipeline pipeline, RosterFileTransform transform) {
this.pipeline = pipeline;
this.transform = transform;
}
public void start() {
pipeline.apply(transform);
pipeline.run();
}
}
Notice that everything we do is Guice Injection based so the Pipeline may be direct runner or not. I may need to modify this class to pass things through :( but anything that works for now would be great.
The function I am trying to get our api(and mock and impl to) with no serialization is thus
private class ValidRecordPublisher extends DoFn<Validated<PractitionerDataRecord>, String> {
#ProcessElement
public void processElement(#Element Validated<PractitionerDataRecord>element) {
microServiceApi.writeRecord(element.getValue);
}
}
I am not sure how to pass in microServiceApi in a way that avoid serialization. I would be ok with delayed creation as well after deserialization using guice Provider provider; with provider.get() if there is a solution there too.
Solved in such a way that mocks no longer need static or serialization anymore by one since glass bridging the world of dataflow(in prod and in test) like so
NOTE: There is additional magic-ness we have in our company that passes through headers from service to service and through dataflow and that is some of it in there which you can ignore(ie. the RouterRequest request = Current.request();). so for anyone else, they will have to pass in projectId into getInstance each time.
public abstract class DataflowClientFactory implements Serializable {
private static final Logger log = LoggerFactory.getLogger(DataflowClientFactory.class);
public static final String PROJECT_KEY = "projectKey";
private transient static Injector injector;
private transient static Module overrides;
private static int counter = 0;
public DataflowClientFactory() {
counter++;
log.info("creating again(usually due to deserialization). counter="+counter);
}
public static void injectOverrides(Module dfOverrides) {
overrides = dfOverrides;
}
private synchronized void initialize(String project) {
if(injector != null)
return;
/********************************************
* The hardest part is this piece since this is specific to each Dataflow
* so each project subclasses DataflowClientFactory
* This solution is the best ONLY in the fact of time crunch and it works
* decently for end to end testing without developers needing fancy
* wrappers around mocks anymore.
***/
Module module = loadProjectModule();
Module modules = Modules.combine(module, new OrderlyDataflowModule(project));
if(overrides != null) {
modules = Modules.override(modules).with(overrides);
}
injector = Guice.createInjector(modules);
}
protected abstract Module loadProjectModule();
public <T> T getInstance(Class<T> clazz) {
if(!Current.isContextSet()) {
throw new IllegalStateException("Someone on the stack is extending DoFn instead of OrderlyDoFn so you need to fix that first");
}
RouterRequest request = Current.request();
String project = (String)request.requestState.get(PROJECT_KEY);
initialize(project);
return injector.getInstance(clazz);
}
}
I suppose this may not be what you're looking for, but your use case makes me think of using factory objects. They may depend on the pipeline options that you pass (i.e. your PipelineOptions object), or on some other configuration object.
Perhaps something like this:
class MicroserviceApiClientFactory implements Serializable {
MicroserviceApiClientFactory(PipelineOptions options) {
this.options = options;
}
public static MicroserviceApiClient getClient() {
MyPipelineOptions specialOpts = options.as(MySpecialOptions.class);
if (specialOpts.getMockMicroserviceApi()) {
return new MockedMicroserviceApiClient(...); // Or whatever
} else {
return new MicroserviceApiClient(specialOpts.getMicroserviceEndpoint()); // Or whatever parameters it needs
}
}
}
And for your DoFns and any other execution-time objects that need it, you would pass the factory:
private class ValidRecordPublisher extends DoFn<Validated<PractitionerDataRecord>, String> {
ValidRecordPublisher(MicroserviceApiClientFactory msFactory) {
this.msFactory = msFactory;
}
#ProcessElement
public void processElement(#Element Validated<PractitionerDataRecord>element) {
if (microServiceapi == null) microServiceApi = msFactory.getClient();
microServiceApi.writeRecord(element.getValue);
}
}
This should allow you to encapsulate the mocking functionality into a single class that lazily creates your mock or your client at pipeline execution time.
Let me know if this matches what you want somewhat, or if we should try to iterate further.
I have no experience with Guice, so I don't know if Guice configurations can easily pass the boundary between pipeline construction and pipeline execution (serialization / submittin JARs / etc).
Should this be a sink? Maybe, if you have an external service, and you're writing to it, you can write a PTransform that takes care of it - but the question of how you inject various dependencies will remain.

Obtain CpsScript instance in workflow-cps groovy code?

Currently coding a lot of groovy for very specific jenkins scenarios.
The problem is that I have to keep track of the current CpsScript-instance for the context (getting properties, the environment and so on) and its invokeMethod (workflow steps and the likes).
Currently this means I pass this in the pipeline groovy script onto my entry class and from there it's passed on to every class separately, which is very annoying.
The script instance is created by the CpsFlowExecution and stored within the Continuable-instance and the CpsThreadGroup, neither of which allow you to retrieve it.
Seems that GlobalVariable derived extensions receive it so that they have a context but I'm currently not knowledgeable enough to write my own extension to leverage that.
So the question is:
Does anyone know of a way to keep track of the CpsScript-instance that doesn't require me to pass it on to every new class I create? (Or alternatively: obtain it from anywhere - does this really need to be so hard?)
Continued looking into ways to accomplish this. Even wrote a jenkins plugin that provides an cpsScript global variable. Unfortunately you need the instance to provide a context for that call, so it's useless.
So as the "least bad solution"(tm) I created a class I called ScriptContext that I can use as a base class for my pipeline classes (It implements Serializable).
When you write your pipeline script you either pass it the CpsScript statically once:
ScriptContext.script = this
Or, if you derived from it (make sure to call super()):
new MyPipeline(this)
If your class is derived from the ScriptContext your work is done. Everything will work as though you didn't create a class but just used the automagic conversion. If you use any CpsScript-level functions besides println, you might want to add these in here as well.
Anywhere else you can just call ScriptContext.script to get the script instance.
The class code (removed most of the comments to keep it as short as possible):
package ...
import org.jenkinsci.plugins.workflow.cps.*
class ScriptContext implements Serializable {
protected static CpsScript _script = null
ScriptContext(CpsScript script = null) {
if (!_script && script) {
_script = script
}
}
ScriptContext withScript(CpsScript script) {
setScript(script)
this
}
static void setScript(CpsScript script) {
if (!_script && script) {
_script = script
}
}
static CpsScript getScript()
{
_script
}
// functions defined in CpsScript itself are not automatically found
void println(what) {
_script.println(what)
}
/**
* For derived classes we provide missing method functionality by trying to
* invoke the method in script context
*/
def methodMissing(String name, args) {
if (!_script) {
throw new GroovyRuntimeException('ScriptContext: No script instance available.')
}
return _script.invokeMethod(name, args)
}
/**
* For derived classes we provide missing property functionality.
* Note: Since it's sometimes unclear whether a property is an actual property or
* just a function name without brackets, use evaluate for this instead of getProperty.
* #param name
* #param args
* #return
*/
def propertyMissing(String name) {
if (!_script) {
throw new GroovyRuntimeException('ScriptContext: No script instance available.')
}
_script.evaluate(name)
}
/**
* Wrap in node if needed
* #param body
* #return
*/
protected <V> V node(Closure<V> body) {
if (_script.env.NODE_NAME != null) {
// Already inside a node block.
body()
} else {
_script.node {
body()
}
}
}
}

Typo3 Call to a member function on null

I have seen similar problems on Stackoverflow but none of those answers has worked for me (including clearing cache, clearing PHP opcode cache systems, de-activating and re-activating the extension). Hopefully someone can point me in the right direction.
I am running a scheduled command for an extension. At some point my command will need to call the method test() from the MyController class.
I have tried to create a reference to the class via an inheritance call AND by ALL injection methods but no matter which way I try it I always get the same issue...:
Call to a member function test() on null
Most recently I used the injection method that is not recommended, but it simplifies my example below so I'll use it for now. (VendorName and ExtensionName are obvs dummy names):
* #var \VendorName\ExtensionName\Controller\MyController
* #inject
*/
protected $mc;
public function myCommand()
{
return $this->mc->test(); //should return true
}
...and inside MyController
public function test()
{
return true;
}
The issue isn't the injection call on the command Class, but some automatically generated code on the MyController Class. It seems Extension Builder can cause the error by incorrectly creating the #inject line in the wrong place. Here is the code it created automatically:
/**
* #var \VendorName\ExtensionKey\Domain\Repository\ExampleRepository
* #inject
*/
protected $importService = null;
/**
* #inject
*/
protected $exampleRepository = null;
...that second #inject call creates the error. It should be just:
/**
* #var \VendorName\ExtensionKey\Domain\Repository\ExampleRepository
* #inject
*/
protected $importService = null;
protected $exampleRepository = null;
Unfortunately the dubugging doesn't tell you which class is causing the issue so I naturally thought it was my own code.

Passing parameter to a vaadin 7 web application

I was working with vaadin6 so i used the following method to retrive the web application parameters:
public abstract class Application implements URIHandler,
Terminal.ErrorListener, Serializable {
/**
* Searches for the property with the specified name in this application.
* This method returns <code>null</code> if the property is not found.
*
* See {#link #start(URL, Properties, ApplicationContext)} how properties
* are defined.
*
* #param name
* the name of the property.
* #return the value in this property list with the specified key value.
*/
public String getProperty(String name) {
return properties.getProperty(name);
}
//...
}
After migrating to vaadin7 i want to use the same functionnality but i couldn't find it.
It seems you are looking for
VaadinServlet.getCurrent().getServletContext().getInitParameter("name");
It grants access to paramter defined in web.xml, for example:
<context-param>
<param-name>name</param-name>
<param-value>John</param-value>
</context-param>

Lazy/Eager loading/fetching in Neo4j/Spring-Data

I have a simple setup and encountered a puzzling (at least for me) problem:
I have three pojos which are related to each other:
#NodeEntity
public class Unit {
#GraphId Long nodeId;
#Indexed int type;
String description;
}
#NodeEntity
public class User {
#GraphId Long nodeId;
#RelatedTo(type="user", direction = Direction.INCOMING)
#Fetch private Iterable<Worker> worker;
#Fetch Unit currentUnit;
String name;
}
#NodeEntity
public class Worker {
#GraphId Long nodeId;
#Fetch User user;
#Fetch Unit unit;
String description;
}
So you have User-Worker-Unit with a "currentunit" which marks in user that allows to jump directly to the "current unit". Each User can have multiple workers, but one worker is only assigned to one unit (one unit can have multiple workers).
What I was wondering is how to control the #Fetch annotation on "User.worker". I actually want this to be laoded only when needed, because most of the time I only work with "Worker".
I went through http://static.springsource.org/spring-data/data-neo4j/docs/2.0.0.RELEASE/reference/html/ and it isn't really clear to me:
worker is iterable because it should be read only (incoming relation) - in the documentation this is stated clarly, but in the examples ''Set'' is used most of the time. Why? or doesn't it matter...
How do I get worker to only load on access? (lazy loading)
Why do I need to annotate even the simple relations (worker.unit) with #Fetch. Isn't there a better way? I have another entity with MANY such simple relations - I really want to avoid having to load the entire graph just because i want to the properties of one object.
Am I missing a spring configuration so it works with lazy loading?
Is there any way to load any relationships (which are not marked as #Fetch) via an extra call?
From how I see it, this construct loads the whole database as soon as I want a Worker, even if I don't care about the User most of the time.
The only workaround I found is to use repository and manually load the entities when needed.
------- Update -------
I have been working with neo4j quite some time now and found a solution for the above problem that does not require calling fetch all the time (and thus does not load the whole graph). Only downside: it is a runtime aspect:
import org.aspectj.lang.ProceedingJoinPoint;
import org.aspectj.lang.annotation.Around;
import org.aspectj.lang.annotation.Aspect;
import org.aspectj.lang.annotation.Pointcut;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.neo4j.annotation.NodeEntity;
import org.springframework.data.neo4j.support.Neo4jTemplate;
import my.modelUtils.BaseObject;
#Aspect
public class Neo4jFetchAspect {
// thew neo4j template - make sure to fill it
#Autowired private Neo4jTemplate template;
#Around("modelGetter()")
public Object autoFetch(ProceedingJoinPoint pjp) throws Throwable {
Object o = pjp.proceed();
if(o != null) {
if(o.getClass().isAnnotationPresent(NodeEntity.class)) {
if(o instanceof BaseObject<?>) {
BaseObject<?> bo = (BaseObject<?>)o;
if(bo.getId() != null && !bo.isFetched()) {
return template.fetch(o);
}
return o;
}
try {
return template.fetch(o);
} catch(MappingException me) {
me.printStackTrace();
}
}
}
return o;
}
#Pointcut("execution(public my.model.package.*.get*())")
public void modelGetter() {}
}
You just have to adapt the classpath on which the aspect should be applied: my.model.package..get())")
I apply the aspect to ALL get methods on my model classes. This requires a few prerequesites:
You MUST use getters in your model classes (the aspect does not work on public attributes - which you shouldn't use anyways)
all model classes are in the same package (so you need to adapt the code a little) - I guess you could adapt the filter
aspectj as a runtime component is required (a little tricky when you use tomcat) - but it works :)
ALL model classes must implement the BaseObject interface which provides:
public interface BaseObject {
public boolean isFetched();
}
This prevents double-fetching. I just check for a subclass or attribute that is mandatory (i.e. the name or something else except nodeId) to see if it is actually fetched. Neo4j will create an object but only fill the nodeId and leave everything else untouched (so everything else is NULL).
i.e.
#NodeEntity
public class User implements BaseObject{
#GraphId
private Long nodeId;
String username = null;
#Override
public boolean isFetched() {
return username != null;
}
}
If someone finds a way to do this without that weird workaround please add your solution :) because this one works, but I would love one without aspectj.
Base object design that doenst require a custom field check
One optimization would be to create a base-class instead of an interface that actually uses a Boolean field (Boolean loaded) and checks on that (so you dont need to worry about manual checking)
public abstract class BaseObject {
private Boolean loaded;
public boolean isFetched() {
return loaded != null;
}
/**
* getLoaded will always return true (is read when saving the object)
*/
public Boolean getLoaded() {
return true;
}
/**
* setLoaded is called when loading from neo4j
*/
public void setLoaded(Boolean val) {
this.loaded = val;
}
}
This works because when saving the object "true" is returned for loaded. When the aspect looks at the object it uses isFetched() which - when the object is not yet retrieved will return null. Once the object is retrieved setLoaded is called and the loaded variable set to true.
How to prevent jackson from triggering the lazy loading?
(As an answer to the question in the comment - note that I didnt try it out yet since I did not have this issue).
With jackson I suggest to use a custom serializer (see i.e. http://www.baeldung.com/jackson-custom-serialization ). This allows you to check the entity before getting the values. You simply do a check if it is already fetched and either go on with the whole serialization or just use the id:
public class ItemSerializer extends JsonSerializer<BaseObject> {
#Override
public void serialize(BaseObject value, JsonGenerator jgen, SerializerProvider provider)
throws IOException, JsonProcessingException {
// serialize the whole object
if(value.isFetched()) {
super.serialize(value, jgen, provider);
return;
}
// only serialize the id
jgen.writeStartObject();
jgen.writeNumberField("id", value.nodeId);
jgen.writeEndObject();
}
}
Spring Configuration
This is a sample Spring configuration I use - you need to adjust the packages to your project:
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:neo4j="http://www.springframework.org/schema/data/neo4j"
xmlns:tx="http://www.springframework.org/schema/tx"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd
http://www.springframework.org/schema/data/neo4j http://www.springframework.org/schema/data/neo4j/spring-neo4j-2.0.xsd http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-2.5.xsd">
<context:annotation-config/>
<context:spring-configured/>
<neo4j:repositories base-package="my.dao"/> <!-- repositories = dao -->
<context:component-scan base-package="my.controller">
<context:exclude-filter type="annotation" expression="org.springframework.stereotype.Controller"/> <!-- that would be our services -->
</context:component-scan>
<tx:annotation-driven mode="aspectj" transaction-manager="neo4jTransactionManager"/>
<bean class="corinis.util.aspects.Neo4jFetchAspect" factory-method="aspectOf"/>
</beans>
AOP config
this is the /META-INF/aop.xml for this to work:
<!DOCTYPE aspectj PUBLIC
"-//AspectJ//DTD//EN" "http://www.eclipse.org/aspectj/dtd/aspectj.dtd">
<aspectj>
<weaver>
<!-- only weave classes in our application-specific packages -->
<include within="my.model.*" />
</weaver>
<aspects>
<!-- weave in just this aspect -->
<aspect name="my.util.aspects.Neo4jFetchAspect" />
</aspects>
</aspectj>
Found the answer to all the questions myself:
#Iterable: yes, iterable can be used for readonly
#load on access: per default nothing is loaded. and automatic lazy loading is not available (at least as far as I can gather)
For the rest:
When I need a relationship I either have to use #Fetch or use the neo4jtemplate.fetch method:
#NodeEntity
public class User {
#GraphId Long nodeId;
#RelatedTo(type="user", direction = Direction.INCOMING)
private Iterable<Worker> worker;
#Fetch Unit currentUnit;
String name;
}
class GetService {
#Autowired private Neo4jTemplate template;
public void doSomethingFunction() {
User u = ....;
// worker is not avaiable here
template.fetch(u.worker);
// do something with the worker
}
}
Not transparent, but still lazy fetching.
template.fetch(person.getDirectReports());
And #Fetch does the eager fetching as was already stated in your answer.
I like the aspect approach to work around the limitation of the current spring-data way to handle lazy loading.
#niko - I have put your code sample in a basic maven project and tried to get that solution to work with little success:
https://github.com/samuel-kerrien/neo4j-aspect-auto-fetching
For some reasons the Aspect is initialising but the advice doesn't seem to get executed. To reproduce the issue, just run the following JUnit test:
playground.neo4j.domain.UserTest

Resources