Is there a way to track usage of a global shared library in Jenkins? - jenkins

Context:
At my work most developers are free to write their own Jenkinsfile for their own team's projects.
As the Jenkins admin, I provide developers with a global shared library.
Most projects are using either v1 or v2 or v3 or another version of this library, using the idiom library("theSharedLib#v#").
Question: Is there a way for me to find out which Jenkinsfile is using which version of the shared library without having to actually lookup into all those Jenkinsfile files (50+ files in as much git repos)?
What I would see best is some mechanism that write up (into a file on the Jenkins master or in a DB) which project/Jenkinsfile is using which version at the time the library is loaded.
A possible solution would be to add some code to every function inside the library that will actually do this reporting. I could then see which function is used by who. Any better solution?

I wrote https://github.com/CiscoDevNet/es-logger to gather information such as this from Jenkins. It has a plugin that will run a regex against the console log of a completed job and can then post events to elastic search.
Jenkins helpfully posts library loads at the start of the log such as:
Loading library sharedLib#version
So a simple regex like
"^Loading library\S+(?P<library_name>.*?)#(?P<library_version>.*?)\S+$"
added to the console_log_events plugin would generate events in an elastic search for each usage and each version.

Related

Add condition to transition using script runner

I am using the scriptrunner plugin for Jira.
Is it possible to add a condition to a transition using scriptrunner?
Currently, my condition is in a script which I have manually added to the workflow.
But I was wondering if there is a way to do it automatically?
I was looking through documentation on: https://docs.atlassian.com/
I came across this method:
replaceConditionInTransition which is a method of WorkFlowManager.
But I'm unsure on how to use this.
Any help would be appreciated.
Conditions as any another scripts can be added from file system. You can store scripts in any VCS (bitbucket, github, gitlab, etc) and automatically deploy them to Jira server file system through any CI/CD system (teamcity, jenkins, bamboo, gitlab, etc).
So, as result process will be looks like. 1. commit changes in you script to vcs 2. wait a bit for auto deploy (e.g. triggered by commit) 3. done. As additional you can write any script/service/etc for commit these changes automatically if needed.
Also look at script roots it's helpful way which allows reuse any of script fragments through helpers classes.
It's rather conceptual answer basically because implementation is depends on environment, but I hope that you get at least one more point of view to solve this task.
I think that using the Java API to modify Jira workflows is pretty tough. You could dig around in the workflow editor to see how conditions are added there. Remember that you have to do this in a draft workflow and then publish it, which takes some time in large projects
I like the idea of replacing a script file as easier, if it can be done when no issues are transitioning

Performing denodo tasks from Jenkins

I am trying a create a working prototype for performing denodo activities from my Jenkins server.
Steps that i want to perform are :
Import a VSQL file from GIT to Denodo from Jenkins.
Create a view in Denodo from Jenkins.
Run this VSQL file in Denodo from Jenkins.
I am new to Denodo world and i am not sure if Denodo has any APIs for doing this.
Can someone let me know if this is really possible? If so where can i find a solution for this requirement. I tried searching in the internet for last few days, but couldn't find a solution.
The problem why you don't find to much on the web for this is that the files and query language in denodo is called vql not vsql. Try searching for that, you will find a lot there.
Anyways about your problem:
You have two options to work with CI and CD in Denodo. If you use Jenkins and just want to create views based on actions in other systems, e.g. create a base view as soon as a new table is created in the source you can just send the vql create script (containing create wrapper an create view) via jdbc or odbc to the server. For that create a technical user on denodo and load the driver to the jenkins server.
The other option is if you are using Denodo 7 to use the solution manager. There you have a rest API where you can create Revisions, test them on different environments and deploy them. Not sure if you can create a revision based on vql code that comes from Jenkins, but I think this should be possible.

How can I serve HTML files for development using maven?

I'm working on a scala app (building with maven) where the UI is html and javascript and the back end is a REST API. For deployment, the html/javascript will just get thrown into nginx as static resources, but for development I just want something that serves up the files from local disk. Other teams use gulp-connect for this, but I'm hoping to avoid adding a second build tool (i.e., gulp) to my stack if I can avoid it.
What are my options for going about this? I see there's an nginx plugin for maven, but it's poorly documented. NanoHttpd seems promising, but it looks like I'd have to write my own maven plugin.

Jenkins: Does 'Static' and '(blocking)' imply that respective plugins are used?

New to Jenkins and new to the way it is being used to my new workplace, I encountered the following in a job's page:
I would like to understand what I see, so a quick Google search yielded the following in regard to the use of 'Static' and '(blocking)' in Jenkins:
Static Code Analysis Plug-ins
Build Blocker Plugin
Are these the right places to look at for attempting to understand the meaning of these keywords in the context of Jenkins?
If not, what is the right place to find out about them?

archiver/publisher for external artifact URL

I'm generating a presigned S3 URL as part of a workflow job, which is passed into a build step that essentially runs outside of the workspace (via ssh). I've been unable to identify an existing publish or archive plugin (workflow compatible or not) that will allow setting an external URL. Is there a plugin or workaround that enables setting a URL as an artifact or the addition of simple metadata to the build results?
I'm a bit surprised that there isn't a way to publish metadata directly from a workflow -- it seems like this would be insanely useful functionally. Have I missed something obvious?
JENKINS-26918 proposes support for some of the features in the Groovy Postbuild plugin. Using core Workflow, you can call
currentBuild.description = "Published"
(or whatever your installation’s markup formatter allows).

Resources