Jenkins shared library and jenkinsFiles at the same repo - jenkins

Is it possible to have a global shared library and jenkinsFiles at the same repo?
I want to have something like
─ root
   ├── all-jenkins-files
   |    └── dir1
| └── Jenkinsfile1
| └── dir2
| └── Jenkinsfile2
   ├── shared-libraries
      └── src
      └── var
I was trying to use globel shared libraries configuration but I believe it failed because of the directories structure. global shared-libraries expecting to have src and var
folders under root dir.
any idea how to overcome this?

Yes It's possible to have a shared library like structure in the same folder, but I don't think you can use like a shared library (implicit or dynamic loading).
With this kind of scenario, you can make use of load DSL.
If the folder in SCM like :
.
├── shared-library
│   ├── src
│   └── vars
│   ├── log.groovy
│   ├── myPipeline.groovy
│  
├── all-jenkins-file
│   └── Jenkinsfile
shared-library/vars/myPipeline.groovy
stage('01') {
echo "01"
}
stage('02') {
echo "02"
}
stage('03') {
echo "03"
}
shared-library/vars/log.groovy
def info(message) {
echo "INFO: ${message}"
}
def warning(message) {
echo "WARNING: ${message}"
}
return this;
all-jenkins-file/Jenkinsfile
node {
checkout scm
load "${env.WORKSPACE}/shared-library/vars/myPipeline.groovy"
def log =load "${env.WORKSPACE}/shared-library/vars/log.groovy"
log.info('Hello')
}

Related

Change test execution directory in Bazel?

I have a simple Bazel project layout like this:
.
├── foo
│   ├── BUILD.bazel
│   ├── testdata
│   │   └── a.txt
│   └── test.sh
└── WORKSPACE
The test checks that a.txt exists:
foo/test.sh
#!/bin/bash
FILE=foo/testdata/a.txt
test -f "$FILE"
And it is defined for Bazel like this:
foo/BUILD.bazel
sh_test(
name = "foo",
size = "small",
srcs = [ "test.sh" ],
data = glob([ "testdata/*.txt" ]),
)
However, suppose I don't want my test to depend on its location within the workspace:
#!/bin/bash
FILE=testdata/a.txt # <-------- Path relative to the package directory
test -f "$FILE"
This does not work of course.
$ bazel test --cache_test_results=no --test_output=streamed //foo
...
//foo:foo FAILED in 0.0s
Is there a way to define my test target in Bazel so that it works, without modifying my test script?
In case it matters:
$ bazel --version
bazel 5.3.1
Bazel expects the files to be relative to the WORKSPACE file. Therefore, somewhere the path has to be stored. For instance, you can move the path from .sh file to BUILD file:
Add Skylib to your project, i.e. extend you WORKSPACE file:
WORKSPACE
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "bazel_skylib",
sha256 = "74d544d96f4a5bb630d465ca8bbcfe231e3594e5aae57e1edbf17a6eb3ca2506",
urls = [
"https://mirror.bazel.build/github.com/bazelbuild/bazel-skylib/releases/download/1.3.0/bazel-skylib-1.3.0.tar.gz",
"https://github.com/bazelbuild/bazel-skylib/releases/download/1.3.0/bazel-skylib-1.3.0.tar.gz",
],
)
load("#bazel_skylib//:workspace.bzl", "bazel_skylib_workspace")
bazel_skylib_workspace()
Replace the path in the bash script using text replacement:
foo/BUILD.bazel
load("#bazel_skylib//rules:expand_template.bzl", "expand_template")
expand_template(
name = "modifiy_for_bazel",
out = "test_modfied_for_bazel.sh",
substitutions = {
"FILE=testdata/a.txt": "FILE=foo/testdata/a.txt",
},
template = "test.sh",
)
sh_test(
name = "foo",
size = "small",
srcs = ["test_modfied_for_bazel.sh"],
data = glob(["testdata/*.txt"]),
)
Now bazel test --cache_test_results=no --test_output=streamed //foo works.
A similar problem is described here: Bazel: Reading a file with relative path to package, not workspace

How can I copy files from parameterised directory using maven archetypes?

I need to copy files from a parameterised local directory to a specific directory inside the project. I currently have this archetype-metadata.xml.
<?xml version="1.0" encoding="UTF-8"?>
<archetype-descriptor
xsi:schemaLocation="http://maven.apache.org/plugins/maven-archetype-plugin/archetype-descriptor/1.0.0 http://maven.apache.org/xsd/archetype-descriptor-1.0.0.xsd"
xmlns="http://maven.apache.org/plugins/maven-archetype-plugin/archetype-descriptor/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
name="custom-maven-archetype">
<fileSets>
<fileSet filtered="true" packaged="true" encoding="UTF-8">
<directory>src/main/java</directory>
</fileSet>
<fileSet filtered="true" packaged="true" encoding="UTF-8">
<directory>src/test/java</directory>
</fileSet>
<fileSet filtered="true" packaged="true" encodind="UTF-8">
<directory>${wsdlFile}</directory>
<outputDirectory>${project.basedir}</outputDirectory>
</fileSet>
</fileSets>
<requiredProperties>
<requiredProperty key="codigoMunicipal" />
<requiredProperty key="wsdlFile" />
</requiredProperties>
</archetype-descriptor>
This is my current filetree.
.
├── pom.xml
├── src
│ └── main
│ └── resources
│ ├── META-INF
│ │ └── maven
│ │ └── archetype-metadata.xml
│ └── archetype-resources
│ ├── README.md
│ └── pom.xml
└── target
├── agency-lib-archetype-1.0.0.jar
└── classes
├── META-INF
│ └── maven
│ └── archetype-metadata.xml
└── archetype-resources
├── README.md
└── pom.xml
You can also use the archetype-post-generate.groovy to copy it manually
See the Post-Generation Script at the end of this page https://maven.apache.org/archetype/maven-archetype-plugin/advanced-usage.html
AS suggest #Devendra, something like this in your groovy script:
def command = "cp ${request.getOutputDirectory()}/subfolderName ./destinationFolder"
def proc = command.execute()
proc.waitFor()
if(proc.exitValue() == 0) {
println "Post Script: Moving files...SUCCESS";
} else{
println "Std Err: ${proc.err.text}"
"Post Script: ... KO, error moving files"
}
Where request is the instance of https://maven.apache.org/archetype/archetype-common/apidocs/org/apache/maven/archetype/ArchetypeGenerationRequest.html attached to this generation.
Or simply use variables on groovyscript:
def command = "cp ./${artifactId}/subfolderName ./destinationFolder"
def proc = command.execute()
proc.waitFor()
if(proc.exitValue() == 0) {
println "Post Script: Moving files...SUCCESS";
} else{
println "Std Err: ${proc.err.text}"
"Post Script: ... KO, error moving files"
}

Jenkins pipeline shared library can't invoke method in src directory

I want to invoke method of src directory from vars directory, which it works in IDE. But it seems not work in Jenkins.
1.project structure
├── src
│ └── foo
│ └── DemoClass.groovy
└── vars
└── varDemo.groovy
2.Content of DemoClass.groovy
#!groovy
package foo
class DemoClass {
def testDemoMethod() {
println("src DemoClass testDemoMethod")
}
}
3.Content of varDemo.groovy
#!groovy
import foo.DemoClass
def testVarsDemo() {
println("vars varDemo.groovy testVarsDemo")
}
def testVarsInvokeDemoMethod() {
println("vars varDemo.groovy testVarsInvokeDemoMethod")
def demoClass = new DemoClass()
demoClass.testDemoMethod()
println("end vars varDemo.groovy testVarsInvokeDemoMethod")
}
4.Jenkins pipeline
#Library('tools') _
varDemo.testVarsDemo()
varDemo.testVarsInvokeDemoMethod()
5.execute result in pipeline
> git checkout -f b6176268be99abe300d514e1703ff8a08e3ef8da
Commit message: "test"
> git rev-list --no-walk c1a50961228ca071d43134854548841a056e16c9 # timeout=10
[Pipeline] echo
vars varDemo.groovy testVarsDemo
[Pipeline] echo
vars varDemo.groovy testVarsInvokeDemoMethod
[Pipeline] echo
end vars varDemo.groovy testVarsInvokeDemoMethod
[Pipeline] End of Pipeline
It seem like demoClass.testDemoMethod() not work. Why can't invoke demoClass.testDemoMethod()? If I want to invoke the method in src directory, what should I do? Thank you!
Without reimplementing your code sections locally, Here are some differences I notice between yours and mine that is working.
JENKINSFILE
I don't have a space before the underscore on my #Library line
Immediately after my #Library line I am importing my shared library class that implements the methods I want to call. In your case this would be import foo.DemoClass
My call to my method is of the form (new DemoClass(config, this)).testVarsInvokeDemoMethod()
SHARED LIBRARY CLASSES
I don't have #!groovy in any of my groovy classes.
My class is public and implements Serializable
Hopefully one of these difference is the source of why its not getting called.
We need to follow the below structure
src/packagename/GroovyFile.groovy
vars/callFromJenkinsPipelne.groovy
The jenkins pipeline should be like this .
library('libracyname#Branchname')
callFromJenkinsPipelne()
Inside the vars file ( callFromJenkinsPipelne) , we can call the groovy file which is in src folder
//code in vars folder groovy file will be like this
for callFromJenkinsPipelne groovy file
def call(){
GroovyFile groovyFile = new com.jenkins.mypackagename.GroovyFile()
groovyFile.methodName(parameters)
}

Integrate Angular and Webpack in an ASP.NET MVC Application [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
I am looking for a write up with steps i can follow to add Angular to an existing MVC application that lives in an area of my solution. I have found a write up that shows me how to add angular using a gulp.js file that pulls the necessary node_modules and transpiles my ts files into js files. I would like to do the same exact thing using webpack instead.
Currently my package.json looks like this.
{
"version": "1.0.0",
"name": "aspnet",
"private": true,
"scripts": {},
"dependencies": {
"#angular/animations": "4.3.5",
"#angular/common": "4.3.5",
"#angular/compiler": "4.3.5",
"#angular/compiler-cli": "4.3.5",
"#angular/core": "4.3.5",
"#angular/forms": "4.3.5",
"#angular/http": "4.3.5",
"#angular/platform-browser": "4.3.5",
"#angular/platform-browser-dynamic": "4.3.5",
"#angular/platform-server": "4.3.5",
"#angular/router": "4.3.5",
"#angular/upgrade": "4.3.5",
"angular-in-memory-web-api": "0.3.2",
"bootstrap": "3.3.7",
"core-js": "2.5.0",
"ie-shim": "0.1.0",
"rxjs": "5.4.3",
"zone.js": "0.8.16",
"systemjs": "^0.20.18"
},
"devDependencies": {
"gulp": "^3.9.1",
"gulp-clean": "^0.3.2",
"gulp-concat": "^2.6.1",
"gulp-tsc": "~1.3.2",
"gulp-typescript": "^3.2.2",
"path": "^0.12.7",
"typescript": "^2.4.2"
}
}
This is the write up I followed that uses gulp and this works.
But i would much rather prefer using webpack instead of a gulp task.
I hope that can help you find the idea because our projects also need to integrate webpack and ASP.NET MVC together. Noted that it is my own proposal so that there might be a better way to do it. Below is what we do.
Check the source code on Github
1. Structure the project
We separated our project into two folders: Client and Server. Those will be located in mvc5-angular-webpack folder and this folder will be committed to the repository
mvc5-angular-webpack/
├── Server/
│ ├── WebApplication/
│ │ ├── Controllers/
│ │ ├── Scripts/
│ │ ├── Web.config
│ │ ├── Many more folder and file...
│ │
│ └── Web-Core.sln
│
├── Client/
├── modules
│ ├── angularModule-1/
│ │ ├── main.ts
│ │ ├── app.modules.ts
│ │ ├── app.component.ts
│ │ ├── Many more file...
│ │
│ ├── angularModule-2/
│ │ ├── main.ts
│ │ ├── app.modules.ts
│ │ ├── app.component.ts
│ │ ├── Many more file...
│ ├── polyfill.ts
│ ├── vendor.ts
│
└── build.bat
└── npm-shrinkwrap.json
└── package.json
└── tsconfig.json
└── tslint.json
└── webpack.config.js
In the Server folder, we added the MVC solution named Web-Core.sln
and all the common library project is written in C#.
In the Client folder only contains front-end related stuff. To build the front project, simply call the build.bat. I will talk about this file content later. Inside modules folder, our project will create each subfolder for each module.
Our website has some module still using server-side rendering with pure Razor. And there is some module written in client-side code with AngularJS and Angular.
2. Configure webpack
Assume that you configured all the typescript and npm already. Let see what I have inside webpack.config.js
const webpack = require('webpack')
const path = require('path')
const UglifyJsPlugin = require('uglifyjs-webpack-plugin')
const entryPath = path.resolve(__dirname, 'modules')
const corePath = path.resolve(__dirname, '../Server/WebApplication/Scripts/ng2')
const module1 = `${entryPath}/angularModule-1`
const module2 = `${entryPath}/angularModule-2`
module.exports = (envOptions) => {
envOptions = envOptions || {};
const config = {
entry: {
'polyfills': `${entryPath}/polyfill.ts`,
'vendors': `${entryPath}/vendor.ts`,
'module1': `${module1}/main.ts`,
'module2': `${module2}/main.ts`
},
output: {
path: corePath,
filename: '[name].js',
sourceMapFilename: "[name].js.map"
},
resolve: {
extensions: ['.ts', '.js', '.html']
},
module: {
rules: [
{
test: /\.ts$/,
loaders: ['awesome-typescript-loader', 'angular2-template-loader']
},
{
test: /\.html$/,
loader: 'raw-loader'
},
{
test: /\.css$/,
loader: 'raw-loader'
}
]
},
devtool: 'source-map',
plugins: [
new webpack.NoEmitOnErrorsPlugin(),
new webpack.optimize.CommonsChunkPlugin({
name: ['vendors', 'polyfills']
})
]
}
if (envOptions.MODE === 'prod') {
config.plugins.push(
new UglifyJsPlugin()
)
}
return config;
}
So basically it will try to resolve the directory in an upper level and put all the compiled files to Scripts/ng2 inside Server folder.
3. Configure npm script
After configured webpack, we will basically add some more script to run during the build process. Add the following code to package.json file
"scripts": {
"tsc": "tsc",
"tsc:w": "tsc -w",
"dev": "webpack-dev-server --https --open",
"watch": "webpack --config webpack.config.js --watch",
"build": "webpack --config webpack.config.js",
"build:html": "webpack --config webpack-html-plugin.config.js --env.MODE=prod",
"build:prod": "webpack --config webpack.config.js --env.MODE=prod"
}
4. Configure build.bat
At the beginning of Angular 2 integration, we have created an empty web application project for front-end purpose and added this project as a dependency of our WebApplication. But our backend team later complained about how slow the front-end process take. Because they don't need the front-end project to build every time the WebApplication is being built.
The idea of the build.bat file is to manually run it to get the latest version of front-end code on their machine. Not every single time they run the project.
call npm install --production --loglevel verbose
echo "Build Angular projects"
npm run build:prod
The call is to continue because some of the commands abort the command line. Refer here
The script here is very simple. First, we run the npm install to restore all the necessary dependency. Then call build:prod as we defined on package.json before. Webpack will take care of bundle our Typescript code into three big JavaScript files as vendors.js, polyfills.js and module1.js.
Our team used Jenkins for deployment, so our dev ops just need to include to run the build.bat and we are all set. If you want to run it everytime your project was built, you can set it inside the pre-build or post-build event.
Image source
5. Reference compiled JavaScript file in the view.
Normally we will return only one view in an area as below. The my-angular-app is what we defined in app.component.ts
Index.cshtml
<script src="~/Scripts/ng2/polyfills.js"></script>
<script src="~/Scripts/ng2/vendors.js"></script>
<script src="~/Scripts/ng2/module1.js"></script>
<my-angular-app>
Loading...
</my-angular-app>
HomeController.cs
public ActionResult Module1()
{
return View();
}
That is a bit drawback If we deploy to production. Because after the compiled JavaScript was updated, the browser sometimes still keep the old version of the file because of caching. We should have a mechanism to provide a unique name for the files after deployment. There are 3 options for us to do so.
i. html-webpack-plugin
If we work with pure front-end project, webpack provides html-webpack-plugin to take care of it as below. Technically, It will automatically inject the JavaScript file into our view with the unique id.
webpack-html-webpack-plugin.config.js
...
const HtmlWebpackPlugin = require('html-webpack-plugin');
const viewPath = path.resolve(
__dirname,
"../Server/WebApplication/Views/Home"
);
...
entry: {
polyfills: `${entryPath}/polyfill.ts`,
vendors: `${entryPath}/vendor.ts`,
module1: `${module1}/main.ts`
},
output: {
path: corePath,
filename: "[name].[hash].js",
sourceMapFilename: "[name].[hash].js.map"
}
....
plugins: [,
...,
new HtmlWebpackPlugin({
template: viewPath + "/loader.cshtml",
filename: viewPath + "/Module1.cshtml",
inject: false
})
]
And in the same folder with the designated view, we created a cshtml file named loader.cshtml
loader.cshtml
<% for (var chunk in htmlWebpackPlugin.files.chunks) { %>
<script src="<%= htmlWebpackPlugin.files.chunks[chunk].entry %>"></script>
<% } %>
<my-angular-app>
Loading...
</my-angular-app>
And run the npm run build:html as defined on package.json. If it runs successfully, the result will look like that.
#{
ViewBag.Title = "Module 1";
}
<script src="../../Scripts/ng2/vendors.1470de344a0f2260b700.js"></script>
<script src="../../Scripts/ng2/vendors.1470de344a0f2260b700.js"></script>
<script src="../../Scripts/ng2/module1.1470de344a0f2260b700.js"></script>
<my-angular-app>Loading....</my-angular-app>
ii. Defined your own JavaScript version
In our ASP.NET MVC project is a bit different because we used more than one Angular app. So that we defined a version in the class and append it at the end of the file when loading the JS. By doing it, we will make sure the latest will be loaded into the browser. But it is out of this question context so I will not go further. Basically, It is gonna look like below code.
<script src="#string.Format("{0}?v={1}", "~/Scripts/ng2/polyfills.js", VersionHelper.CurrentVersion())</script>
<script src="#string.Format("{0}?v={1}", "~/Scripts/ng2/vendors.js", VersionHelper.CurrentVersion())</script>
<script src="#string.Format("{0}?v={1}", "~/Scripts/ng2/module1.js", VersionHelper.CurrentVersion())</script>
When serving it on browser, it will look like
<script src="~/Scripts/ng2/polyfills.js?v=1.1.0"></script>
<script src="~/Scripts/ng2/vendors.js?v=1.1.0"></script>
<script src="~/Scripts/ng2/module1.js?v=1.1.0"></script>
iii. ASP.NET Bundling
At Zyllem, our team didn't use it because of our JavaScript files is configure inside the model and render it later to the view.
You can just open App_Start\BundleConfig.cs in your project and config a bundle. Let say the name is module1
bundles.Add(new ScriptBundle("~/bundles/module1").Include(
"~/Scripts/ng2/polyfills.js",
"~/Scripts/ng2/vendors.js"
"~/Scripts/ng2/module1.js"));
And render inside the view by doing it.
Index.cshtml
#Scripts.Render("~/bundles/module1")
So that when serving on the browser, it will have the unique token at the end and it is different If you make any changes in the script bundle.
<script src="/bundles/module1?v=2PbUw0z_gh_Ocp_Vz13rdmmBSG6utLJAXm2lThxYAow1"></script>
If everything works fine, you will see the below screenshots.
Update
Add the repository on Github https://github.com/trungk18/mvc5-angular-webpack

Jenkins Job DSL - load groovy library from git repo

I want to keep my seed job as small as possible and keep all the logic in a central git repository. Also, I have several independent Jenkins instances that then could share the code. How can I load a groovy library in a Jenkins Job DSL script?
Is there something like the Pipeline Remote File Loader Plugin, so that you only need to do fileLoader.fromGit('lib.groovy', 'https://git.repo')?
Hereafter my quicksheet about achieving this in a parameterized Pipeline job,
Using Pipeline script from SCM from git.repo
What may be of interest to you:
loading mecanism : stash/unstash
"from SCM" location : src = "../${env.JOB_NAME}#script/"
Jenkins
Pipeline
Definition: "Pipeline script from SCM"
SCM: Git
Repository URL git.repo
Branches to build */master
Script Path jobs/build.groovy
This project is parameterized:
String Parameter PARAM0
String Parameter PARAM1
git.repo
├── jobs
│ ├── helpers
│ │ └── utils.groovy
│ └── build.groovy
└── scripts
├── build
│ └── do_build.sh
└── inc.sh
Contents : utils.groovy
├── jobs
│ ├── helpers
│ │ └── utils.groovy
def log(msg) {
println("========== " + msg)
}
return this
Contents : build.groovy
├── jobs
│ └── build.groovy
stage ('Init') {
/* Loads */
def src = "../${env.JOB_NAME}#script/"
def helpers_dir = 'jobs/helpers'
def scripts_dir = 'scripts'
/* Stages Scripts */
def do_build = 'build/do_build.sh'
utils = load src + helpers_dir + "/utils.groovy"
dir(src) {
stash name: scripts_dir, includes: "${scripts_dir}/"
}
}
stage ('Build') {
node () {
unstash scripts_dir
build_return = sh (returnStdout: true, script: """
./${scripts_dir}/${do_build} \
"${PARAM0}" \
"${PARAM1}"
""").readLines()
builded = build_return.get(build_return.size()-1).tokenize(',')
utils.log("PARAM0: " + builded[0])
utils.log("PARAM1: " + builded[1])
}
}
Contents : inc.sh
└── scripts
└── inc.sh
#!/bin/sh
## scripts common includes
common=included
Contents : do_build.sh
└── scripts
├── build
│ └── do_build.sh
#!/bin/sh
## includes
. $(dirname $(dirname ${0}))/inc.sh
echo ${common}
## ${0} : PARAM0
## ${1} : PARAM1
echo "${0},${1}"
The Job DSL Gradle Example shows how to maintain DSL code in a Git repository.

Resources