Is is possible for my Waf wscript to call other wscripts in the current directory? - waf

I am building a project with Waf. It has several third-party dependencies, and I would like to build each dependency from my main wscript. My project is organized like this:
/boost/
/gtest/
/source/
/waf
/wscript
/wscript_boost
/wscript_gtest
Each "dependency" wscript_* has commands defined, just like my main wscript.
def options(opt): pass
def configure(conf): pass
def build(bld): pass
If I had put the dependency wscript files in the dependency folders, I could just call opt.recurse('boost gtest'), but I don't want to mix my files with third-party files.
Is there any way for my wscript to call into wscript_boost and wscript_gtest?

I don't think it is possible in the current Waf implementation (waflib.Context.Context.recurse uses a global WSCRIPT_FILE variable).
Though ugly, it is possible to hack this at the beginning of your main wscript:
import waflib.Context
original_recurse = waflib.Context.Context.recurse
def new_recurse(ctx,*args, **kwargs):
original_wscript_file = waflib.Context.WSCRIPT_FILE
try:
waflib.Context.WSCRIPT_FILE = (original_wscript_file +
kwargs.pop('suffix', ''))
original_recurse(ctx, *args, **kwargs)
finally:
waflib.Context.WSCRIPT_FILE = original_wscript_file
waflib.Context.Context.recurse = new_recurse
Your main wscript would the be something like:
def configure(cfg):
cfg.recurse(suffix='_boost')
cfg.recurse(suffix='_gtest')
Note that there are some side effects, as WSCRIPT_FILE is suffixed when you are inside the wscript_boost file.
You can also submit an issue to the waf project.

The way to do that is usually to put a wscript into each subdirectory and to use recurse :
/boost/
/boost/wscript
/gtest/wscript
/source/
/waf
/wscript
And use recurse :
def build(bld):
# ...
bld.recurse("boost")
bld.recurse("gtest")
# ...

Related

How to get a list of shared libraries used in the pipeline?

I can get all the libraries in Jenkins like this:
Jenkins.getInstance().getDescriptor("org.jenkinsci.plugins.workflow.libs.GlobalLibraries").getLibraries()
but it gives me all the libraries even if they are not used in the current project. I can see in the console log that only one of those libraries was loaded. How to get it's name?
Loading library name_of_the_library
It might not be the answer you were expecting but a simple solution would be to make a variable with the name of your library/ies and println it ?
def inst = Jenkins.getInstance()
def libs = inst.getDescriptor("org.jenkinsci.plugins.workflow.libs.GlobalLibraries").getLibraries()
for( lib in libs ) {
def lib_path = lib.getRetriever().getScm().getRemote()
}
But this gives me all the libraries, not only shared ones

Jenkins Shared Library - Importing classes from the /src folder in /vars

I am trying to writing a Jenkins Shared Library for my CI process. I'd like to reference a class that is in the \src folder inside a global function defined in the \vars folder, since it would allow me to put most of the logic in classes instead of in the global functions. I am following the repository structure documented on the official Jenkins documentation:
Jenkins Shared Library structure
Here's a simplified example of what I have:
/src/com/example/SrcClass.groovy
package com.example
class SrcClass {
def aFunction() {
return "Hello from src folder!"
}
}
/vars/classFromVars.groovy
import com.example.SrcClass
def call(args) {
def sc = new SrcClass()
return sc.aFunction()
}
Jenkinsfile
#Library('<lib-name>') _
pipeline {
...
post {
always {
classFromVars()
}
}
}
My goal was for the global classes in the /vars folder to act as a sort of public facade and to use it in my Jenkinsfile as a custom step without having to instantiate a class in a script block (making it compatible with declarative pipelines). It all seems pretty straightforward to me, but I am getting this error when running the classFromVars file:
<root>\vars\classFromVars.groovy: 1: unable to resolve class com.example.SrcClass
# line 1, column 1.
import com.example.SrcClass
^
1 error
I tried running the classFromVars class directly with the groovy CLI locally and on the Jenkins server and I have the same error on both environments. I also tried specifying the classpath when running the /vars script, getting the same error, with the following command:
<root>>groovy -cp <root>\src\com\example vars\classFromVars.groovy
Is what I'm trying to achieve possible? Or should I simply put all of my logic in the /vars class and avoid using the /src folder?
I have found several repositories on GitHub that seem to indicate this is possible, for example this one: https://github.com/fabric8io/fabric8-pipeline-library, which uses the classes in the /src folder in many of the classes in the /vars folder.
As #Szymon Stepniak pointed out, the -cp parameter in my groovy command was incorrect. It now works locally and on the Jenkins server. I have yet to explain why it wasn't working on the Jenkins server though.
I found that when I wanted to import a class from the shared library I have, to a script step in the /vars I needed to do it like this:
//thanks to '_', the classes are imported automatically.
// MUST have the '#' at the beginning, other wise it will not work.
// when not using "#BRANCH" it will use default branch from git repo.
#Library('my-shared-library#BRANCH') _
// only by calling them you can tell if they exist or not.
def exampleObject = new example.GlobalVars()
// then call methods or attributes from the class.
exampleObject.runExample()

why I use "local_repository" in a .bzl file, then it tell me name 'local_repository' is not defined?

I wanna build envoy via bazel,i mannual download some package in my pc, then I change http_archive to local_repository, but it tell me name 'local_repository' is not defined. Did local_repository need any load action?
local_repository can be used in WORKSPACE,but can not in my .bzl file
WORKSPACE:
workspace(name = "envoy")
load("//bazel:api_repositories.bzl", "envoy_api_dependencies")
envoy_api_dependencies()
load("//bazel:repositories.bzl", "GO_VERSION", "envoy_dependencies")
load("//bazel:cc_configure.bzl", "cc_configure")
envoy_dependencies()
`repositories.bzl`:
local_repository(
name = "com_google_protobuf",
path = "/home/user/com_google_protobuf",
)
local_repository is a workspace rule so I think it's not available outside of the WORKSPACE file.
If you want to call local_repository from a .bzl file you can define a function in there, using native, and call it from WORKSPACE, e.g.:
# repositories.bzl
def deps():
native.local_repository(
name = "com_google_protobuf",
path = "/home/user/com_google_protobuf",
)
# WORKSPACE
load("//:repositories.bzl", "deps")
deps()
I've seen this pattern, for example, in the grpc project.
In a .bzl file, you have to use native.local_repository instead of just local_repository.
All symbols in .bzl files are expected to be defined in Starlark, but local_repository is a special rule that is defined natively within Bazel.

Sharing const variables across FAKE fsx scripts

Is there any way to share a variable by including a fsx script within another fsx script.
e.g script buildConsts.fsx contains
let buildDir = "./build/"
I want to reference this in other build scripts e.g.
#load #".\buildConsts.fsx"
let testDlls = !! (buildDir + "*Test*.dll")
When I attempt to run the script the 'buildDir' variable the script fails to compile.
This is a fairly common approach that is used with tools such as MSBuild and PSAKE to modularise scripts. Is this the correct approach with FAKE ?
What you're doing should work - what exactly is the error message that you're getting?
I suspect that the problem is that F# automatically puts the contents of a file in a module and you need to open the module before you can access the constants. The module is named based on the file name, so in your case buildConsts.fsx will generate a module named BuildConsts. You should be able to use it as follows:
#load #".\buildConsts.fsx"
open BuildConsts
let testDlls = !! (buildDir + "*Test*.dll")
You can also add an explicit module declaration to buildconsts.fsx, which is probably a better idea as it is less fragile (won't change when you rename the file):
moule BuildConstants
let buildDir = "./build/"

How do I create a new compiler profile with Waf?

I've found this very helpful page in the API docs of the Waf build system:
My wscript looks like this:
def options(opt):
opt.load('compiler_c')
def configure(conf):
from waflib.Tools.compiler_c import c_compiler
c_compiler['linux'] = ['mycc']
conf.load('compiler_c')
def build(bld):
bld.program(source='main.c', target='nop')
I've tried creating the file mycc.py, placing it in waflib/extras and recompiling Waf. However, when I try to configure my project using this new profile, I get the following error:
Setting top to : /home/user/waf/example
Setting out to : /home/user/waf/example/build
Checking for 'mycc' (c compiler) : not found
could not configure a c compiler!
(complete log in /home/user/waf/example/build/config.log)
I've also tried creating waflib/extras/mycc.py in the root of my project (the place where Waf is and from where it gets called). No good.
How do I do this?

Resources