Makefile foreach removes path from previous item - foreach

I'm having a bit of a problem trying to use a foreach in a makefile (probably due to a noobness problem?)
I have a project with this structure
Project_root
+- Makefile
+- main.c
+- main.h
+- Module_1
| +- src
| | +- unit_test
| | | +- Module_1_test.c
| | | +- Module_1_test.h
| | +- Module_1.c
| | +- Module_1.h
+- Module_2
| +- src
| | +- unit_test
| | | +- Module_2_test.c
| | | +- Module_2_test.h
| | +- Module_2.c
| | +- Module_2.h
...
The objective is to have each module encapsulated and then have a makefile like so:
TARGET_LIBS += Module_1
TARGET_LIBS += Module_2
...
DEPS = ${foreach SRC, $(basename $(wildcard $(TARGET_LIBS)/src/*.c)), $(addsuffix .o, $(SRC))}
DEPS_TEST = ${foreach SRC, $(basename $(wildcard $(TARGET_LIBS)/src/unit_test/*.c)), $(addsuffix .o, $(SRC))}
DEPS_PATH = ${foreach LIB, $(TARGET_LIBS), $(LIB)/src}
all: $(DEPS) main.o
gcc $(CFLAGS) main.o $(DEPS) $(CLIBS) -o main
test: $(DEPS) main.o
gcc $(CFLAGS) main.o $(DEPS) $(CLIBS) -o main
...
$(DEPS):
gcc $(CFLAGS) -c $(addsuffix .c, $(basename $(DEPS))) -I $(DEPS_PATH)
For a simple program where we use 1 module I have:
Makefile:
TARGET_LIBS += Module_1
...
Variables:
TARGET_LIBS -> Module_1
DEPS -> Module_1/src/Module_1.o
DEPS_TEST -> Module_1/src/unit_test/Module_1_test.o
DEPS_PATH -> Module_1/src
But when using more modules, by having more "TARGET_LIBS += ..." lines, I get this problem:
Makefile:
TARGET_LIBS += Module_1
TARGET_LIBS += Module_2
...
Variables:
TARGET_LIBS -> Module_1 Module_2
DEPS -> Module_1.o Module_2/src/Module_2.o
DEPS_TEST -> Module_1_test.o Module_2/src/unit_test/Module_2_test.o
DEPS_PATH -> Module_1/src Module_2/src
By having 2 modules, the first module to be added in the makefile loses the path in DEPS.
DEPS should be 'Module_1/src/Module_1.o Module_2/src/Module_2.o', but I have 'Module_1.o Module_2/src/Module_2.o'.
Is there something wrong in the makefile?
Am I making some wrong assumption?
I'd appreciate any "You should do this instead" type of answers, but if someone can explain the "Why" this is not working so I don't get the same problem again... It would be the perfect answer
Thanks in advance

Well, it's not too hard to figure out. Let's see what you have:
$(wildcard $(TARGET_LIBS)/src/*.c)
OK, well, what is TARGET_LIBS?
TARGET_LIBS += Module_1
TARGET_LIBS += Module_2
so the value of TARGET_LIBS is Module_1 Module_2. So what does that wildcard function expand to?
$(wildcard Module_1 Module_2/src/*.c)
and what is your output?
Module_1 Module_2/src/Module_2.c
just as you'd expect.
Simply adding some extra text after a variable doesn't magically cause that extra text to be appended to every word in that variable... when working with variables that contain multiple words you need to use functions that modify all the words.
I would write it like this:
DEPS := $(patsubst %.c,%.o,$(wildcard $(addsuffix /src/*.c,$(TARGET_LIBS))))

Related

Include path has been specified but still failed to include the header in the path in a Bazel C++ project

I have projects with a directory structure like that
---root
| |--src
| |--project1
| |--model
| | |--incude
| | | |--model
| | | |--modelA.hpp
| | | |--modelB.hpp
| | |--modelA.cpp
| | |--modelB.cpp
| | |--BUILD #1
| |...
| |--view
| |...
| |--common
| | |--include
| | |--common
| | |--data_type.hpp
| |--BUILD #2
|--WORKSPACE
As I have other package in this project and some of them use the same self-defined data type, I defined them in a package named common.
Now I include the data_type.hpp in file modelA.hpp
...
#include "common/data_type.hpp
...
Refering to the stage3 example in the tutorial, the BUID(#1) is like that
cc_library(
name = "modelA",
hdrs = "include/model/modelA.hpp",
deps = ["//src/project/common:data_type"],
copts = ["-Isrc/project/common/include"],
)
and the BUILD(#2) which defines the depedency module data_typeis like that
cc_library(
name = "data_type",
hdrs = ["include/common/data_type.hpp"],
visibility = ["//visibility:public"],
)
However, when I built the code, I got
src/project/model/include/model/modelA.hpp: fatal error: common/data_type.hpp: No such file or directory
Why I have defined copts = ["-Isrc/heimdallr/common/include"] but still got this error?
Please check the Header inclusion checking section of C/C++ Rules from the Bazel document. Relative to the workspace directory, all include paths should be created. Kindly refer to this issue for more information. Thank you!

How to refer to shell script inside shared jenkins library

I have a Jenkinsfile that uses the shared Jenkins lib and calls method inside the groovy file defined below.
shared-jenkins-lib
|
|-----------vars
| |-------pipeline_script.groovy
|
|-----------scripts
|-------test_script.groovy
This is the structure of the library and pipeline_script.groovy has a method test which is called from jenkinsfile.
def test(){
dockerArgs = "--entrypoint /bin/bash"
dockerCommand = "`../scripts/test_script.sh`"
dockerOut = sh (script: "docker run ${dockerArgs} ${image} ${dockerCommand}", returnStatus: true)
}
I refer test_script.sh but doesn't seem to find the file in that location.
I am running jenkins within docker. What is the correct way to refer to the script
As stated in the docs of the shared libraries plugin, a library repo has a very specific structure. The root of the repo can have the following 3 folders:
+- src # Groovy source files
| +- org
| +- foo
| +- Bar.groovy # for org.foo.Bar class
+- vars
| +- foo.groovy # for global 'foo' variable
| +- foo.txt # help for 'foo' variable
+- resources # resource files (external libraries only)
| +- org
| +- foo
| +- bar.json # static helper data for org.foo.Bar
If you want to refer to a helper code/data from your library method, you should put it in the resources directory, and then you can retrieve it with the libraryResource step, also described in the docs.
You need to pass in the workflowscript object into test().
def test(workFlowScript wfs) {
...
wfs.sh ''
wfs.echo ''
wfs.withCredentials()
}
// your Jenkisnfile which call test()
test(this) // this is the instance of workFlowScript

Global Header Staging in Bazel

Our current project structure is as follows:
thirdparty(ws_root)
|_WORKSPACE
|_comp1
| |_BUILD
| |_src
| |_ a.c
| |_include
| |_ a.h
|_comp2
| |_BUILD
| |_src
| |_ b.c
| |_include
| |_ b.h
|_inc
| |_comp1
| |_a.h
| |_comp2
| |_b.h
Contents of a.c :
#include <comp1/a.h>
With our current build system the header file under thirdparty/comp1/include/a.h is staged with the following path:
thirdparty/inc/comp1/a.h
Here the thirdparty/inc folder is a global location where all header files in the workspace are staged under the respective components.
I would like to know if Bazel provides a mechanism by which we can stage the header files in a similar manner.
I'm not sure I understand, by "global location" you mean a folder in each library (library in the Bazel sense, you might call it component or package)? That's what I understood from the example. If that is the case, and each library just provides a -isystem include to all its dependents, that is possible in bazel, look for includes attribute.

Jenkins Pipeline: How to add help for Global Shared Library

According to documentation, it should be possible to include *.txt file with help/documentation:
(root)
+- src # Groovy source files
| +- org
| +- foo
| +- Bar.groovy # for org.foo.Bar class
+- vars
| +- foo.groovy # for global 'foo' variable/function
| +- foo.txt # help for 'foo' variable/function
+- resources # resource files (external libraries only)
| +- org
| +- foo
| +- bar.json # static helper data for org.foo.Bar
...
The vars directory hosts scripts that define global variables
accessible from Pipeline scripts. The basename of each *.groovy file
should be a Groovy (~ Java) identifier, conventionally camelCased. The
matching *.txt, if present, can contain documentation, processed
through the system’s configured markup formatter (so may really be
HTML, Markdown, etc., though the txt extension is required).
Unfortunately it hasn't worked for me by simply creating *.txt with some arbitrary content.
Am I missing something? Or does Global Shared Library doesn't shown documentation in usual Jenkins places:
Please note that the PipelineSyntax/Global Variables Reference page is ONLY updated when the pipeline's run is successful. And therefore ONLY for this pipeline (and not any others).
Here's a link!

Gulp loosing relative path for files from gulp.src(Array) to gulp.dest

I want to use gulp to copy and paste all HTML from a collection of src that are from a parent directory to a child directory, but keep their path
Project
+-- /Development
| +- gulpfile.js
|
+-- /Source
+- /ComponentA
| +- /DirA
| +- fileA.html
|
+- /ComponentB
| +- /DirB
| +- fileB.html
|
+- /ComponentC
+- /DirC
+- fileC.html
I need gulp to copy all HTML files to the relative paths into Development/public_html
Project
+-- /Development
+- gulpfile.js
|
+- /public_html
+- /ComponentA
| +- /DirA
| +- fileA.html
|
+- /ComponentC
+- /DirC
+- fileC.html
My gulp task
gulp.task('copyHTMLwrong', function() {
return gulp.src([
'../Source/ComponentA/**/*.html',
'../Source/ComponentC/**/*.html'
])
.pipe(gulp.dest('public_html'));
});
But I get (loose path):
Project
+-- /Development
+- gulpfile.js
|
+- /public_html
+- fileA.html
+- fileC.html
PS: I know if I use 'Source/**/*.html', will copy all files correctly and I'm sure I can also remove ComponentC using !, but i need to define each Component on my gulp.src, so I can compile for each website a group of files.
gulp.task('copyHTMLnotUseful', function() {
return gulp.src([
'../Source/**.*.html',
'!../Source/ComponentB/**/*.html'
])
.pipe(gulp.dest('public_html'));
});
I've tried set cwd or base, but didn't work either.
Thanks
I figured out how to use base to fix my Issue.
It's simple, just add a base with a __dirname, like so.
Just make sure you set inside path.resolve():
gulp.task('copyHTML', function() {
return gulp.src([
'../Source/ComponentA/**/*.html',
'../Source/ComponentC/**/*.html'
],
{base: path.resolve(__dirname + '/../Source')})
.pipe(gulp.dest('public_html'));
});

Resources