Bower and folder structure - bower

Let's say that I want to install jQuery UI. I do the command bower install jquery-ui and bower will download:
.
├── bower_components
│ ├── jquery
│ │ ├── dist
│ │ │ ├── jquery.js
│ │ │ └── jquery.min.js
│ │ └── src
│ └── jquery-ui
│ ├── themes
│ │ ├── smoothness
│ │ │ ├── jquery-ui.css
│ │ │ └── jquery-ui.min.css
│ │ └── [The rest of jQuery UI's themes]
│ ├── ui
│ │ ├── accordion.js
│ │ ├── autocomplete.js
│ │ └── ...
│ ├── jquery-ui.js
│ └── jquery-ui.min.js
└── index.html
after the download is finished to include the files in my index.html I'll have to write something like bower_components/jquery/dist/jquery.min.js, bower_components/jquery-ui/jquery-ui.min.js and for the css bower_components/jquery-ui/themes/smoo... etc.
I'm used to work with a much simpler folder structure, like this:
.
├── css
│ ├── main.css
│ └── slider.css
├── js
│ ├── jquery.min.js
│ └── jquery-ui.min.js
├── index.html
├── contact.html
└── pricing.html
I want to know if there's any way I can make bower automatically download the dist files to my css, js folders regardless of what I'm installing?

Bower is used for one thing - to grab the latest version of those components and make sure you get all the files you need.
Bower does one job and is used as part of a "build pipeline". You're supposed to then use a second tool like Grunt or Gulp, or just a batch file/shell script with copy commands if you'd like, to copy only the files you need from bower_components into your desired folder structure. With Grunt and Gulp, this step can also include things like bundling or minification of scripts and stylesheets or even turning images into sprite sheets.
That said, if you don't mind dumping all the files Bower will pull down from a certain component into your structure (probably leaving a lot of crap you don't want, considering the details in your question), you can use the .bowerrc file to change the output directory.
(TL;DR: Bower is like IKEA, delivering parts in flatpack - you can't put the package in the living room and expect it to be a table already, but you can write something that knows what's in the package and assemble exactly what you want without having to go hunt for the individual parts manually. Or you can just unpack the package once and take what you want manually and never use Bower again - there's nothing wrong with that.)

I use wiredep with gulp.js or Grunt.js to include whatever dependencies my app needs:
http://www.gulpjs.com
http://www.npmjs.com/package/wiredep
It allows you to automatically insert any bower package your app needs to run. Add "wiredep" as a task to any of your gulp.js or Grunt.js build tasks.

Related

How to overwrite the structure of existing directory when using tar command to unarchive a directory?

Suppse I got a file tree as following
workspace
├── project
│ ├── a
│ ├── b
│ └── c (c is not included in project.tar)
└── project.tar
andproject.tar was created by tar -cf project.tar project-new where project-new has following structure
project-new
├── a
└── b
So I was wondering if there exists any way that when I unarchive project.tar in workspace, I can completely overwrite the structure of directory project. In other words, after unarchiving, is it possiple to make sub-directory c disappear?

go mod unable to find modules

I'm trying to build a Docker image using a go compiled binary as the ENTRYPOINT but I'm unable to compile the binary because the go mod is unable to find one of the required files.
the project structure looks like this:
editor/
├── container
│ ├── Dockerfile
│ └── src
│ ├── install-browsers.sh
│ ├── selenium-server-standalone-3.141.59.jar
│ └── webCopy
│ ├── go.mod
│ ├── go.sum
│ └── main.go
├── copier
│ ├── copier.go
│ ├── internal
│ │ └── utils.go
│ └── scripts
│ └── load.go
└── resource
└── handler.go
The file I'm trying to compile is webCopy/main.go
Inside that file I need to import the module editor/copier
The path to the editor module is:
bitbucket.org/backend/editor
which is inside the GOPATH
The error the go mod tidy gives me is:
go: finding module for package bitbucket.org/mvps/backend/editor/copier
bitbucket.org/backend/editor/container/src/webCopy imports
bitbucket.org/backend/editor/copier: cannot find module providing package bitbucket.org/mvps/backend/editor/copier: reading https://api.bitbucket.org/2.0/repositories/mvps/backend?fields=scm: 404 Not Found
I really don't want to mix the copier module inside the src of the container, the reason being I feel the submodules to the main should be separated, yet inside the editor module.
Furthermore, I'm using go.mod as a way to get a clean image by compiling main.go and using the binary to create a new clean artifact, so I would like to have the go.mod and go.sum files inside editor/container/src/webCopy/
btw. I have checked the package names and everything is properly named.
FYI if you are using a go modules build - you are no longer using GOPATH - so that is not the issue.
If you want a custom build - and not have to create laborious git key access to repo's from within a docker build - you can leverage the replace directive in go.mod
So add to .../webCopy/go.mod the following line:
replace bitbucket.org/backend/editor/copier => ../../../copier/
this will instruct the go build to use this relative path (instead of a direct https download)

Vapor "Cannot find 'Model' in scope" error

I create model in Models folder in my Vapor project. Then when I try to create an instance of this model somewhere else I get the error "Cannot find 'model_name' in scope".
It seems that XCode does not see the content of Models folder for some reason. And I don't know what settings should I fix to access the models from other classes.
The project's template is default and its structure is correct.
Please compare your project's folder structure with the Vapor docs: https://docs.vapor.codes/3.0/getting-started/structure/
Your Model(s) folder (as any other folders and files of your project) should be in the "App" folder of the Vapor project.
.
├── Public
├── Sources
│ ├── App
│ │ ├── Controllers
│ │ ├── Models
│ │ ├── boot.swift
│ │ ├── configure.swift
│ │ └── routes.swift
│ └── Run
│ └── main.swift
├── Tests
│ └── AppTests
└── Package.swift

How to run Jenkins Pipeline defined in a single repo navigating through folder?

Problem: I have a one single repository where I have to walk through the repo to find a specific Jenkinsfile to run the pipeline. Note that I want to define the path to this Jenkinsfile explicity so I thought about having a jenkinsfilePath.yml in root directory of the repo, read the yaml, change directory and run Jenkinfile from the path. The folder structure is as follows:
testingSingleRepo
├── Jenkinsfile
├── feature_flagging
│   ├── Jenkinsfile
│   ├── __init__.py
│   ├── src
│   └── tests
└── jenkinsfilePath.yml
I am having issue running Jenkinsfile inside feature_flagging from the root Jenkinfile in testingSingleRepo. I was successful in changing directory to the folder feature_flagging by using dir. After googling a lot with similar questions, I came across the function build but I could not make that work. Any suggestions/solutions?
Calling a Jenkinsfile from a main pipeline, we can use load
load 'feature_flagging/Jenkinsfile'
So after looking around, I have decided to go with another approach. I have decided that I will have a master Jenkinfile in the root where I will have a generic pipeline setup. It will read the yaml file, change directory and execute shell scripts inside Jenkins/ folder accordingly. The folder will consist of generic scripts that reflects to the root Jenkins pipeline such as setup.sh,test.sh, deploy.sh etc. The folder structure will look something like below:
testingSingleRepo
├── Jenkinsfile
├── feature_flagging
│ ├── Jenkins/
│ ├── __init__.py
│ ├── src
│ └── tests
└── jenkinsfilePath.yml

Bazel working directory differs from Maven. How to migrate?

I have an existing project which is built with Maven. It typically defines several modules. I want to migrate this project to Bazel.
In a first attempt, I use
└── project
├── moduleA
│   ├── BUILD
│   ├── pom.xml
│   └── src
│   ├── main
│   │   └── java
│   └── test
│   ├── data
│   └── java
├── moduleB
│   ├── BUILD
│   ├── pom.xml
│   └── src
│   ├── main
│   │   └── java
│   └── test
│   └── java
├── pom.xml
└── WORKSPACE
It was not too hard to make the project build with Bazel. My problem is now that tests fails to find their test data.
Indeed, with Maven (or ant), the working directory is the one that contains the pom.xml (or build.xml). So, in that case for moduelA can do:
new File("src/test/data/foo.txt");
However, when the test runs in Bazel, the working directory is the sanboxed runfiles which are rooted like the workspace, i.e. the test must now open:
new File("projectA/src/test/data/foo.txt");
This is all fine after migration, but do you handle this situation during migration, i.e. how do you make the test pass both in Maven and in Bazel?
Is there any facility offered by the Bazel test runner to adapt the paths to legacy behaviour?
The current workaround I have is to check whether new File(".").getAbsoluteFile().getParentFile() is __main__.
See TestFileUtil.java

Resources