Background:
I have a lerna monorepo with yarn workspaces with two packages. I am using rollup as the bundler.
packages/module1/package.json:
{
scripts: {
"watch": "rollup -c rollup.config.js --watch",
"build": "NODE_ENV=production && rollup -c rollup.config.js"
}
}
packages/module2/package.json:
{
scripts: {
"watch": "rollup -c rollup.config.js --watch",
"build": "NODE_ENV=production && rollup -c rollup.config.js"
}
}
Expected Behavior:
lerna run build will run the build scripts for each package.
lerna run watch will run the watch scripts for each package in watch mode.
Current Behavior:
lerna run build works as expected. The build script runs properly for both packages.
lerna run watch just hangs there:
lerna notice cli v3.13.1
lerna info Executing command in 2 packages: "yarn run watch"
[[just hangs here]]
I have tried lerna run --parallel watch, and this only runs once. It exits after rollup completes. In other words, it never seems to be watching.
I believe the command you are looking for is lerna exec. This will run whatever command is passed to it over every package in your Monorepo.
lerna exec --parallel -- yarn build
If each package has the same build step, you could abstract it to the top level package.json like so:
lerna exec --parallel -- rollup -c=rollup.config.js
Which will go into each package and run that rollup command.
Sources:
Adding Rollup to a Monorepo
Creating a Monorepo with Lerna & Yarn Workspaces
It will need some tweaks to enable rollup to watch in parallel for the lerna monorepo.
lerna run --parallel watch
The code above will only run for one package and block the rest of the packages, and here is the code for the inner workings of the rollup. The following code snippet is the Watcher class constructor from the rollup github code base. As you can see, the watcher actually accept an array of configs. So now you only need to write some wrapper code to incorporate all your configs into one and then run the watch from the same config for all packages.
constructor(configs: GenericConfigObject[] | GenericConfigObject) {
this.emitter = new (class extends EventEmitter implements RollupWatcher {
close: () => void;
constructor(close: () => void) {
super();
this.close = close;
// Allows more than 10 bundles to be watched without
// showing the `MaxListenersExceededWarning` to the user.
this.setMaxListeners(Infinity);
}
})(this.close.bind(this));
this.tasks = (Array.isArray(configs) ? configs : configs ? [configs] : []).map(
config => new Task(this, config)
);
this.running = true;
process.nextTick(() => this.run());
}
Related
After following the solution on github, and the solution on Stack Overflow, I am still experiencing the same issue when building a code pipeline with AWS CDK.
Error:
This CDK CLI is not compatible with the CDK library used by your application. Please upgrade the CLI to the latest version.
(Cloud assembly schema version mismatch: Maximum schema version supported is 21.0.0, but found 22.0.0)
This error appears in the Code Build Stage of the Code Pipeline. Sourcing the code from Code Commit works successfully, as the first stage.
CDK Pipeline Code:
As you can see in the code below, I have the install commands of uninstalling the cdk, and then installing it again. This was the recommended solution provided by the document above. Re-ordering does not influence the outcome.
this.codePipeline = new CodePipeline(this, `${environment}-${appName}-`, {
pipelineName: `${environment}-${appName}-`,
selfMutation: true,
crossAccountKeys: false,
role: this.codePipelineRole,
synth: new ShellStep("Deployment", {
input: CodePipelineSource.codeCommit(this.codeRepository, environment, {
codeBuildCloneOutput: true
}),
installCommands: ["npm uninstall -g aws-cdk", "npm i -g npm#latest", "npm install -g aws-cdk"],
commands: [
"cd backend",
"npm ci",
"npm run build",
"npx cdk synth",
],
primaryOutputDirectory: "backend/cdk.out",
})
});
Dependencies in the package.json file:
"dependencies": {
"#aws-cdk/aws-appsync-alpha": "^2.55.1-alpha.0",
"aws-cdk-lib": "^2.58.0",
"aws-sdk": "^2.1278.0",
"constructs": "^10.1.204",
"git-branch": "^2.0.1",
"source-map-support": "^0.5.21"
}
The solution was to do without the npx in npx cdk synth. I removed it and the code worked. This was also experienced when attempting to run npx cdk synth locally.
Solution: cdk synth
We are trying to get our multi-stack application deployed using the cdk pipeline library.
We have recently disabled the publishAssetsInParallel flag, as with the default setting our pipeline would create >20 FileAsset objects under the Assets stage, which AWS then complains as being too many CodeBuild projects running parallel.
However, with this property now disabled, I'm getting the following error for the Assets stage:
[Container] 2022/11/14 12:04:24 Phase complete: DOWNLOAD_SOURCE State: FAILED
[Container] 2022/11/14 12:04:24 Phase context status code: YAML_FILE_ERROR Message: stat /codebuild/output/src112668013/src/buildspec-c866864112c35d54804951dbe96b99440c9b891fde-FileAsset.yaml: no such file or directory
I'm assuming this is supposed to be a build spec that is create by cdk pipeline, as we didn't need to create a build spec when things were running in parallel.
Here is the current pipeline code:
const pipeline = new CodePipeline(this, 'Pipeline', {
publishAssetsInParallel: false,
selfMutation: false,
pipelineName: fullStackName('Pipeline', app),
synth: new CodeBuildStep('SynthStep', {
input: CodePipelineSource.codeCommit(repo, repoBranchName, {codeBuildCloneOutput: true}),
buildEnvironment: {computeType: ComputeType.MEDIUM},
installCommands: [
'npm install -g yarn',
'yarn install',
'cd apps/cloud-app',
'yarn install',
'yarn global add aws-cdk'
],
commands: [
'yarn build',
'cdk synth'
],
primaryOutputDirectory: 'apps/cloud-app/cdk.out'
}
)
});
UPDATE:
I reverted the publishAssetsInParallel flag to its default setting to compare, and it seems there is a fundamental difference in the way it creates the FileAsset CodeBuild projects based on this flag. With it enabled, when I inspect the build details for one of the FileAsset projects that is created, I can see under the buildspec section it contains a concrete implementation of a build spec, eg:
{
"version": "0.2",
"phases": {
"install": {
"commands": [
"npm install -g cdk-assets#2"
]
},
"build": {
"commands": [
"cdk-assets --path \"MyStack.assets.json\" --verbose publish \"2357296280127ce793d8dbb13e6c907db22f5dcc57a173ba77fcd19a76d8f444:12345678910-eu-west-2\""
]
}
}
}
With the flag disabled, the buildspec simply contains a pointer to a buildspec file as below, which it then fails to find...
buildspec-c866864112c35d54804951dbe96b99440c9b891fde-FileAsset.yaml
Self-mutation has to be enabled - currently, asset updates mutate the pipeline.
Reference: https://github.com/aws/aws-cdk/issues/9080
I am using an agent setup with multiple versions of Python (3.6, 3.7, 3.8, and 3.9) and have installed withPythonEnv plugin to see if it can switch runtimes during builds. The project is found here: https://github.com/jenkinsci/pyenv-pipeline-plugin.
When i try to run this in Jenkins running some simple commands:
stage ('Unit Test'){
steps {
withPythonEnv('/usr/bin/python3.8') {
script{
sh """
pwd
env
python --version
pip install --upgrade pip
pip install -r requirements-test.txt
python -m pytest foo/tests/ --cov foo --cov-report=xml --junitxml=junit.xml
"""
}
}
}
post {
always {
script {
junit "junit.xml"
}
}
}
}
I am constantly seeing the build failed. I dont EVER get any additional logging as to why this is occurring. The only message i see is this:
ERROR: Error while creating virtualenv: Error: Command '['/home/jenkins/workspace/foo/.pyenv-usr-bin-python3.8/bin/python3.8', '-Im', 'ensurepip', '--upgrade', '--default-pip']' returned non-zero exit status 1.
Does anyone know how to get around this using withPythonEnv? Their docs really dont say much more than the examples provided and I've tried a few of those already.
Could you use withEnv and then use PythonPath? Something like: withEnv('PYTHONPATH=/usr/bin/python3.8')
I try to execute bower commands in a sh script that is run in the after-success phase o a travis build. I installed bower in the install phase:
install:
- npm install -g bower
[...]
after_success:
- if [ ${TRAVIS_PULL_REQUEST} = "false" ] && [ "$TRAVIS_BRANCH" = "master" ]; then
./my-script.sh;
fi
Unfortunately, if I call bower in the sh script it produces the following output:
./my-script.sh: line 30: ./node_modules/.bin/bower: No such file or directory
I do not know how to proceed to fix the error. Any help would be greatly appreciated, thank you already!
I had to call the script using
bash my-script.sh;
instead of
./my-script.sh;
Now everything is working fine.
I'm having issues getting coveralls to work. I've created a simple project here.
It seems to be outputting the report correctly, but I'm definitely missing a step somewhere because coveralls doesn't see me as being set up.
No branches show up, and it simply gives instructions on how to set it up.
I've tried to copy what qunit is doing, because they obviously have it working.
Here is what I've done so far.
Created the project that uses node/grunt/qunit as well as the coveralls account and toggled on the project.
I've then replaced the qunit reference in the devDependencies section in package.json with this.
"grunt-coveralls": "0.3.0",
"grunt-qunit-istanbul": "^0.4.0"
I've added this to my package.json.
"scripts": {
"ci": "grunt && grunt coveralls"
}
I've added this config for qunit in my Gruntfile.js.
options: {
timeout: 30000,
"--web-security": "no",
coverage: {
src: [ "src/<%= pkg.name %>.js" ],
instrumentedFiles: "temp/",
coberturaReport: "report/",
htmlReport: "build/report/coverage",
lcovReport: "build/report/lcov",
linesThresholdPct: 70
}
},
I then added this to my .travis.yml.
language: node_js
node_js:
- "0.10"
before_install:
npm install -g grunt-cli
install:
npm install
before_script:
grunt
after_script:
npm run-script coveralls
I got it working, check the repo for the example https://github.com/thorst/Code-Coverage-Qunit
While its not always possible, I found jasmine to be easier in multiple ways. I have a complete example here: https://github.com/thorst/Code-Coverage-Jasmine
I still haven't gotten mocha to work though. That (broken) repo is here: https://github.com/thorst/Code-Coverage-Mocha