Jest configuration setupFilesAfterEnv option was not found - path

I'm trying to make Jest work again on a project developped 1 year ago and not maintained.
I have an error with path of setupFilesAfterEnvor transform.
the error i get when i run "yarn test"
$ jest __testsv2__ --config=./jest.config.js
● Validation Error:
Module <rootDir>/jest/setup.js in the setupFilesAfterEnv option was not found.
<rootDir> is: /Users/alain/dev/ddf/release
Configuration Documentation:
https://jestjs.io/docs/configuration.html
error Command failed with exit code 1.
my filesystem, in /Users/alain/dev/ddf/release/ i have
babel.config.js
jest.config.js
/jest
/setup
setup.js ( so full path is : /Users/alain/dev/ddf/release/jest/setup.js )
staticFileAssetTransform.js ( so full path is : /Users/alain/dev/ddf/release/jest/staticFileAssetTransform.js )
My package.json
{ ...
"scripts": {
"test": "jest __testsv2__ --config=./jest.config.js"
...
}
}
babel.config.js
module.exports = function(api) {
api.cache(false);
const presets = ['#babel/preset-env', '#babel/preset-react'];
const plugins = [['#babel/proposal-object-rest-spread'],];
return {
presets, plugins, sourceMaps: "inline",
ignore: [(process.env.NODE_ENV !== 'test' ? "**/*.test.js" : null) ].filter(n => n)
};
};
jest.config.js
module.exports = {
resolver: 'browser-resolve',
clearMocks: true,
moduleNameMapper: { '\\.(css|less|styl|md)$': 'identity-obj-proxy' },
// A list of paths to modules that run some code to configure or set up the testing framework before each test
// setupFilesAfterEnv: ['./jest/setup.js'], // don't work too
setupFilesAfterEnv: ['<rootDir>/jest/setup.js'],
// An array of regexp pattern strings that are matched against all test paths, matched tests are skipped
testPathIgnorePatterns: ['/node_modules/', '/__gql_mocks__/'],
// A map from regular expressions to paths to transformers
transform: {
'^.+\\.js$': './jest/babelRootModeUpwardTransform.js',
'\\.(jpg|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga)$': '<rootDir>/jest/staticFileAssetTransform.js',
},
};

You might want to delete your node modules, package-lock.json and run npm i again, I had a similar issue and that was a fix for me. Also try npm cache clean --force

For those coming here, make sure you prefix the path with <rootDir>
Like this:
setupFilesAfterEnv: ['<rootDir>/node_modules/#hirez_io/observer-spy/dist/setup-auto-unsubscribe.js']

Related

AWS CDK Pipeline: Assets stage fails to find buildSpec FileAsset.yaml when publishAssetsInParallel is disabled

We are trying to get our multi-stack application deployed using the cdk pipeline library.
We have recently disabled the publishAssetsInParallel flag, as with the default setting our pipeline would create >20 FileAsset objects under the Assets stage, which AWS then complains as being too many CodeBuild projects running parallel.
However, with this property now disabled, I'm getting the following error for the Assets stage:
[Container] 2022/11/14 12:04:24 Phase complete: DOWNLOAD_SOURCE State: FAILED
[Container] 2022/11/14 12:04:24 Phase context status code: YAML_FILE_ERROR Message: stat /codebuild/output/src112668013/src/buildspec-c866864112c35d54804951dbe96b99440c9b891fde-FileAsset.yaml: no such file or directory
I'm assuming this is supposed to be a build spec that is create by cdk pipeline, as we didn't need to create a build spec when things were running in parallel.
Here is the current pipeline code:
const pipeline = new CodePipeline(this, 'Pipeline', {
publishAssetsInParallel: false,
selfMutation: false,
pipelineName: fullStackName('Pipeline', app),
synth: new CodeBuildStep('SynthStep', {
input: CodePipelineSource.codeCommit(repo, repoBranchName, {codeBuildCloneOutput: true}),
buildEnvironment: {computeType: ComputeType.MEDIUM},
installCommands: [
'npm install -g yarn',
'yarn install',
'cd apps/cloud-app',
'yarn install',
'yarn global add aws-cdk'
],
commands: [
'yarn build',
'cdk synth'
],
primaryOutputDirectory: 'apps/cloud-app/cdk.out'
}
)
});
UPDATE:
I reverted the publishAssetsInParallel flag to its default setting to compare, and it seems there is a fundamental difference in the way it creates the FileAsset CodeBuild projects based on this flag. With it enabled, when I inspect the build details for one of the FileAsset projects that is created, I can see under the buildspec section it contains a concrete implementation of a build spec, eg:
{
"version": "0.2",
"phases": {
"install": {
"commands": [
"npm install -g cdk-assets#2"
]
},
"build": {
"commands": [
"cdk-assets --path \"MyStack.assets.json\" --verbose publish \"2357296280127ce793d8dbb13e6c907db22f5dcc57a173ba77fcd19a76d8f444:12345678910-eu-west-2\""
]
}
}
}
With the flag disabled, the buildspec simply contains a pointer to a buildspec file as below, which it then fails to find...
buildspec-c866864112c35d54804951dbe96b99440c9b891fde-FileAsset.yaml
Self-mutation has to be enabled - currently, asset updates mutate the pipeline.
Reference: https://github.com/aws/aws-cdk/issues/9080

Cannot get webpack --watch or dev server to work using Lando to run a local Drupal environment

I've scoured the internet and have bits and pieces but nothing is coming together for me. I have a local Drupal environment running with Lando. I've successfully installed and configured webpack. Everything is working except when I try to watch or hot reload.
When I run lando npm run build-dev (that currently uses webpack --watch I can see my changes compiled successfully into the correct folder. However, when I refresh my Drupal site, I do not see that changes. The only time I see my updated JS changes are when I run lando drush cr to clear cache. Same things are happening when I try to configure the webpack-dev-server. I can get everything to watch for changes and compile correctly but I cannot get my browser to reload my files, they stay cached. I'm at a loss.
I've tried configuring a proxy in my .lando.yml , and have tried different things with the config options for devServer. I'm just not getting a concise answer, and I just don't have the knowledge to understand exactly what is happening. I believe it has to do with Docker containers not being exposed to webpack (??) but I don't understand how to configure this properly.
These are the scripts I have set up in my package.json , build outputs my production ready files into i_screamz/js/dist, build-dev starts a watch and compiles non-minified versions to i_screamz/js/dist-dev - start I have in here from trying to get the devServer to work. I'd like to get webpack-dev-server running as I'd love to have reloading working.
"scripts": {
"start": "npm run build:dev",
"build:dev": "webpack --watch --progress --config webpack.config.js",
"build": "NODE_ENV=production webpack --progress --config webpack.config.js"
},
This is my webpack.config.js - no sass yet, this is just a working modular js build at this point.
const path = require("path");
const BrowserSyncPlugin = require('browser-sync-webpack-plugin');
const isDevMode = process.env.NODE_ENV !== 'production';
module.exports = {
mode: isDevMode ? 'development' : 'production',
devtool: isDevMode ? 'source-map' : false,
entry: {
main: ['./src/index.js']
},
output: {
filename: isDevMode ? 'main-dev.js' : 'main.js',
path: isDevMode ? path.resolve(__dirname, 'js/dist-dev') : path.resolve(__dirname, 'js/dist'),
publicPath: '/web/themes/custom/[MYSITE]/js/dist-dev'
},
module: {
rules: [
{
test: /\.js$/,
exclude: /node_modules/,
use: {
loader: 'babel-loader'
}
}
]
},
plugins: [
new BrowserSyncPlugin({
proxy: {
target: 'http://[MYSITE].lndo.site/',
proxyReq: [
function(proxyReq) {
proxyReq.setHeader('Cache-Control', 'no-cache, no-store');
}
]
},
open: false,
https: false,
files: [
{
match: ['**/*.css', '**/*.js'],
fn: (event, file) => {
if (event == 'change') {
const bs = require("browser-sync").get("bs-webpack-plugin");
if (file.split('.').pop()=='js') {
bs.reload();
} else {
bs.stream();
}
}
}
}
]
}, {
// prevent BrowserSync from reloading the page
// and let Webpack Dev Server take care of this
reload: false,
injectCss: true,
name: 'bs-webpack-plugin'
}),
],
watchOptions: {
aggregateTimeout: 300,
ignored: ['**/*.woff', '**/*.json', '**/*.woff2', '**/*.jpg', '**/*.png', '**/*.svg', 'node_modules'],
}
};
And here is the config I have setup in my .lando.yml - I did have the proxy key in here but it's been removed as I couldn't get it setup right.
name: [MYSITE]
recipe: pantheon
config:
framework: drupal8
site: [MYPANTHEONSITE]
services:
node:
type: node
build:
- npm install
tooling:
drush:
service: appserver
env:
DRUSH_OPTIONS_URI: "http://[MYSITE].lndo.site"
npm:
service: node
settings.local.php
<?php
/**
* Disable CSS and JS aggregation.
*/
$config['system.performance']['css']['preprocess'] = FALSE;
$config['system.performance']['js']['preprocess'] = FALSE;
I've updated my code files above to reflect reflect a final working setup with webpack. The main answer was a setting in
/web/sites/default/settings.local.php
**Disable CSS & JS aggregation. **
$config['system.performance']['css']['preprocess'] = FALSE;
$config['system.performance']['js']['preprocess'] = FALSE;
I found a working setup from saschaeggi and just tinkered around until I found this setting. So thank you! I also found more about what this means here. This issue took me way longer than I want to admit and it was so simple. I don't know why the 'Disabling Caching css/js aggregation' page never came up when I was furiously googling a caching issue. Hopefully this answer helps anyone else in this very edge case predicament.
I have webpack setup within my theme root folder with my Drupal theme files. I run everything with Lando, including NPM. I found a nifty trick to switch the dist-dev and dist libraries for development / production builds from thinkshout.
I should note my setup does not include hot-reloading but I can at least compile my files and refresh immediately and see my changes. The issue I was having before is that I would have to stop my watches to drush cr and that workflow was ridiculous. I've never gotten hot reloading to work with with either BrowserSync or Webpack Dev Server and I might try to again but I need to move on with my life at this point.
I've also note included sass yet, so these files paths will change to include compilation and output for both .scss and .js files but this is the basic bare min setup working.

How to skip Javascript output in Webpack 4?

I use Webpack 4 in a project where I only need to compile and bundle styles so far. There's no Javascript.
Here's the config I have:
const path = require('path');
const MiniCssExtractPlugin = require('mini-css-extract-plugin');
module.exports = {
entry: {
'css/bundle': path.resolve(__dirname, 'static/scss/index.scss'),
},
output: {
path: path.resolve(__dirname, 'static'),
},
module: {
rules: [
{
test: /\.s[ac]ss$/,
include: path.resolve(__dirname, 'static/scss'),
use: [MiniCssExtractPlugin.loader, 'css-loader', 'sass-loader'],
},
],
},
plugins: [
new MiniCssExtractPlugin(),
],
};
The problem is that it outputs two files: bundle.css and bundle.js. Is there a way to configure Webpack so that it doesn't output the Javascript bundle? I tried to navigate the docs, tried a dozen different things, but it didn't really work.
One important note here is that if I remove the css-loader, bundling fails. So while css-loader is most likely responsible for outputting the bundle.js file, I'm not entirely sure how to avoid using it.
webpack-extraneous-file-cleanup-plugin has no effect with webpack 4.12.0.
I can suggest to remove bundle.js manually with on-build-webpack plugin:
var WebpackOnBuildPlugin = require('on-build-webpack');
// ...
plugins: [
// ...
new WebpackOnBuildPlugin(function () {
fs.unlinkSync(path.join('path/to/build', 'bundle.js'));
}),
],
March 2021:
In Webpack 5, on-build-webpack plugin did not work for me.
I found this:
Webpack Shell Plugin Next
The project I’m working on we’re using Webpack 5 as a build tool for a CSS pattern library. Therefore, we didn’t need the main.js in our dist.
Run npm i -D webpack-shell-plugin-next
Then in webpack.config.ts (just showing the pertinent parts):
import WebpackShellPluginNext from "webpack-shell-plugin-next";
module.exports = {
output: {
path: path.resolve(__dirname, "static/dist")
},
plugins: [
// Run commands before or after webpack 5 builds:
new WebpackShellPluginNext({
onBuildEnd: {
scripts: [
() => {
fs.unlinkSync(path.join(config.output.path, "main.js"));
}
]
}
})
]
};
export default config;
Unfortunately, this is just the way that webpack currently works. However, we are not alone in this problem! There's a plugin to cleanup any unwanted files:
install the plugin:
yarn add webpack-extraneous-file-cleanup-plugin -D
and then in your config:
const ExtraneousFileCleanupPlugin = require('webpack-extraneous-file-cleanup-plugin');
plugins: [
new ExtraneousFileCleanupPlugin({
extensions: ['.js'],
minBytes: 1024,
paths: ['./static']
}),
]
I simply delete the unneeded output with rm in package.json:
"scripts": {
"build": "npm run clean && webpack -p && rm ./dist/unneeded.js"
},
The webpack-remove-empty-scripts plugin, compatible with webpack 5, cover the current issue. It remove unexpected empty js file.

Requiring test files with webpack for use with mocha or similar

I'm trying to use webpack to bundle tests into a pack that I can pass directly to mocha. My webpack config looks something like this:
module.exports = {
entry: ...,
output: ...,
module: {
rules: [
{
test: /\.jsx?(.erb)?$/,
exclude: /node_modules/,
loader: 'babel-loader',
options: {
presets: [
'react',
[ 'latest', { 'es2015': { 'modules': false } } ]
]
}
}
]
},
plugins: [],
resolve: {
extensions: [ '.js', '.jsx' ],
modules: [
path.resolve('../app/javascript'),
path.resolve('../vendor/node_modules')
]
},
resolveLoader: {
modules: [ path.resolve('../vendor/node_modules') ]
}
}
(From https://github.com/rails/webpacker)
My entry point looks like this, which I've seen references to elsewhere:
var context = require.context('../path/to/tests', true, /.+\.test\.js?$/);
context.keys().forEach(context);
module.exports = context;
Now this works, and produces a bundle, say tests.js, that I can pass to mocha:
$ mocha tests.js
However, this makes webpack recompile all test files every time something is changed, which is really slow. For my application code, where I import modules using regular import statements, webpack only recompiles files that have changed.
Changing my entry point to something like:
require('../path/to/tests/foo.test.js');
require('../path/to/tests/bar.test.js');
require('../path/to/tests/baz.test.js');
...
Seems to have the desired effect, but there are hundreds of test files, and it seems cumbersome to have to manually import or require each one.
Webpack is recompiling your tests because you are using require.context. The docs for using it are here. Normally when running a series of tests with mocha you can just use mocha ../path/**.test.js to run all of them. This, of course, won't work if you are using new features from js which are not supported by Node.
One way to get around this is to pass compiler options to mocha, and it will use babel under the hood. A more detailed explanation can be found on this blog post. The general idea is to run mocha with --compilers js:babel-core/register which finds your .babelrc file.
If you are on a mac or linux, you can use the following command to make writing a bash expression to find all of your tests a little easier.
find ./path -name '*.test.js' | xargs mocha

Clarification on grunt-protractor-coverage syntax for rails app backend?

Background
I'm trying to use the grunt-protractor-coverage in my grunt script to get code coverage for protractor functional e2e tests. To get started, I utilized this tutorial, with some minor modifications and it works perfectly. Using this as a guide, I created a new gruntfile, substituting the "express" app with a rails app backend.
The Problem
When running my gruntfile, I get the following stack trace:
../dummy/node_modules/grunt-protractor-coverage/node_modules/protractor/node_modules/glob/glob.js:130
throw new Error("must provide pattern")
^
Error: must provide pattern
at new Glob (../dummy/node_modules/grunt-protractor-coverage/node_modules/protractor/node_modules/glob/glob.js:130:11)
at glob ../dummy/node_modules/grunt-protractor-coverage/node_modules/protractor/node_modules/glob/glob.js:57:11)
at Function.globSync [as sync] (../dummy/node_modules/grunt-protractor-coverage/node_modules/protractor/node_modules/glob/glob.js:76:10)
at Function.ConfigParser.resolveFilePatterns (../dummy/node_modules/grunt-protractor-coverage/node_modules/protractor/lib/configParser.js:89:26)
at Runner.run (../dummy/node_modules/grunt-protractor-coverage/node_modules/protractor/lib/runner.js:323:24)
at process.<anonymous> (../dummy/node_modules/grunt-protractor-coverage/node_modules/protractor/lib/runFromLauncher.js:32:14)
at process.EventEmitter.emit (events.js:98:17)
at handleMessage (child_process.js:318:10)
at Pipe.channel.onread (child_process.js:345:11)
[launcher] Runner Process Exited With Error Code: 8
Tracing through the code in grunt's task.js file via [node-inspector] (https://github.com/node-inspector/node-inspector), it seems there are two possible issues:
I'm missing some parameter from my config file which would correctly retrieve the files needed
There is a syntax issue
Any idea why it's throwing this error?
My Config File
protractor_coverage: {
options: {
configFile: '/usr/local/lib/node_modules/protractor/referenceConf.js', // Default config file
keepAlive: true, // If false, the grunt process stops when the test fails.
noColor: false, // If true, protractor will not use colors in its output.
coverageDir: '<%= dirs.instrumentedE2E %>',
args: {}
},
phantom: {
options: {
args: {
baseUrl: 'http://localhost:3000/',
// Arguments passed to the command
'browser': 'phantomjs'
}
}
},
chrome: {
options: {
args: {
baseUrl: 'http://localhost:3000/',
// Arguments passed to the command
'browser': 'chrome'
}
}
}
},
This error means that the plugin was unable to locate your spec files. There were a couple of bugs related to this that have been fixed in recent releases.
You'll generally want your protractorConf.js to be part of your projects. I generally put it in a directory called 'tests'.
So your options would then look like:
options: {
configFile: 'tests/protractorConf.js',
keepAlive: true, // If false, the grunt process stops when the test fails.
noColor: false, // If true, protractor will not use colors in its output.
coverageDir: '<%= dirs.instrumentedE2E %>',
args: {}
},
You could then put your specs in a 'tests/specs' directory, and in this protractorConf.js, reference them as:
specs: [
'tests/specs/**/*.spec.js',
'!**/exclude.spec.js'
],
This would get any file that ends with .spec.js in /tests/specs and any subdirectories therein, unless the file is named exclude.spec.js.
I had this issue also, for me in my protractorConf file I had to change where my path was.
I originally had
specs: [
'./e2e/**/*.spec.js'
],
which worked fine with protractor, and grunt-protractor-runner
But for some reason, the protractor-coverage, this did not run. I changed it to
specs: [
'test/e2e/**/*.spec.js'
],
and this solved my issue. Basically protractor-coverage looks at the path from the base rather then from the config file.

Resources