I am using Grunt + browserSync + grunt-php. The server starts normally. The problem is that whenever I make changes to PHP files, the changes are not reloaded automatically in browser. I have to manually reload the page despite having the settings in place. Been trying to solve this issue for the past 1 week, but no success. Tried out other online sources, but didn't help either. Please help.
Directory structure:
my_app/
src/
index.php
about.php
dist/
Gruntfile.js:
"use strict";
module.exports = function (grunt) {
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
watch: {
php: {
files: ['src/**/*.php']
}
},
browserSync: {
dev: {
bsFiles: {
src: 'src/**/*.php'
},
options: {
proxy: '127.0.0.1:8010', //our PHP server
port: 8080, // our new port
open: true,
watchTask: true
}
}
},
php: {
dev: {
options: {
port: 8010,
base: 'src'
}
}
}
});
grunt.registerTask('default', [
'php', // Using the PHP instance as a proxy
'browserSync',
'watch' // Any other watch tasks you want to run
]);
};
A kind soul helped me with the answer. I don't take credit for the answer and would like to share the solution so that it may help someone in need. Here it is:
1) Just make sure that you have the body tag in the PHP file that you want to reload.
2) Include the following JS code in the page:
<script id="__bs_script__">
//<![CDATA[
document.write("<script async src='/browser-sync/browser-sync-client.js?v=2.17.5'><\/script>".replace("HOST", location.hostname));
//]]>
</script>
Related
I've scoured the internet and have bits and pieces but nothing is coming together for me. I have a local Drupal environment running with Lando. I've successfully installed and configured webpack. Everything is working except when I try to watch or hot reload.
When I run lando npm run build-dev (that currently uses webpack --watch I can see my changes compiled successfully into the correct folder. However, when I refresh my Drupal site, I do not see that changes. The only time I see my updated JS changes are when I run lando drush cr to clear cache. Same things are happening when I try to configure the webpack-dev-server. I can get everything to watch for changes and compile correctly but I cannot get my browser to reload my files, they stay cached. I'm at a loss.
I've tried configuring a proxy in my .lando.yml , and have tried different things with the config options for devServer. I'm just not getting a concise answer, and I just don't have the knowledge to understand exactly what is happening. I believe it has to do with Docker containers not being exposed to webpack (??) but I don't understand how to configure this properly.
These are the scripts I have set up in my package.json , build outputs my production ready files into i_screamz/js/dist, build-dev starts a watch and compiles non-minified versions to i_screamz/js/dist-dev - start I have in here from trying to get the devServer to work. I'd like to get webpack-dev-server running as I'd love to have reloading working.
"scripts": {
"start": "npm run build:dev",
"build:dev": "webpack --watch --progress --config webpack.config.js",
"build": "NODE_ENV=production webpack --progress --config webpack.config.js"
},
This is my webpack.config.js - no sass yet, this is just a working modular js build at this point.
const path = require("path");
const BrowserSyncPlugin = require('browser-sync-webpack-plugin');
const isDevMode = process.env.NODE_ENV !== 'production';
module.exports = {
mode: isDevMode ? 'development' : 'production',
devtool: isDevMode ? 'source-map' : false,
entry: {
main: ['./src/index.js']
},
output: {
filename: isDevMode ? 'main-dev.js' : 'main.js',
path: isDevMode ? path.resolve(__dirname, 'js/dist-dev') : path.resolve(__dirname, 'js/dist'),
publicPath: '/web/themes/custom/[MYSITE]/js/dist-dev'
},
module: {
rules: [
{
test: /\.js$/,
exclude: /node_modules/,
use: {
loader: 'babel-loader'
}
}
]
},
plugins: [
new BrowserSyncPlugin({
proxy: {
target: 'http://[MYSITE].lndo.site/',
proxyReq: [
function(proxyReq) {
proxyReq.setHeader('Cache-Control', 'no-cache, no-store');
}
]
},
open: false,
https: false,
files: [
{
match: ['**/*.css', '**/*.js'],
fn: (event, file) => {
if (event == 'change') {
const bs = require("browser-sync").get("bs-webpack-plugin");
if (file.split('.').pop()=='js') {
bs.reload();
} else {
bs.stream();
}
}
}
}
]
}, {
// prevent BrowserSync from reloading the page
// and let Webpack Dev Server take care of this
reload: false,
injectCss: true,
name: 'bs-webpack-plugin'
}),
],
watchOptions: {
aggregateTimeout: 300,
ignored: ['**/*.woff', '**/*.json', '**/*.woff2', '**/*.jpg', '**/*.png', '**/*.svg', 'node_modules'],
}
};
And here is the config I have setup in my .lando.yml - I did have the proxy key in here but it's been removed as I couldn't get it setup right.
name: [MYSITE]
recipe: pantheon
config:
framework: drupal8
site: [MYPANTHEONSITE]
services:
node:
type: node
build:
- npm install
tooling:
drush:
service: appserver
env:
DRUSH_OPTIONS_URI: "http://[MYSITE].lndo.site"
npm:
service: node
settings.local.php
<?php
/**
* Disable CSS and JS aggregation.
*/
$config['system.performance']['css']['preprocess'] = FALSE;
$config['system.performance']['js']['preprocess'] = FALSE;
I've updated my code files above to reflect reflect a final working setup with webpack. The main answer was a setting in
/web/sites/default/settings.local.php
**Disable CSS & JS aggregation. **
$config['system.performance']['css']['preprocess'] = FALSE;
$config['system.performance']['js']['preprocess'] = FALSE;
I found a working setup from saschaeggi and just tinkered around until I found this setting. So thank you! I also found more about what this means here. This issue took me way longer than I want to admit and it was so simple. I don't know why the 'Disabling Caching css/js aggregation' page never came up when I was furiously googling a caching issue. Hopefully this answer helps anyone else in this very edge case predicament.
I have webpack setup within my theme root folder with my Drupal theme files. I run everything with Lando, including NPM. I found a nifty trick to switch the dist-dev and dist libraries for development / production builds from thinkshout.
I should note my setup does not include hot-reloading but I can at least compile my files and refresh immediately and see my changes. The issue I was having before is that I would have to stop my watches to drush cr and that workflow was ridiculous. I've never gotten hot reloading to work with with either BrowserSync or Webpack Dev Server and I might try to again but I need to move on with my life at this point.
I've also note included sass yet, so these files paths will change to include compilation and output for both .scss and .js files but this is the basic bare min setup working.
I wanted to tackle file cache updating in a Svelte app and wanted this part of the rollup build. I decide to add a querystring parameter to the file references (such as 'index.html?v=0.1') in the distributed build scripts. I created a constant '__cVersion__' in my rollup.config.js script and tried to use the 'rollup-plugin-modify', but that only updated my main.js and App.svelte code (the files being compiled). I also tried the '#rollup/plugin-replace' plugin with the same results. I needed the files I was also copying (not building) from src to public to also to have instances of '__cVersion__' replaced in the scripts.
The following was my initial rollup.config.js export function (the string replacement that did not work):
export default {
input: 'src/main.js',
output: {
sourcemap: true,
format: 'iife',
name: 'app',
file: 'public/build/bundle.js'
},
plugins: [
// this only seems to work on the main.js and .svelte files
modify({
'__cVersion__': 'c0.1.19'
}),
svelte({
dev: !production,
css: css => {
css.write('public/build/bundle.css');
}
}),
copy({
targets: [{
src: 'src/bs4.4.1.css',
dest: 'public/'
},
{
src: 'src/sw.js',
dest: 'public/'
},
{
src: 'src/index.html',
dest: 'public/'
},
{
src: 'src/manifest.json',
dest: 'public/'
},
{
src: 'src/images/*',
dest: 'public/images/'
}
]
}),
resolve({
browser: true,
dedupe: ['svelte']
}),
commonjs(),
// cache files
workbox({
mode: 'injectManifest',
options: {
swSrc: 'src/sw.js',
swDest: 'public/sw.js',
globDirectory: 'public',
globPatterns: [
'**/*.{html,json,js,css,png,map}',
'./manifest.json',
'./images/**',
'./bs4.4.1.css',
'./index.html'
]
}
}),
!production && serve(),
!production && livereload('public'),
production && terser()
],
watch: {
clearScreen: false
}
};
By default, the rollup-plugin-copy plugin will trigger on rollup's buildEnd hook.
Setting the hook to writeBundle fixed this issue for me, like so:
copy({
targets: [
{
src: ...,
dest: ...
},
...
],
hook: "writeBundle",
}),
I decided to take a different approach since I realized the build process needed to complete before I tried to replace '__cVersion__' in the files. After some trial and error I settled on this code:
https://github.com/kuhlaid/svelte2/releases/tag/v0.1.7
If you search the source code for '__cVersion__' you will see where I am adding the file revision string to try and force a file cache update...however, that didn't fully fix the issue.
I then looked at the service worker (sw.js) and realized the Workbox 'injectManifest' was actually handling the file revisions. The only problem with my current setup was that I had added '__cVersion__' constants to my scripts, but Workbox never saw the replacements since Workbox processed the service worker before I replaced the constants.
What I probably need to do is copy the src files to a 'staging directory' where I can replace the 'cache' constants in the scripts and then run the build rollup off of the staging files. This 'should' cause Workbox to treat the files as updated and thus assign them different revision numbers in the service worker file. I will try and update this thread when I have that issue worked out.
My current application is set up using Ruby on Rails and React/Typescript. I am trying to set up hot reloading.
Here is the current folder structure
Project Root
- app => all the rails code
- frontend => all the react code
- webpack => list of configuration files, like development.js and production.js
This project isn't using react_on_rails or webpacker. The frontend code is kept separate from the backend code. The Rails backend serves up an html
<div id='root' />
and the react code will run off of that.
This is the command I tried to run to get hot reloading to work
node_modules/.bin/webpack-dev-server --config=./webpack/development.js --hotOnly --entry=../frontend/Entry.tsx --allowedHosts=localhost:3000
However, not only is hot reloading not working, the changes I made are not showing up in the browser as well. Everything looks like in the terminal.
My issue here is I technically have two servers running at the same time.
localhost:3000 => Rails server
localhost:8080 => Webpack dev server.
If I change webpack server to point to 3000 as well, the rails app will not work properly.
Is there a way where I can get hot reloading to work using this setup?
here are the webpack version
"webpack": "^4.20.1",
"webpack-cli": "^3.1.1",
"webpack-dev-server": "^3.7.1"
webpack.development.config.js
const webpack = require('webpack');
const path = require('path');
const HtmlWebpackPlugin = require('html-webpack-plugin');
const ForkTsCheckerWebpackPlugin = require('fork-ts-checker-webpack-plugin');
const CaseSensitivePathsPlugin = require('case-sensitive-paths-webpack-plugin');
module.exports = {
context: __dirname,
entry: '../frontend/Entry.tsx',
devtool: 'source-maps',
resolve: {
extensions: ['*', '.js', '.jsx', '.ts', '.tsx'],
modules: [
'node_modules',
path.resolve(__dirname, '../frontend'),
path.resolve(__dirname, '../node_modules')
]
},
output: {
path: path.join(__dirname, `../public/javascripts/`),
publicPath: `/javascripts/`,
filename: '[name]-[hash].js'
},
module: {
rules: [
{
test: /\.(t|j)sx?$/,
loader: 'ts-loader',
options: {
// disable type checker - we will use it in fork plugin
transpileOnly: true
}
},
{
enforce: 'pre',
test: /\.(t|j)sx?$/,
loader: 'source-map-loader'
},
{
test: /\.css$/,
use: ['style-loader', 'css-loader']
},
{
test: /\.scss$/,
use: ['style-loader', 'css-loader', 'sass-loader']
},
{
test: /\.(png|svg|jpg|gif)$/,
use: [
{
loader: 'file-loader',
options: {
name: '[name]-[hash].[ext]',
outputPath: 'images/'
}
},
{
loader: 'image-webpack-loader',
options: {
pngquant: {
quality: '40',
speed: 4
}
}
}
]
}
]
},
plugins: [
new webpack.DefinePlugin({
'process.env': {
NODE_ENV: JSON.stringify('development')
}
}),
new HtmlWebpackPlugin({
template: path.join(__dirname, '..', 'application.html'),
filename: path.join(__dirname, '..', 'app', 'views', 'layouts', '_javascript.html.erb')
}),
// runs typescript type checker on a separate process.
new ForkTsCheckerWebpackPlugin({
checkSyntacticErrors: true,
tsconfig: '../tsconfig.json'
}),
new CaseSensitivePathsPlugin()
],
optimization: {
splitChunks: { chunks: 'all' }
}
};
Since you are setting up webpack dev server the first time, the problem is two fold,
Setup webpack dev server
Configure hot reload
Setting up webpack dev server
I presume your app is the api server. Similarly webpack-dev-server too is a http server. Its just a wrapper around expressjs infact.
while using webpack dev server during development, the bundles are served by webpack dev server, and all xhr requests are made to this dev server. In order to route these requests to your app server, you need to add proxy rules to your webpack config.
On a high level the flow would look as follows.
browser ---(xhr requests)-----> webpack-dev-server -----(proxy api requests)--->app server
In order to add a proxy rule to route all api request to your rails server, your api routes should be prepended with /api, eg, /api/customers so that all request matching /api are forwarded to the rails server
A sample config to support the above flow would be something as follows in your webpack config file
module.exports = {
// ...your other configs
devServer: {
contentBase: path.join(__dirname, 'public/'),
port: 8080,
publicPath: 'http://localhost:8080/', // Path of your dev server
historyApiFallback: true, // add this if you are not using browser router
proxy: {
'/api': { // string to look for proxying requests to api
target: 'http://localhost:3000', // Path of your rails api server
},
},
},
// ...your other configs
}
Setting up Hot reload
In order to setup hot reload, I would recommend to use Dan Abramov's react-hot-loader as its less buggy in hmr patching.
Setting up hmr is easy
Add the dependency yarn add react-hot-loader
Add babel plugin in your .babelrc
{
"plugins": ["react-hot-loader/babel"]
}
Mark your root component as hot exported
import { hot } from 'react-hot-loader/root'; // this should be imported before react and react-dom
const App = () => <div>Hello World!</div>;
export default hot(App);
Note: Its safe to add react-hot-loader in your dependencies, because in your production build. Hot reload package will be stripped out.
To start the webpack server in hot mode, you can add a script like below in your package.json.
"scripts": {
"start": "webpack-dev-server --hot --mode development --config ./webpack.dev.config"
}
I use Webpack 4 in a project where I only need to compile and bundle styles so far. There's no Javascript.
Here's the config I have:
const path = require('path');
const MiniCssExtractPlugin = require('mini-css-extract-plugin');
module.exports = {
entry: {
'css/bundle': path.resolve(__dirname, 'static/scss/index.scss'),
},
output: {
path: path.resolve(__dirname, 'static'),
},
module: {
rules: [
{
test: /\.s[ac]ss$/,
include: path.resolve(__dirname, 'static/scss'),
use: [MiniCssExtractPlugin.loader, 'css-loader', 'sass-loader'],
},
],
},
plugins: [
new MiniCssExtractPlugin(),
],
};
The problem is that it outputs two files: bundle.css and bundle.js. Is there a way to configure Webpack so that it doesn't output the Javascript bundle? I tried to navigate the docs, tried a dozen different things, but it didn't really work.
One important note here is that if I remove the css-loader, bundling fails. So while css-loader is most likely responsible for outputting the bundle.js file, I'm not entirely sure how to avoid using it.
webpack-extraneous-file-cleanup-plugin has no effect with webpack 4.12.0.
I can suggest to remove bundle.js manually with on-build-webpack plugin:
var WebpackOnBuildPlugin = require('on-build-webpack');
// ...
plugins: [
// ...
new WebpackOnBuildPlugin(function () {
fs.unlinkSync(path.join('path/to/build', 'bundle.js'));
}),
],
March 2021:
In Webpack 5, on-build-webpack plugin did not work for me.
I found this:
Webpack Shell Plugin Next
The project I’m working on we’re using Webpack 5 as a build tool for a CSS pattern library. Therefore, we didn’t need the main.js in our dist.
Run npm i -D webpack-shell-plugin-next
Then in webpack.config.ts (just showing the pertinent parts):
import WebpackShellPluginNext from "webpack-shell-plugin-next";
module.exports = {
output: {
path: path.resolve(__dirname, "static/dist")
},
plugins: [
// Run commands before or after webpack 5 builds:
new WebpackShellPluginNext({
onBuildEnd: {
scripts: [
() => {
fs.unlinkSync(path.join(config.output.path, "main.js"));
}
]
}
})
]
};
export default config;
Unfortunately, this is just the way that webpack currently works. However, we are not alone in this problem! There's a plugin to cleanup any unwanted files:
install the plugin:
yarn add webpack-extraneous-file-cleanup-plugin -D
and then in your config:
const ExtraneousFileCleanupPlugin = require('webpack-extraneous-file-cleanup-plugin');
plugins: [
new ExtraneousFileCleanupPlugin({
extensions: ['.js'],
minBytes: 1024,
paths: ['./static']
}),
]
I simply delete the unneeded output with rm in package.json:
"scripts": {
"build": "npm run clean && webpack -p && rm ./dist/unneeded.js"
},
The webpack-remove-empty-scripts plugin, compatible with webpack 5, cover the current issue. It remove unexpected empty js file.
I am planning on running my app in docker. I want to dynamically start, stop, build, run commands, ... on docker container. I found a tool named dockerode. Here is the project repos. This project has doc, but I am not understanding very well. I would like to understand few thing. This is how to build an image
docker.createContainer({Image: 'ubuntu', Cmd: ['/bin/bash'], name: 'ubuntu-test'}, function (err, container) {
container.start(function (err, data) {
//...
});
});
It is possible to make RUN apt-get update like when we use Dockerfile, or RUN ADD /path/host /path/docker during build ? how to move my app into container after build ?
Let's see this code :
//tty:true
docker.createContainer({ /*...*/ Tty: true /*...*/ }, function(err, container) {
/* ... */
container.attach({stream: true, stdout: true, stderr: true}, function (err, stream) {
stream.pipe(process.stdout);
});
/* ... */
}
How can I know how many params I can put here { /*...*/ Tty: true /*...*/ } ?
Has someone tried this package too ? please help me to start with.
Dockerode is just a node wrapper for Docker API. You can find all params you can use for each command in api docs.
For example docker.createContainer will call POST /containers/create (docs are here: https://docs.docker.com/engine/reference/api/docker_remote_api_v1.24/#/create-a-container)
Check files in lib folder of dockerode repo to see what api command is wrapped for each dockerode method.