How can one split API documentation in multiple files using Swagger 2.0 - swagger

According to Swagger 2.0 specs, it might be possible to do this. I am referencing PathObject using $ref which points to another file. We used to be able to do this nicely using Swagger 1.2. But Swagger-UI does not seem to be able to read the referred PathObject in another file.
Is this part of spec too new and is not yet supported? Is there a way to split each "path"'s documentation into another file?
{
"swagger": "2.0",
"basePath": "/rest/json",
"schemes": [
"http",
"https"
],
"info": {
"title": "REST APIs",
"description": "desc",
"version": "1.0"
},
"paths": {
"/time": {
"$ref": "anotherfile.json"
}
}
}

To support multiple files, your libraries have to support dereferencing the $ref field. But I would not recommend to deliver the swagger file with unresolved references. Our swagger defintion has around 30-40 files. Delivering them via HTTP/1.1 could slow down any reading application.
Since we are building javascript libs, too, we already had a nodejs based build system using gulp. For the node package manager (npm) you can find some libraries supporting dereferencing to build one big swagger file.
Our base file looks like this (shortened):
swagger: '2.0'
info:
version: 2.0.0
title: App
description: Example
basePath: /api/2
paths:
$ref: "routes.json"
definitions:
example:
$ref: "schema/example.json"
The routes.json is generated from our routing file. For this we use a gulp target implementing swagger-jsdoc like this:
var gulp = require('gulp');
var fs = require('fs');
var gutil = require('gulp-util');
var swaggerJSDoc = require('swagger-jsdoc');
gulp.task('routes-swagger', [], function (done) {
var options = {
swaggerDefinition: {
info: {
title: 'Routes only, do not use, only for reference',
version: '1.0.0',
},
},
apis: ['./routing.php'], // Path to the API docs
};
var swaggerSpec = swaggerJSDoc(options);
fs.writeFile('public/doc/routes.json', JSON.stringify(swaggerSpec.paths, null, "\t"), function (error) {
if (error) {
gutil.log(gutil.colors.red(error));
} else {
gutil.log(gutil.colors.green("Succesfully generated routes include."));
done();
}
});
});
And for generating the swagger file, we use a build task implementing SwaggerParser like this:
var gulp = require('gulp');
var bootprint = require('bootprint');
var bootprintSwagger = require('bootprint-swagger');
var SwaggerParser = require('swagger-parser');
var gutil = require('gulp-util');
var fs = require('fs');
gulp.task('swagger', ['routes-swagger'], function () {
SwaggerParser.bundle('public/doc/swagger.yaml', {
"cache": {
"fs": false
}
})
.then(function(api) {
fs.writeFile('public/doc/swagger.json', JSON.stringify(api, null, "\t"), function (error) {
if (error) {
gutil.log(gutil.colors.red(error));
} else {
gutil.log("Bundled API %s, Version: %s", gutil.colors.magenta(api.info.title), api.info.version);
}
});
})
.catch(function(err) {
gutil.log(gutil.colors.red.bold(err));
});
});
With this implementation we can maintain a rather large swagger specification and we are not restricted to special programming language or framework implementation, since we define the paths in the comments to the real routing definitions. (Note: The gulp tasks are split in multiple files too.)

While it would theoretically be possible to do that in the future, the solution is still not fully baked into the supporting tools so for now I'd highly recommend keeping it in one file.
If you're looking for a way to manage and navigate the Swagger definition, I'd recommend using the YAML format of the spec, where you can add comments and that may ease up navigation and splitting of a large definition.

You can also use JSON Refs library to resolve such multi-file Swagger spec.
I've written about it in this blog post
There is also this GitHub repo to demonstrate how all of this work.

My solution to this problem is using this package below to solve the reference issue
https://www.npmjs.com/package/json-schema-ref-parser
Here is the code snippet when generating the swagger UI using that library. I was using Express.js for my node server.
import express from 'express';
import * as path from 'path';
import refParser from '#apidevtools/json-schema-ref-parser';
import swaggerUi from 'swagger-ui-express';
const port = 3100;
const app = express();
app.get('/', async (req, res) => {
res.redirect('/api-docs')
});
app.use(
'/api-docs',
async function (req: express.Request, res: express.Response, next: express.NextFunction) {
const schemaFilePath = path.join(__dirname, 'schema', 'openapi.yml');
try {
// Resolve $ref in schema
const swaggerDocument = await refParser.dereference(schemaFilePath);
(req as any).swaggerDoc = swaggerDocument;
next();
} catch (err) {
console.error(err);
next(err);
}
},
swaggerUi.serve,
swaggerUi.setup()
);
app.listen(port, () => console.log(`Local web server listening on port ${port}!`));
Take a look at my Github repository to see how it works

Related

NestJS / Swagger - Not Getting Complete Descriptions

I'm having issues getting Swagger to work correctly. I can get an example of the request body, but not the response or the swagger-json. The response shows as {} and the swagger-json is:
{
statusCode: 404,
message: "Cannot GET /swagger-json",
error: "Not Found",
}
My nest-cli.json file is:
{
"collection": "#nestjs/schematics",
"sourceRoot": "src",
"compilerOptions": {
"deleteOutDir": true,
"plugins": ["#nestjs/swagger/plugin"]
}
}
And in my main.ts I have:
import { DocumentBuilder, SwaggerModule } from '#nestjs/swagger';
...
const options = new DocumentBuilder()
.setTitle('ILuvCoffee')
.setDescription('Coffee Application')
.setVersion('1.0')
.build();
const document = SwaggerModule.createDocument(app, options);
SwaggerModule.setup('api', app, document);
I'm doing the Nest.js course, and I have followed along pretty faithfully so far. When I hit this snag, I double-checked my code was correct, even copy/pasting. I also double-checked that my DTO files follow *.dto.ts and my Entity files *.entity.ts. But I still can't get Swagger to show anything more than the request. Thoughts? Who's seeing what I'm not?
Here is my repo for it if you'd like to take a deeper peek: https://github.com/jstrother/iluvcoffee
Thanks!
Looks like you did not specifify the actual response types in your controller, instead you're using <any>, e.g:
#Get(':id')
findOne(#Param('id') id: string): Promise<any> {
console.log(id);
return this.coffeesService.findOne(id);
}
Try changing those anys to the actual types
#Get(':id')
findOne(#Param('id') id: string): Promise<Coffee> {
console.log(id);
return this.coffeesService.findOne(id);
}
Note that you can also use the ApiResponse decorator to explicitely define responses -> check the official example for more details.

How do I setup Swashbuckle v5 with swagger when I have a custom base url?

I am upgrading a .net API to .net Core 3.1 and using Swashbuckle.AspNetcore 5.4.1. The API is running inside a ServiceFabric app. I found this https://github.com/domaindrivendev/Swashbuckle.AspNetCore/issues/1173 and tried to follow that and swagger gets generated but if I try to use the Swagger UI to send requests the request URL is with the wrong IP so the request fail.
In the old Swashbuckle 4.0.1 setup we did not specify host, only the relative basePath. How can I achieve the same?
Startup.cs
var swaggerBasePath = "/MySfApp/SfApp.ClientApi/";
app.UseSwagger(c =>
{
c.SerializeAsV2 = serializeAsSwaggerV2;
c.RouteTemplate = "swagger/{documentName}/swagger.json";
c.PreSerializeFilters.Add((swaggerDoc, httpReq) =>
{
swaggerDoc.Servers = new List<OpenApiServer> { new OpenApiServer { Url = $"{httpReq.Scheme}://{httpReq.Host.Value}{swaggerBasePath}" } };
});
});
app.UseSwaggerUI(options =>
{
options.SwaggerEndpoint("api/swagger.json", "My API V1");
});
The result is that the Swagger UI loads correctly on URL:
http://145.12.23.1:54000/MySfApp/SfApp.ClientApi/swagger/index.html
and it says under name that BaseUrl is:
[ Base URL: 10.0.0.4:10680/MySfApp/SfApp.ClientApi/ ]
The 10.0.0.4:10680 is the node inside the ServiceFabric cluster. Correct IP to reach from outside is 145.12.23.1:54000. In the older version (4.0.1) of Swashbuckle it says baseUrl without IP first: "/MySfApp/SfApp.ClientApi"
Swagger.json is located at:
http://40.68.213.118:19081/MySfApp/SfApp.ClientApi/swagger/api/swagger.json
and it says:
"swagger": "2.0",
...
"host": "10.0.0.4:10680",
"basePath": "/MySfApp/SfApp.ClientApi/",
"schemes": [
"http"
],
"paths": {
"/activity/{activityId}": {
"get"
...etc
If i try to send a GET request from the Swagger UI the request is sent to wrong IP:
curl -X GET "http://10.0.0.4:10680/MySfApp/MySfApp/activity/3443"
EDIT 1:
After some digging I have now changed the setup to this in
startup.cs
var swaggerBasePath = "/MySfApp/SfApp.ClientApi/";
app.UsePathBase($"/{swaggerBasePath}");
app.UseMvc();
app.UseSwagger(c =>
{
c.SerializeAsV2 = serializeAsSwaggerV2;
c.PreSerializeFilters.Add((swaggerDoc, httpReq) =>
{
if (!httpReq.Headers.ContainsKey("X-Original-Host"))
return;
var serverUrl = $"{httpReq.Headers["X-Original-Proto"]}://" +
$"{httpReq.Headers["X-Original-Host"]}/" +
$"{httpReq.Headers["X-Original-Prefix"]}";
swaggerDoc.Servers = new List<OpenApiServer>()
{
new OpenApiServer { Url = serverUrl }
};
});
});
app.UseSwaggerUI(options => {
options.SwaggerEndpoint("api/swagger.json", "My API V1");
});
This now leads to the Swagger UI loading properly with the baseUrl
http://145.12.23.1:54000/MySfApp/SfApp.ClientApi/swagger/index.html
and also swagger.json is served correctly with the correct baseUrl.
http://145.12.23.1:54000/MySfApp/SfApp.ClientApi/swagger/api/swagger.json
So the wrong hostname is resolved. Thanks to idea from this thread.
However when I try to call an endpoint from the Swagger UI page, the curl URL does not include the baseUrl. So closer... but currently not possible to use Swagger UI.
curl -X GET "http://10.0.0.4:10680/activity/3443"
The swagger.json does not have 'host' nor 'basePath' defined.
We're using Swashbuckle version 6.1.4 - which is the latest as of this time of writing and we're still having the same issue when our API is deployed in Azure App Service that is mapped through Azure Front Door and APIM. The "Try out" functionality does not work as the base path / api route prefix is stripped from the Swagger UI. For example,
Instead of https://{DOMAIN}.com/{BASEPATH}/v1/Foo, the Swagger UI uses this: https://{DOMAIN}.com/v1/Foo. You can see that the /BASEPATH is missing.
I spent the whole day trying to fix this with trial and error, trying various approaches with no luck, I couldn't get an elegant way to get the base path from swagger configuration. For the time being, here's what I did to fix it:
app.UseSwagger(options =>
{
//Workaround to use the Swagger UI "Try Out" functionality when deployed behind a reverse proxy (APIM) with API prefix /sub context configured
options.PreSerializeFilters.Add((swagger, httpReq) =>
{
if (httpReq.Headers.ContainsKey("X-Forwarded-Host"))
{
//The httpReq.PathBase and httpReq.Headers["X-Forwarded-Prefix"] is what we need to get the base path.
//For some reason, they returning as null/blank. Perhaps this has something to do with how the proxy is configured which we don't have control.
//For the time being, the base path is manually set here that corresponds to the APIM API Url Prefix.
//In this case we set it to 'sample-app'.
var basePath = "sample-app"
var serverUrl = $"{httpReq.Scheme}://{httpReq.Headers["X-Forwarded-Host"]}/{basePath}";
swagger.Servers = new List<OpenApiServer> { new OpenApiServer { Url = serverUrl } };
}
});
})
.UseSwaggerUI(options =>
{
options.RoutePrefix = string.Empty;
options.SwaggerEndpoint("swagger/v1/swagger.json", "My Api (v1)");
});
Here's an open discussion related to this issue here.
I were having something similar in my solution and I have used a little bit this way and that works well for me, in case that helps someone.
public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
var pathBase = Configuration["PATH_BASE"];
if (!string.IsNullOrWhiteSpace(pathBase))
{
app.UsePathBase($"/{pathBase.TrimStart('/')}");
app.Use((context, next) =>
{
context.Request.PathBase = new PathString($"/{pathBase.TrimStart('/')}");
return next();
});
if (env.IsDevelopment())
{
app.UseSwagger(c =>
{
c.PreSerializeFilters.Add((swaggerDoc, httpReq) =>
{
if (!httpReq.Headers.ContainsKey("X-Original-Host"))
return;
var serverUrl = $"{httpReq.Headers["X-Original-Proto"]}://" + $"{httpReq.Headers["X-Original-Host"]}/" + $"{httpReq.Headers["X-Original-Prefix"]}";
swaggerDoc.Servers = new List<OpenApiServer>()
{
new OpenApiServer { Url = serverUrl }
}
});
});
app.UseSwaggerUI(c => c.SwaggerEndpoint($"/{pathBase.TrimStart('/')}/swagger/v1/swagger.json", "My.API v1"));
}
}
}
check the last line
app.UseSwaggerUI(c => c.SwaggerEndpoint($"/{pathBase.TrimStart('/')}/swagger/v1/swagger.json", "My.API v1"));
Try this:
serverUrl = $"{httpReq.Headers["X-Forwarded-Proto"]}://" +
$"{httpReq.Headers["X-Forwarded-Host"]}" + _basePath;
where _basePath can be set using the ServiceName property of StatelessServiceContext.
Please be noted that the original value of X-Forwarded-Proto may be overridden by SF.

Electron ES6 module import

Electron 3.0.0-beta.1
Node 10.2.0
Chromium 66.0.3359.181
The problem I'm having is importing a module. I created the following protocol:
protocol.registerFileProtocol('client', (request, callback) => {
var url = request.url.substr(8);
callback({path: path.join(__dirname, url)});
});
The output of the protocol is the correct path
"/Users/adviner/Projects/Client/src/ClientsApp/app.js"
I have the following module app.js with the following code:
export function square() {
return 'hello';
}
in my index.html I import the module like so:
<script type="module" >
import square from 'client://app.js';
console.log(square());
</script>
But I keep getting the error:
app.js/:1 Failed to load module script: The server responded with a non-JavaScript MIME type of "". Strict MIME type checking is enforced for module scripts per HTML spec.
I'm done searches but can't seem to find a solution. Can anyone suggest a way I can make this work?
Thanks
This is a tricky question and i will refer to Electron#12011 and this GitHub Gist for a deeper explaination but the core learning is that the corresponding HTML spec, disallows import via file:// (For XSS reasons) and a protocol must have the mime types defined.
The file protocol you use client:// has to set the correct mime-types when serving the files. Currently i would guess they are not set when you define the protocol via protocol.registerBufferProtocol thus you recive a The server responded with a non-JavaScript MIME type of "", the gist above has a code sample on how to do it.
Edit: I just want to emphasize the other answers here do only cover the absolute minimum basics implementation with no consideration of exceptions, security, or future changes. I highly recommend taking the time and read trough the gist I linked.
To confirm: this is there for security reasons.
However, in the event that you just need to get it deployed:
Change "target": "es2015" to "target": "es5" in your tsconfig.json file
Quick Solution:
const { protocol } = require( 'electron' )
const nfs = require( 'fs' )
const npjoin = require( 'path' ).join
const es6Path = npjoin( __dirname, 'www' )
// <= v4.x
// protocol.registerStandardSchemes( [ 'es6' ] )
// >= v5.x
protocol.registerSchemesAsPrivileged([
{ scheme: 'es6', privileges: { standard: true } }
])
app.on( 'ready', () => {
protocol.registerBufferProtocol( 'es6', ( req, cb ) => {
nfs.readFile(
npjoin( es6Path, req.url.replace( 'es6://', '' ) ),
(e, b) => { cb( { mimeType: 'text/javascript', data: b } ) }
)
})
})
<script type="module" src="es6://main.js"></script>
Based on flcoder solution for older Electron version.
Electron 5.0
const { protocol } = require('electron')
const nfs = require('fs')
const npjoin = require('path').join
const es6Path = npjoin(__dirname, 'www')
protocol.registerSchemesAsPrivileged([{ scheme: 'es6', privileges: { standard: true, secure: true } }])
app.on('ready', async () => {
protocol.registerBufferProtocol('es6', (req, cb) => {
nfs.readFile(
npjoin(es6Path, req.url.replace('es6://', '')),
(e, b) => { cb({ mimeType: 'text/javascript', data: b }) }
)
})
await createWindow()
})
Attention! The path always seems to be transformed to lowercase
<script type="module" src="es6://path/main.js"></script>
Sorry Viziionary, not enough reputation to answer the comment.
I've now done it like this:
https://gist.github.com/jogibear9988/3349784b875c7d487bf4f43e3e071612
my problem was, I also wanted to support modules which are imported via none relative path's, so I don't need to transpile my code.

Resolve a version from stable/dev channel of the Dart SDK

Is there some shared/simple logic to implement something like:
Future<Version> getLatestStable();
Future<Version> getLatestDev();
Just curious. I'd rather not parse the download page. I guess another option would be releasing either a feed.json or feed.xml as part of the website to make it easier for tools.
In Dart Code I pull the version from storage.googleapis.com:
https://storage.googleapis.com/dart-archive/channels/stable/release/latest/VERSION
https://storage.googleapis.com/dart-archive/channels/stable/dev/latest/VERSION
I think I took this from the source of the Dart downloads page and it's been stable since I implemented it, so should be good to use!
Here's my (TypeScript) code that pulls it:
export function getLatestSdkVersion(): PromiseLike<string> {
return new Promise<string>((resolve, reject) => {
const options: https.RequestOptions = {
hostname: "storage.googleapis.com",
port: 443,
path: "/dart-archive/channels/stable/release/latest/VERSION",
method: "GET",
};
let req = https.request(options, resp => {
if (resp.statusCode < 200 || resp.statusCode > 300) {
reject({ message: `Failed to get Dart SDK Version ${resp.statusCode}: ${resp.statusMessage}` });
} else {
resp.on('data', (d) => {
resolve(JSON.parse(d.toString()).version);
});
}
});
req.end();
});
}
Get the VERSION file of the latest directory
http://gsdview.appspot.com/dart-archive/channels/dev/release/latest/
Available channels are listed in
http://gsdview.appspot.com/dart-archive/channels/
Some code that makes use of it can be found in https://pub.dartlang.org/packages/bwu_dart_archive_downloader

How to create multiple output paths in Webpack config

Does anyone know how to create multiple output paths in a webpack.config.js file? I'm using bootstrap-sass which comes with a few different font files, etc. For webpack to process these i've included file-loader which is working correctly, however the files it outputs are being saved to the output path i specified for the rest of my files:
output: {
path: __dirname + "/js",
filename: "scripts.min.js"
}
I'd like to achieve something where I can maybe look at the extension types for whatever webpack is outputting and for things ending in .woff .eot, etc, have them diverted to a different output path. Is this possible?
I did a little googling and came across this *issue on github where a couple of solutions are offered, edit:
but it looks as if you need to know the entry point in able to specify an output using the hash method
eg:
var entryPointsPathPrefix = './src/javascripts/pages';
var WebpackConfig = {
entry : {
a: entryPointsPathPrefix + '/a.jsx',
b: entryPointsPathPrefix + '/b.jsx',
c: entryPointsPathPrefix + '/c.jsx',
d: entryPointsPathPrefix + '/d.jsx'
},
// send to distribution
output: {
path: './dist/js',
filename: '[name].js'
}
}
*https://github.com/webpack/webpack/issues/1189
however in my case, as far as the font files are concerned, the input process is kind of abstracted away and all i know is the output. in the case of my other files undergoing transformations, there's a known point where i'm requiring them in to be then handled by my loaders. if there was a way of finding out where this step was happening, i could then use the hash method to customize output paths, but i don't know where these files are being required in.
Webpack does support multiple output paths.
Set the output paths as the entry key. And use the name as output template.
webpack config:
entry: {
'module/a/index': 'module/a/index.js',
'module/b/index': 'module/b/index.js',
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: '[name].js'
}
generated:
└── module
├── a
│   └── index.js
└── b
└── index.js
I'm not sure if we have the same problem since webpack only support one output per configuration as of Jun 2016. I guess you already seen the issue on Github.
But I separate the output path by using the multi-compiler. (i.e. separating the configuration object of webpack.config.js).
var config = {
// TODO: Add common Configuration
module: {},
};
var fooConfig = Object.assign({}, config, {
name: "a",
entry: "./a/app",
output: {
path: "./a",
filename: "bundle.js"
},
});
var barConfig = Object.assign({}, config,{
name: "b",
entry: "./b/app",
output: {
path: "./b",
filename: "bundle.js"
},
});
// Return Array of Configurations
module.exports = [
fooConfig, barConfig,
];
If you have common configuration among them, you could use the extend library or Object.assign in ES6 or {...} spread operator in ES7.
You can now (as of Webpack v5.0.0) specify a unique output path for each entry using the new "descriptor" syntax (https://webpack.js.org/configuration/entry-context/#entry-descriptor) –
module.exports = {
entry: {
home: { import: './home.js', filename: 'unique/path/1/[name][ext]' },
about: { import: './about.js', filename: 'unique/path/2/[name][ext]' }
}
};
If you can live with multiple output paths having the same level of depth and folder structure there is a way to do this in webpack 2 (have yet to test with webpack 1.x)
Basically you don't follow the doc rules and you provide a path for the filename.
module.exports = {
entry: {
foo: 'foo.js',
bar: 'bar.js'
},
output: {
path: path.join(__dirname, 'components'),
filename: '[name]/dist/[name].bundle.js', // Hacky way to force webpack to have multiple output folders vs multiple files per one path
}
};
That will take this folder structure
/-
foo.js
bar.js
And turn it into
/-
foo.js
bar.js
components/foo/dist/foo.js
components/bar/dist/bar.js
Please don't use any workaround because it will impact build performance.
Webpack File Manager Plugin
Easy to install copy this tag on top of the webpack.config.js
const FileManagerPlugin = require('filemanager-webpack-plugin');
Install
npm install filemanager-webpack-plugin --save-dev
Add the plugin
module.exports = {
plugins: [
new FileManagerPlugin({
onEnd: {
copy: [
{source: 'www', destination: './vinod test 1/'},
{source: 'www', destination: './vinod testing 2/'},
{source: 'www', destination: './vinod testing 3/'},
],
},
}),
],
};
Screenshot
If it's not obvious after all the answers you can also output to a completely different directories (for example a directory outside your standard dist folder). You can do that by using your root as a path (because you only have one path) and by moving the full "directory part" of your path to the entry option (because you can have multiple entries):
entry: {
'dist/main': './src/index.js',
'docs/main': './src/index.js'
},
output: {
filename: '[name].js',
path: path.resolve(__dirname, './'),
}
This config results in the ./dist/main.js and ./docs/main.js being created.
In my case I had this scenario
const config = {
entry: {
moduleA: './modules/moduleA/index.js',
moduleB: './modules/moduleB/index.js',
moduleC: './modules/moduleB/v1/index.js',
moduleC: './modules/moduleB/v2/index.js',
},
}
And I solve it like this (webpack4)
const config = {
entry: {
moduleA: './modules/moduleA/index.js',
moduleB: './modules/moduleB/index.js',
'moduleC/v1/moduleC': './modules/moduleB/v1/index.js',
'moduleC/v2/MoculeC': './modules/moduleB/v2/index.js',
},
}
You definitely can return array of configurations from your webpack.config file. But it's not an optimal solution if you just want a copy of artifacts to be in the folder of your project's documentation, since it makes webpack build your code twice doubling the overall time to build.
In this case I'd recommend to use the FileManagerWebpackPlugin plugin instead:
const FileManagerPlugin = require('filemanager-webpack-plugin');
// ...
plugins: [
// ...
new FileManagerPlugin({
onEnd: {
copy: [{
source: './dist/*.*',
destination: './public/',
}],
},
}),
],
You can only have one output path.
from the docs https://github.com/webpack/docs/wiki/configuration#output
Options affecting the output of the compilation. output options tell Webpack how to write the compiled files to disk. Note, that while there can be multiple entry points, only one output configuration is specified.
If you use any hashing ([hash] or [chunkhash]) make sure to have a consistent ordering of modules. Use the OccurenceOrderPlugin or recordsPath.
I wrote a plugin that can hopefully do what you want, you can specify known or unknown entry points (using glob) and specify exact outputs or dynamically generate them using the entry file path and name. https://www.npmjs.com/package/webpack-entry-plus
I actually wound up just going into index.js in the file-loader module and changing where the contents were emitted to. This is probably not the optimal solution, but until there's some other way, this is fine since I know exactly what's being handled by this loader, which is just fonts.
//index.js
var loaderUtils = require("loader-utils");
module.exports = function(content) {
this.cacheable && this.cacheable();
if(!this.emitFile) throw new Error("emitFile is required from module system");
var query = loaderUtils.parseQuery(this.query);
var url = loaderUtils.interpolateName(this, query.name || "[hash].[ext]", {
context: query.context || this.options.context,
content: content,
regExp: query.regExp
});
this.emitFile("fonts/"+ url, content);//changed path to emit contents to "fonts" folder rather than project root
return "module.exports = __webpack_public_path__ + " + JSON.stringify( url) + ";";
}
module.exports.raw = true;
u can do lik
var config = {
// TODO: Add common Configuration
module: {},
};
var x= Object.assign({}, config, {
name: "x",
entry: "./public/x/js/x.js",
output: {
path: __dirname+"/public/x/jsbuild",
filename: "xbundle.js"
},
});
var y= Object.assign({}, config, {
name: "y",
entry: "./public/y/js/FBRscript.js",
output: {
path: __dirname+"/public/fbr/jsbuild",
filename: "ybundle.js"
},
});
let list=[x,y];
for(item of list){
module.exports =item;
}
The problem is already in the language:
entry (which is a object (key/value) and is used to define the inputs*)
output (which is a object (key/value) and is used to define outputs*)
The idea to differentiate the output based on limited placeholder like '[name]' defines limitations.
I like the core functionality of webpack, but the usage requires a rewrite with abstract definitions which are based on logic and simplicity... the hardest thing in software-development... logic and simplicity.
All this could be solved by just providing a list of input/output definitions... A LIST INPUT/OUTPUT DEFINITIONS.
Vinod Kumar's good workaround is:
module.exports = {
plugins: [
new FileManagerPlugin({
events: {
onEnd: {
copy: [
{source: 'www', destination: './vinod test 1/'},
{source: 'www', destination: './vinod testing 2/'},
{source: 'www', destination: './vinod testing 3/'},
],
},
}
}),
],
};

Resources