When I hit http://localhost:3001/api-docs loads the swagger json docs.
{
swagger: "2.0",
info: {
version: "1.0.0",
title: "Auth-gateway services",
contact: {
name: "swagger docs",
url: "https://www.google.com"
}
},
host: "127.0.0.1:3001",
basePath: "/",
...
}
But how do I load UI like http://petstore.swagger.io/ for my APIs.
To view you api through swagger-ui, do one of the following.
Option 1: Using online swagger-ui
Go to this.
On the dialog-box on the top of the page, provide the url for swagger-json. In your case, insert http://localhost:3001/api-docs instead of http://petstore.swagger.io/v2/swagger.json (which can be seen in default) and click Explore.
Now you can see swagger-ui generated for your api.
Option 2: Setting up swagger-ui project locally
You have to set up swagger-ui. You can clone the project set up that with the below instructions provided.
Windows Users: Please install Python before follow below guidelines
for node-gyp rebuild to run.
1. npm install
2. npm run build
3. You should see the distribution under the dist folder. Open ./dist/index.html to launch Swagger UI in a browser
Development
Use npm run serve to make a new build, watch for changes, and serve the result at http://localhost:8080/.
Now you should be able to see something exactly like online swagger-ui.
Do the same as option 1 to provide swagger-json url and see swagger-ui generated.
Related
I've been updating my company's projects to the latest versions of all the packages. When I try to run swagger (http://localhost:4000/swagger/?url=/swagger/swagger.json), it now redirects me to the Petstore.
After searching Google for an answer, it seems like I need to set the following parameter:
queryConfigEnabled=true
According to the Swagger documentation, I can set the parameter in the following ways:
Swagger UI accepts configuration parameters in four locations.
From lowest to highest precedence:
The swagger-config.yaml in the project root directory, if it exists, is baked into the application
configuration object passed as an argument to Swagger UI (SwaggerUI({ ... }))
configuration document fetched from a specified configUrl
configuration items passed as key/value pairs in the URL query string
I tried the following URL
http://localhost:4000/swagger/?queryConfigEnabled=true&url=http://localhost:4000/swagger/swagger.json
but that still redirected me to the Petstore.
I then tried to create a swagger-config.yaml file in the project root directory with the following contents:
queryConfigEnabled: "true"
url: "/swagger/swagger.json"
dom_id: "#swagger-ui"
validatorUrl: "https://validator.swagger.io/validator"
That didn't work either. I even tried to copy the config file to the project /src folder to see if that made a difference.
I tried other things like using a swagger-config.json file instead, copying the config file to the app root in the docker image, setting the configURL to point to the config file, setting the docker environment CONFIG_URL to point to the swagger config file. None of these solutions worked. I'm still being redirected to the Petstore.
As a last resort, I modified the dist/swagger-initializer.js file and added the queryConfigEnabled parameter.
That worked.
Obviously, if I delete the /node_modules folder and re-run npm install, that change will go away.
What am I doing wrong? How do I fix this?
I built a website using Gatsby, Contentful, and deployed on Netlify.
I am going to run this website with multiple domain aliases.
ex:
alias1.example.com
alias2.example.com
In that case, the aliases work well and the website have to show contents that belong to the own alias in Contentful.
For example, let's say the current alias is alias1, then the website have to fetch data only have alias1 entry from Contentful.
What I was trying is to add the codes to identify alias in gatsby-config.js using windows.location.href, and set siteUrl as dynamic, but it didn't work.
I am not sure it could be possible and how to implement it.
Thank you.
The best (and almost the only) approach to achieving this is to use an environment variables for each site/alias and configure the deploy command to trigger and use the variables for each site. In that way, each deploy will fetch the data from each Contentful environment.
In your gatsby-config.js (above the module exportation) add:
require("dotenv").config({
path: `.env.${process.env.NODE_ENV}`,
})
The next step is to create one environment file for each alias. In your project root:
.env.alias1
.env.alias2
Each file should contain your environment variables from Contentful:
CONTENTFUL_ACCESS_TOKEN:12345
CONTENTFUL_SPACE_ID:12345
Then, in your gatsby-config.js just replace your hardcoded variables for the ones in your environment files:
{
resolve: `gatsby-source-contentful`,
options: {
spaceId: process.env.CONTENTFUL_SPACE_ID,
accessToken: process.env.CONTENTFUL_ACCESS_TOKEN,
},
},
The last step is to configure the deploy scripts to trigger each desired alias. In your package.json:
"scripts": {
"clean": "gatsby clean",
"test": "jest",
"format": "prettier --write \"**/*.{js,jsx,json,md}\""
"develop-alias1": "gatsby develop GATSBY_ACTIVE_ENV=alias1"
"build-alias1": "gatsby build GATSBY_ACTIVE_ENV=alias1"
"develop-alias2": "gatsby develop GATSBY_ACTIVE_ENV=alias2"
"build-alias2": "gatsby build GATSBY_ACTIVE_ENV=alias2"
},
Note that you will replace the default gatsby develop and gatsby build for your aliased commands.
By adding this bunch of configuration, for each develop or build/deploy you are telling your Gatsby project to which environment file should look at (it will take your .env.alias* instead). Each file will contain the keys for each environment in Contentful with different content in it, allowing you to deploy aliased sites with different content using a unique CMS.
This might be the most critical problem of Gatsby, and almost people has hard time with it.
The core problem is the "browser environment" is not available when you "build" Gatsby project. And gatsby-config.js is used for NodeJS environment. In other words, everything sticked with window variable is not accessible.
You should read the offical docs about gatsby build process here:
https://www.gatsbyjs.com/docs/overview-of-the-gatsby-build-process/#build-time-vs-runtime.
Solution: you can define different "scripts" in package.json for each alias which you can provide environment variables for NodeJS environment. Then in gatsby-config.js, use dotenv package to read passed variables.
You can read more here about using environment variables: https://www.gatsbyjs.com/docs/environment-variables/#reach-skip-nav
I've been thinking about using React as the frontend for a Grails application, but I'm having a bit of trouble getting started.
So far, I've become accustomed to write a React app using Node/NPM with the help of Webpack, and that's been pretty easy because there is plenty of documentation for that setup.
However, I'm struggling to find anything concrete integrating React seamlessly with Grails.
Ideally, I would just do grails run-app and it should take care of everything. I do not want other team members to worry about starting up two different servers or something along those lines.
Please let me know if anyone has done this before.
Webpack can be configured to work quite well with Grails. The key is to have webpack generate its bundle whenever the app is started up, and to output the bundle in a directory where it can be served from the GSP. You do not want your source JavaScript (I.e, React/ES6 code) in the asset pipeline if your using Webpack, instead you want to keep those source files in another directory (such as src/webapp), and configure Webpack to bundle these files and output the result to the asset pipeline (assuming you're using AP at all).
Here's an example configuration for Webpack:
var path = require('path');
module.exports = {
entry: {
index: './src/webapp/index.js'
},
output: {
path: './grails-app/assets/javascripts',
publicPath: '/assets/',
filename: 'bundle.js'
},
Finally, to achieve the integrated webpack/Grails startup, you can use the Gradle node plugin and attach the webpack run script to the application startup in a custom task in your build.gradle (this is assuming that you have a npm script named "webpack" defined to run webpack)
assetCompile.dependsOn(['npmInstall', 'npm_run_webpack'])
Please note that if you want to run webpack in "watch" mode, you'll need to do that seperately from starting up the Grails app, so that that script can run continuously (there actually is support for this in the Gradle mode plugin but it's currently broken).
See this link for a more in-depth explanation of this approach, with a sample application: http://grailsblog.objectcomputing.com/posts/2016/05/28/using-react-with-grails.html
Also checkout the React profile for Grails 3: https://github.com/grails-profiles/react
It has not been released yet but should be in the next few days. It makes use of the same approach outlined here and in the linked post.
You could use npm's scripts feature to combine all steps necessary to start up the development environment into a single command, e.g.:
// package.json
{
...
"scripts": {
"start": "npm start-grails & npm start-react",
"start-grails": "grails run-app",
"start-react": "node server.js"
},
...
}
Now all it takes is a simple npm start to launch all relevant applications.
I'm trying to run this example. But it won't run.
I've the latest version 45, along with JPM installed. From the command line if I give "jpm run" it gives couple of errors like name should be in all small letters and no content script specified etc.
How to make it run?
I want some options popup when it's browser icon is clicked. But this part as in package.json is n't working:
"browser_action": {
"default_icon": "icons/beasts-32.png",
"default_title": "Beastify",
"default_popup": "popup/choose_beast.html"
},
WebExtensions don't use JPM.
You just pack them into a ZIP file and rename it to XPI and that's it. You can also use the web-ext command-line tool to do it (web-ext build).
For testing your extension, you don't even need to pack it - just open about:debugging in your Firefox, click "Load add-on temporarily" and select your extension's main folder.
See https://developer.mozilla.org/en-US/Add-ons/WebExtensions/Packaging_and_installation for details.
I have trouble connecting to sites with ssl, i.e. https. It can successfully download artifacts from the internet if the url begins with http.
bower install will download dependencies via https. Is there anyway make it download via http?
I had troubles with this too, and I couldn't find an elegant way to fix it. My workaround was:
Go to your global npm folder and find the "bower" folder (on Windows 7 that is "C:\Users\\AppData\Roaming\npm\node_modules").
In that folder, search the default.js file placed in node_modules\bower-config\lib\util\default.js
Inside that file you will find a "var defaults". Replace the "registry" url property from "https" to "http".
Yes, I know. This shouldn't be done like this, but at least help me to bypass the connection error.
Hope that helps!
You can change the registry used by Bower in the .bowerrc file. The default registry is: https://bower.herokuapp.com and is defined in node_modules/bower-config/lib/util/default.js (as described by Jean Manuel Arias in his answer).
To override for your project, add a value for the registry setting in .bowerrc. An example file might be:
{
"directory": "<YOUR LIBRARY INSTALL DIRECTORY>",
"registry":"http://bower.herokuapp.com"
}
In the above example, the default https registry is being overridden with the http version. A full list of the available .bowerrc settings can be found at: Bower Spec.
You can do a global override for the current user by creating a %USERPROFILE%\.bowerrc file (for windows, in Linux it is: ~/.bowerrc). Bower follows a similar search path when applying settings to NPM (see npmrc settings). This is probably a better route as it avoids cluttering your project with local settings.