Generate static docs with swagger - swagger

Is there a method for creating static documentation for swagger 2.0?
Perhaps like the 'preview' on editor.swagger.io.
I need to get static html files so I can include them in some static documents.
So far I've not found a way to do this. I see there is swagger-codegens static-docs
but this only works for swagger <= 1.2.

Use swagger-codegen:
swagger-codegen generate -i <path to your swagger file> -l html2 -o <path to output location>
If you decide to customize the HTML template:
Clone the swagger-codegen project from github
Copy modules/swagger-codegen/src/main/resources/htmlDocs2 folder to another place, for example: cp -R modules/swagger-codegen/src/main/resources/htmlDocs2 ~/templates
Modify the .mustache templates in ~/templates to fit your requirements.
Run: swagger-codegen generate -i <path to your swagger file> -l html2 -o <path to output location> -t <templates path> for <templates path> should be ~/templates in the example above.

If you'd just like to generate static docs in a straightforward way, consider Spectacle.
npm install spectacle-docs if you want to put a script in your package.json, or npm install -g spectacle-docs if it should be available everywhere.
Then you can just run spectacle spec.yaml, with options to build to a specific directory, run a server, and/or watch the specfile and update as needed.

You can use the swagger-codegen command as others have mentioned, but the output that you get from using -l html or -l html2 is not interactive like the Swagger UI.
To get an interactive static page like the Swagger UI, follow these steps:
Install
Go to https://github.com/swagger-api/swagger-ui/releases and download the latest release as a .zip file.
Unzip the file and copy everything in the ./dist folder over to the directory that you want the webserver to serve up. For example, Gitlab Pages needs it needs to be in the ./public folder in your repository.
Copy over your swagger.yml file to the ./public folder.
Open up the ./public/index.html file and update the URL to where the swagger file will be on the webserver. For a local server, it might be this: url: "http://localhost:8000/swagger.yml
Test
To test this out, you can run a simple HTTP server using python3.
python3 -m http.server 8000 --directory public
Open up http://localhost:8000/ and check it out!

I usually do it with https://editor.swagger.io/. No installation or anything required.
Copy your yml file into the editor and choose 'Generate Client > html2' and it will generate static html files in a zip file.

The static-docs in 2.0 is implemented for 2.0. see the ./bin/static-docs.sh here:
https://github.com/swagger-api/swagger-codegen/tree/master/bin

You can use:
swagger-ui : just git clone the project and copy your JSON in the base directory
swagger-codegen

OpenAPI 3
For rendering an OpenAPI 3 specification into self-contained HTML file, redoc-cli can be used. You can use ReDoc's Petstore OpenAPI 3 spec as example.
mkdir -p -m 777 build && docker run --rm --user 1000 \
-v $PWD/build:/tmp/build -w /tmp/build \
-v $PWD/openapi.yaml:/tmp/openapi.yaml node:14-slim npx -q \
redoc-cli bundle /tmp/openapi.yaml
This will generate build/redoc-static.html in your current directory.
To avoid waiting for installation, you can also build yourself a Docker image with redoc-cli according to their Dockerfile, or installing redoc-cli to your OS, if you have NodeJS there, with npm install -g redoc-cli.
OpenAPI 2
ReDoc also has compatibility mode for OpenAPI 2/Swagger, so the above also works for Petstore OpenAPI 2 spec.
[ReDoc Compatibility mode]: Converting OpenAPI 2.0 to OpenAPI 3.0
Alternatively, the there's OpenAPI 2-only Spectacle and its official Docker image. It can be used similarly like:
mkdir -p -m 777 build && docker run --rm --user 1000 \
-v $PWD/build:/tmp/build \
-v $PWD/swagger.yaml:/tmp/swagger.yaml sourcey/spectacle \
spectacle -t /tmp/build /tmp/swagger.yaml
It generates static build which is almost self-contained (except loading jQuery from code.jquery.com which was slow for some reason on my end).
├── index.html
├── javascripts
│   ├── spectacle.js
│   └── spectacle.min.js
└── stylesheets
├── foundation.css
├── foundation.min.css
├── spectacle.css
└── spectacle.min.css

I've been using OpenAPI Generator CLI Docker Image https://github.com/OpenAPITools/openapi-generator
it can generate server, clients and docs. for each language there are template files so you can modify the behaviour as needed
I managed to get a python-flask server generated and self hosting generated documentation.
the below will generate an html file that includes code examples for a client
USER=$(shell id -u)
GROUP=$(shell id -g)
MDIR=$(shell pwd)
docker run --rm --user ${USER} -u\:${GROUP} \
-v ${MDIR}:/local openapitools/openapi-generator-cli generate \
--package-name EXAMPLE \
-t /local/.openapi-generator-html-client/ \
-i /local/EXAMPLE.v1.yaml \
-g html2 \
-o /local/openapi_docs

If your specifically looking for for Swagger 2.0, I'd like to point you to my answer in
Converting Swagger specification JSON to HTML documentation
, although I believe that Swagger-Codegen supports Swagger 2.0 by now.

"static" docs can mean several things.
If you're looking for an interactive display (like the editor's preview), you have swagger-ui (https://github.com/swagger-api/swagger-ui).
The code in the codegen that does the more static docs (without the "Try it now" button, for example) is not implemented yet for 2.0 though should be available in the upcoming few weeks.

Clik on preview docs, use chrome addon 'Save Page WE' (right click on page -> 'save page we'), result is single html file (its not clickable so you have to click up everything you want to be seen).

Include dependency for swagger in your pom.
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>2.4.0</version>
</dependency>
And try hitting -> https://editor.swagger.io/

Related

Using JMeter plugins with justb4/jmeter Docker image results in error

Goal
I am using Docker to run JMeter in Azure Devops. I am trying to use Blazemeter's Parallel Controller, which is not native to JMeter. So, according to the justb4/jmeter image documentation, I used the following command to get the image going and run the JMeter test:
docker run --name jmetertest -i -v /home/vsts/work/1/s/plugins:/plugins -v $ROOTPATH:/test -w /test justb4/jmeter ${#:2}
Error
However, it produces the following error while trying to accommodate for the plugin (I know the plugin makes the difference due to testing without the plugin):
cp: can't create '/test/lib/ext': No such file or directory
As far as I understand, this is an error produced when one of the parent directories of the directory you are trying to make does not exist. Is there something I am doing wrong, or is there actually something wrong with the image?
References
For reference, I will include links to the image documentation and the repository.
Image: https://hub.docker.com/r/justb4/jmeter
Repository: https://github.com/justb4/docker-jmeter
Looking into the Dockerfile:
ENV JMETER_HOME /opt/apache-jmeter-${JMETER_VERSION}
Looking into entrypoint.sh
if [ -d /plugins ]
then
for plugin in /plugins/*.jar; do
cp $plugin $(pwd)/lib/ext
done;
fi
It basically copies the plugins from /plugins folder (if it is present) to /lib/ext folder relative to current working directory
I don't know why did you add this stanza -w /test to your command line but it explicitly "tells" the container that local working directory is /test, not /opt/apache-jmeter-xxxx, that's why the script is failing to copy the files.
In general I don't think that the approach is very valid because:
In Azure DevOps you won't have your "local" folder (unless you want to add plugins binaries under the version control system)
Some JMeter Plugins have other .jars as the dependencies so when you're installing the plugin you should:
put the plugin itself under /lib/ext folder of your JMeter installation
put the plugin dependencies under /lib folder of your JMeter installation
So I would recommend amending the Dockerfile, download JMeter Plugins Manager and installed the plugin(s) you need from the command line
Something like:
RUN wget https://jmeter-plugins.org/get/ -O /opt/apache-jmeter-${JMETER_VERSION}/lib/ext/jmeter-plugins-manager.jar
RUN wget https://repo1.maven.org/maven2/kg/apc/cmdrunner/2.2/cmdrunner-2.2.jar -P /opt/apache-jmeter-${JMETER_VERSION}/lib/
RUN java -cp /opt/apache-jmeter-${JMETER_VERSION}/lib/ext/jmeter-plugins-manager.jar org.jmeterplugins.repository.PluginManagerCMDInstaller
RUN /opt/apache-jmeter-${JMETER_VERSION}/bin/./PluginsManagerCMD.sh install bzm-parallel

Is there an alternative link instead of the one in the docker book?

I am new to docker and was working through the docker book and just got to the building a Sinatra web application part. But the link provided to download the source code doesn't exist. I tried using the GitHub link for the code but that isn't working as well. I need to make a binary executable as the next step and I am unable to do so.
$ cd sinatra
$ wget --cut-dirs=3 -nH -r -e robots=off --reject="index.html","
Dockerfile" --no-parent http://dockerbook.com/code/5/sinatra/ webapp/
This is what I am supposed to do but if you copy paste the link into your browser, it doesn't work. I also tried making each folder I needed and physically making files but in the next step which is chmod +x webapp/bin/webapp, it says that the directory does not exist even though it does.

How to install karaf features using Dockerfile

I am trying to create a Dockerfile that will automatically install apache karaf and configure it to work and its working fine.
I want to install list of features. I can do it with below
docker exec -it 7447419c89da /opt/karaf/bin/client
but I want to automate the process. What command can I run that will allow me to install the features?
You can use XML file (Feature Repository] and copy it into .../apache-karaf-4.1.5/deploy folder. Then it will be picked-up by Karaf during start-time and features described in the file will be installed automatically if they have attribute specified: install="auto".
Sample file:
<features
name="AET Features"
xmlns="http://karaf.apache.org/xmlns/features/v1.3.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://karaf.apache.org/xmlns/features/v1.3.0 http://karaf.apache.org/xmlns/features/v1.3.0">
<repository>mvn:org.apache.cxf.karaf/apache-cxf/3.2.0/xml/features</repository>
<feature name="fooo" version="1.0.0" description="Features that should be installed" install="auto">
<feature>cxf-core</feature>
<feature>webconsole</feature>
<bundle>mvn:org.apache.karaf.webconsole/org.apache.karaf.webconsole.features/4.1.2</bundle>
</feature>
</features>
This will install new feature called fooo which consist of cxf-core feature - just for purpose of this example (this one needs its own repository location), the webconsole feature that is available on Karaf and additional bundle that provides a view of features in the Web Console.
To summarise:
Download and unzip Karaf
create a file with some name, i.e.: required-features.xml with features description
Start Karaf instance
You can find examples of feature files in Karaf source, e.g.:
https://github.com/apache/karaf/blob/master/assemblies/features/spring/src/main/feature/feature.xml
TL;DR - pass the features as a parameter to the client as docker exec -it 7447419c89da /opt/karaf/bin/client -r 7 "feature:install http; feature:install webconsole"
I think that the Witek's answer is correct and that is how it should work.
However when I was building my own Dockerfile for iDempiere Micro on Karaf to automatically install Apache Karaf, install other features (e.g. WebConsole) and deploy my bundles, I found out, that the only way how to achive this was:
install Apache Karaf in the Dockerfile, also include other helper shell scripts, do not try to install the features
start the docker with Karaf, let it boot and wait a while (on my testing environment it took up to 120s to be ready)
run /opt/karaf/bin/client as in your question and pass the required features as parameters as in docker exec -i idempiere-micro-karaf /opt/karaf/bin/client -r 7 "feature:install http; feature:install http-whiteboard; feature:install war; feature:install webconsole"
wait again, restart Apache Karaf using docker exec -i idempiere-micro-karaf /opt/karaf/bin/client "system:shutdown -f -r"

Check Syntax errors in Dockerfile [duplicate]

If a Dockerfile is written with mistakes for example:
CMD ["service", "--config", "/etc/service.conf] (missing quote)
Is there a way to lint it to detect such mistake before building?
Try:
Either the Haskell Dockerfile Linter ("hadolint"), also available online. hadolint parses the Dockerfile into an AST and performs checking and validation based on best practice Docker images rules. It also uses Shellcheck to lint the Bash code on RUN commands.
Or dockerlinter (node.js-based).
I've performed a simple test against of a simple Docker file with RUN, ADD, ENV and CMD. dockerlinter was smart about grouping the same violation of rules together but it was not able to inspect as thorough as hadolinter possibly due to the lack of Shellcheck to statically analyze the Bash code.
Although dockerlinter falls short in the scope it can lint, it does seem to be much easier to install. npm install -g dockerlinter will do, while compiling hadolinter requires a Haskell compiler and build environment that takes forever to compile.
$ hadolint ./api/Dockerfile
L9 SC2046 Quote this to prevent word splitting.
L11 SC2046 Quote this to prevent word splitting.
L8 DL3020 Use COPY instead of ADD for files and folders
L10 DL3020 Use COPY instead of ADD for files and folders
L13 DL3020 Use COPY instead of ADD for files and folders
L18 DL3020 Use COPY instead of ADD for files and folders
L21 DL3020 Use COPY instead of ADD for files and folders
L6 DL3008 Pin versions in apt get install. Instead of `apt-get install <package>` use `apt-get install <package>=<version>`
L6 DL3009 Delete the apt-get lists after installing something
L6 DL3015 Avoid additional packages by specifying `--no-install-recommends`
$ dockerlint ./api/Dockerfile
WARN: ADD instruction used instead of COPY on line 8, 10, 13, 18, 21
ERROR: ./api/Dockerfile failed.
Update in 2018. Since hadolint has the official Docker repository now, you can get the executable quickly:
id=$(docker create hadolint/hadolint:latest)
docker cp "$id":/bin/hadolint .
docker rm "$id"
or you can use this command
docker container run --rm -i hadolint/hadolint hadolint - < Dockerfile
This is a statically compiled executable (according to ldd hadolint), so it should run regardless of installed libraries. A reference on how the executable is built: https://github.com/hadolint/hadolint/blob/master/docker/Dockerfile.
If you have a RedHat subscription, you can access the "Linter for Dockerfile" application directly at https://access.redhat.com/labs/linterfordockerfile/; information about the application is located at https://access.redhat.com/labsinfo/linterfordockerfile
This Node.js application is also available on GitHub https://github.com/redhataccess/dockerfile_lint if you prefer to run it locally.
I use very successfully in my CI pipeline npm's dockerfile_lint. You can add or extend rules. Using the package.json you can create different configs for the different jobs. There are both
Docker CLI
docker run -it --rm --privileged -v `pwd`:/root/ \
projectatomic/dockerfile-lint \
dockerfile_lint [-f Dockerfile]
docker run -it --rm --privileged -v `pwd`:/root/ \
-v /var/run/docker.sock:/var/run/docker.sock \
projectatomic/dockerfile-lint \
dockerfile_lint image <imageid>
and Atomic CLI available
atomic run projectatomic/dockerfile-lint
atomic run projectatomic/dockerfile-lint image <imageid>
Also you can lint your images for tagging.
I created dockerfile-validator as an extension for VS Code, which uses the dockerfile-lint mentioned in a previous answer. By default it uses dockerfile-lint default rules, but in VS code User Settings (dockerfile-validator.rulefile.path) you can specify a path to a custom rule file with your own coding standards.
Recently, I cam across dockerfilelint which is NodeJS based.
dockerfilelint Dockerfile
Supports following rules and rudimentary CMD checks
required_params
uppercase_commands
from_first
invalid_line
sudo_usage
apt-get_missing_param
apt-get_recommends
apt-get-upgrade
apt-get-dist-upgrade
apt-get-update_require_install
apkadd-missing_nocache_or_updaterm
apkadd-missing-virtual
invalid_port
invalid_command
expose_host_port
label_invalid
missing_tag
latest_tag
extra_args
missing_args
add_src_invalid
add_dest_invalid
invalid_workdir
invalid_format
apt-get_missing_rm
deprecated_in_1.13
Hadolint seems like a better option but this may suffice for simple needs. Also, Github's super-linter uses this.
I'm not too familiar with go but it looks like you can simply call the Parse method as is done in the test suite here. If that does not return an err then your lint passes. I'm assuming that's trivial to expose to a script or something to call during development.

How to build LLVM doxygen in HTML ? I tried but failed

I want to get a copy of the doxygen web-pages of llvm, so I can work with it without the internet.
I did as follows:
$ cd LLVM_ROOT_DIR
$ mkdir out
$ cd out/
$ ../configure --enable-doxygen
$ make ENABLE_OPTIMIZED=1
But it only built llvm without documentation. I also tried
$ make BUILD_FOR_WEBSITE=1 ENABLE_OPTIMIZED=1
and
$ make ENABLE_OPTIMIZED=1 EXTRA_DIST=1
All of them did not work.
How could I build the web pages ?
Thanks a lot.
Using recent versions of LLVM, an in-source build is prohibited by configure. Luckily the documentation can be built using cmake.
$ mkdir out
$ cd out/
$ cmake -DLLVM_ENABLE_DOXYGEN=On ../
$ make doxygen-llvm
The process will take a while, but after it you should have the full documentation.
Once you enable doxygen in the configure step, you can run make doxygen-llvm on the docs/ folder in your build directory.
You can run make help to check the available options.
I collect the web-site by wget.

Resources