How to make Trino(prev. presto) to log in JSON format - log4j2

Looking for solution to make Trino logs(worker and coordinator) in JSON format, tried to replace log4j jars with log4j2 or logback with appropriate configuration, but neither worked, from DOCKER file:
RUN tar --exclude="*log4j*.jar" --directory /opt/ -xzf /opt/presto-server-${PRESTO_VERSION}.tar.gz
COPY log4j/*.jar /opt/presto/lib/
Any input is appreciated.

Add log.format=json to the server configuration file.

Related

Using JMeter plugins with justb4/jmeter Docker image results in error

Goal
I am using Docker to run JMeter in Azure Devops. I am trying to use Blazemeter's Parallel Controller, which is not native to JMeter. So, according to the justb4/jmeter image documentation, I used the following command to get the image going and run the JMeter test:
docker run --name jmetertest -i -v /home/vsts/work/1/s/plugins:/plugins -v $ROOTPATH:/test -w /test justb4/jmeter ${#:2}
Error
However, it produces the following error while trying to accommodate for the plugin (I know the plugin makes the difference due to testing without the plugin):
cp: can't create '/test/lib/ext': No such file or directory
As far as I understand, this is an error produced when one of the parent directories of the directory you are trying to make does not exist. Is there something I am doing wrong, or is there actually something wrong with the image?
References
For reference, I will include links to the image documentation and the repository.
Image: https://hub.docker.com/r/justb4/jmeter
Repository: https://github.com/justb4/docker-jmeter
Looking into the Dockerfile:
ENV JMETER_HOME /opt/apache-jmeter-${JMETER_VERSION}
Looking into entrypoint.sh
if [ -d /plugins ]
then
for plugin in /plugins/*.jar; do
cp $plugin $(pwd)/lib/ext
done;
fi
It basically copies the plugins from /plugins folder (if it is present) to /lib/ext folder relative to current working directory
I don't know why did you add this stanza -w /test to your command line but it explicitly "tells" the container that local working directory is /test, not /opt/apache-jmeter-xxxx, that's why the script is failing to copy the files.
In general I don't think that the approach is very valid because:
In Azure DevOps you won't have your "local" folder (unless you want to add plugins binaries under the version control system)
Some JMeter Plugins have other .jars as the dependencies so when you're installing the plugin you should:
put the plugin itself under /lib/ext folder of your JMeter installation
put the plugin dependencies under /lib folder of your JMeter installation
So I would recommend amending the Dockerfile, download JMeter Plugins Manager and installed the plugin(s) you need from the command line
Something like:
RUN wget https://jmeter-plugins.org/get/ -O /opt/apache-jmeter-${JMETER_VERSION}/lib/ext/jmeter-plugins-manager.jar
RUN wget https://repo1.maven.org/maven2/kg/apc/cmdrunner/2.2/cmdrunner-2.2.jar -P /opt/apache-jmeter-${JMETER_VERSION}/lib/
RUN java -cp /opt/apache-jmeter-${JMETER_VERSION}/lib/ext/jmeter-plugins-manager.jar org.jmeterplugins.repository.PluginManagerCMDInstaller
RUN /opt/apache-jmeter-${JMETER_VERSION}/bin/./PluginsManagerCMD.sh install bzm-parallel

Apache Jena Commands not found

I'm trying to set up my system (Ubuntu 16.04) with Apache Jena 3.10.0, and followed the provided instructions, but I'm unable to access any of the commands that I should have access to.
For example, sparql --version and bin/sparql --version both return:
sparql: command not found
I have downloaded and extracted the files to /home/[user]/apache-jena-3.10.0, then run:
export JENA_HOME=/home/[user]/apache-jena-3.10.0
export PATH=$PATH:$JENA_HOME/bin
The command cd $JENA_HOME successfully goes the apache-jena-3.10.0 directory.
I feel that there is a basic linux thing here that I'm missing, but I've tried a lot of things and had no luck so far. Any help would be greatly appreciated. Thanks!
The files in the download from Apache were not marked as executable. From the main apache-jena-3.10.0 directory, chmod -R 775 bin changed all files so I could run them from command line.

How do I extract a TAR to a different destination directory

On server A, I created a tar file (backup.tar.gz) of the entire website /www. The tar file includes the top-level directory www
On server B, I want to put those files into /public_html but not include the top level directory www
Of course, tar -xzif backup.tar.gz places everything into /public_html/www
How do I do this?
Thanks!
You can use the --transform option to change the beginning of the archived file names to something else. As an example, in my case I had installed owncloud in directory named sscloud instead of owncloud. This caused problems when upgrading from the *.tar file. So I used the transform option like so:
tar xvf owncloud-10.3.2.tar.bz2 --transform='s/owncloud/sscloud/' --overwrite
The transform option takes sed-like commands. The above will replace the first occurrence of owncloud with sscloud.
Answer is:
tar --strip-components 1 -xvf backup.tar.gz

Generate static docs with swagger

Is there a method for creating static documentation for swagger 2.0?
Perhaps like the 'preview' on editor.swagger.io.
I need to get static html files so I can include them in some static documents.
So far I've not found a way to do this. I see there is swagger-codegens static-docs
but this only works for swagger <= 1.2.
Use swagger-codegen:
swagger-codegen generate -i <path to your swagger file> -l html2 -o <path to output location>
If you decide to customize the HTML template:
Clone the swagger-codegen project from github
Copy modules/swagger-codegen/src/main/resources/htmlDocs2 folder to another place, for example: cp -R modules/swagger-codegen/src/main/resources/htmlDocs2 ~/templates
Modify the .mustache templates in ~/templates to fit your requirements.
Run: swagger-codegen generate -i <path to your swagger file> -l html2 -o <path to output location> -t <templates path> for <templates path> should be ~/templates in the example above.
If you'd just like to generate static docs in a straightforward way, consider Spectacle.
npm install spectacle-docs if you want to put a script in your package.json, or npm install -g spectacle-docs if it should be available everywhere.
Then you can just run spectacle spec.yaml, with options to build to a specific directory, run a server, and/or watch the specfile and update as needed.
You can use the swagger-codegen command as others have mentioned, but the output that you get from using -l html or -l html2 is not interactive like the Swagger UI.
To get an interactive static page like the Swagger UI, follow these steps:
Install
Go to https://github.com/swagger-api/swagger-ui/releases and download the latest release as a .zip file.
Unzip the file and copy everything in the ./dist folder over to the directory that you want the webserver to serve up. For example, Gitlab Pages needs it needs to be in the ./public folder in your repository.
Copy over your swagger.yml file to the ./public folder.
Open up the ./public/index.html file and update the URL to where the swagger file will be on the webserver. For a local server, it might be this: url: "http://localhost:8000/swagger.yml
Test
To test this out, you can run a simple HTTP server using python3.
python3 -m http.server 8000 --directory public
Open up http://localhost:8000/ and check it out!
I usually do it with https://editor.swagger.io/. No installation or anything required.
Copy your yml file into the editor and choose 'Generate Client > html2' and it will generate static html files in a zip file.
The static-docs in 2.0 is implemented for 2.0. see the ./bin/static-docs.sh here:
https://github.com/swagger-api/swagger-codegen/tree/master/bin
You can use:
swagger-ui : just git clone the project and copy your JSON in the base directory
swagger-codegen
OpenAPI 3
For rendering an OpenAPI 3 specification into self-contained HTML file, redoc-cli can be used. You can use ReDoc's Petstore OpenAPI 3 spec as example.
mkdir -p -m 777 build && docker run --rm --user 1000 \
-v $PWD/build:/tmp/build -w /tmp/build \
-v $PWD/openapi.yaml:/tmp/openapi.yaml node:14-slim npx -q \
redoc-cli bundle /tmp/openapi.yaml
This will generate build/redoc-static.html in your current directory.
To avoid waiting for installation, you can also build yourself a Docker image with redoc-cli according to their Dockerfile, or installing redoc-cli to your OS, if you have NodeJS there, with npm install -g redoc-cli.
OpenAPI 2
ReDoc also has compatibility mode for OpenAPI 2/Swagger, so the above also works for Petstore OpenAPI 2 spec.
[ReDoc Compatibility mode]: Converting OpenAPI 2.0 to OpenAPI 3.0
Alternatively, the there's OpenAPI 2-only Spectacle and its official Docker image. It can be used similarly like:
mkdir -p -m 777 build && docker run --rm --user 1000 \
-v $PWD/build:/tmp/build \
-v $PWD/swagger.yaml:/tmp/swagger.yaml sourcey/spectacle \
spectacle -t /tmp/build /tmp/swagger.yaml
It generates static build which is almost self-contained (except loading jQuery from code.jquery.com which was slow for some reason on my end).
├── index.html
├── javascripts
│   ├── spectacle.js
│   └── spectacle.min.js
└── stylesheets
├── foundation.css
├── foundation.min.css
├── spectacle.css
└── spectacle.min.css
I've been using OpenAPI Generator CLI Docker Image https://github.com/OpenAPITools/openapi-generator
it can generate server, clients and docs. for each language there are template files so you can modify the behaviour as needed
I managed to get a python-flask server generated and self hosting generated documentation.
the below will generate an html file that includes code examples for a client
USER=$(shell id -u)
GROUP=$(shell id -g)
MDIR=$(shell pwd)
docker run --rm --user ${USER} -u\:${GROUP} \
-v ${MDIR}:/local openapitools/openapi-generator-cli generate \
--package-name EXAMPLE \
-t /local/.openapi-generator-html-client/ \
-i /local/EXAMPLE.v1.yaml \
-g html2 \
-o /local/openapi_docs
If your specifically looking for for Swagger 2.0, I'd like to point you to my answer in
Converting Swagger specification JSON to HTML documentation
, although I believe that Swagger-Codegen supports Swagger 2.0 by now.
"static" docs can mean several things.
If you're looking for an interactive display (like the editor's preview), you have swagger-ui (https://github.com/swagger-api/swagger-ui).
The code in the codegen that does the more static docs (without the "Try it now" button, for example) is not implemented yet for 2.0 though should be available in the upcoming few weeks.
Clik on preview docs, use chrome addon 'Save Page WE' (right click on page -> 'save page we'), result is single html file (its not clickable so you have to click up everything you want to be seen).
Include dependency for swagger in your pom.
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>2.4.0</version>
</dependency>
And try hitting -> https://editor.swagger.io/

Tar Error [can not open: not a directory]

I have made some archive file with the tar gnome GUI on Ubuntu but when I try to extract them
tar zxvf archive_name
I get following error
Cannot open: Not a directory
What is the problem ?
Try extracting the archive in an empty directory; any existing files/directories in the extract target usually cause problems if names overlap.
I encountered the same issue (for each file within an archive) and I solved it by appending ".tar.gz" to the archive filename as I'd managed to download a PECL package without a file extension:
mv pecl_http pecl_http.tar.gz
I was then able to issue the following command to extract the contents of the archive:
tar -xzf pecl_http.tar.gz
You probably might already have a file with the same name that the tar is extracting a directory.
Try to tar in different location.
tar zxvf tar_name.tgz --one-top-level=new_directory_name
Try using tar -zxvf archive_name instead. I believe that the command format has changed, and it now requires the z (unzip) x (extract) v (verbose) f (filename...) parts as switches instead of plain text. The error comes from tar trying to do something to the file zxvf, which of course does not exist.

Resources