Logging incoming Keycloak Requests in Docker Container - docker

I am currently working on a personal project and would like to implement a honeypot for Keycloak, as I have not found a framework offering this (please correct me if I am wrong). The idea is to have Keycloak running and log all incoming HTTP Requests (including Headers, Body, etc..). Currently I have Keycloak running within docker. So far I have not found a suitable solution and would really appreciate your help.
Here is the docker-compose.yml
version: '3'
volumes:
postgres_data:
driver: local
services:
postgres:
image: postgres
volumes:
- postgres_data:/var/lib/postgresql/data
environment:
POSTGRES_DB: keycloak
POSTGRES_USER: keycloak
POSTGRES_PASSWORD: password
keycloak:
image: quay.io/keycloak/keycloak:latest
environment:
DB_VENDOR: POSTGRES
DB_ADDR: postgres
DB_DATABASE: keycloak
DB_USER: keycloak
DB_SCHEMA: public
DB_PASSWORD: password
KEYCLOAK_USER: user
KEYCLOAK_PASSWORD: Pa55w0rd
KEYCLOAK_ADMIN: admin
KEYCLOAK_ADMIN_PASSWORD: password
entrypoint: /opt/keycloak/bin/kc.sh start-dev
ports:
- 8080:8080
depends_on:
- postgres

Related

How to import keycloak user via docker-compose?

I am trying to import keycloak users with docker-compose.
What I have tried ?
keycloak:
container_name: keycloak
image: quay.io/keycloak/keycloak:15.0.2
ports:
- 8079:8080
# restart: on-failure
volumes:
- ./keycloak/realms/keycloak-realm.json:/opt/jboss/keycloak/imports/keycloak-realm.json
- ./keycloak/realms/keycloak-users-0.json:/opt/jboss/keycloak/imports/keycloak-users-0.json
command:
- "-b 0.0.0.0 -Djboss.http.port=8080 -Dkeycloak.import=/opt/jboss/keycloak/imports/keycloak-users-0.json"
environment:
DB_VENDOR: POSTGRES
DB_ADDR: postgresdb
DB_DATABASE: keycloak
DB_USER: keycloak
DB_SCHEMA: public
DB_PASSWORD: keycloak#12345
KEYCLOAK_USER: admin
KEYCLOAK_PASSWORD: admin123
KEYCLOAK_IMPORT: /opt/jboss/keycloak/imports/keycloak-users-0.json -Dkeycloak.profile.feature.upload_scripts=enabled
KEYCLOAK_IMPORT: /opt/jboss/keycloak/imports/keycloak-realm.json -Dkeycloak.profile.feature.upload_scripts=enabled
depends_on:
postgresdb:
condition: service_healthy
My problem with above setup is that I am successfully able to import the realm and client settings but importing users fails.
I can add users to the realm file but I don't prefer that.
KEYCLOAK_IMPORT: /opt/jboss/keycloak/imports/keycloak-users-0.json -Dkeycloak.profile.feature.upload_scripts=enabled
KEYCLOAK_IMPORT: /opt/jboss/keycloak/imports/keycloak-realm.json -Dkeycloak.profile.feature.upload_scripts=enabled
It seems you are overriding you first KEYCLOAK_IMPORT enviroment variable entry with a second KEYCLOAK_IMPORT entry. Could you try combining them into one? Perhaps using only one json file to contain both realm and users?

Keycoak: Import realm when creating Docker container

I have the a docker-compose configuration for Keycloak and Postgres that works fine.
Now, I have exported the config in real-export.json in order to restart Keycloak with these configuration. Unfortunately, the Keycloak container does not work as expected.
Original docker-compose:
version: '3.1'
services:
postgres:
image: postgres
volumes:
- ~/docker/volumes/keycloak-db:/var/lib/postgresql/data
environment:
POSTGRES_DB: keycloak
POSTGRES_USER: keycloak
POSTGRES_PASSWORD: ${KEYCLOAK_DB_PASS}
keycloak:
image: mihaibob/keycloak:14.0.0
environment:
DB_VENDOR: POSTGRES
DB_ADDR: postgres
DB_DATABASE: keycloak
DB_USER: keycloak
DB_SCHEMA: public
DB_PASSWORD: ${KEYCLOAK_DB_PASS}
KEYCLOAK_USER: admin
KEYCLOAK_PASSWORD: ${KEYCLOAK_PASS}
# Uncomment the line below if you want to specify JDBC parameters. The parameter below is just an example, and it shouldn't be used in production without knowledge. It is highly recommended that you read the PostgreSQL JDBC driver documentation in order to use it.
#JDBC_PARAMS: "ssl=true"
ports:
- 8080:8080
depends_on:
- postgres
New docker-compose to import config:
version: '3.1'
services:
postgres:
image: postgres
container_name: postgres
environment:
POSTGRES_DB: keycloak
POSTGRES_USER: keycloak
POSTGRES_PASSWORD: ${KEYCLOAK_DB_PASS}
keycloak:
image: mihaibob/keycloak:14.0.0
volumes:
- ./realm-export.json:/tmp/realm-export.json
environment:
DB_VENDOR: POSTGRES
DB_ADDR: postgres
DB_DATABASE: keycloak
DB_USER: keycloak
DB_SCHEMA: public
DB_PASSWORD: ${KEYCLOAK_DB_PASS}
KEYCLOAK_USER: admin
KEYCLOAK_PASSWORD: ${KEYCLOAK_PASS}
KEYCLOAK_MIGRATION_ACTION: IMPORT
KEYCLOAK_IMPORT: /tmp/realm-export.json
# Uncomment the line below if you want to specify JDBC parameters. The parameter below is just an example, and it shouldn't be used in production without knowledge. It is highly recommended that you read the PostgreSQL JDBC driver documentation in order to use it.
#JDBC_PARAMS: "ssl=true"
ports:
- 9080:8080
depends_on:
- postgres
The Keycloak container terminates with the following logs:
OpenJDK Server VM warning: No monotonic clock was available - timed services may be adversely affected if the time-of-day clock changes
Added 'admin' to '/opt/jboss/keycloak/standalone/configuration/keycloak-add-user.json', restart server to load user
-b 0.0.0.0
=========================================================================
Using PostgreSQL database
=========================================================================
Grateful for every idea.
Try using absolute path for the volume mounting
I would suggest a better keycloak image vendor. I ended up, after using for sometime JBoss's one, by going with the quay version
here: https://www.keycloak.org/server/containers
It is straightforward to mount your volume that contains the json file of your realm and update the CMD with --import-realm option.

How to link Keycloak docker image to MariaDB docker

I have a Maria DB docker image for my application.
I've now pulled Keycloak image. It is using the default h2. But I want to use my existing maria DB image
The documentation is asking me to create network etc, but I am not sure how I do it in cloud. So looking for a configuration based solution i.e. change Keycloak image config to link to Maria DB image. I am not using docker compose, I only pulled image.
https://github.com/keycloak/keycloak-containers/blob/master/server/README.md#environment-variables
Not sure what is environment variables - are these inside keycloak image or on host machine?
start command: docker run -p 7080:8080 -e KEYCLOAK_USER=admin -e KEYCLOAK_PASSWORD=admin jboss/keycloak
and I find it is highly unsecure. Is there a secure way?
Edit:
I opened cli from docker dashboard, and typed
env
but do not know how can I add more env variables like
PROXY_ADDRESS_FORWARDING: 'true'
# PostgreSQL DB settings
DB_VENDOR: postgres
DB_ADDR: 172.17.0.1
DB_PORT: 5432
DB_DATABASE: keycloak
DB_SCHEMA: public
DB_USER: keycloak
DB_PWD: keycloak
(how to change PROXY_ADDRESS_FORWARDING=true from false ?)
I was able to do it like this. You need to define a network and add database and keycloak services to that network.
To add the env variables you have to define them under environment block.
version: '3.7'
services:
demo_db:
container_name: demo-maria-db
image: mariadb:10.5.8-focal
restart: always
ports:
- 3306:3306
volumes:
- /apps/demo/db:/data/db
environment:
MYSQL_ROOT_PASSWORD: root
MYSQL_DATABASE: mydb
MYSQL_USER: user
MYSQL_PASSWORD: password
networks:
demo_mesh:
aliases:
- demo-db
demo_keycloak:
container_name: demo-keycloak
image: jboss/keycloak:10.0.1
restart: always
ports:
- 8180:8080
environment:
PROXY_ADDRESS_FORWARDING: "true"
DB_VENDOR: mariadb
DB_ADDR: demo-db
DB_DATABASE: keycloak
DB_USER: user
DB_PASSWORD: password
KEYCLOAK_USER: admin
KEYCLOAK_PASSWORD: admin
depends_on:
- demo_db
networks:
- demo_mesh
networks:
demo_mesh: {}

Importing keycloak configuration files while using docker-compose

I'm trying to import configuration from one keycloak instance into many different keycloak instances (Each instance is for the same application just differnt sections in my CICD flow)
I'm running keycloak through Docker and finding it difficult to import the required json file
To get the actual data I want imported, I went to the required realm and simply clicked the export button with clients etc. selected. This downloaded a file to my browser which I now want imported when I build my docker containers
I've tried a lot of different methods I've found online and nothing seems to be working so I'd appreciate some help
The first thing I tried was to import the file through the docker-compose file using the following
KEYCLOAK_IMPORT: /realm-export.json
The next thing I tried was also in my docker-compose where I tried
command: "-b 0.0.0.0 -Djboss.http.port=8080 -Dkeycloak.migration.action=import -Dkeycloak.import=realm-export.json
Finally, I tried going into my Dockerfile and running the import as my CMD using the following
CMD ["-b 0.0.0.0", "-Dkeycloak.import=/opt/jboss/keycloak/realm-export.json"]
Below is my current docker-compose and Dockerfiles without the imports added, they might be some help in answering this question. Thanks in advance
# Dockerfile
FROM jboss/keycloak:4.8.3.Final
COPY keycloak-metrics-spi-1.0.1-SNAPSHOT.jar keycloak/standalone/deployments
And the keycloak releated section of my docker-compose file
postgres:
image: postgres
volumes:
- postgres_data:/var/lib/postgresql/data
environment:
POSTGRES_DB: keycl0ak
POSTGRES_USER: keycl0ak
POSTGRES_PASSWORD: password
ports:
- 5431:5431
keycloak:
build:
context: services/keycloak
environment:
DB_VENDOR: POSTGRES
DB_ADDR: postgres
DB_DATABASE: keycl0ak
DB_USER: keycl0ak
DB_PASSWORD: password
KEYCLOAK_USER: administrat0r
KEYCLOAK_PASSWORD: asc88a8c0ssssqs
ports:
- 8080:8080
depends_on:
- postgres
volumes:
postgres_data:
driver: local
Explanation
First you need to copy the file into your container before you can import it into Keycloak. You could place your realm-export.json in a folder next to the docker-compose.yml, lets say we call it imports. This can be achieved using volumes:. Once the file has been copied into the container then you can use command: as you were before, pointing at the correct file within the container.
File Structure
/your_computer/keycloak_stuff/
|-- docker-compose.yml
|-- imports -> realm-export.json
Docker-Compose
This is how the docker-compose.yml should look with the changes:
postgres:
image: postgres
volumes:
- postgres_data:/var/lib/postgresql/data
environment:
POSTGRES_DB: keycl0ak
POSTGRES_USER: keycl0ak
POSTGRES_PASSWORD: password
ports:
- 5431:5431
keycloak:
build:
context: services/keycloak
volumes:
- ./imports:/opt/jboss/keycloak/imports
command:
- "-b 0.0.0.0 -Dkeycloak.import=/opt/jboss/keycloak/imports/realm-export.json"
environment:
DB_VENDOR: POSTGRES
DB_ADDR: postgres
DB_DATABASE: keycl0ak
DB_USER: keycl0ak
DB_PASSWORD: password
KEYCLOAK_USER: administrat0r
KEYCLOAK_PASSWORD: asc88a8c0ssssqs
ports:
- 8080:8080
depends_on:
- postgres
volumes:
postgres_data:
driver: local
To wrap up the answer of #JesusBenito and #raujonas, the docker-compose could be changed, so that you make use of the keyloak environment KEYCLOAK_IMPORT:
keycloak:
volumes:
- ./imports:/opt/jboss/keycloak/imports
# command: not needed anymore
# - "-b 0.0.0.0 -Dkeycloak.import=/opt/jboss/keycloak/imports/realm-export.json"
environment:
KEYCLOAK_IMPORT: /opt/jboss/keycloak/imports/realm-export.json -Dkeycloak.profile.feature.upload_scripts=enabled
DB_VENDOR: POSTGRES
DB_ADDR: postgres
DB_DATABASE: keycl0ak
DB_USER: keycl0ak
DB_PASSWORD: password
KEYCLOAK_USER: administrat0r
KEYCLOAK_PASSWORD: asc88a8c0ssssqs
This config worked for me:
keycloak:
image: mihaibob/keycloak:15.0.1
container_name: keycloak
ports:
- "9091:8080"
volumes:
- ./src/test/resources/keycloak:/tmp/import
environment:
...
KEYCLOAK_IMPORT: /tmp/import/global.json

Best way to share Docker volume between different computers?

I'm using Docker along with the jboss/keycloak-ha-postgres and postgres images.
I have two developers who want to share the postgres data. I'm trying to figure out what is the best way to do this.
I've already figured out how to persist data locally using the volumes attribute in my docker-compose.yml file:
version: '2'
services:
db:
container_name: keycloak-postgres
image: postgres
environment:
POSTGRES_DB: keycloak
POSTGRES_USER: keycloak
POSTGRES_PASSWORD: password
ports:
- "5432:5432"
volumes_from:
- data
keycloak:
container_name: keycloak
image: jboss/keycloak-ha-postgres
depends_on:
- "db"
environment:
POSTGRES_DATABASE: keycloak
POSTGRES_USER: keycloak
POSTGRES_PASSWORD: password
POSTGRES_PORT_5432_TCP_ADDR: postgres
POSTGRES_PORT_5432_TCP_PORT: 5432
KEYCLOAK_USER: admin
KEYCLOAK_PASSWORD: admin123
links:
- "db"
ports:
- "8080:8080"
data:
container_name: keycloak-postgres-db-data
image: cogniteev/echo
command: echo 'Data Container for PostgreSQL'
volumes:
- /var/lib/postgresql/data
One approach I'm thinking of includes create my own Docker image (using a Dockerfile and using FROM cogniteev/echo) of the volume and hosting it DockerHub. Commit and push changes to the volume data to DockerHub. I'd then have to update my docker-compose.yml file to grab that particular image instead of cogniteev/echo.
But I'm not sure what is the best thing to do in this situation.

Resources