Running Spring Boot test inside Docker container - docker

I am wondering if it is possible to run Spring Boot test inside a Docker container. I would like to test my service in the environment as close to the real one as possible. I am already using Testcontainers and Localstack to setup my environment (Postgres, AWS secrets manager etc.), but couldn't find any way to run my service inside Docker. Ideally I would like to test my service inside Localstack's EC2 instance, but just testing it as Docker container would be sufficient. I have Dockerfile at the root of my project and am using Gradle Palantir plugin to build the container. The bootstrap section of my test:
#Testcontainers
#ActiveProfiles(profiles = {"test","jpa"})
#SpringBootTest(classes = PayoutsApplication.class, webEnvironment=WebEnvironment.RANDOM_PORT,
properties = {"aws.paramstore.enabled=false", "aws.secretsmanager.enabled=false"})
#EnableAutoConfiguration(exclude = {
org.springframework.cloud.aws.autoconfigure.context.ContextInstanceDataAutoConfiguration.class,
org.springframework.cloud.aws.autoconfigure.context.ContextStackAutoConfiguration.class,
org.springframework.cloud.aws.autoconfigure.context.ContextRegionProviderAutoConfiguration.class
})
#TestMethodOrder(OrderAnnotation.class)
#ContextConfiguration(loader = PayoutsApplicationTests.CustomLoader.class)
#TestExecutionListeners(listeners = {DependencyInjectionTestExecutionListener.class, PayoutsApplicationTests.class})
class MyApplicationTests extends AbstractTestExecutionListener {
private static AWSSecretsManager awsSecretsManager;
private static AWSSimpleSystemsManagement ssmClient;
// we need it to execute the listeners
public static class CustomLoader extends SpringBootContextLoader {
#Override
protected SpringApplication getSpringApplication() {
PropertiesListener listener = new PropertiesListener();
ReflectionTestUtils.setField(listener, "awsSecretsManager", awsSecretsManager);
ReflectionTestUtils.setField(listener, "ssmClient", ssmClient);
SpringApplication app = super.getSpringApplication();
app.addListeners(listener);
return app;
}
}
static DockerImageName localstackImage = DockerImageName.parse("localstack/localstack:0.11.3");
private static final LocalStackContainer.Service[] TEST_SERVICES = {
LocalStackContainer.Service.SECRETSMANAGER,
LocalStackContainer.Service.SSM,
LocalStackContainer.Service.EC2,
};
static LocalStackContainer awsLocalStackContainers = new LocalStackContainer(localstackImage)
.withServices(TEST_SERVICES).withEnv("LOCALSTACK_HOSTNAME", "localhost")
.withEnv("HOSTNAME", "localhost");
#Container
public static PostgreSQLContainer<?> postgreSQL =
new PostgreSQLContainer<>("postgres:13.1")
.withUsername("clusteradmin")
.withPassword("testPassword")
.withDatabaseName("public");
#DynamicPropertySource
static void postgresqlProperties(DynamicPropertyRegistry registry) {
registry.add("spring.datasource.url", postgreSQL::getJdbcUrl);
registry.add("spring.datasource.password", postgreSQL::getPassword);
registry.add("spring.datasource.username", postgreSQL::getUsername);
}
...

Related

REST call not working with Camel running in Docker

I have this Camel Rest Route:
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.main.Main;
import org.apache.camel.model.rest.RestBindingMode;
import spark.Spark;
public class MainCamel {
public static void main(final String[] args) throws Exception {
final Main camelMain = new Main();
camelMain.configure().addRoutesBuilder(new RouteBuilder() {
#Override
public void configure() throws Exception {
this.getContext().getRegistry().bind("healthcheck", CheckUtil.class);
this.restConfiguration()
.bindingMode(RestBindingMode.auto)
.component("netty-http")
.host("localhost")
.port(11010);
this.rest("/healthcheck")
.get()
.description("Healthcheck for docker")
.outType(Integer.class)
.to("bean:healthcheck?method=healthCheck");
}
});
// spark
Spark.port(11011);
Spark.get("/hello", (req, res) -> "Hello World");
System.out.println("ready");
camelMain.run(args);
}
public static class CheckUtil {
public Integer healthCheck() {
return 0;
}
}
}
I also created a second REST server with Spark.
The Camel route does NOT work if the code is executed in a Docker container.
Exception: org.apache.http.NoHttpResponseException: localhost:11010 failed to respond
The Spark server works fine.
However when executing the code directly in IntelliJ both REST Servers work. Of course both ports are exposed in the container.
You are binding the Netty HTTP server to localhost. Meaning that it will not be able to serve requests that originate from outside of the container.
Change .host("localhost") to .host("0.0.0.0") so that the sever listens on all available network interfaces.

JSON.SET in redis cli and stackexchange.redis it's showing exception

I'm trying to set and get JSON results in redis with redis-om anyway it uses Stackexchange.Redis even with that the same exception repects.
using System;
using StackExchange.Redis;
namespace test
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello World!");
var muxe = ConnectionMultiplexer.Connect("localhost");
var db = muxe.GetDatabase();
var JSONResult = db.CreateTransaction();
db.Execute("JSON.SET", "dog:1", "$", "{\"name\":\"Honey\",\"breed\":\"Greyhound\"}");
db.Execute("JSON.GET", "dog:1", "$.breed");
JSONResult.Execute();
Console.WriteLine("I'm ok");
}
}
}
Exception: StackExchange.Redis.RedisServerException: 'ERR unknown command `JSON.SET`,
I missed with Redis moudle because alpine image only have core funs some of the modules are not present with the same. please check the redis modules.
redis-cli info modules

how to run testcontainer with dynamic port for spring data elasticsearch

My test case uses #SpringBootTest annotations to bring up the context and has Autowired some repository. Testcontainer is started in #BeforeAll() method. The problem is RestClientConfig is being initialized/injected before #BeforeAll() in test case. When testcontainer starts, it exports some dynamic port.
I have to set some fixed port in testcontainer 34343 and use the same port in properties file for RestClientConfig.
container = new ElasticsearchContainer(ELASTICSEARCH_IMAGE)
.withEnv("discovery.type", "single-node")
.withExposedPorts(9200)
.withCreateContainerCmdModifier(cmd -> cmd.withHostConfig(
new HostConfig().withPortBindings(new PortBinding(Ports.Binding.bindPort(34343), new ExposedPort(9200)))));
Is there a way to start container and get its dynamic port then use it to initialize RestClientConfig?
I didn't use annoation #Testcontainers though. Is it needed?
Newer versions of Spring provide #DynamicPropertySource for exactly this use case:
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/test/context/DynamicPropertySource.html
Your code should look roughly like this:
#SpringJUnitConfig(...)
#Testcontainers
class ExampleIntegrationTests {
#Container
static ElasticsearchContainer elastic= new ElasticsearchContainer(ELASTICSEARCH_IMAGE)
.withEnv("discovery.type", "single-node");
// ...
#DynamicPropertySource
static void elasticProperties(DynamicPropertyRegistry registry) {
registry.add("spring.elasticsearch.uris", elastic::getHttpHostAddress);
}
}
You can use context configuration initialiser to set properties during runtime, which you can later use in your RestClientConfig.
Let me show you on the example of Postgresql container setup:
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT, classes = Application.class)
#ContextConfiguration(initializers = AbstractTestcontainersTest.DockerPostgreDataSourceInitializer.class)
public abstract class AbstractTestcontainersTest {
protected static final String DB_CONTAINER_NAME = "postgres-auth-test";
protected static PostgreSQLContainer<?> postgreDBContainer =
new PostgreSQLContainer<>(DockerImageName.parse("public.ecr.aws/docker/library/postgres:12.10-alpine")
.asCompatibleSubstituteFor("postgres"))
.withUsername("postgres")
.withPassword("change_me")
.withInitScript("db.sql")
.withCreateContainerCmdModifier(cmd -> cmd.withName(DB_CONTAINER_NAME))
.withDatabaseName("zpot_main");
#BeforeAll
public static void beforeAll() throws ShellExecutionException {
postgreDBContainer.start();
}
#AfterAll
public static void afterAll() {
postgreDBContainer.stop();
}
public static class DockerPostgreDataSourceInitializer implements ApplicationContextInitializer<ConfigurableApplicationContext> {
#Override
public void initialize(ConfigurableApplicationContext applicationContext) {
TestPropertySourceUtils.addInlinedPropertiesToEnvironment(
applicationContext,
"spring.datasource.url=" + postgreDBContainer.getJdbcUrl(),
"spring.datasource.username=" + postgreDBContainer.getUsername(),
"spring.datasource.password=" + postgreDBContainer.getPassword()
);
}
}
}
All the configuration is done in DockerPostgreDataSourceInitializer, where I set all the properties I need. You also need to annotate your test class with #ContextConfiguration annotaion. You can do something similar with your ElasticSearchContainer. As I just checked the ElasticSearchContainer has a method getHttpHostAddress() which returns host+dynamic_port combination for your container. You can get that host-port pair and set in in properties to be used later in your client configuration. If you need just port you can call container.getMappedPort(9200) and again set that port in properties.
Regarding #Testcontainers annotation, you need it if you want testcontainers to manage your container lifecycle. In that case you also need to annotate container with #Container annotation. Your container will be started either once before all test methods in a class if your container is a static field or before each test method if it's a regular field. You can read more about that here: https://www.testcontainers.org/test_framework_integration/junit_5/#extension.
Or you can start your container manually either in #BeforeAll or #BeforeEach annotated setup methods. In other words no, you don't have to use #Testcontainers annotation.

Deploying a transaction event listener in a Neo4jDesktop installation

I have created a project that contains an ExtensionFactory subclass annotated as #ServiceProvider that returns a LifecycleAdapter subclass which registers a transaction event listener in its start() method, as shown in this example. The code is below:
#ServiceProvider
public class EventListenerExtensionFactory extends ExtensionFactory<EventListenerExtensionFactory.Dependencies> {
private final List<TransactionEventListener<?>> listeners;
public EventListenerExtensionFactory() {
this(List.of(new MyListener()));
}
public EventListenerExtensionFactory(List<TransactionEventListener<?>> listeners) {
super(ExtensionType.DATABASE, "EVENT_LISTENER_EXT_FACTORY");
this.listeners = listeners;
}
#Override
public Lifecycle newInstance(ExtensionContext context, Dependencies dependencies) {
return new EventListenerLifecycleAdapter(dependencies, listeners);
}
#RequiredArgsConstructor
private static class EventListenerLifecycleAdapter extends LifecycleAdapter {
private final Dependencies dependencies;
private final List<TransactionEventListener<?>> listeners;
#Override
public void start() {
DatabaseManagementService managementService = dependencies.databaseManagementService();
listeners.forEach(listener -> managementService.registerTransactionEventListener(
DEFAULT_DATABASE_NAME, listener));
dependencies.log()
.getUserLog(EventListenerExtensionFactory.class)
.info("Registering transaction event listener for database " + DEFAULT_DATABASE_NAME);
}
}
interface Dependencies {
DatabaseManagementService databaseManagementService();
LogService log();
}
}
It works fine in an integration test:
public AbstractDatabaseTest(TransactionEventListener<?>... listeners) {
URI uri = Neo4jBuilders.newInProcessBuilder()
.withExtensionFactories(List.of(new EventListenerExtensionFactory(List.of(listeners))))
.withDisabledServer()
.build()
.boltURI();
driver = GraphDatabase.driver(uri);
session = driver.session();
}
Then I copy the jar file in the plugins directory of my desktop database:
$ cp build/libs/<myproject>.jar /mnt/c/Users/albert.gevorgyan/.Neo4jDesktop/relate-data/dbmss/dbms-7fe3cbdb-11b2-4ca2-81eb-474edbbb3dda/plugins/
I restart the database and even the whole desktop Neo4j program but it doesn't seem to identify the plugin or to initialize the factory: no log messages are found in neo4j.log after the start event, and the transaction events that should be captured by my listener are ignored. Interestingly, a custom function that I have defined in the same jar file actually works - I can call it in the browser. So something must be missing in the extension factory as it doesn't get instantiated.
Is it possible at all to deploy an ExtensionFactory in a Desktop installation and if yes, what am I doing wrong?
It works after I added a provider configuration file to META-INF/services, as explained in https://www.baeldung.com/java-spi. Neo4j finds it then.

Grails injected services exposed as web methods in cxf

I have a service class that I expose as jaxws using the Grails Cxf plugin. In my service I have to inject another service class which I use in my web services. If I make the service field public I get unnecessary service methods generated like below:
retrieveLastRecordUpdateDate
setPricingContractService
retrieveRecordsUpdatedFromDate
retrieveAllRecordsByInsurance
getPricingContractService
If I make the field private I cannot inject the service class. How can I both inject the service and not expose it as a web service? Simplified code below:
class PricingContractWebService {
static expose = EndpointType.JAX_WS
def pricingContractService // private?
#WebMethod( operationName="retrieveAllRecordsByInsurance" )
#WebResult( name="pricingContractList" )
#XmlElement(name="healthCareCompany", required=true)
List<PricingContractDTO> retrieveAllRecordsByInsurance(#WebParam(partName = "HealthCareCompany", name = "healthCareCompany", ) final HealthCareCompany healthCareCompany) {
def pricingContractDTOList = []
pricingContractDTOList
}
#WebMethod( operationName="retrieveLastRecordUpdateDate" )
#WebResult( name="lastUpdateDate" )
Date retrieveLastRecordUpdateDate() {
}
#WebMethod( operationName="retrieveRecordsUpdatedFromDate" )
#WebResult( name="pricingContractList" )
#XmlElement(name="updateDate", required=true)
List<PricingContractDTO> retrieveRecordsUpdatedFromDate(#WebParam(name = "updateDate") final Date date) {
def pricingContractDTOList = []
pricingContractDTOList
}
}
You should make service endpoint private and add #Autowired before endpoint declaration:
#Autowired
private PricingContractService pricingContractService

Resources