Running init script on oracle test container with system privileges - docker

I am struggling with org.testcontainers:oracle-xe:1.14.3.
I am trying to run a test intended to verify schema creation and migration, however I'm getting stuck at the InitScript, when trying to initialize the users for the test with the users 'sys as sysdba'.
#Before
public void setUp() {
oracleContainer = new OracleContainer("oracleinanutshell/oracle-xe-11g")
.withUsername("sys as sysdba")
.withInitScript("oracle-initscript.sql");
oracleContainer.start();
}
The above seems to be able to connect, but execution of the init script fails with a
ORA-01109: database not open
Using the 'system' user in the above does not provide the InitScript connection with sysdba privileges, but result in an open database.
I'm looking for a solution that will allow me to initialize multiple users prior to a test. This initialization has grants that requires sysdba privileges. The test, in which some SQL scripts are executed, requires that both users are created in the database and can connect to the database.

In my case I'm using
oracleContainer = new OracleContainer("gvenzl/oracle-xe:18.4.0-slim")
.withUsername("test")
.withPassword("test")
.addEnv("ORACLE_PASSWORD", "s") // Sys password is required
.withCopyFileToContainer(MountableFile.forHostPath("oracle-initscript.sql"), "/container-entrypoint-initdb.d/init.sql")
gvenzl/oracle-xe is the default image used by the org.testcontainers.oracle-xe library.
The documentation for this image describes how to call initialization SQL on DB start and it works great.
Hard to say what is the issue but here are some tricks:
maybe "sys as sysdba" is not valid in your code, documentation is not clear about the usage
maybe withLogConsumer can provide some clues what's wrong
I recommend the image gvenzl/oracle-xe,
in some cases withInitScript may not work properly.
it is useful to test the init script on the container started manually

I finished on end with this approach:
as sys admin created two different schema/user)
#SpringBootTest(classes = Main.class)
#Import(DbConfiguration.class)
#Testcontainers
public class ServiceIntegrationTest {
#Container
public static final OracleContainer oracleContainer =
new OracleContainer("gvenzl/oracle-xe:21-slim-faststart");
}
import static com.integrationtests.local_test.service.IntegrationTest.oracleContainer;
#TestConfiguration
public class DbConfiguration {
static final String DEFAULT_SYS_USER = "sys as sysdba";
private static final String ENTITY_MANAGER_FACTORY = "entityManagerFactory";
#Bean
public DataSource getDataSource() {
DataSourceBuilder<?> dataSourceBuilder = DataSourceBuilder.create();
dataSourceBuilder.driverClassName("oracle.jdbc.OracleDriver");
dataSourceBuilder.url(oracleContainer.getJdbcUrl());
dataSourceBuilder.username(DEFAULT_SYS_USER);
dataSourceBuilder.password(oracleContainer.getPassword());
return dataSourceBuilder.build();
}
Also in application.yaml put scripts
spring:
datasource:
initialization-mode: always
schema:
- classpath:/sql/init_schemas/USER_ONE.sql
- classpath:/sql/init_schemas/USER_TWOT.sql

Related

How do i run a Windows service in Azure Service Fabric?

I have a Windows service for test purposes that i want to migrate to Service Fabric. The service does nothing more than writing to a txt-file on my drive when it starts and when it stops. It works fine when i manually start and stop the service after installing it on my machine. Can i achieve the same result on service fabric or does the implementation be different?
I have created a guest executable with the service and deployed it to a local cluster following this guide.
First of all, I don't like this answer. After playing with it, I'm convinced the best way is to just port the code to a service fabric app. I would love to see a better "bolt-on" solution, but I haven't found any others. Every answer I've seen says "just run the exe as a Guest Executable", but a Windows Service exe doesn't "just run". It needs to be ran as a Windows Service which calls the OnStart entry point of the Service class (which inherits from ServiceBase).
The code below will allow your Windows Service to run in Service Fabric, but Service Fabric seems to report WARNINGS! So it's FAR from perfect.
It shouldn't require any changes to your OnStart or OnStop methods, however it does require some basic plumbing to work. This is also helpful if you wish to debug your windows services, as it allows you to pass in a /console command line argument and have it run in a console window.
First, either create your own ServiceBase class, or simply paste this code into your Service class (by default it's called Service1.cs in a C# Windows Service project):
// Expose public method to call the protected OnStart method
public void StartConsole(string[] args)
{
// Plumbing...
// Allocate a console, otherwise we can't properly terminate the console to call OnStop
AllocConsole();
// Yuck, better way?
StaticInstance = this;
// Handle CTRL+C, CTRL+BREAK, etc (call OnStop)
SetConsoleCtrlHandler(new HandlerRoutine(ConsoleCtrlCheck), true);
// Start service code
this.OnStart(args);
}
// Expose public method to call protected OnStop method
public void StopConsole()
{
this.OnStop();
}
public static Service1 StaticInstance;
private static bool ConsoleCtrlCheck(CtrlTypes ctrlType)
{
switch (ctrlType)
{
case CtrlTypes.CTRL_C_EVENT:
case CtrlTypes.CTRL_BREAK_EVENT:
case CtrlTypes.CTRL_CLOSE_EVENT:
case CtrlTypes.CTRL_LOGOFF_EVENT:
case CtrlTypes.CTRL_SHUTDOWN_EVENT:
StaticInstance.StopConsole();
return false;
}
return true;
}
[DllImport("kernel32.dll")]
private static extern bool AllocConsole();
[DllImport("Kernel32")]
public static extern bool SetConsoleCtrlHandler(HandlerRoutine Handler, bool Add);
public delegate bool HandlerRoutine(CtrlTypes CtrlType);
public enum CtrlTypes
{
CTRL_C_EVENT = 0,
CTRL_BREAK_EVENT,
CTRL_CLOSE_EVENT,
CTRL_LOGOFF_EVENT = 5,
CTRL_SHUTDOWN_EVENT
}
Now change your Main method in Program.cs to look like this:
static void Main(string[] args)
{
var service = new Service1();
if (args.Length > 0 && args.Any(x => x.Equals("/console", StringComparison.OrdinalIgnoreCase)))
{
service.StartConsole(args);
}
else
{
ServiceBase.Run(
new ServiceBase[]
{
service
});
}
}
You may need to rename 'Service1' to whatever your service class is called.
When calling it through Service Fabric, make sure it's passing in the /console argument in ServiceManifest.xml:
<CodePackage Name="Code" Version="1.0.0">
<EntryPoint>
<ExeHost>
<Program>WindowsService1.exe</Program>
<Arguments>/console</Arguments>
<WorkingFolder>Work</WorkingFolder>
</ExeHost>
</EntryPoint>
</CodePackage>
If you wish to use this as a debuggable Windows Service, you can also set your 'Command line arguments' to /console under the Project settings > Debug tab.
EDIT:
A better option is to use TopShelf. This will work without warnings in Service Fabric, however it does require some code refactoring as it becomes a Console project instead of a Windows Service project.

How to customize an existing Grails plugin functionality, modifying behavior of doWithSpring method

I am new to grails and while working with Spring Security LDAP plugin it was identified that it accepts the ldap server password in plain text only. The task in hand is to pass an encrypted password which is decrypted before it is consumed by the plugin during its initialization phase.
I have already searched for all possible blogs and stackoverflow questions but could not find a way to extend the main plugin class to simply override the doWithSpring() method so that i can simply add the required decryption logic for the Ldap server password. Any help here will be appreciated.
I have already seen and tried jasypt plugin but it also does not work well if the password is stored in some external file and not application yml. So I am looking for a solution to extend the Spring security plugin main class, add the required behavior and register the custom class.
EDIT
Adding the snippet from Grails LDAP Security plugin, which I am trying to override. So If i am successfully able to update the value of securityConfig object before the plugin loads, the purpose is solved.
Some snippet from the plugin:
def conf = SpringSecurityUtils.securityConfig
...
...
contextSource(DefaultSpringSecurityContextSource, conf.ldap.context.server) { // 'ldap://localhost:389'
authenticationSource = ref('ldapAuthenticationSource')
authenticationStrategy = ref('authenticationStrategy')
userDn = conf.ldap.context.managerDn // 'cn=admin,dc=example,dc=com'
**password = conf.ldap.context.managerPassword // 'secret'**
contextFactory = contextFactoryClass
dirObjectFactory = dirObjectFactoryClass
baseEnvironmentProperties = conf.ldap.context.baseEnvironmentProperties // none
cacheEnvironmentProperties = conf.ldap.context.cacheEnvironmentProperties // true
anonymousReadOnly = conf.ldap.context.anonymousReadOnly // false
referral = conf.ldap.context.referral // null
}
ldapAuthenticationSource(SimpleAuthenticationSource) {
principal = conf.ldap.context.managerDn // 'cn=admin,dc=example,dc=com'
**credentials = conf.ldap.context.managerPassword // 'secret'**
}
You don't need to override the doWithSpring() method in the existing plugin. You can provide your own plugin which loads after the one you want to affect and have your doWithSpring() add whatever you want to the context. If you add beans with the same name as the ones added by the other plugin, yours will replace the ones provided by the other plugin as long as you configure your plugin to load after the other one. Similarly, you could do the same think in resources.groovy of the app if you don't want to write a plugin for this.
You have other options too. You could write a bean post processor or bean definition post processor that affects the beans created by the other plugin. Depending on the particulars, that might be a better idea.
EDIT:
After seeing your comment below I created a simple example that shows how you might use a definition post processor. See the project at https://github.com/jeffbrown/postprocessordemo.
The interesting bits:
https://github.com/jeffbrown/postprocessordemo/blob/master/src/main/groovy/demo/SomeBean.groovy
package demo
class SomeBean {
String someValue
}
https://github.com/jeffbrown/postprocessordemo/blob/master/src/main/groovy/demo/SomePostProcessor.groovy
package demo
import org.springframework.beans.BeansException
import org.springframework.beans.MutablePropertyValues
import org.springframework.beans.PropertyValue
import org.springframework.beans.factory.config.BeanDefinition
import org.springframework.beans.factory.config.ConfigurableListableBeanFactory
import org.springframework.beans.factory.support.BeanDefinitionRegistry
import org.springframework.beans.factory.support.BeanDefinitionRegistryPostProcessor
class SomePostProcessor implements BeanDefinitionRegistryPostProcessor{
#Override
void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry registry) throws BeansException {
BeanDefinition definition = registry.getBeanDefinition('someBean')
MutablePropertyValues values = definition.getPropertyValues()
PropertyValue value = values.getPropertyValue('someValue')
def originalValue = value.getValue()
// this is where you could do your decrypting...
values.addPropertyValue('someValue', "MODIFIED: ${originalValue}".toString())
}
#Override
void postProcessBeanFactory(ConfigurableListableBeanFactory beanFactory) throws BeansException {
}
}
https://github.com/jeffbrown/postprocessordemo/blob/master/grails-app/conf/spring/resources.groovy
beans = {
someBean(demo.SomeBean) {
someValue = 'Some Value'
}
somePostProcessor demo.SomePostProcessor
}
https://github.com/jeffbrown/postprocessordemo/blob/master/grails-app/init/postprocessordemo/BootStrap.groovy
package postprocessordemo
import demo.SomeBean
class BootStrap {
SomeBean someBean
def init = { servletContext ->
log.info "The Value: ${someBean.someValue}"
}
def destroy = {
}
}
At application startup you will see log output that looks something like this...
2017-10-23 19:04:54.356 INFO --- [ main] postprocessordemo.BootStrap : The Value: MODIFIED: Some Value
The "MODIFIED" there is evidence that the bean definition post processor modified the property value in the bean. In my example I am simply prepending some text to the string. In your implementation you could decrypt a password or do whatever you want to do there.
I hope that helps.
After trying Jasypt plugin and BeanPostProcessor solutions unsuccessfully for my use case, I found below solution to work perfectly.
To describe again the problem statement here,
a) we had to keep the passwords in an encrypted format inside properties files
b) and given we were packaging as a war file so the properties must not be kept inside the war to allow automated deployment scripts update the encrypted passwords depending on the environment
Jasypt plugin was a perfect solution for the use case a), but it was not able to cover the b) scenario
Moreover, the Grails LDAP Security plugin was getting loaded quite early hence Bean Post processors were also not helping out here.
Solution:
Created a new class by implementing the interface SpringApplicationRunListener. Extended its methods and parsed the properties file using YamlPropertySourceLoader
Sample code:
YamlPropertySourceLoader loader = new YamlPropertySourceLoader();
PropertySource<?> applicationYamlPropertySource = loader.load(
"application.yml", new ClassPathResource("application.yml"),"default");
return applicationYamlPropertySource;
Once the properties were loaded inside the MapPropertySource object, parsed them for the encrypted values and applied the decryption logic.
This whole implementation was executed before any plugins were initialized during Grails bootup process solving the purpose.
Hope it will help others.

load properties file depends on spring profiles

I want to use PropertyPlaceholderConfiguration to load different property file depends on spring.profiles.active passed when web application launched. I have different stages divided by two groups. read application.properties when spring profile is 'prod', otherwise read application-dev.properties file.
When I launched non-prod stage, the developmentPropertyPlaceholderConfigurer() called, "Development properties read" print out, I guess application-dev should be loaded. But When I use #Value("${aws.key}") to read the value, it's application.properties' value.
I don't know what's wrong
Forgot mention I use spring-boot.
I did little test, let's say I have two same properties name in both file. aws.key=dev in dev file aws.key=prod in default file. Even if I active dev stage, the aws.key=prod in application.properties was always read in. But, If I remove 'aws.key' in application.properties, then aws.key=dev was read in. I think the appliaction-dev.properties file was read in, then spring boot read application.properties again override the same properties even if I do not want spring boot to read application.property in non prod stage. how to solve it?
#Configuration
public class PropertyPlaceholderConfiguration {
#Bean
#Profile({"test","qa","demo","dev","AWS","localhost"})
public static PropertySourcesPlaceholderConfigurer developmentPropertyPlaceholderConfigurer() {
System.out.println("Development properties read");
PropertySourcesPlaceholderConfigurer configurer = new PropertySourcesPlaceholderConfigurer();
configurer.setIgnoreUnresolvablePlaceholders(Boolean.TRUE);
configurer.setLocation(new ClassPathResource("application-dev.properties"));
return configurer;
}
#Bean
#Profile("prod") // The default
public static PropertySourcesPlaceholderConfigurer propertyPlaceholderConfigurer() {
System.out.println("Production properties read");
PropertySourcesPlaceholderConfigurer configurer = new PropertySourcesPlaceholderConfigurer();
configurer.setIgnoreUnresolvablePlaceholders(Boolean.TRUE);
configurer.setLocation(new ClassPathResource("application.properties"));
return configurer;
} }

Hangfire job on Console/Web App solution?

I'm new to Hangfire and I'm trying to understand how this works.
So I have a MVC 5 application and a Console application in the same solution. The console application is a simple one that just updates some data on the database (originally planned to use Windows Task Scheduler).
Where exactly do I install Hangfire? In the Web app or the console? Or should I convert the console into a class on the Web app?
If I understand it correctly, the console in your solution is acting like an "pseudo" HangFire, since like you said it does some database operations overtime and you plan to execute it using the Task Scheduler.
HangFire Overview
HangFire was design to do exactly what you want with your console app, but with a lot more of power and functionalities, so you avoid all the overhead of creating all that by yourself.
HangFire Instalation
HangFire is installed commonly alongside with ASP.NET Applications, but if you carefully read the docs, you will surprisingly find this:
Hangfire project consists of a couple of NuGet packages available on
NuGet Gallery site. Here is the list of basic packages you should know
about:
Hangfire – bootstrapper package that is intended to be installed only
for ASP.NET applications that uses SQL Server as a job storage. It
simply references to Hangfire.Core, Hangfire.SqlServer and
Microsoft.Owin.Host.SystemWeb packages.
Hangfire.Core – basic package
that contains all core components of Hangfire. It can be used in any
project type, including ASP.NET application, Windows Service, Console,
any OWIN-compatible web application, Azure Worker Role, etc.
As you can see, HangFire can be used in any type of project including console applications but you will need to manage and add all the libraries depending on what kind of job storage you will use. See more here:
Once HangFire is Installed you can configure it to use the dashboard, which is an interface where you can find all the information about your background jobs. In the company I work, we used HangFire several times with recurring jobs mostly to import users, synchronize information across applications and perform operations that would be costly to run during business hours, and the Dashboard proved to be very useful when we wanted to know if a certain job was running or not. It also uses CRON to schedule the operations.
A sample of we are using right now is:
Startup.cs
public partial class Startup
{
public void Configuration(IAppBuilder app)
{
//Get the connection string of the HangFire database
GlobalConfiguration.Configuration.UseSqlServerStorage(connection);
//Start HangFire Server and enable the Dashboard
app.UseHangfireDashboard();
app.UseHangfireServer();
//Start HangFire Recurring Jobs
HangfireServices.Instance.StartSendDetails();
HangfireServices.Instance.StartDeleteDetails();
}
}
HangfireServices.cs
public class HangfireServices
{
//.. dependency injection and other definitions
//ID of the Recurring JOBS
public static string SEND_SERVICE = "Send";
public static string DELETE_SERVICE = "Delete";
public void StartSend()
{
RecurringJob.AddOrUpdate(SEND_SERVICE, () =>
Business.Send(), //this is my class that does the actual process
HangFireConfiguration.Instance.SendCron.Record); //this is a simple class that reads an configuration CRON file
}
public void StartDeleteDetails()
{
RecurringJob.AddOrUpdate(DELETE_SERVICE, () =>
Business.SendDelete(), //this is my class that does the actual process
HangFireConfiguration.Instance.DeleteCron.Record); //this is a simple class that reads an configuration CRON file
}
}
HangFireConfiguration.cs
public sealed class HangFireConfiguration : ConfigurationSection
{
private static HangFireConfiguration _instance;
public static HangFireConfiguration Instance
{
get { return _instance ?? (_instance = (HangFireConfiguration)WebConfigurationManager.GetSection("hangfire")); }
}
[ConfigurationProperty("send_cron", IsRequired = true)]
public CronElements SendCron
{
get { return (CronElements)base["send_cron"]; }
set { base["send_cron"] = value; }
}
[ConfigurationProperty("delete_cron", IsRequired = true)]
public CronElements DeleteCron
{
get { return (CronElements)base["delete_cron"]; }
set { base["delete_cron"] = value; }
}
}
hangfire.config
<hangfire>
<send_cron record="0,15,30,45 * * * *"></send_cron>
<delete_cron record="0,15,30,45 * * * *"></delete_cron>
</hangfire>
The CRON expression above will run at 0,15,30,45 minutes every hour every day.
Web.config
<configSections>
<!-- Points to the HangFireConfiguration class -->
<section name="hangfire" type="MyProject.Configuration.HangFireConfiguration" />
</configSections>
<!-- Points to the .config file -->
<hangfire configSource="Configs\hangfire.config" />
Conclusion
Given the scenario you described, I would probably install HangFire in your ASP.NET MVC application and remove the console application, simple because it is one project less to worry about. Even though you can install it on a console application I would rather not follow that path because if you hit a brick wall (and you'll hit, trust me), chances are you'll find help mostly for cases where it was installed in ASP.NET applications.
No need of any more console application to update the database. You can use hangfire in your MVC application itself.
http://docs.hangfire.io/en/latest/configuration/index.html
After adding the hangfire configuration, you can make use of normal MVC method to do the console operations like updating the DB.
Based on your requirement you can use
BackgroundJob.Enqueue --> Immediate update to DB
BackgroundJob.Schedule --> Delayed update to DB
RecurringJob.AddOrUpdate --> Recurring update to DB like windows service.
Below is an example,
public class MyController : Controller
{
public void MyMVCMethod(int Id)
{
BackgroundJob.Enqueue(() => UpdateDB(Id));
}
public void UpdateDB(Id)
{
// Code to update the Database.
}
}

Spring Data Neo4j - Unit Test - Transaction rollbacked but data not deleted

I am building an application using spring-boot (1.1.8.RELEASE), spring-data-neo4j (3.2.0.RELEASE) in order to connect to a stand alone neo4j server via rest api. I am using spring-test in order to test the application I have implemented a unit test to create a Node and retrieved it. It is working well but the new node remained in the database after the test is completed, however I expect the transaction to be rollbacked and the node deleted
However in the console I can see the following statement.
"Rolled back transaction after test execution for test context...
** I don't understand why based on the console the roll back seems to have occured but the transaction has been committed to the database. **
It would be really appreciated if somebody could help me to figure out where the issue is coming from.
Find below my spring configuration
#Configuration
#ComponentScan
#EnableTransactionManagement
#EnableAutoConfiguration
public class AppConfig extends Neo4jConfiguration {
public AppConfig() {
setBasePackage("demo");
}
#Bean
public GraphDatabaseService graphDatabaseService(Environment environment) {
return new SpringRestGraphDatabase("http://localhost:7474/db/data");
}
}
Find below my test class
#SuppressWarnings("deprecation")
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = AppConfig.class)
#Transactional
public class AppTests {
#Autowired
private Neo4jTemplate template;
#Test
public void templateTest() {
Person person = new Person();
person.setName("Benoit");
person.setBorn(1986);
Person newPerson = template.save(person);
Person retrievedPerson = template.findOne(newPerson.getNodeId(),Person.class);
Assert.assertEquals("Benoit", retrievedPerson.getName());
}
}
I tried to add the following annotation in my unit test class but it did not change anything:
#TransactionConfiguration(transactionManager="transactionManager", defaultRollback=true)
I also tried to add the following in my unit test based on what I have seen in other posts
implements ApplicationContextAware
Thank you for your help
Regards
The behavior you are experiencing is to be expected: there is nothing wrong with transaction support in the Spring TestContext Framework (TCF) in this regard.
The TCF manages transactions via the configured transactionManager.
So when you switched to an embedded database and configured the transaction manager with the data source for that embedded database, that works perfectly. The issue is that the transaction support in Neo4J-REST does not tie in with Spring's transaction management facilities. As Michael Hunger stated in the other thread you referenced, an upcoming version of the Neo4J-REST API should address this issue.
Note that annotating your test class with #TransactionConfiguration has zero effect since you are merely overriding the defaults with the defaults which achieves nothing. Furthermore, implementing ApplicationContextAware in a test class has no effect on transaction management.
Regards,
Sam (spring-test component lead)

Resources