Setting swagger config parameters in SpringDoc - swagger

I am trying to override Swagger configuration in SpringDoc as described in:
https://springdoc.org/#swagger-ui-properties
I am setting these in the code in (kotlin class) init block
init {
System.setProperty("springdoc.swagger-ui.path", "/services/$serviceName")
System.setProperty("springdoc.swagger-ui.url", "/services/$serviceName/v3/api-docs")
System.setProperty("springdoc.swagger-ui.configUrl", "/services/$serviceName/v3/api-docs/swagger-config")
// Controls the display of extensions (pattern, maxLength, minLength, maximum, minimum) fields and values for Parameters.
System.setProperty("springdoc.swagger-ui.showCommonExtensions", "true")
}
However it seems to be completely ignored and none of the fields are taken into account. What is the place these should be set correctly?
Note that I need to set the config properties according to the SERVICE_NAME env var, so I cannot use the static properties file.
Full config here: https://gist.github.com/knyttl/852f67f1688ea6e808b8eb89068e90d1

As suggested by the SpringDoc author in https://github.com/springdoc/springdoc-openapi/issues/1485, this can be done as follows:
#Bean
open fun swaggerUiConfig(config: SwaggerUiConfigProperties): SwaggerUiConfigProperties {
// For details, see https://springdoc.org/#swagger-ui-properties
// Controls the display of extensions (pattern, maxLength, minLength, maximum, minimum) fields and values for Parameters.
config.showCommonExtensions = true
// Allows to configure source for the documentation via query params (?url=/v3/api-docs).
config.queryConfigEnabled = true
return config
}

Related

Expected getter for property [tempLocation] to be marked with #default on all

I am trying to execute a Dataflow pipeline that writes to BigQuery. I understand that in order to do so, I need to specify a GCS temp location.
So I defined options:
private interface Options extends PipelineOptions {
#Description("GCS temp location to store temp files.")
#Default.String(GCS_TEMP_LOCATION)
#Validation.Required
String getTempLocation();
void setTempLocation(String value);
#Description("BigQuery table to write to, specified as "
+ "<project_id>:<dataset_id>.<table_id>. The dataset must already exist.")
#Default.String(BIGQUERY_OUTPUT_TABLE)
#Validation.Required
String getOutput();
void setOutput(String value);
}
And try to pass this to the Pipeline.Create() function:
public static void main(String[] args) {
Pipeline p = Pipeline.create(PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class));
...
}
But I am getting the following error. I don't understand why because I annotate with#Default:
Exception in thread "main" java.lang.IllegalArgumentException: Expected getter for property [tempLocation] to be marked with #Default on all [my.gcp.dataflow.StarterPipeline$Options, org.apache.beam.sdk.options.PipelineOptions], found only on [my.gcp.dataflow.StarterPipeline$Options]
Is the above snippet your code or a copy from the SDK?
You don't define a new options class for this. You actually want to call withCustomGcsTempLocation on BigQueryIO.Write [1].
Also, I think BQ should determine a temp location on it's own if you do not provide one. Have you tried without setting this? Did you get an error?
[1] https://github.com/apache/beam/blob/a17478c2ee11b1d7a8eba58da5ce385d73c6dbbc/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java#L1402
Most users simply set the staging directory. To set the staging directory, you want to do something like:
DataflowPipelineOptions options = PipelineOptionsFactory.create()
.as(DataflowPipelineOptions.class);
options.setRunner(BlockingDataflowPipelineRunner.class);
options.setStagingLocation("gs://SET-YOUR-BUCKET-NAME-HERE");
However if you want to set gcpTemporaryDirectory, you can do that as well:
GcpOptions options = PipelineOptionsFactory.as(GcpOptions.class);
options.setGcpTempLocation()
Basically you have to do .as(X.class) to get to the X options. Then once you have that object you can just set any options that are part of X. You can find many additional examples online.

How can I write FlowFile attributes to Avro metadata inside the FlowFile's content?

I am creating FlowFiles that are manipulated and split downstream after being emitted by an ExecuteSql processor. I have populated the FlowFiles' attributes with data that I want to put into the Avro metadata contained within each FlowFile's content.
How can I do this?
I've tried using an UpdateRecord processor configured with an AvroReader and AvroRecordSetWriter and a property with a key of /canary that should be writing a FlowFile attribute to that key somewhere in the Avro document. It does not appear anywhere in the output, though.
It would be acceptable to move the records in the Avro data to a subkey and have a metadata section be a part of the record data. I would prefer not to do this, though, because it does not seem like the correct solution and because it sounds much more complex than simply modifying the Avro metadata.
The record-aware processors (and the Readers/Writers) are not metadata-aware, meaning they cannot currently (as of NiFi 1.5.0) act on metadata in any way (inspect, create, delete, etc.), so UpdateRecord won't work for metadata per se. With your /canary property key, it will try to insert a field into your Avro record at the top level, named canary, and should have the value you specify. However I believe your output schema needs to have the canary field added at the top level, or it may be ignored (I'm not positive of this, you can check the output schema to see if it is added automatically).
There is currently no NiFi processor that can update Avro metadata explicitly (MergeContent does some with regards to merging various Avro files together, but you can't choose to set a value, e.g.). However I have an unpolished Groovy script you could use in ExecuteScript to add metadata to Avro files in NiFi 1.5.0+. In ExecuteScript you would set the language to Groovy and the following as the Script Body, then add user-defined (aka "dynamic" properties) to ExecuteScript, where the key will be the metadata key, and the evaluated value (the properties support Expression Language) will be the value:
#Grab('org.apache.avro:avro:1.8.1')
import org.apache.avro.*
import org.apache.avro.file.*
import org.apache.avro.generic.*
def flowFile = session.get()
if(!flowFile) return
try {
// Save off dynamic property values for metadata key/values later
def metadata = [:]
context.properties.findAll {e -> e.key.dynamic}.each {k,v -> metadata.put(k.name, context.getProperty(k).evaluateAttributeExpressions(flowFile).value.bytes)}
flowFile = session.write(flowFile, {inStream, outStream ->
DataFileStream<GenericRecord> reader = new DataFileStream<>(inStream, new GenericDatumReader<GenericRecord>())
DataFileWriter<GenericRecord> writer = new DataFileWriter<>(new GenericDatumWriter<GenericRecord>())
def schema = reader.schema
def inputCodec = reader.getMetaString(DataFileConstants.CODEC) ?: DataFileConstants.NULL_CODEC
// Forward the existing metadata to the output
reader.metaKeys.each { key ->
if (!DataFileWriter.isReservedMeta(key)) {
byte[] metadatum = reader.getMeta(key)
writer.setMeta(key, metadatum)
}
}
// For each dynamic property, set the key/value pair as Avro metadata
metadata.each {k,v -> writer.setMeta(k,v)}
writer.setCodec(CodecFactory.fromString(inputCodec))
writer.create(schema, outStream)
writer.appendAllFrom(reader, false)
} as StreamCallback)
session.transfer(flowFile, REL_SUCCESS)
} catch(e) {
log.error('Error adding Avro metadata, penalizing flow file and routing to failure', e)
flowFile = session.penalize(flowFile)
session.transfer(flowFile, REL_FAILURE)
}
Note that this script can work with versions of NiFi previous to 1.5.0, but the #Grab at the top is not supported until 1.5.0, so you'd have to download Avro and its dependencies into a flat folder, and point to that in the Module Directory property of ExecuteScript.

How to customize an existing Grails plugin functionality, modifying behavior of doWithSpring method

I am new to grails and while working with Spring Security LDAP plugin it was identified that it accepts the ldap server password in plain text only. The task in hand is to pass an encrypted password which is decrypted before it is consumed by the plugin during its initialization phase.
I have already searched for all possible blogs and stackoverflow questions but could not find a way to extend the main plugin class to simply override the doWithSpring() method so that i can simply add the required decryption logic for the Ldap server password. Any help here will be appreciated.
I have already seen and tried jasypt plugin but it also does not work well if the password is stored in some external file and not application yml. So I am looking for a solution to extend the Spring security plugin main class, add the required behavior and register the custom class.
EDIT
Adding the snippet from Grails LDAP Security plugin, which I am trying to override. So If i am successfully able to update the value of securityConfig object before the plugin loads, the purpose is solved.
Some snippet from the plugin:
def conf = SpringSecurityUtils.securityConfig
...
...
contextSource(DefaultSpringSecurityContextSource, conf.ldap.context.server) { // 'ldap://localhost:389'
authenticationSource = ref('ldapAuthenticationSource')
authenticationStrategy = ref('authenticationStrategy')
userDn = conf.ldap.context.managerDn // 'cn=admin,dc=example,dc=com'
**password = conf.ldap.context.managerPassword // 'secret'**
contextFactory = contextFactoryClass
dirObjectFactory = dirObjectFactoryClass
baseEnvironmentProperties = conf.ldap.context.baseEnvironmentProperties // none
cacheEnvironmentProperties = conf.ldap.context.cacheEnvironmentProperties // true
anonymousReadOnly = conf.ldap.context.anonymousReadOnly // false
referral = conf.ldap.context.referral // null
}
ldapAuthenticationSource(SimpleAuthenticationSource) {
principal = conf.ldap.context.managerDn // 'cn=admin,dc=example,dc=com'
**credentials = conf.ldap.context.managerPassword // 'secret'**
}
You don't need to override the doWithSpring() method in the existing plugin. You can provide your own plugin which loads after the one you want to affect and have your doWithSpring() add whatever you want to the context. If you add beans with the same name as the ones added by the other plugin, yours will replace the ones provided by the other plugin as long as you configure your plugin to load after the other one. Similarly, you could do the same think in resources.groovy of the app if you don't want to write a plugin for this.
You have other options too. You could write a bean post processor or bean definition post processor that affects the beans created by the other plugin. Depending on the particulars, that might be a better idea.
EDIT:
After seeing your comment below I created a simple example that shows how you might use a definition post processor. See the project at https://github.com/jeffbrown/postprocessordemo.
The interesting bits:
https://github.com/jeffbrown/postprocessordemo/blob/master/src/main/groovy/demo/SomeBean.groovy
package demo
class SomeBean {
String someValue
}
https://github.com/jeffbrown/postprocessordemo/blob/master/src/main/groovy/demo/SomePostProcessor.groovy
package demo
import org.springframework.beans.BeansException
import org.springframework.beans.MutablePropertyValues
import org.springframework.beans.PropertyValue
import org.springframework.beans.factory.config.BeanDefinition
import org.springframework.beans.factory.config.ConfigurableListableBeanFactory
import org.springframework.beans.factory.support.BeanDefinitionRegistry
import org.springframework.beans.factory.support.BeanDefinitionRegistryPostProcessor
class SomePostProcessor implements BeanDefinitionRegistryPostProcessor{
#Override
void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry registry) throws BeansException {
BeanDefinition definition = registry.getBeanDefinition('someBean')
MutablePropertyValues values = definition.getPropertyValues()
PropertyValue value = values.getPropertyValue('someValue')
def originalValue = value.getValue()
// this is where you could do your decrypting...
values.addPropertyValue('someValue', "MODIFIED: ${originalValue}".toString())
}
#Override
void postProcessBeanFactory(ConfigurableListableBeanFactory beanFactory) throws BeansException {
}
}
https://github.com/jeffbrown/postprocessordemo/blob/master/grails-app/conf/spring/resources.groovy
beans = {
someBean(demo.SomeBean) {
someValue = 'Some Value'
}
somePostProcessor demo.SomePostProcessor
}
https://github.com/jeffbrown/postprocessordemo/blob/master/grails-app/init/postprocessordemo/BootStrap.groovy
package postprocessordemo
import demo.SomeBean
class BootStrap {
SomeBean someBean
def init = { servletContext ->
log.info "The Value: ${someBean.someValue}"
}
def destroy = {
}
}
At application startup you will see log output that looks something like this...
2017-10-23 19:04:54.356 INFO --- [ main] postprocessordemo.BootStrap : The Value: MODIFIED: Some Value
The "MODIFIED" there is evidence that the bean definition post processor modified the property value in the bean. In my example I am simply prepending some text to the string. In your implementation you could decrypt a password or do whatever you want to do there.
I hope that helps.
After trying Jasypt plugin and BeanPostProcessor solutions unsuccessfully for my use case, I found below solution to work perfectly.
To describe again the problem statement here,
a) we had to keep the passwords in an encrypted format inside properties files
b) and given we were packaging as a war file so the properties must not be kept inside the war to allow automated deployment scripts update the encrypted passwords depending on the environment
Jasypt plugin was a perfect solution for the use case a), but it was not able to cover the b) scenario
Moreover, the Grails LDAP Security plugin was getting loaded quite early hence Bean Post processors were also not helping out here.
Solution:
Created a new class by implementing the interface SpringApplicationRunListener. Extended its methods and parsed the properties file using YamlPropertySourceLoader
Sample code:
YamlPropertySourceLoader loader = new YamlPropertySourceLoader();
PropertySource<?> applicationYamlPropertySource = loader.load(
"application.yml", new ClassPathResource("application.yml"),"default");
return applicationYamlPropertySource;
Once the properties were loaded inside the MapPropertySource object, parsed them for the encrypted values and applied the decryption logic.
This whole implementation was executed before any plugins were initialized during Grails bootup process solving the purpose.
Hope it will help others.

log4j2 and custom key value using JSONLayout

I would like to add to my log a String key and an Integer value using Log4j2.
Is there a way to do it? when I added properties to the ThreadContext I was able to add only String:String key and values but this does not help I have numbers that I need to present in Kibana (some graphs)
thanks,
Kobi
The built-in GelfLayout may be useful.
It's true that the default ThreadContext only supports String:String key-values. The work done in LOG4J2-1648 allows you to use other types in ThreadContext:
Tell Log4j to use a ThreadContext map implementation that implements the ObjectThreadContextMap interface. The simplest way to accomplish this is by setting system property log4j2.garbagefree.threadContextMap to true.
The standard ThreadContext facade only has methods for Strings, so you need to create your own facade. The below should work:
public class ObjectThreadContext {
public static boolean isSupported() {
return ThreadContext.getThreadContextMap() instanceof ObjectThreadContextMap;
}
public static Object getValue(String key) {
return getObjectMap().getValue(key);
}
public static void putValue(String key, Object value) {
getObjectMap().putValue(key, value);
}
private static ObjectThreadContextMap getObjectMap() {
if (!isSupported()) { throw new UnsupportedOperationException(); }
return (ObjectThreadContextMap) ThreadContext.getThreadContextMap();
}
}
It is possible to avoid ThreadContext altogether by injecting key-value pairs from another source into the LogEvent. This is (briefly) mentioned under Custom Context Data Injectors (http://logging.apache.org/log4j/2.x/manual/extending.html#Custom_ContextDataInjector).
I found default log4j2 implementation somewhat problematic for passing custom fields with values. In my opinion current java logging frameworks are not well suited for writing structured log events
If you like hacks, you can check https://github.com/skorhone/gelfj-alt/tree/master/src/main/java/org/graylog2/log4j2 . It's a library written for gelf. One of provided features is a layout (ExtGelfjLayout) that supports extracting custom fields (See FieldExtractor) from events. But... im order to send such event, you need to write your own logging facade on top of log4j2.

How to omit data type tags in SnakeYaml?

I have the following 1.1 YAML generated by SnakeYaml
'test_jbgrp1':
'tags': []
'jobs':
- 'test_job1'
'reserve': []
'cancel':
- 'max_duration': !!int '1200'
The !!int tag is breaking another (older) piece of software and I have a requirement to remove the tag before writing the file. I don't want to revert to a silly solutions such as writing content to a String and postprocessing it before dumping the file so the question is - is there a setting in Snakeyaml that would remove !!int from the code above?
Assuming you have to remove all occurences of the !!int
You can take a look at the How to skip a property to skip the property or do some transformation using Flexible Scalar Type Customization
In short you configure Yaml instance as below
Yaml yaml = new Yaml(new MyRepresenter());
String output = yaml.dump(new MyJavaBean());
where MyRepresenter is expressed as below
#Override
protected NodeTuple representJavaBeanProperty(Object javaBean, Property property,
Object propertyValue, Tag customTag) {
if (int.class.equals(property.getType())) {//some better condition
//construct NodeTupe as you wish - i.e. keep the element and remove the type
return null;//this will skip the property
} else {
return super
.representJavaBeanProperty(javaBean, property, propertyValue, customTag);
}
}

Resources