I use the following command to generate spring boot code using swagger
java -jar ./swagger-codegen-cli-2.2.1.jar generate \
-i /swagger.yaml \
-o ./swagger-demo-restful \
-l spring
using the following definitions in swagger.yaml:
definitions:
AlertDef:
type: object
properties:
alertName:
type: string
description: name of the alert
xml:
name: AlertName
The generated code is like:
package io.swagger.model;
import java.util.Objects;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.annotation.JsonCreator;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
/**
* AlertDef
*/
#javax.annotation.Generated(value = "class io.swagger.codegen.languages.SpringCodegen", date = "2016-11-10T13:06:05.500+08:00")
public class AlertDef {
private String alertName = null;
public AlertDef alertName(String alertName) {
this.alertName = alertName;
return this;
}
/**
* name of the alert
* #return alertName
**/
#ApiModelProperty(value = "name of the alert")
public String getAlertName() {
return alertName;
}
public void setAlertName(String alertName) {
this.alertName = alertName;
}
// toString, hashcode, etc...
}
But to make the restful API return XML successfully, it is needed to add #XmlRootElement(name = "AlertDef") to class and #XmlElement to each set method.
Is there something missing in the yaml file or swagger codegen doesn't support it for now?
Related
While generating swagger client through CodeGen, it is appending 2 digits character at the end of the method name. For Ex: Java rest API
HTTP POST /all
is generating the method name in python client as
create_all_using_postIntValue or create_all_using_post19
I want the generated method name as create_all_using_post. Why is it happened and how to fix this?
I am using below commands to generate the client:
swagger-codegen generate -i https://example.com/v3/api-docs -l python -o swagger_test
I got the answer. MethodName can be override by implementing OperationBuilderPlugin. Below code fixed my issue:
import com.google.common.base.CaseFormat;
import io.swagger.annotations.ApiOperation;
import org.springframework.core.annotation.Order;
import org.springframework.stereotype.Component;
import springfox.documentation.service.Operation;
import springfox.documentation.spi.DocumentationType;
import springfox.documentation.spi.service.OperationBuilderPlugin;
import springfox.documentation.spi.service.contexts.OperationContext;
import springfox.documentation.swagger.common.SwaggerPluginSupport;
import java.util.Optional;
#Component
#Order(SwaggerPluginSupport.SWAGGER_PLUGIN_ORDER + 1000)
public class GenerateUniqueOperationName implements OperationBuilderPlugin {
#Override
public void apply(OperationContext context) {
Optional<ApiOperation> methodAnnotation = context.findControllerAnnotation(ApiOperation.class);
Operation operationBuilder = context.operationBuilder().build();
if(operationBuilder.getTags().stream().findFirst().get().isEmpty())
throw new RuntimeException("operationBuilder.getTags().stream().findFirst()");
String tag = operationBuilder.getTags().stream().findFirst().get();
String methodType = operationBuilder.getMethod().name().toUpperCase();
String methodName = operationBuilder.getSummary();
String operationId = tag + "_" + methodType + "_" + CaseFormat.UPPER_CAMEL.to(CaseFormat.LOWER_UNDERSCORE, methodName);
context.operationBuilder().uniqueId(operationId);
context.operationBuilder().codegenMethodNameStem(operationId);
}
#Override
public boolean supports(DocumentationType delimiter) {
return SwaggerPluginSupport.pluginDoesApply(delimiter);
}
}
I am looking through the documentation for a sample of how to handle a submit from an Orbeon form that I gather some data in and then submitting to another application via REST. I am not seeing anything that shows how to do that. Does Orbeon provide functionality to do that or do I need to code some JSP or something else on the backside to handle that?
My understanding is, that you have to provide/implement the REST service yourself. You aren't restricted to do it in Java, but if this is your preferred language, here's how a very simple servlet would look like. In this case the REST service saves the form in a file in the temp directory.
import java.io.File;
import java.io.FileWriter;
import java.io.IOException;
import java.io.PrintWriter;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.Enumeration;
import java.util.Optional;
import java.util.logging.Logger;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
public class FormDumpServlet extends HttpServlet {
private static final Logger logger = Logger.getLogger(FormDumpServlet.class.getName());
private static final SimpleDateFormat FORMAT = new SimpleDateFormat("yyyy-MM-dd-HH-mm-ss-SSS");
protected Optional<String> makeTempDir() {
final String dir = System.getProperty("java.io.tmpdir");
logger.info(String.format("java.io.tmpdir=%s", dir));
if (dir == null) {
logger.severe("java.io.tmpdir is null, can't create temp directory");
return Optional.empty();
}
final File f = new File(dir,"form-dumps");
if (f.exists() && f.isDirectory() && f.canWrite()) {
return Optional.of(f.getAbsolutePath());
}
if (f.mkdir()) {
return Optional.of(f.getAbsolutePath());
}
logger.severe(String.format("failed to create temp dir <%s>", f.getAbsolutePath()));
return Optional.empty();
}
#Override
protected void doPost(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
String path = req.getPathInfo();
if (!path.equalsIgnoreCase("/accept-form")) {
resp.setStatus(HttpServletResponse.SC_NOT_FOUND);
return;
}
Enumeration<String> parameterNames = req.getParameterNames();
while(parameterNames.hasMoreElements()) {
final String name = parameterNames.nextElement();
final String value = req.getParameter(name);
logger.info(String.format("parameter: name=<%s>, value=<%s>", name, value));
}
Optional<String> tempPath = makeTempDir();
if (tempPath.isPresent()) {
String fn = String.format("%s.xml", FORMAT.format(new Date()));
File f = new File(new File(tempPath.get()), fn);
logger.info(String.format("saving form to file <%s>", f.getAbsolutePath()));
try(PrintWriter pw = new PrintWriter(new FileWriter(f))) {
req.getReader().lines().forEach((l) -> pw.println(l));
}
}
resp.setStatus(HttpServletResponse.SC_OK);
}
}
You also have to configure a property in properties-local.xml which connects the send action for your form (the form with the name my_form in your application my_application) to the REST endpoint. This property could look as follows:
<property
as="xs:string"
name="oxf.fr.detail.process.send.my_application.my_form"
>
require-valid
then save-final
then send(uri = "http://localhost:8080/my-form-dump-servlet/accept-form")
then success-message(message = "Success: the form was transferred to the REST service")
</property>
I am writing a class to recursively extract files from inside a zip file and produce them to a Kafka queue for further processing. My intent is to be able to extract files from multiple levels of zip. The code below is my implementation of the tika ContainerExtractor to do this.
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import java.util.Stack;
import org.apache.commons.lang.StringUtils;
import org.apache.tika.config.TikaConfig;
import org.apache.tika.detect.DefaultDetector;
import org.apache.tika.detect.Detector;
import org.apache.tika.exception.TikaException;
import org.apache.tika.extractor.ContainerExtractor;
import org.apache.tika.extractor.EmbeddedResourceHandler;
import org.apache.tika.io.TemporaryResources;
import org.apache.tika.io.TikaInputStream;
import org.apache.tika.metadata.Metadata;
import org.apache.tika.mime.MediaType;
import org.apache.tika.parser.AbstractParser;
import org.apache.tika.parser.AutoDetectParser;
import org.apache.tika.parser.ParseContext;
import org.apache.tika.parser.Parser;
import org.apache.tika.parser.pkg.PackageParser;
import org.xml.sax.ContentHandler;
import org.xml.sax.SAXException;
import org.xml.sax.helpers.DefaultHandler;
public class UberContainerExtractor implements ContainerExtractor {
/**
*
*/
private static final long serialVersionUID = -6636138154366178135L;
// statically populate SUPPORTED_TYPES
static {
Set<MediaType> supportedTypes = new HashSet<MediaType>();
ParseContext context = new ParseContext();
supportedTypes.addAll(new PackageParser().getSupportedTypes(context));
SUPPORTED_TYPES = Collections.unmodifiableSet(supportedTypes);
}
/**
* A stack that maintains the parent filenames for the recursion
*/
Stack<String> parentFileNames = new Stack<String>();
/**
* The default tika parser
*/
private final Parser parser;
/**
* Default tika detector
*/
private final Detector detector;
/**
* The supported container types into which we can recurse
*/
public final static Set<MediaType> SUPPORTED_TYPES;
/**
* The number of documents recursively extracted from the container and its
* children containers if present
*/
int extracted;
public UberContainerExtractor() {
this(TikaConfig.getDefaultConfig());
}
public UberContainerExtractor(TikaConfig config) {
this(new DefaultDetector(config.getMimeRepository()));
}
public UberContainerExtractor(Detector detector) {
this.parser = new AutoDetectParser(new PackageParser());
this.detector = detector;
}
public boolean isSupported(TikaInputStream input) throws IOException {
MediaType type = detector.detect(input, new Metadata());
return SUPPORTED_TYPES.contains(type);
}
#Override
public void extract(TikaInputStream stream, ContainerExtractor recurseExtractor, EmbeddedResourceHandler handler)
throws IOException, TikaException {
ParseContext context = new ParseContext();
context.set(Parser.class, new RecursiveParser(recurseExtractor, handler));
try {
Metadata metadata = new Metadata();
parser.parse(stream, new DefaultHandler(), metadata, context);
} catch (SAXException e) {
throw new TikaException("Unexpected SAX exception", e);
}
}
private class RecursiveParser extends AbstractParser {
/**
*
*/
private static final long serialVersionUID = -7260171956667273262L;
private final ContainerExtractor extractor;
private final EmbeddedResourceHandler handler;
private RecursiveParser(ContainerExtractor extractor, EmbeddedResourceHandler handler) {
this.extractor = extractor;
this.handler = handler;
}
public Set<MediaType> getSupportedTypes(ParseContext context) {
return parser.getSupportedTypes(context);
}
public void parse(InputStream stream, ContentHandler ignored, Metadata metadata, ParseContext context)
throws IOException, SAXException, TikaException {
TemporaryResources tmp = new TemporaryResources();
try {
TikaInputStream tis = TikaInputStream.get(stream, tmp);
// Figure out what we have to process
String filename = metadata.get(Metadata.RESOURCE_NAME_KEY);
MediaType type = detector.detect(tis, metadata);
if (extractor == null) {
// do nothing
} else {
// Use a temporary file to process the stream
File file = tis.getFile();
System.out.println("file is directory = " + file.isDirectory());
// Recurse and extract if the filetype is supported
if (SUPPORTED_TYPES.contains(type)) {
System.out.println("encountered a supported file:" + filename);
parentFileNames.push(filename);
extractor.extract(tis, extractor, handler);
parentFileNames.pop();
} else { // produce the file
List<String> parentFilenamesList = new ArrayList<String>(parentFileNames);
parentFilenamesList.add(filename);
String originalFilepath = StringUtils.join(parentFilenamesList, "/");
System.out.println("producing " + filename + " with originalFilepath:" + originalFilepath
+ " to kafka queue");
++extracted;
}
}
} finally {
tmp.dispose();
}
}
}
public int getExtracted() {
return extracted;
}
public static void main(String[] args) throws IOException, TikaException {
String filename = "/Users/rohit/Data/cd.zip";
File file = new File(filename);
TikaInputStream stream = TikaInputStream.get(file);
ContainerExtractor recursiveExtractor = new UberContainerExtractor();
EmbeddedResourceHandler resourceHandler = new EmbeddedResourceHandler() {
#Override
public void handle(String filename, MediaType mediaType, InputStream stream) {
// do nothing
}
};
recursiveExtractor.extract(stream, recursiveExtractor, resourceHandler);
stream.close();
System.out.println("extracted " + ((UberContainerExtractor) recursiveExtractor).getExtracted() + " files");
}
}
It works on multiple levels of zip as long as the files inside the zips are in a flat structure. for ex.
cd.zip
- c.txt
- d.txt
The code does not work if there the files in the zip are present inside a directory. for ex.
ab.zip
- ab/
- a.txt
- b.txt
While debugging I came across the following code snippet in the PackageParser
try {
ArchiveEntry entry = ais.getNextEntry();
while (entry != null) {
if (!entry.isDirectory()) {
parseEntry(ais, entry, extractor, xhtml);
}
entry = ais.getNextEntry();
}
} finally {
ais.close();
}
I tried to comment out the if condition but it did not work. Is there a reason why this is commented? Is there any way of getting around this?
I am using tika version 1.6
Tackling your question in reverse order:
Is there a reason why this is commented?
Entries in zip files are either directories or files. If files, they include the name of the directory they come from. As such, Tika doesn't need to do anything with the directories, all it needs to do is process the embedded files as and when they come up
The code does not work if there the files in the zip are present inside a directory. for ex. ab.zip - ab/ - a.txt - b.txt
You seem to be doing something wrong then. Tika's recursion and package parser handle zips with folders in them just fine!
To prove this, start with a zip file like this:
$ unzip -l ../tt.zip
Archive: ../tt.zip
Length Date Time Name
--------- ---------- ----- ----
0 2015-02-03 16:42 t/
0 2015-02-03 16:42 t/t2/
0 2015-02-03 16:42 t/t2/t3/
164404 2015-02-03 16:42 t/t2/t3/test.jpg
--------- -------
164404 4 files
Now, make us of the -z extraction flag of the Tika App, which causes Tika to extract out all of the embedded contents of a file. Run like that, and we get
$ java -jar tika-app-1.7.jar -z ../tt.zip
Extracting 't/t2/t3/test.jpg' (image/jpeg) to ./t/t2/t3/test.jpg
Then list the resulting directory, and we see
$ find . -type f
./t/t2/t3/Test.jpg
I can't see what's wrong with your code, but sadly for you we've shown that the problem is there, and not with Tika... You'd be best off reviewing the various examples of recursion that Tika provides, such as the Tika App tool and the Recursing Parser Wrapper, then re-write your code to be something simple based from those
I am trying to build a simple component to understand how and why JSF 2.X works the way it does. I have been using the newer annotations and have been trying to piece together a clear example.
So I have built my component and deployed it in a xhtml file as follows:
<kal:day value="K.Day" title="Kalendar" model="#{kalendarDay}"/>
The within the UIComponent I do the following:
ValueExpression ve = getValueExpression("model");
if (ve != null)
{
System.out.println("model expression "+ve.getExpressionString());
model = (KalendarDay) ve.getValue(getFacesContext().getELContext());
System.out.println("model "+model);
}
The expression "#{kalendarDay}" is correctly displayed indicating that the value has been successfully transmitted between the page and the component. However the evaluation of the expression results in "null".
This seems to indicate that the backing bean is unavailable at this point, although the page correctly validates and deploys. So I am 95% certain that the bean is there at run time.
So perhaps this is a phase thing? Should I be evaluating this in the decode of the renderer and setting the value in the attributes map? I am still a little confused about the combination of actual values and value expressions.
So my question is where should I fetch and evaluate the valueExpression for model and should I store the result of the evaluation in the UIComponent or should I simply evaluate it every time?
SSCCE files below I think these are the only required files to demonstrate the problem
Bean Interface -----
/**
*
*/
package com.istana.kalendar.fixture;
import java.util.Date;
/**
* #author User
*
*/
public interface KalendarDay
{
public Date getDate();
}
Bean Implementation ---
/**
*
*/
package com.istana.kalendar.session.wui;
import java.util.Calendar;
import java.util.Date;
import javax.ejb.Stateful;
import javax.inject.Named;
import com.istana.kalendar.fixture.KalendarDay;
/**
* #author User
*
*/
#Named ("kalendarDay")
#Stateful
public class KalKalendarDay
implements KalendarDay
{
private Calendar m_date = Calendar.getInstance();
/* (non-Javadoc)
* #see com.istana.kalendar.fixture.KalendarDay#getDate()
*/
#Override
public Date getDate()
{
return m_date.getTime();
}
}
UIComponent ---
/**
*
*/
package com.istana.kalendar.fixture.jsf;
import javax.el.ValueExpression;
import javax.faces.component.FacesComponent;
import javax.faces.component.UIOutput;
import com.istana.kalendar.fixture.KalendarDay;
/**
* #author User
*
*/
#FacesComponent (value=UIDay.COMPONENT_TYPE)
public class UIDay extends UIOutput
{
static final
public String COMPONENT_TYPE = "com.istana.kalendar.fixture.jsf.Day";
static final
public String COMPONENT_FAMILY = "com.istana.kalendar.fixture.jsf.Day";
private KalendarDay m_model;
private String m_title;
#Override
public String getRendererType()
{
return UIDayRenderer.RENDERER_TYPE;
}
#Override
public String getFamily()
{
return COMPONENT_FAMILY;
}
public KalendarDay getModel()
{
KalendarDay model = (KalendarDay) getStateHelper().eval("model");
System.out.println("model "+model);
return model;
}
public void setModel(KalendarDay model)
{
getStateHelper().put("model",model);
}
public String getTitle()
{
return (String) getStateHelper().eval("title");
}
public void setTitle(String title)
{
getStateHelper().put("title",title);
}
}
UIComponentRenderer ---
/**
*
*/
package com.istana.kalendar.fixture.jsf;
import java.io.IOException;
import javax.el.ValueExpression;
import javax.faces.component.UIComponent;
import javax.faces.context.FacesContext;
import javax.faces.context.ResponseWriter;
import javax.faces.render.FacesRenderer;
import javax.faces.render.Renderer;
import com.istana.kalendar.fixture.KalendarDay;
/**
* #author User
*
*/
#FacesRenderer (componentFamily = UIDay.COMPONENT_FAMILY
,rendererType = UIDayRenderer.RENDERER_TYPE
)
public class UIDayRenderer extends Renderer
{
static final
public String RENDERER_TYPE = "com.istana.kalendar.fixture.jsf.DayRenderer";
#Override
public void encodeBegin (FacesContext context,UIComponent component)
throws IOException
{
UIDay uic = (UIDay) component;
ResponseWriter writer = context.getResponseWriter();
writer.startElement("p", uic);
/*
* This is the call that triggers the println
*/
writer.write("Day - "+uic.getModel().getDate());
}
#Override
public void encodeEnd (FacesContext context,UIComponent component)
throws IOException
{
ResponseWriter writer = context.getResponseWriter();
writer.endElement("p");
writer.flush();
}
}
kalendar.taglib.xml ---
<facelet-taglib
id="kalendar"
xmlns="http://java.sun.com/xml/ns/javaee"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-facelettaglibrary_2_0.xsd"
version="2.0"
>
<namespace>http://istana.com/kalendar</namespace>
<tag>
<tag-name>day</tag-name>
<component>
<component-type>com.istana.kalendar.fixture.jsf.Day</component-type>
</component>
</tag>
</facelet-taglib>
I'm not sure why it's null, but the symptoms indicate that the #{kalendarDay} is been specified during view render time while you're trying to evaluate it during the view build time.
So perhaps this is a phase thing? Should I be evaluating this in the decode of the renderer and setting the value in the attributes map? I am still a little confused about the combination of actual values and value expressions.
You should use the encodeXxx() methods of the component or the associated renderer (if any) to generate HTML based on the component's attributes/properties.
You should use the decode() method of the component or the associated renderer (if any) to set component's attributes/properties based on HTTP request parameters which are been sent along with a HTML form submit.
So my question is where should I fetch and evaluate the valueExpression for model and should I store the result of the evaluation in the UIComponent or should I simply evaluate it every time?
Since JSF 2.x it's recommended to explicitly specify a getter and setter for component attributes which in turn delegates to UIComponent#getStateHelper().
public String getValue() {
return (String) getStateHelper().eval("value");
}
public void setValue(String value) {
getStateHelper().put("value", value);
}
public String getTitle() {
return (String) getStateHelper().eval("title");
}
public void setTitle(String title) {
getStateHelper().put("title", title);
}
public Object getModel() {
return getStateHelper().eval("model");
}
public void setModel(Object model) {
getStateHelper().put("model", model);
}
That's all you basically need (note that the getter and setter must exactly match the attribute name as per Javabeans specification). Then in the encodeXxx() method(s) just call getModel() to get (and evaluate) the value of the model attribute.
I'm just getting started with Dropwizard 0.4.0, and I would like some help with HMAC authentication. Has anybody got any advice?
Thank you in advance.
At present Dropwizard doesn't support HMAC authentication right out of the box, so you'd have to write your own authenticator. A typical choice for HMAC authentication is to use the HTTP Authorization header. The following code expects this header in the following format:
Authorization: <algorithm> <apiKey> <digest>
An example would be
Authorization: HmacSHA1 abcd-efgh-1234 sdafkljlkansdaflk2354jlkj5345345dflkmsdf
The digest is built from the content of the body (marshalled entity) prior to URL encoding with the HMAC shared secret appended as base64. For a non-body request, such as GET or HEAD, the content is taken as the complete URI path and parameters with the secret key appended.
To implement this in a way that Dropwizard can work with it requires you to copy the BasicAuthenticator code present in the dropwizard-auth module into your own code and modify it with something like this:
import com.google.common.base.Optional;
import com.sun.jersey.api.core.HttpContext;
import com.sun.jersey.server.impl.inject.AbstractHttpContextInjectable;
import com.yammer.dropwizard.auth.AuthenticationException;
import com.yammer.dropwizard.auth.Authenticator;
import javax.ws.rs.WebApplicationException;
import javax.ws.rs.core.HttpHeaders;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
class HmacAuthInjectable<T> extends AbstractHttpContextInjectable<T> {
private static final String PREFIX = "HmacSHA1";
private static final String HEADER_VALUE = PREFIX + " realm=\"%s\"";
private final Authenticator<HmacCredentials, T> authenticator;
private final String realm;
private final boolean required;
HmacAuthInjectable(Authenticator<HmacCredentials, T> authenticator, String realm, boolean required) {
this.authenticator = authenticator;
this.realm = realm;
this.required = required;
}
public Authenticator<HmacCredentials, T> getAuthenticator() {
return authenticator;
}
public String getRealm() {
return realm;
}
public boolean isRequired() {
return required;
}
#Override
public T getValue(HttpContext c) {
try {
final String header = c.getRequest().getHeaderValue(HttpHeaders.AUTHORIZATION);
if (header != null) {
final String[] authTokens = header.split(" ");
if (authTokens.length != 3) {
// Malformed
HmacAuthProvider.LOG.debug("Error decoding credentials (length is {})", authTokens.length);
throw new WebApplicationException(Response.Status.BAD_REQUEST);
}
final String algorithm = authTokens[0];
final String apiKey = authTokens[1];
final String signature = authTokens[2];
final String contents;
// Determine which part of the request will be used for the content
final String method = c.getRequest().getMethod().toUpperCase();
if ("GET".equals(method) ||
"HEAD".equals(method) ||
"DELETE".equals(method)) {
// No entity so use the URI
contents = c.getRequest().getRequestUri().toString();
} else {
// Potentially have an entity (even in OPTIONS) so use that
contents = c.getRequest().getEntity(String.class);
}
final HmacCredentials credentials = new HmacCredentials(algorithm, apiKey, signature, contents);
final Optional<T> result = authenticator.authenticate(credentials);
if (result.isPresent()) {
return result.get();
}
}
} catch (IllegalArgumentException e) {
HmacAuthProvider.LOG.debug(e, "Error decoding credentials");
} catch (AuthenticationException e) {
HmacAuthProvider.LOG.warn(e, "Error authenticating credentials");
throw new WebApplicationException(Response.Status.INTERNAL_SERVER_ERROR);
}
if (required) {
throw new WebApplicationException(Response.status(Response.Status.UNAUTHORIZED)
.header(HttpHeaders.AUTHORIZATION,
String.format(HEADER_VALUE, realm))
.entity("Credentials are required to access this resource.")
.type(MediaType.TEXT_PLAIN_TYPE)
.build());
}
return null;
}
}
The above is not perfect, but it'll get you started. You may want to refer to the MultiBit Merchant release candidate source code (MIT license) for a more up to date version and the various supporting classes.
The next step is to integrate the authentication process into your ResourceTest subclass. Unfortunately, Dropwizard doesn't provide a good entry point for authentication providers in v0.4.0, so you may want to introduce your own base class, similar to this:
import com.google.common.collect.Lists;
import com.google.common.collect.Sets;
import com.sun.jersey.api.client.Client;
import com.sun.jersey.test.framework.AppDescriptor;
import com.sun.jersey.test.framework.JerseyTest;
import com.sun.jersey.test.framework.LowLevelAppDescriptor;
import com.xeiam.xchange.utils.CryptoUtils;
import com.yammer.dropwizard.bundles.JavaBundle;
import com.yammer.dropwizard.jersey.DropwizardResourceConfig;
import com.yammer.dropwizard.jersey.JacksonMessageBodyProvider;
import com.yammer.dropwizard.json.Json;
import org.codehaus.jackson.map.Module;
import org.junit.After;
import org.junit.Before;
import org.multibit.mbm.auth.hmac.HmacAuthProvider;
import org.multibit.mbm.auth.hmac.HmacAuthenticator;
import org.multibit.mbm.persistence.dao.UserDao;
import org.multibit.mbm.persistence.dto.User;
import org.multibit.mbm.persistence.dto.UserBuilder;
import java.io.UnsupportedEncodingException;
import java.security.GeneralSecurityException;
import java.util.List;
import java.util.Set;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;
/**
* A base test class for testing Dropwizard resources.
*/
public abstract class BaseResourceTest {
private final Set<Object> singletons = Sets.newHashSet();
private final Set<Object> providers = Sets.newHashSet();
private final List<Module> modules = Lists.newArrayList();
private JerseyTest test;
protected abstract void setUpResources() throws Exception;
protected void addResource(Object resource) {
singletons.add(resource);
}
public void addProvider(Object provider) {
providers.add(provider);
}
protected void addJacksonModule(Module module) {
modules.add(module);
}
protected Json getJson() {
return new Json();
}
protected Client client() {
return test.client();
}
#Before
public void setUpJersey() throws Exception {
setUpResources();
this.test = new JerseyTest() {
#Override
protected AppDescriptor configure() {
final DropwizardResourceConfig config = new DropwizardResourceConfig();
for (Object provider : JavaBundle.DEFAULT_PROVIDERS) { // sorry, Scala folks
config.getSingletons().add(provider);
}
for (Object provider : providers) {
config.getSingletons().add(provider);
}
Json json = getJson();
for (Module module : modules) {
json.registerModule(module);
}
config.getSingletons().add(new JacksonMessageBodyProvider(json));
config.getSingletons().addAll(singletons);
return new LowLevelAppDescriptor.Builder(config).build();
}
};
test.setUp();
}
#After
public void tearDownJersey() throws Exception {
if (test != null) {
test.tearDown();
}
}
/**
* #param contents The content to sign with the default HMAC process (POST body, GET resource path)
* #return
*/
protected String buildHmacAuthorization(String contents, String apiKey, String secretKey) throws UnsupportedEncodingException, GeneralSecurityException {
return String.format("HmacSHA1 %s %s",apiKey, CryptoUtils.computeSignature("HmacSHA1", contents, secretKey));
}
protected void setUpAuthenticator() {
User user = UserBuilder
.getInstance()
.setUUID("abc123")
.setSecretKey("def456")
.build();
//
UserDao userDao = mock(UserDao.class);
when(userDao.getUserByUUID("abc123")).thenReturn(user);
HmacAuthenticator authenticator = new HmacAuthenticator();
authenticator.setUserDao(userDao);
addProvider(new HmacAuthProvider<User>(authenticator, "REST"));
}
}
Again, the above code is not perfect, but the idea is to allow a mocked up UserDao to provide a standard user with a known shared secret key. You'd have to introduce your own UserBuilder implementation for testing purposes.
Finally, with the above code a Dropwizard Resource that had an endpoint like this:
import com.google.common.base.Optional;
import com.yammer.dropwizard.auth.Auth;
import com.yammer.metrics.annotation.Timed;
import org.multibit.mbm.core.Saying;
import org.multibit.mbm.persistence.dto.User;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.QueryParam;
import javax.ws.rs.core.MediaType;
import java.util.concurrent.atomic.AtomicLong;
#Path("/")
#Produces(MediaType.APPLICATION_JSON)
public class HelloWorldResource {
private final String template;
private final String defaultName;
private final AtomicLong counter;
public HelloWorldResource(String template, String defaultName) {
this.template = template;
this.defaultName = defaultName;
this.counter = new AtomicLong();
}
#GET
#Timed
#Path("/hello-world")
public Saying sayHello(#QueryParam("name") Optional<String> name) {
return new Saying(counter.incrementAndGet(),
String.format(template, name.or(defaultName)));
}
#GET
#Timed
#Path("/secret")
public Saying saySecuredHello(#Auth User user) {
return new Saying(counter.incrementAndGet(),
"You cracked the code!");
}
}
could be tested with a unit test that was configured like this:
import org.junit.Test;
import org.multibit.mbm.core.Saying;
import org.multibit.mbm.test.BaseResourceTest;
import javax.ws.rs.core.HttpHeaders;
import static org.junit.Assert.assertEquals;
public class HelloWorldResourceTest extends BaseResourceTest {
#Override
protected void setUpResources() {
addResource(new HelloWorldResource("Hello, %s!","Stranger"));
setUpAuthenticator();
}
#Test
public void simpleResourceTest() throws Exception {
Saying expectedSaying = new Saying(1,"Hello, Stranger!");
Saying actualSaying = client()
.resource("/hello-world")
.get(Saying.class);
assertEquals("GET hello-world returns a default",expectedSaying.getContent(),actualSaying.getContent());
}
#Test
public void hmacResourceTest() throws Exception {
String authorization = buildHmacAuthorization("/secret", "abc123", "def456");
Saying actual = client()
.resource("/secret")
.header(HttpHeaders.AUTHORIZATION, authorization)
.get(Saying.class);
assertEquals("GET secret returns unauthorized","You cracked the code!", actual.getContent());
}
}
Hope this helps you get started.