WSO2 GAR file WSDL to Schema dependencies - wsdl

I am uploading a GAR which has WSDL and associated Schemas. The dependencies (schema imports, includes) between one schema to another is showing up fine in the "Dependencies" and "Associations"
However the dependency from a WSDL to the Schema is not showing up for many in the "Dependencies" or "Associations" It shows up for some schemas and not the others. I have spent several hours buy not been able to make out what could cause this issue and why it works for some and not the others. Can someone help?

This turned out to be a typo in the WSDL where the imports for the schema had "schemalocation" attribute (with a lower case "l"), it should rather have been "scehmaLocation" (with an upper case "L")
Because of this the import dependencies were not recognized. After this was fixed, they are working fine now

Related

How do I get SvelteKit and TypeORM to work together?

I know that SvelteKit is still in beta and changing a lot, but I'd still like to ask about getting TypeORM to work with the SvelteKit dev server and production node server.
I made a minimal repository with a basic SvelteKit project (using npm init svelte#next) and TypeORM (using installation instructions here).
When I try to do npm run dev -- I get a warning that import-metadata isn't importing correctly:
reflect-metadata doesn't appear to be written in CJS, but also doesn't appear to be a valid ES module (i.e. it doesn't have "type": "module" or an .mjs extension for the entry point). Please contact the package author to fix.
Then I get an exception from TypeORM:
[vite] Error when evaluating SSR module /src/lib/User.ts:
ColumnTypeUndefinedError: Column type for User#id is not defined and cannot be guessed. Make sure you have turned on an "emitDecoratorMetadata": true option in tsconfig.json. Also make sure you have imported "reflect-metadata" on top of the main entry file in your application (before any entity imported).If you are using JavaScript instead of TypeScript you must explicitly provide a column type.
I'm not sure why the reflect-metadata library doesn't seem to be importing correctly. I've tried both import "reflect-metadata" and import * as Reflect from 'reflect-metadata'; but got the same error. Heck, I even tried copying and pasting the JS library into hooks.ts and commenting out the imports, but the same reflect-metadata and TypeORM errors seem to happen.
Right now my assumption is that if the reflect-metadata library can be imported correctly then that would fix the TypeORM error, but I'm not sure. Here's a relevant issue in the SvelteKit Github and here's a reddit thread I made without much help. I'm hoping that my issue can be solved without opening a new SvelteKit issue but I want to make sure that it's possible to use TypeORM with SvelteKit since it's the most starred JS ORM on Github.
If you have any ideas or solutions I'd love to hear them!

Generating code from yaml - more reliable than Swagger codegen?

(Sample) Problem:
When a yaml source contains conflicting definitions, e.g. myClass and MyClass both generating into a java class MyClass with one overwriting the other, no warning or error occurs.
Other similar cases can occur the same way (e.g. via implicitly generated classes from complex entries of type: array).
Framing
has to work without modifying the source file
there is exactly one source file, for whatever that's worth
has to work as a Maven plugin
tested with swagger-codegen-maven-plugin 2.4.0 and 2.4.22
am familiar with JAXB2 and both of its common Maven plugins:
jaxb2-maven-plugin
maven-jaxb2-plugin
am not familiar with swagger-codegen-maven-plugin
did not find solutions (or even corresponding problem reports) using Google searches
swagger codegen strict
swagger-codegen-maven-plugin strict
swagger codegen name collision
swagger-codegen-maven-plugin name collision
swagger codegen resolve name collision
swagger-codegen-maven-plugin resolve name collision
found this closed issue describing a similar issue (silently generates corrupt code), which appears to be mostly unrelated and is alledgedly long fixed
Question
What is the proper solution that produces
an eagerly failing generation process
amended via externally configured customBindings
without touching the yaml source file
?
Solution ideas:
Use JAXB2 instead
JAXB2
fails generation where a generated file already exists
allows a separate customBindings.xjs file to modify generation of
classes
can be adapted to work with YAML using
JacksonJaxbYAMLProvider
However it seems that the common Maven plugins jaxb2-maven-plugin and maven-jaxb2-plugin do not support YAMl definition files (out of the box?). [1]
Use Mustache files to resolve the name collision
The Mustache language seems to provide the necessary facilities to implement rules similar to JAXBs xjb.
However
binding a Mustache template to a class has to be done by name (<class name>.mustache), aka. name collision in particular cannot be resolved from an external file [2]
this does not help with the silence of errors in generated code - one cannot fix the error, while unaware of it
1 - Searches for this topic have yielded one result that it does not work out-of-the-box (duh) and no results whether it can be made to work using the extensions interface both plugins support.
2 - I could not find the relevant part directly within Swagger codegen documentation, but here is a relevant part of the implemented-by-Swagger OpenAPI generator documentation.

Is it possible to generate entities from an existing database model in JHipster?

Is there a way to generate entities from an existing database model or do I have to create all entities with yeoman (yo) on my own?
I've heard about such a technique from the Spring Roo project.
Please check this helper
https://www.npmjs.com/package/generator-jhipster-db-helper
as per the description of this helper "This JHipster module makes mapping on an existing database easier."
also you can use this tool :
https://github.com/Blackdread/sql-to-jdl
which will convert your sql files into idl files , then you can use jHipster import-idl functionality to import the database.
that will help you speed up the process of generating hipster application with an existing database
no you cannot, because the jhipster yeoman generator "only" scaffolds the entities according to the templates + the given parameters/choices. It does not ask external sources like databases in this step.
The generator creates all files for jpa, angular and liquibase changelogs. Finally, liquibase creates the tables using the changelogs during startup, if they don't exist yet.
So, you can say that jhipster uses an "entity first" instead of a "table first" approach.
Although it would be a nice feature, I don't think that it will be integrated into jhipster, because existing databases are so different that it would be too difficult to handle each possibility. There are different choices of primary keys, different datatypes, different realizations of many-to-many relations or generalizations and so on.
Or you can request a new feature on Github and maybe it will be implemented...
But, to give some directions:
I also had a same situation where I tried to migrate an existing database with about 50 tables and with lots of data to jhipster (this was jhipster 1.6 or so) and I also thought of a "database refactoring" [1]. However, my "solution" was to create a new database using jhipster and then migrate the data from the old database to the new one by using some sql statements.
The main reasons:
I had another database model that differs from the jhipster expected model (e.g. I used other primary keys and references)
No repositories nor angular stuff (which was my main reason to use jhipster) are generated
The liquibase changelogs are missing [2]
Such a special refactoring causes a lot of additional changes afterwards, when you try to generate a jhipster entity against an existing table. These changes could be more time-consuming than creating new entities with jhipster.
These changes could also cause problems in feature upgrades
and yes, roo has such a technique for reverse engineering or refactoring a database (http://docs.spring.io/spring-roo/reference/html/base-dbre.html). AFAIK, it only creates roo-conform entities that are based on JPA. So, it also differs to spring data JPA that is used by jhipster (same problem like other jpa-refectoring tools like [1])
[1] I used an eclipse JPA plugin that can create jpa entity classes from an existing database in another dropwizard-based project before. But, I don't tried it in combination with Spring/Jhipster.
[2] It is possible to create liquibase changelogs from an existing database: http://www.liquibase.org/documentation/generating_changelogs.html
Spring Roo includes the DBRE addon, a great tool for database reverse engineering that generates your domain entities automatically.
#eplog you are wrong, the DBRE lets you to use the --repository option to create Spring Data JPA Repositories for each entity. Take a look at http://docs.spring.io/spring-roo/docs/1.3.1.RELEASE/reference/html/base-dbre.html#d4e1765
Imho, the benefits that the DBRE provides you are:
You don't need to search, test and learn 3rd party tools, Roo does it for you.
Reverse egineering is a powerful tool to migrate legacy applications in those environments in which you cannot migrate the database
Hope it helps. Enjoy with Roo!
Yes you can!
Check out this stack overflow answer:
How to modify existing entity generated with jhipster?
AND, check this video out for auto-generating JPA annotated domain objects from an existing schema using Eclipse with JBoss Tools.
After you make the Hibernate config file, and you can open up the "Code Generation Tools", on the "Exporters" tab, mmake sure you select checkboxes for "Use Java 5 syntax", and "EJB3" annotations.
https://www.youtube.com/watch?v=KO_IdJbSJkI
Also, make sure your hibernate jar(s) are the same number number as your config, in my case I was doing Hibernate Spatial and the versions were mismatched from Hibernate-Core and wouldn't work for a min.
Out of the two options suggested in previous answers
https://www.npmjs.com/package/generator-jhipster-db-helper
https://github.com/Blackdread/sql-to-jdl
second option worked in a straight-forward manner. Below are the steps and pre-reuisites:
Install java and maven (you can use sdkman)
Clone the repo
modify ./src/main/resources/application.yml for your database's configurations specifically : spring -> datasource -> username, spring -> datasource -> password and application -> database-to-export
run 'mvn'
if you want to generate the jdl with a specific file name modify application -> export -> path per your needs.

`Unable to load the specified metadata resource` suddenly appears without code changes

PLEASE NOTE: None of the answers in the link above (which I don't seem to be able to remove) helped me. As I explain below, I had already tried all that stuff
I have a web site, developed in VS2013 using ASP.NET MVC5/WebAPI2, which has several related projects such as a service layer, repository layer, etc. Down at the bottom of the stack is a class library that holds an EF model. I have separated the actual entities into another class library, to allow them to be reused without requiring the model library.
All of this has been working fine. The web site was running, and I could make calls to the WebAPI methods as well.
I just uploaded the latest version to the production server, and all is working fine there. Then came back to VS to carry on work, and when I try to run up the web site, I get an exception
Unable to load the specified metadata resource
Searching around, it seems that the cure for this is to modify the connection string to point to the actual assembly name instead of using the default . I have two problems with this, first is that none of the config files in the solution have been touched today (by me at least, and the file history form source control confirms this), so there's no reason why it should suddenly stop working after being deployed, and second, even if I add the assembly name, I get the same exception.
Anyone any ideas what I can do? I'm completely stuffed now. Can't do anything.
Edit: I tried again to specify the assembly in the connection string, and now get the exception Unable to resolve assembly. I have checked the assembly name in a decompiler, and I'm pretty sure got it right.
Edit again: I just pulled the version that I deployed from source control, and that gives the same exception, so I'm sure this is nothing to do with any files I've changed (or even that have been changed by VS). The version on the production server is still working, but the source code that drives that exact same version gives the exception. So, I'm certain that the answer is NOT to be found in the myriad other versions of this question, but is somewhere else.
Found the problem, and am posting it here in the hope that it will help someone else, as I don't think this was clear in any of the other posts on this issue.
I have a layered solution, with the web project referencing a service layer, which references a repository layer which in turn references the model project. It seems to for EF to work, whichever layer actually causes the database to be accessed requires a reference to the model project. My service layer project, which was where ToList() was being called (thus enumerating the query, and causing the database to be hit) didn't have a reference to the model project, so was failing to load the assembly.
I didn't need to alter the metadata part of the connection string either, as once the service layer had a reference to the model project, it was able to find the resources by itself. Having said that, one thing I did learn from all of this is that you can speed up the creation of the model (slightly) by specifying the assembly containing the resources explicitly, as this saves the framework having to search through all loaded assemblies to find them. I'm not sure if this will make any noticeable difference, but it can't harm.
I still can't explain how this had been working up until now, and suddenly stopped, as I hadn't changed any references, nor the way I was doing the data access. Still, it seems to be working now, which is all that matters.
Hope this helps someone.

Grails migration issues. DBM-UPDATE include not recursing, fails silently

This is part of an ongoing project... splitting out domain objects so they can be consumed by multiple applications. The database migration files for the domain objects live with the plugin... but we want the apps to be able to reference them during a dbm-update.
I can get the application to recognize the plugin changelog, but after that, the changelog does not perform includes and process them as I expect.
Using GrailsPluginUtils I am able to get the path of the plugin and the plugin changelog, with which I do an include file. If I put the changeSet right in that file I am good, it runs. If I move it to a separate file in the same folder, or in a sub-folder, and reference it via "./someFile.groovy" it seems to FIND it but does not process it. I say it seems to find it because if I do NOT use a relative file path, the migration process throws an error saying it cannot find the file e.g., "someFile.groovy"...
I have workarounds but they are not acceptable because we want to control the order of how the DB migrations occur by using sub-directors with a _changelog.groovy that then includes the actual transformations (changeSets). But they are not being "include"ed.
If I use includeAll, it will grab any and all scripts in that one folder, but again, does not process any other includes referenced therein. I can write a script to scan the folders recursively but again, that requires a lot of coding to parse the _changelogs and grab the appropriate inclusion order, etc.
I really just want "include file:" to work as it does in a given application for its own changelog files.
Has anyone else done this? Am I missing something terribly obvious?
In the app...
databaseChangeLog {
...
include file: "${GrailsPluginUtils.pluginInfos.find { it.name == 'my-plugin' }.pluginDir}/grails-app/migrations/my_plugin_changes"
}
... in the plugin...
databaseChangeLog {
include file: "./someChangeLogChangeSet.groovy"
}
Thank you...
We had also tried adding changelog files to the plugin where our domains live but were unable to access it from the main app. However, if you want to access your files from other location then you could specify this property in your config file and give the folder name here Or even copy all the migrations to the appropriate location to the main app.
grails.plugin.databasemigration.changelogLocation = 'migrations'
If you do find a actual solution to this, please post it.

Resources