I am using a plugin for Grails - the Amazon S3 plugin - and the domain object provided by the plugin doesn't specify the Id Generator. I am using Postgresql and require the id genrator to be identity.
I could copy the plugin in my plugins directory and mess with the domain object provided but that doesn't sound like a clean approach.Could I add the correct id generation at runtime? Or maybe there is a better way.
If you are using 1.2, you could provide a default mapping for all your GORM classes, including generator.
grails.gorm.default.mapping = {
id generator:'sequence'
}
See more in 1.2 release notes.
I think you could copy just the S3Asset.groovy into src/groovy/. From memory, your class should override the one provided by the plugin. I've used this technique to tweak a couple of plugins until bugs were fixed. But I haven't tried it with domain classes only *GrailsPlugin.groovy files.
Also, Jean's suggestion above is a good one!
cheers
Lee
Related
I am working on a NopCommerce website and have quite a bit of site-wide customization so I have created a plugin to handle it all but not sure on how to handle the localization. I see there are a couple of ways of updating the Localization strings, one way I have found is in the Plugin's Install() method:
this.AddOrUpdatePluginLocaleResource("Plugins.Payments.PayPalStandard.Fields.AdditionalFee", "Additional fee");
This looks like it only adds new resource strings for the plugin, is there a similar way to update the other resources via the Install() method like:
Admin.Catalog.Products.List.DownloadPDF
I found that there is a way to export the entire language to a language_pack.xml file, would it be better to just create an entire language pack instead? Is there a way to add a new language pack from the plugins Install() method?
I guess I could simply open the language_pack.xml file and add each resource found using the AddOrUpdatePluginLocaleResource, I was hoping that there was a built-in way of doing this using NopCommernce functionality.
Thanks!
As #Raphael suggested in a comment, provide a language pack along with plugin file to the end users, and give an option to upload required resource file within your plugin configuration page.
As per as I know, there is no inbuilt way to add language pack on plugin installation, but you can do some code on plugin install method to find language pack file(s) from plugin folder and install it, not quite sure, you can take reference of inbuilt methods.
Is there a way to generate entities from an existing database model or do I have to create all entities with yeoman (yo) on my own?
I've heard about such a technique from the Spring Roo project.
Please check this helper
https://www.npmjs.com/package/generator-jhipster-db-helper
as per the description of this helper "This JHipster module makes mapping on an existing database easier."
also you can use this tool :
https://github.com/Blackdread/sql-to-jdl
which will convert your sql files into idl files , then you can use jHipster import-idl functionality to import the database.
that will help you speed up the process of generating hipster application with an existing database
no you cannot, because the jhipster yeoman generator "only" scaffolds the entities according to the templates + the given parameters/choices. It does not ask external sources like databases in this step.
The generator creates all files for jpa, angular and liquibase changelogs. Finally, liquibase creates the tables using the changelogs during startup, if they don't exist yet.
So, you can say that jhipster uses an "entity first" instead of a "table first" approach.
Although it would be a nice feature, I don't think that it will be integrated into jhipster, because existing databases are so different that it would be too difficult to handle each possibility. There are different choices of primary keys, different datatypes, different realizations of many-to-many relations or generalizations and so on.
Or you can request a new feature on Github and maybe it will be implemented...
But, to give some directions:
I also had a same situation where I tried to migrate an existing database with about 50 tables and with lots of data to jhipster (this was jhipster 1.6 or so) and I also thought of a "database refactoring" [1]. However, my "solution" was to create a new database using jhipster and then migrate the data from the old database to the new one by using some sql statements.
The main reasons:
I had another database model that differs from the jhipster expected model (e.g. I used other primary keys and references)
No repositories nor angular stuff (which was my main reason to use jhipster) are generated
The liquibase changelogs are missing [2]
Such a special refactoring causes a lot of additional changes afterwards, when you try to generate a jhipster entity against an existing table. These changes could be more time-consuming than creating new entities with jhipster.
These changes could also cause problems in feature upgrades
and yes, roo has such a technique for reverse engineering or refactoring a database (http://docs.spring.io/spring-roo/reference/html/base-dbre.html). AFAIK, it only creates roo-conform entities that are based on JPA. So, it also differs to spring data JPA that is used by jhipster (same problem like other jpa-refectoring tools like [1])
[1] I used an eclipse JPA plugin that can create jpa entity classes from an existing database in another dropwizard-based project before. But, I don't tried it in combination with Spring/Jhipster.
[2] It is possible to create liquibase changelogs from an existing database: http://www.liquibase.org/documentation/generating_changelogs.html
Spring Roo includes the DBRE addon, a great tool for database reverse engineering that generates your domain entities automatically.
#eplog you are wrong, the DBRE lets you to use the --repository option to create Spring Data JPA Repositories for each entity. Take a look at http://docs.spring.io/spring-roo/docs/1.3.1.RELEASE/reference/html/base-dbre.html#d4e1765
Imho, the benefits that the DBRE provides you are:
You don't need to search, test and learn 3rd party tools, Roo does it for you.
Reverse egineering is a powerful tool to migrate legacy applications in those environments in which you cannot migrate the database
Hope it helps. Enjoy with Roo!
Yes you can!
Check out this stack overflow answer:
How to modify existing entity generated with jhipster?
AND, check this video out for auto-generating JPA annotated domain objects from an existing schema using Eclipse with JBoss Tools.
After you make the Hibernate config file, and you can open up the "Code Generation Tools", on the "Exporters" tab, mmake sure you select checkboxes for "Use Java 5 syntax", and "EJB3" annotations.
https://www.youtube.com/watch?v=KO_IdJbSJkI
Also, make sure your hibernate jar(s) are the same number number as your config, in my case I was doing Hibernate Spatial and the versions were mismatched from Hibernate-Core and wouldn't work for a min.
Out of the two options suggested in previous answers
https://www.npmjs.com/package/generator-jhipster-db-helper
https://github.com/Blackdread/sql-to-jdl
second option worked in a straight-forward manner. Below are the steps and pre-reuisites:
Install java and maven (you can use sdkman)
Clone the repo
modify ./src/main/resources/application.yml for your database's configurations specifically : spring -> datasource -> username, spring -> datasource -> password and application -> database-to-export
run 'mvn'
if you want to generate the jdl with a specific file name modify application -> export -> path per your needs.
I'm developing a plugin for Redmine and encountered an issue of how to implement plugin specific settings in Redmine in the most neat way.
Is it possible to have a plugin specific settings in {redmine_home}/plugin/{my_plugin}/config/settings.yml while sharing with a core a model (in MVC terms) logic which reads YAML file, sets attributes of the model class, provides easy access to them, etc. ({redmine_home}/app/models/setting.rb)
I think copypasting or require'ing the core model in the plugin model is definitely a poor design so right now i'm tending to have a plugin specific settings in the core config {redmine_home}/config/settings.yml and when it comes to plugin controller to read a settings it relies on the core model to do that. ({redmine_home}/app/models/setting.rb)
Is this a proper design? Is there any better ways to do this?
Thanks.
I just checked 3 different plugins in our project all used something like:
options = YAML::load( File.open(File.join(Rails.root, 'plugins/fancy_plugin/config', 'settings.yml')))
So just copy pasting.
I have 1.3.2 installed. My investigations found this in scripts/DbmDiff.groovy:
// TODO this will fail with JNDI or encryption codec
buildOtherDatabase = { String otherEnv ->
Searching the web lead me to this:
https://github.com/grails-plugins/grails-database-migration/commit/ac38a7310fe48ba7b5c4dda4d6e30dd8040dbeb6
which is code for DbmDiff.groovy, but in spite of the same TODO comment, appears to handle jndi.
Does this mean that a 1.3.3 is coming soon with jndi support? If so, then I can work around for a while using a temporary env using urls etc.
Regards, John
I'm not sure when 1.3.3 will be released since this is the only fix so far. It shouldn't be too soon though. Until then you can copy the current script to your application's scripts folder. Then you run grails dbm-diff Grails will ask you which of the two scripts to run.
In my grails 1.3.7 project, I have put all of my classes in com.mycompany.myapp, as you do. So this goes for services, controllers, domain classes. I have a filter that goes in its own package. My app works fine.
However, when I run grails doc, grails decides to create two pages for every class:
one in its right comp.mycompany.myapp package that has all the right Groovy Doc
the other takes all the above classes and pretends as if those also live in the default package.
So, target/docs contains two directories: 'DefaultPackage' and 'com', with DefaultPackage holding a copy of everything that lives under com/
Consequently, my groovy doc looks messy because there is two copies for each class.
How can I solve this?
It has been documented as a bug at GRAILS-6605. There is no workaround listed there for the bug.
I too faced the same issue and so created a plugin "Grails Runtime Docs" ( http://grails.org/plugin/grails-runtime-docs ) that solves this issue and generates both Java and groovy docs properly only 1 copy per class. It's grails aware and categorizes the classes into Controllers, Commands, Domains, Services and Tag Libraries. The groovy documentation is actually generated from runtime so as to include the dynamic methods also, adding "Dynamic Method Summary" & "Dynamic Method Detail" in the generated html docs, that provide their source information. Hope you find it useful.