I am new to Grails and I have inherited an existing application. I have a big file message.properties that I would like to prune, in order to remove keys that are no longer used.
In Django there is a command makemessages that goes through all codebase and collects all strings that need translation, adding them to the messages file and commenting out the entries that no longer exist. Is there a similar tool for Grails? If it helps, the project is based on verions 1.3.9.
There is no such tool, but you can create your own gant script. Take a look at getting a list of all i18n properties used in a Grails application and process this list.
Related
I have a business web application made with Grails 2.5.0.
In this application, I use a GSP form to upload files and to fill a database.
In fact, I upload httpd conf files from a directory and I extract informations from them.
Now, I want to find a solution to do the same automatically. But I didn't find solution.
I tried Spring Batch plugin but it does work with Grails 2.5.0.Other Batch plugin seems too old to work.
Other way, it perhaps to import files at startup but I don't know if you can read files from a directory at startup ( e.g : from /var/myapp/conf ).
If you want to execute something when the application starts add it to BootStrap.groovy
Is there a way to generate entities from an existing database model or do I have to create all entities with yeoman (yo) on my own?
I've heard about such a technique from the Spring Roo project.
Please check this helper
https://www.npmjs.com/package/generator-jhipster-db-helper
as per the description of this helper "This JHipster module makes mapping on an existing database easier."
also you can use this tool :
https://github.com/Blackdread/sql-to-jdl
which will convert your sql files into idl files , then you can use jHipster import-idl functionality to import the database.
that will help you speed up the process of generating hipster application with an existing database
no you cannot, because the jhipster yeoman generator "only" scaffolds the entities according to the templates + the given parameters/choices. It does not ask external sources like databases in this step.
The generator creates all files for jpa, angular and liquibase changelogs. Finally, liquibase creates the tables using the changelogs during startup, if they don't exist yet.
So, you can say that jhipster uses an "entity first" instead of a "table first" approach.
Although it would be a nice feature, I don't think that it will be integrated into jhipster, because existing databases are so different that it would be too difficult to handle each possibility. There are different choices of primary keys, different datatypes, different realizations of many-to-many relations or generalizations and so on.
Or you can request a new feature on Github and maybe it will be implemented...
But, to give some directions:
I also had a same situation where I tried to migrate an existing database with about 50 tables and with lots of data to jhipster (this was jhipster 1.6 or so) and I also thought of a "database refactoring" [1]. However, my "solution" was to create a new database using jhipster and then migrate the data from the old database to the new one by using some sql statements.
The main reasons:
I had another database model that differs from the jhipster expected model (e.g. I used other primary keys and references)
No repositories nor angular stuff (which was my main reason to use jhipster) are generated
The liquibase changelogs are missing [2]
Such a special refactoring causes a lot of additional changes afterwards, when you try to generate a jhipster entity against an existing table. These changes could be more time-consuming than creating new entities with jhipster.
These changes could also cause problems in feature upgrades
and yes, roo has such a technique for reverse engineering or refactoring a database (http://docs.spring.io/spring-roo/reference/html/base-dbre.html). AFAIK, it only creates roo-conform entities that are based on JPA. So, it also differs to spring data JPA that is used by jhipster (same problem like other jpa-refectoring tools like [1])
[1] I used an eclipse JPA plugin that can create jpa entity classes from an existing database in another dropwizard-based project before. But, I don't tried it in combination with Spring/Jhipster.
[2] It is possible to create liquibase changelogs from an existing database: http://www.liquibase.org/documentation/generating_changelogs.html
Spring Roo includes the DBRE addon, a great tool for database reverse engineering that generates your domain entities automatically.
#eplog you are wrong, the DBRE lets you to use the --repository option to create Spring Data JPA Repositories for each entity. Take a look at http://docs.spring.io/spring-roo/docs/1.3.1.RELEASE/reference/html/base-dbre.html#d4e1765
Imho, the benefits that the DBRE provides you are:
You don't need to search, test and learn 3rd party tools, Roo does it for you.
Reverse egineering is a powerful tool to migrate legacy applications in those environments in which you cannot migrate the database
Hope it helps. Enjoy with Roo!
Yes you can!
Check out this stack overflow answer:
How to modify existing entity generated with jhipster?
AND, check this video out for auto-generating JPA annotated domain objects from an existing schema using Eclipse with JBoss Tools.
After you make the Hibernate config file, and you can open up the "Code Generation Tools", on the "Exporters" tab, mmake sure you select checkboxes for "Use Java 5 syntax", and "EJB3" annotations.
https://www.youtube.com/watch?v=KO_IdJbSJkI
Also, make sure your hibernate jar(s) are the same number number as your config, in my case I was doing Hibernate Spatial and the versions were mismatched from Hibernate-Core and wouldn't work for a min.
Out of the two options suggested in previous answers
https://www.npmjs.com/package/generator-jhipster-db-helper
https://github.com/Blackdread/sql-to-jdl
second option worked in a straight-forward manner. Below are the steps and pre-reuisites:
Install java and maven (you can use sdkman)
Clone the repo
modify ./src/main/resources/application.yml for your database's configurations specifically : spring -> datasource -> username, spring -> datasource -> password and application -> database-to-export
run 'mvn'
if you want to generate the jdl with a specific file name modify application -> export -> path per your needs.
This is part of an ongoing project... splitting out domain objects so they can be consumed by multiple applications. The database migration files for the domain objects live with the plugin... but we want the apps to be able to reference them during a dbm-update.
I can get the application to recognize the plugin changelog, but after that, the changelog does not perform includes and process them as I expect.
Using GrailsPluginUtils I am able to get the path of the plugin and the plugin changelog, with which I do an include file. If I put the changeSet right in that file I am good, it runs. If I move it to a separate file in the same folder, or in a sub-folder, and reference it via "./someFile.groovy" it seems to FIND it but does not process it. I say it seems to find it because if I do NOT use a relative file path, the migration process throws an error saying it cannot find the file e.g., "someFile.groovy"...
I have workarounds but they are not acceptable because we want to control the order of how the DB migrations occur by using sub-directors with a _changelog.groovy that then includes the actual transformations (changeSets). But they are not being "include"ed.
If I use includeAll, it will grab any and all scripts in that one folder, but again, does not process any other includes referenced therein. I can write a script to scan the folders recursively but again, that requires a lot of coding to parse the _changelogs and grab the appropriate inclusion order, etc.
I really just want "include file:" to work as it does in a given application for its own changelog files.
Has anyone else done this? Am I missing something terribly obvious?
In the app...
databaseChangeLog {
...
include file: "${GrailsPluginUtils.pluginInfos.find { it.name == 'my-plugin' }.pluginDir}/grails-app/migrations/my_plugin_changes"
}
... in the plugin...
databaseChangeLog {
include file: "./someChangeLogChangeSet.groovy"
}
Thank you...
We had also tried adding changelog files to the plugin where our domains live but were unable to access it from the main app. However, if you want to access your files from other location then you could specify this property in your config file and give the folder name here Or even copy all the migrations to the appropriate location to the main app.
grails.plugin.databasemigration.changelogLocation = 'migrations'
If you do find a actual solution to this, please post it.
Is there a way to override at runtime the value of a property defined in a message bundle?
My grails application contains a property in the messages.properties file:
page1.para1.text=Some text to display to the user
My Config.groovy defines the following config location:
grails.config.locations = [ "file:${userHome}/.myApp/myApp-config.properties" ]
I currently use this approach to override Config.groovy properties (like db connections, etc), but it doesn't seen to apply to message bundle properties.
I was hoping/expecting to just make sure that the myApp-config.properties file contains my new property value, restart the Tomcat server where my app is deployed and it would get picked up and displayed on my page:
page1.para1.text=Some DIFFERENT text to display to the user
Grails docs on Internalization/Message bundles grails i18n doesn't suggest if this is possible or not.
Obviously, I'm trying to achieve this change without the need to recompile and redeliver my Grails application.
Any ideas?
Thanks in advance.
When you are already live and don't want to create a new .war file:
I'm not sure, but the .war file can be found unzipped on the server. You might try to replace the message files directly on the server, but a restart of the app might be necessary. But I wouldn't advice doing so.
If you need to often change the message bundles at runtime, I guess it would make sense to store them in the database. But that means that you have to change your code a little bit and redeploy it once. There is a blog entry which describes how to do it: http://graemerocher.blogspot.de/2010/04/reading-i18n-messages-from-database.html
Another SO question handles the case that you want to store changes to the messages in a DB but fall back to the files:
Grails i18n From Database but Default Back To File
hth
In theory you should be able to replace the messageSource bean with a ReloadableResourceBundleMessageSource inside Resources.groovy. This way you can not only point it to a new location but also declare how often they should be invalidated as cached values.
If I create a Grails app called a-b-c-d, doing a grails create-domain-class User will result in Grails creating a class User in the sub-directory grails-app/domain/a/b/c/d, giving it the package a.b.c.d. How do I prevent Grails from creating these package names?
You should definitely use packages, but you can customize the default package by changing the value of grails.project.groupId in Config.groovy. The default value is appName which is your application name, but you can change it to any value package, e.g. 'com.foo.bar'.
In addition you can specify the package when running a create script, and if you do want to create classes in the default package, you can use this syntax:
grails create-service .Person
and it won't use a package.
I have no idea, this sounds like a bug. There are two obvious workarounds
change your app to have a name without dashes
don't use the Grails commands create-domain-class, create-controller-class, etc. I never use these commands because they don't actually do anything other than creating the class (and a corresponding empty test class). Personally, I find it easier just to create the class myself than to run the Grails command
You can specify the package when you call the cli command...
grails create-domain-class your.package.name.User