I'm developing a Grails 2.0.x application that of course has several external dependencies. Since I'm sitting behind a corporate firewall I've configurerd my ProxySettings.groovy to allow access to internet, which works as it should.
Now we also need to include dependencies to some local artifacts (from other projects), which can be found in our local Maven repository. Our corporate network setup is to use the proxy only for external sites, not internal ones. So when Grails resolve my dependencies at startup it downloads all the external artifacts fine, but fails when trying to fetch our local dependencies. If I completely remove the content of my ProxySettings.groovy-file, then the opposite is true, Grails can't resolve the external dependencies, but does manage to download the JARs from our local Maven repository.
I've tried to find documentation on how to exclude internal sites from using the proxy-settings in Grails, but failed miserably so far.
One other alternative could perhaps be to remove (or change) the proxy settings programmtically in BuildConfig.groovy before the call to mavenRepo?
Currently we are not using Maven to build our Grails projects (since we previously have had some issues with creating release builds on the build server).
Any help would be much appreciated!
Right now I do not thing there is an easy way to get around this.
There is currently an open bug for being able to switch the Proxy-Setting programmtically
http://jira.grails.org/browse/GRAILS-7658
Another option would be to move the internal dependencies inside your grails project.
or you could just dump everything in BuildConfig.groovy
System.properties.putAll([
"http.proxyHost": "myproxy.hostname.com",
"http.proxyPort": "8080",
"http.proxyUserName": "myUser",
"http.proxyPassword": "myPass"
])
clear it out for the internal dependencies and then you might be good.
Related
Good afternoon,
As I understand Jenkins, if I need to install a plugin, it goes to Jenkins Plugins
The problem I have is Jenkins is installed on a closed network, it cannot access the internet. Is there a way I can download all of the plugins, place them on a web server on my local LAN, and have Jenkins reach out and download plugins as necessary? I could download everything and install one plugin at a time, but that seems a little tedious.
You could follow some or all of the instructions for setting up an artifactory mirror for the plugin repo.
It will need to be a http/https server and you will find that many plugins have a multitude of dependencies
The closed network problem:
You can take a cue from the Jenkins Docker install-plugins.sh approach ...
This script takes as input a list of plugins, and optionally versions (eg: $0 workflow-aggregator:2.6 pipeline-maven:3.6.5 job-dsl:1.70) and will download all the plugins and dependencies into a working directory.
Our approach is to create a file (under version control) and redirect that to the command line input (ie: install-plugins.sh $(< plugins.lst).
You can download from where you do have internet access and then place on your network, manually copying them to your ${JENKINS_HOME}/plugins directory and restart the instance.
The tedious list problem:
If you only specify top-level plugins (ie: what you need), every time you run the script, it will resolve the latest dependencies. Makes for a short list, but random dependencies if they get updated at https://updates.jenkins.io. You can use a two-step approach to address this. Use the short-list to download the required plugins and dependencies. Store the generated explicit list for future reference or repeatability.
When calling the 'grails' command for the first time in the console, it is trying to resolve additional dependencies via the internet. However due to our corporate firewall we can only access those via an internal Artifactory respository that is supposed to act as a mirror. The repository is protected by username and password.
According to the Grails Documentation you can force Grails in the %GRAILS_HOME%\settings.groovy file to look for dependencies at a specific URL. However at the moment it is not possible to add credentials to that (see: https://github.com/grails/grails-core/issues/10013).
Is there any other way to automatically resolve all initial Grails dependencies with an internal artifact repository and credentials?
Note: I'm talking about the general Grails level, not the grails.project level
Update: This issue has been fixed with Grails 3.2; It now works like a charm. No more workarounds needed.
I am currently defining the project structure for a project that I am working on. The project is a simple SOA implementation and as such has a grails app and a number of different services.
I wanted to package these services into separate modules (jars) so that they can easily be deployed separately and there is no risk of cost-contamination of classes.
The project structure and dependancies could be visualised as:
Grails App (war)
|__ Service Gateway (jar)
|__Service A (jar)
|__Service B (jar)
Whilst these services will eventually be deployed seperately, for ease of local development I want to package them into a single grails app until such time as it is necessary to break them apart.
My ultimate goal was to be able to develop these services in the same way I would a simple grails app in that I would be able to change any class (within any of the modules) on the fly and have it picked up.
I am struggling though to see the best way in IntelliJ to represent this structure.
I had created seperate modules for each of the above and added the dependancies between them, but obviously grails has no idea of this at runtime.
I have read about and found the following possible solutions, all of which currently feel a bit unsatisfactory as would require a jar to be built meaning that classes cannot be reloaded on the fly.
Install the modules into the local maven repository and reference this in the grails build dependancies.
Drop the built jars into the lib directory.
Add them as grails plugins (seems a little heavy handed as they won't require grails functionality).
Find some other way of including the output directories for these modules on the grails classpath (not sure of the cleanest way to do this).
Thanks!
In the end, I went with a multi module Maven build. The key to the on the fly code deployment is using JRebel to monitor the output directories and reload the classes when they change.
We are using maven in the development process. Maven provides a nice feature of configuring the repositories. Using this feature I have created a remote internal repository and I can download the dependencies from that repository.
The development machines are pointing to this remote internal repository. Each development machine has its own local repository(~/.m2/repository/) and hence the dependencies of the project are downloaded from the remote internal repositor**y to the **local repository(~/.m2/repository/) on each developer machine.
Is there any way that the local repository(~/.m2/repository/) on developer machines can be set to the internal remote repository that we have created and which is used for downloading the dependencies from.
If take a look on Maven Introduction to Repositories first paragraph says:
There are strictly only two types of repositories: local and remote.
There is no way how you could change this behavior.
If you would handle that differently it would cause many problems. E.g. build would take much longer because of downloading file all files, IDE would work not work properly (project dependencies would not be stored local), ...
May I suggest another approach to share dependencies and artifacts. In our projects we use nexus as a proxy and repository for our artifacts. It works well with no issues. A basic configuration I already posted here.
After nexus is running you could also setup continous integration using jenkins and enjoy a fully automated environment.
Is your requirement to avoid each developer from having to download all dependencies to his local repository?
Assuming your remote internal repository has the same format as a maven local repository, you can achieve this by adding the following line in the settings.xml of all your developers.
<localRepository>shared-drive-location-of-remote-repository</localRepository>
I have a WAR file that was built using grails framework and OSGi plugin. When I try to deploy it in Eclipse Virgo it fails because the related dependencies are not yet deployed.
Is there a way to tell Virgo to fetch the dependency bundles from spring EBR repository for example ?
You can configure Virgo to use a remote repository to provide the missing dependencies automatically. However, there is an issue in using the SpringSource EBR as a remote repository because Spring framework, which is deployed during Virgo startup, has a very large number of optional/transitive dependencies. When these are pulled in from the EBR they include normally exclusive alternatives and the net effect is that Virgo startup fails.
So the recommendation is either to put the dependencies you want locally in repository/usr or, if you need a remote repository, to set up your own Virgo instance to act as a repository server and put the dependencies in the repository hosted by the repository server.