Are there any ant tasks out there for listing running processes and optionally killing them, something that is not platform dependent (i.e wraps both tasklist and ps) ?
Related
I need to change some plugin config files before plugins are loaded. I looked into the init.groovy.d, however it seems to run Groovy scripts in that directory after plugins have been loaded and therefore would require a restart to apply. Is there a way to run Groovy scripts before Jenkins loads the plugins?
What you are requesting is not necessary. Generally, when adding plugins, they come unconfigured. Jenkins starts, loads the plugins, then you can configure via init.groovy, CasC, etc., similar to you adding via GUI (add, restart, configure).
We start w/war file, wrapper, init.groovy.d, plus a variant of the docker install_plugins.sh. Other than the war, the wrapper and wrapper.conf, install_plugins.sh and plugin list, and all the init scripts are controlled in a git repo, which we pull down.
dumping the plugins into the plugins dir, then launch jenkins.sh.
The init.groovy runs automatically after initialization and configures all system, global, tools and plugins values, as well as credentials values, also creating/configuring nodes.
NB: best to use 1 init script per section or plugin as a failure in any init script will quietly fail, effectively skipping the rest of the script.
You may need to .save() after setting most parameters via init.goovy. Perhaps that's why you did not see the changes.
If you were really paranoid, you could first invoke Hudson.instance.doQuietDown(), which effective blocks the queue (multiple init.groovy scripts execute in lexical order), do all the configurations, then invoke doCancelQuietDown(), but we'd had no issues w/o that.
This approach (init.groovy.d) works fine, but looking to switch to JCasC now that it's matured. CasC is a simpler to manage (again, using separate config files for each plugin) and read.
I want to unleash the power of parallelism by running some Ivy-related Ant tasks on local Bamboo agents. Our Bamboo machine has plenties of CPU horse power and RAM.
If I split my build task into parallel jobs, each that will produce a different artifact based on the result of ivy-retrieve, I have solved my problem in the theory.
In practice, unfortunately, if two Ant tasks happen, for some reason, to run simultaneously on the same machine and the same organization-artifact, they will clash and one gets an XML error.
I don't have the exact error message with me because 1) the problem is randomic to reproduce and 2) I have already done lots of work to put all the jobs into a sequential one. But I have a clear idea of what is happening.
When Ant runs an ivy-retrieve, code below, it will use local user's cache directory, which happens to be /home/bamboo/.ivy2/cache. There I can find lots of resolved-[org]-[artifact]-[version].xml files (each is a different build version of my project). The problem occurs when I want to run the ivy-retrieve task twice, one for example for compile configuration and another for runtime. The two XMLs will clash and Ivy will report a SAX error in reading one of the files, because it looks like it's being written at the moment.
If I run the job on remote agents I expect no problem, but hey I already have 5 local agents and Bamboo won't fire remote agents if the locals are free.
Unfortunately all of my jobs, independent each other, require a different ivy-retrieve. Currently I run them in sequence.
The question is
Is it possible to tell Ivy, running on Bamboo agent, to use a temporary unique cache directory for its work on dependencies.xml files rather than to use global cache? Or at most to synchronize access to files?
The second option would have parallel Ant process read&write the cached dependencies.xml file mutually exclusively. So what they read will be always a consistent file (being the same exact file, I don't care if a process overwrites another)
Ivy has 2 caches - repository cache and resolution cache. The second one is overwritten each resolution, and should never be used by multiple processes at the same time.
Set an environment variable pointing to a temporary directory in
your bamboo agent.
Create a separate ivysettings.xml file for your project.
User an environment variable in project's ivysettings.xml to setup a cache directory.
Here is an example of ivysettings.xml:
<ivysettings>
<properties environment="env" />
<caches resolutionCacheDir="${env.TEMP_RESOLUTION_CACHE}" />
<settings defaultResolver="local" />
<statuses default="development">
<status name="release" integration="false"/>
<status name="integration" integration="true"/>
<status name="development" integration="true"/>
</statuses>
...
</ivysettings>
Or you can give a try to a lock-strategy. I haven't tried it.
I am working on a project and saw the following configuration with a comment in a properties file.
# Forking just invokes the JVM externally, and doesn't exhibit any performance benefit.
javac.fork.mode=no
I am curious about what this means.
After several google searches, I still can't find a specific article about this. Could someone point me to a good resource?
An option from the ant's <javac> task.
When fork=true ant will run the java-Compiler in it's own jvm.
http://ant.apache.org/manual/Tasks/javac.html
Forking allows javac to run as external process in its own heap space thus limits the memory leak to external process without affecting the parent java process.
Check the thread below which tells other options to be used along with 'fork' if you are using 'javac' Ant task.
How to tell ant to build using a specific javac executable?
I need to fork a new process for a specific ant task. I dont see a fork attribute in the taskdef how do I do it ?
I should be clearer, I am not talking about executing ANT in a forked process:
I have an ant task X, which I need to run in a forked process. Its some third party task which i use with taskdef X and then use this way
Is there anyway to tell any that anytime i use that task please fork the process and run ?
See Running Ant via Java in the Ant manual.
We use TFS 2010 and Automated builds.
We also make use of MSTests.
I would like some concrete information about the build server's test execution method.
Will the test engine (on build server) run the unit tests sequentially or in parallel?
By default it will run them sequentially. You can customize the build workflow by adding a Parallel activity and running different sets of tests in each. Or if you want to parallelize the test run across multiple build machines you can have the build use multiple RunOnAgent activities to do so (http://blogs.msdn.com/b/jimlamb/archive/2010/09/14/parallelized-builds-with-tfs2010.aspx).
Note: If you execute the tests across multiple test runs you will end up with multiple test reports (.trx files) that will not be merged together without further customization of the build.
#Dylan Smiths answer is correct, but does not cover option # 3.
Executing Unit Tests in parallel on a multi-CPU/core machine
DANGER WILL ROBINSON: This is only applicable to VS2010 and mstest.exe. VS2012 has a new test runner that does not support parallel test execution Visual Studio UserVoice Run unit tests in parallel The VS2012 test system can use the legacy testrunner, and you can make it work if you specify a .testsettings file using the MSTest/SettingsFile element. Configuring Unit Tests by using a .runsettings File
How to: Enable parallel test execution
Ensure you have a multi-core/CPU machine
Ensure you are running only unit tests
Ensure your tests are thread-safe
Ensure you do not have any data adapters on
Ensure you are running locally (cannot use TestController/TestAgent)
Modify your test settings file.
Right-click the test setting file and select "Open With" -> Open as Xml
Set the parallelTestCount attribute on the Execution element
Options are:
blank = 1 CPU/Core - this is the default
0 = Auto configure: We will use as many tests as we can based on your CPU and core count
n = The number of tests to run in parallel
Save your settings file