At my organization IT will not permit me to download the apoc jar library. My next option is to download the source and compile. However the rub in that is that Gradle is not an approved tool nor do I have Internet access.
Is there a manual process for me to compile the source using Java? What are the other dependencies?
You could use a service like https://travis-ci.org for your own build. APOC uses https://travis-ci.org/neo4j-contrib/neo4j-apoc-procedures for automating tests after a git push (or for testing PRs).
In the same way you should be able to run travis on your own, maybe on a forked version of apoc that uses gradle shadowJar. Additionally you need to tweak travis config to deploy your artifact to an approriate location - see https://docs.travis-ci.com/user/deployment/ for details.
On a different notice: If my organization would not permit me to download jar artifacts, nor provide internet access to run builds, my immediate next action would be updating mv CV - just sayin'.
Related
I have a Jenkins job that gets the code from version control and builds (like what a normal pipeline do), I was doing is that after building the project, I download the build and use FTP to transfer that build to the client's server then I unzip it and then copy the whole build because I copy whole build my application's down time is very high. (I have to use FTP because as a service provider we have some limitations and can't change this policy)
What I wanted to do is that Jenkins know what is changed when it is building so Jenkins will create a package with all the changes and with the correct path where the file should go, and I can download that package and copy that package and just run the package so whatever was changed only that should get updated.
Is that possible? Is there any plugin that I can use?
This really depends on the build tool/language you are using to build you application. I dont think there is a generic jenkins plugin.
Other idea would be to upload your package to a local Nexus server. Download after the next build and the compare the files from old and new build. With this information you can create a patch package for your clienst server.
Good afternoon,
As I understand Jenkins, if I need to install a plugin, it goes to Jenkins Plugins
The problem I have is Jenkins is installed on a closed network, it cannot access the internet. Is there a way I can download all of the plugins, place them on a web server on my local LAN, and have Jenkins reach out and download plugins as necessary? I could download everything and install one plugin at a time, but that seems a little tedious.
You could follow some or all of the instructions for setting up an artifactory mirror for the plugin repo.
It will need to be a http/https server and you will find that many plugins have a multitude of dependencies
The closed network problem:
You can take a cue from the Jenkins Docker install-plugins.sh approach ...
This script takes as input a list of plugins, and optionally versions (eg: $0 workflow-aggregator:2.6 pipeline-maven:3.6.5 job-dsl:1.70) and will download all the plugins and dependencies into a working directory.
Our approach is to create a file (under version control) and redirect that to the command line input (ie: install-plugins.sh $(< plugins.lst).
You can download from where you do have internet access and then place on your network, manually copying them to your ${JENKINS_HOME}/plugins directory and restart the instance.
The tedious list problem:
If you only specify top-level plugins (ie: what you need), every time you run the script, it will resolve the latest dependencies. Makes for a short list, but random dependencies if they get updated at https://updates.jenkins.io. You can use a two-step approach to address this. Use the short-list to download the required plugins and dependencies. Store the generated explicit list for future reference or repeatability.
I'm developing an open source project containing a number of optimization tools. I've uploaded the project to github and I would like to automatically run the test suite every time someone submits a pull request. To this extend I was planning on using travis-ci. Problem is that the test suite depends on a 3rd party solver (IBM cplex).
To run the test suite locally on my computer, I would do the following:
Download and install solver IBM Cplex
Install cplex.jar in my local maven repository: mvn install:install-file -DgroupId=cplex -DartifactId=cplex -Dversion=12.6.1 -Dpackaging=jar -Dfile=/opt/ILOG/CPLEX_Studio1261/cplex/lib/cplex.jar
Set my LD_LIBRARY_PATH variable to point to the solver's native libraries: export LD_LIBRARY_PATH=/opt/ILOG/CPLEX_Studio1261/cplex/bin/x86-64_linux/:$LD_LIBRARY_PATH
Compile/run the test suite.
Problems:
Cplex is not open source; I don't want to upload it to my github repository. In addition, its unpacked size is quite big (1GB).
Is there a way to uploaded the necessary solver files to travis-ci without making them publicly available? This stack overflow question describes how I could get my cplex.jar into travis, but as far as I can tell I would have to put the jar on some webserver and add a clearly readable link to in in the .travis.yml file.
Even if I manage to get cplex.jar into travis, how do I get the native libraries there as well? Their size is quite big, so it would be undesirable if travis has to download these libraries every time it has to perform a build. Furthermore, I don't want to make these libraries available to anyone but the travis test system.
If it turns out that the above is not possible. Is there another CI system, perhaps one that I can run on a private server, that could do this and run whenever a pull-request is submitted through github?
You may want to look at Travis file encryption. You would still need to add the (albeit) encrypted cplex.jar to your git repository, but at least it wouldn't be public. I can see why this would not be ideal in your type of situation but since you didn't mention it, I wrote this answer just in case.
Alternatively, you could also store the cplex.jar on your own server, and then store the URL in an encrypted environment variable.
In my project we are using QC to execute our test cases(QTP), moving forward we would be eliminating QC (for cost reasons).
As far as I explored MSBuild & Jenkins, they would be suitable.
But MSBuild will trigger the execution when a new build pushed to the repository. Also it will automatically test on the latest build.
Is there any other CI tool available to execute test cases through QTP?
I will be executing automation once in a release. Also we install our application by manual since it requires lots of configuration.
Take a look at HP Application Automation Tools.
This plugin basically replaces the need for QC, and is developed by HP.
Create a Jenkins job using this plugin on the same Jenkins installation used to build your code, then you can configure your job to run your tests as soon the code is available (e.g. on a nightly basis).
See here for a helpful guide on how to implement a simple Jenkins job using this plugin.
They also host the code on Github, which is very useful if you need to change the behavior of the plugin to suit your needs.
Is there any Repository manager that manages the binary dll files and also integrates well with the Jenkins?
Can Nexus be used to manage the dll files as these files are created as a part of Embedded C/C++ Projects and not sure if Nexus Artifact Manager supports/integrates well with such Projects as it mainly supports the Java projects?
Is there a way to automatically manage the upload and download of such project artifacts from Nexus/other artifact managers without the use of POM file?
Suggest in case there are other Artifact Managers that supports binary artifacts.
Artifactory can be used to store any type of binaries.
Starting with Artifactory 4.0, you can create generic repositories which allows uploading packages of any type. You will not need to upload any POM files and Artifactory will not need to calculate any metadata (for example Maven metadata).
To deploy files you can use the REST API or the UI, for example:
curl -uUSER:PASS -T file.dll http://localhost:8081/artifactory/dll-local/path/to/file.dll
If you have a certain layout you would like to use for this repository you can create a custom layout and associate it with the repository. This can be useful for automatic snapshot/integration versions cleanup and other module management tasks.
Disclaimer: I'm affiliated with Artifactory
The Nexus repository manager is java oriented, but can be used to store any files you want. Binaries of all types or even just text configuration files.
To automate the file upload process, you can use maven from command line:
mvn deploy:deploy-file -DgroupId=com.you -DartifactId=file -Dversion=1.0 -Dpackaging=exe -Dfile=c:\out\file.exe -Durl=http://yourserver/nexus/content/repositories/releases -DrepositoryId=releases
Then, to get the file, you should be able to get it directly with the following URL:
wget http://yourserver/nexus/content/repositories/releases/com/you/file/1.0/file-1.0.exe
This is a simple approach to using Nexus as a general artifact repository.
I hope this helps.
The open source version of Nexus (Nexus OSS) is supports many repository formats out of the box including Maven, NuGet, NPM, RubyGems and others. Nexus just runs on Java (e.g. like Jenkins). It is not Java only...
Depending on how you plan to get the DLL files from the repository, different formats might be more or less suited to your usage. You could even use a custom format, but then you rely custom tools.
The scenarios I have seen at many customers are
using a Maven repo and pulling the files in either in a Maven build together with the Maven NAR Plugin (used for native development with C/C++)
using a Maven repo and pulling via plan HTTP GET calls using your scripting language/build tool of choice
using NuGet format and store the DLLs in NuGet packages in the repo and using nuget to retrieve them for the projects
All of these work well.