My development environment consists of Git repository, GitLab repository manager and Jenkins. During build process a documentation is generated with Doxygen in HTML format.
Is it possible to store that documentation on GitLab project's wiki? I know that HTML is not supported in Gollum, which is the Gitlab's wiki engine. Converting HTML to Markdown is not satisfactory because of internal links in HTML files that point to other HTML files.
Should I store documentation in a separate wiki instead and only commit a link to GitLab project wiki?
I guess the answer depends on what you use your HTML documentation for and how much you distribute it.
First of all, which version of Gitlab are you using ?
If it is only for your developers to get access to (aka these are private documents), then on Gitlab >= 8.1.2 there is a service called External Wiki that allows you to substitute the Wiki link of the project by any URL of your choice. Just set up a web server serving your html documentation, have your build server upload the newest version after each build and call it a day.
If your documentation is a code that you want to version control and distribute, then do so.
Whether you go for 1. or 2., it is still a good idea to have a the whole documentation in a separate git repository, because you get compression for free and using git pull is much better than using any rsync to synchronise any local or remote directories. Then is just a matter of setting up shell script or git hooks to automatically do all of it for you; either at commit or build time.
Hope this helps !
Related
My former web developer setup my site so that it uses Jenkins and GitHub. I understand the very basics of GitHub and even less of Jenkins. But in theory, when I make minor text changes to my website, can't GitHub manage the process of pushing those changes to the server? Or is there some good reason that Jenkins is also involved?
Thank you.
Yes. It's not a must but using both Jenkins and Github will make your life easy. Github and Jenkins are two tools that help you to do different functions.
Github will mainly help you to manage your codebase, resolve conflicts, etc. So it will basically behave as a repository. You can commit your changes and get other's updates and always be up to date. There are tons of other advantages but I'll keep it simple for understanding purpose.
Jenkins is an open-source automation server. In your case, you can automate the product building. For example if you have a test environment or even when you deploy the changes t live, you can do all that with just a click. And you can separately build tests and live environments and With concepts like pipeline, you can even integrate the building with tests, etc.
But if you are talking about your local environment, yes git is enough because you can build the project manually. but in production have git and jenkins both will be a handy option.
Read more on Jenkins
I'm developing an open source project containing a number of optimization tools. I've uploaded the project to github and I would like to automatically run the test suite every time someone submits a pull request. To this extend I was planning on using travis-ci. Problem is that the test suite depends on a 3rd party solver (IBM cplex).
To run the test suite locally on my computer, I would do the following:
Download and install solver IBM Cplex
Install cplex.jar in my local maven repository: mvn install:install-file -DgroupId=cplex -DartifactId=cplex -Dversion=12.6.1 -Dpackaging=jar -Dfile=/opt/ILOG/CPLEX_Studio1261/cplex/lib/cplex.jar
Set my LD_LIBRARY_PATH variable to point to the solver's native libraries: export LD_LIBRARY_PATH=/opt/ILOG/CPLEX_Studio1261/cplex/bin/x86-64_linux/:$LD_LIBRARY_PATH
Compile/run the test suite.
Problems:
Cplex is not open source; I don't want to upload it to my github repository. In addition, its unpacked size is quite big (1GB).
Is there a way to uploaded the necessary solver files to travis-ci without making them publicly available? This stack overflow question describes how I could get my cplex.jar into travis, but as far as I can tell I would have to put the jar on some webserver and add a clearly readable link to in in the .travis.yml file.
Even if I manage to get cplex.jar into travis, how do I get the native libraries there as well? Their size is quite big, so it would be undesirable if travis has to download these libraries every time it has to perform a build. Furthermore, I don't want to make these libraries available to anyone but the travis test system.
If it turns out that the above is not possible. Is there another CI system, perhaps one that I can run on a private server, that could do this and run whenever a pull-request is submitted through github?
You may want to look at Travis file encryption. You would still need to add the (albeit) encrypted cplex.jar to your git repository, but at least it wouldn't be public. I can see why this would not be ideal in your type of situation but since you didn't mention it, I wrote this answer just in case.
Alternatively, you could also store the cplex.jar on your own server, and then store the URL in an encrypted environment variable.
A web application typically consists of code, config and data. Code can often be made open source on GitHub. But per-instance config and data may contain secretes therefore are inappropriate be saved in GH. Data can be imported to a persistent storage so disregard for now.
Assuming the configs are file based and are saved in another private secured SVN repo, in order to deploy the web app to OpenShift and implement CI, I need to merge config files with code prior to running build scripts. In addition, the build strategy should support GH webhooks for automated build.
My questions are, to be more specific:
Does OS BuildConfig support multiple data sources, especially from svn?
If not, how to deploy such web app to OS?
The solution I came up with so far:
Instead of relying on OS for CI, use Jenkin instead.
Merge config files with code using Jenkins.
Instead of using Git source type in BuildConfig, use binary source instead
Let jenkins run
oc start-build --from-dir=<directory>
where <directory> contains merged code/config
I'm wondering if there is an easy way to "publish" p2 update sites in Jenkins (built with Tycho) so that they can easily be accessed in downstreams jobs? Currently I'm doing it semi-manually using Jenkins support for copying artifacts between jobs, and then specifying a repository-mirror element in a job-specific settings.xml which refers to the artifacts copied into the job, but this is all a little tricky and requires configuring jobs and build settings in a number of different places.
Is there any nicer way short of using an external solution such as Artifactory?
The only solution involving a repository manager that I am aware of is to use a Nexus and the Unzip Plug-in. (Disclaimer: The Unzip Plug-in is provided by the Tycho project, of which I am a committer.)
With such a setup, you could have one job deploy an update site to Nexus, and the next job use the update site via the unzip URL of the deployed site. Example: If the site was deployed under the GAV project.abc:site:1.0.0-SNAPSHOT, you could then access it via http://<nexus>/content/repositories/<unzip-repo-name>/project/abc/site/1.0.0-SNAPSHOT/site-1.0.0-SNAPSHOT-unzip/.
Note that you are slightly less flexible with such a setup that with what you have set up now: You need to have a version number for what your upstream project is building, so this may become tricky if you have multiple feature branches developing towards the same release version.
If you don't need this, you have the benefit of getting a portable build of your downstream project, i.e. developers build the project in the same way as your Jenkins does.
I'm trying to deploy my MVC4 app to ELB. The project has several post-build steps which pull together dependencies. The AWS SDK publish wizard then does not do the trick - it builds a Web Deploy package behind the scenes, which does not action those post-build steps or preserve the resulting directory structure.
So, I downloaded the command-line EB tools, got a git repository working, but can't work out the next step: what do I push to the server with git aws.push: because if it's just the resulting files then I can't specify the "Enable 32-bit applications" flag (required), etc. Do I then push a web deploy package from my repository?
I presume so, but if so, how do I include the files copied into the output folder during "normal" builds by my post-build steps?
Here we go. This seems to be in conflict with what Jim Flanagan was saying - below it's a zip file, but Jim says it's the contents of it.
#Jim Flanagan - perhaps you could comment if you have some time. Thanks.
Hi thanks for contacting AWS Premium Support
Communication from the Elastic Beanstalk Engineering Team.
When you aws.push an ASP.NET/MVC app you do not push the web deploy archive, rather you push the artifacts as you want them deployed on the machine. From the customers stack overflow question it seems they have already found the local git repo that the VS deployment wizard created and looking their should give them a good indication of what is needed in the git repository.
There isn't a nice way through the aws.push to specify what the "Enable 32-bit Application" app pool setting should be (or any other configuration setting). If you need a specific configuration setting set I would suggest creating the environment (via the console or using the eb command line tool) which allow you to specify the configuration. And then use git aws.push to deploy to that environment, git aws.push will just use the configuration that is already present on the environment.
The last question about still being incremental is not really valid since you are not pushing just one big zip file. But if you were, it could still be incremental depending on what changed in the zip file, it might just send a diff between the two versions of the zip file. As the question implies though that use case is not really what incremental deployments were designed to help with.