I have an app which uses rebar3, eg cowboy. My question is: how can I easily configure it to fetch cowboy from another host? That is I want to point it from github to another host.
You want to build cowboy that has two dependencies which are git repositories on GitHub (https://github.com/ninenines/cowlib and https://github.com/ninenines/ranch to be specific). But you want to fetch the repositories from some other host, like your company's own git server where you mirrored all public repositories you need.
You have a number of options:
Fork cowboy and change the dependency URL-s in the rebar.config. This is the only (sane) way I know to make Rebar3 fetch the dependencies from a different location. The other options will target the layers below Rebar3 to achieve the same result.
Configure git to rewrite GitHub URL-s to URL-s on your server, following e.g. https://stackoverflow.com/a/11383587/9015322
git config --global url.https://git.mycompany.com/.insteadOf https://github.com/
Add an /etc/hosts entry that resolves github.com to the IP of your server. You'll probably have to create a fake self-signed certificate for github.com, and make git on the build machine trust it. But you can do that following the advice here: https://stackoverflow.com/a/16543283/9015322
By default, rebar3 will only grab packages from either hex.pm, any git repository, or any mercurial repository. You can see your options here.
If these defaults are not enough for you, you can create your own dependency resources. This will require you to write some Erlang code in order to tell rebar3 how to find and download the package(s) you are trying to use.
Related
Good afternoon,
As I understand Jenkins, if I need to install a plugin, it goes to Jenkins Plugins
The problem I have is Jenkins is installed on a closed network, it cannot access the internet. Is there a way I can download all of the plugins, place them on a web server on my local LAN, and have Jenkins reach out and download plugins as necessary? I could download everything and install one plugin at a time, but that seems a little tedious.
You could follow some or all of the instructions for setting up an artifactory mirror for the plugin repo.
It will need to be a http/https server and you will find that many plugins have a multitude of dependencies
The closed network problem:
You can take a cue from the Jenkins Docker install-plugins.sh approach ...
This script takes as input a list of plugins, and optionally versions (eg: $0 workflow-aggregator:2.6 pipeline-maven:3.6.5 job-dsl:1.70) and will download all the plugins and dependencies into a working directory.
Our approach is to create a file (under version control) and redirect that to the command line input (ie: install-plugins.sh $(< plugins.lst).
You can download from where you do have internet access and then place on your network, manually copying them to your ${JENKINS_HOME}/plugins directory and restart the instance.
The tedious list problem:
If you only specify top-level plugins (ie: what you need), every time you run the script, it will resolve the latest dependencies. Makes for a short list, but random dependencies if they get updated at https://updates.jenkins.io. You can use a two-step approach to address this. Use the short-list to download the required plugins and dependencies. Store the generated explicit list for future reference or repeatability.
I am supposed to put up the Nexus Repository Manager for an existing Maven Repository. However the maven repo is currently in production, which I will not be able to work on. So I am planning to take an exact replica of the maven repo into a sandbox and start working on the it.
Will just copying the folder contents work?
Yes, just copying will work. We use rsync to do this and have no problems. If you are using Windows, you can also setup cygwin to do this if you want to copy every so often.
At the least you can just zip it up, copy it, and unzip it on the target machine.
I have a project with several dependencies on remote repositories (all on github.com at the moment if that helps). The dependencies don't change often. It would be nice if there was a way to keep the existing rebar.conf files the same so that they pointed to the upstream repositories, but to be able to cache the repos (or a snapshot) locally so that clean builds don't need go to the internet.
Is there anyway of doing this? I.e. rebar command line options, environment settings, git options, etc.?
I suppose you could do couple of things:
Make your own local clone of all repositories and change
rebar.config to take this repos from it. On the first look it seems
horrible solution, but it has a lot of advantages. Github is often
not available, clone speed will increase, and the last most valuable
is: projects are evolving and one day you will find that everything
is broken because one of the deps has changed their APi in master branch.
You could do local deps folder with all you need repos and share
it via symlink with every repo you need.
Rebar has a feature that lets you add a custom script file, rebar.config.script, to modify rebar's configuration dynamically. This lets you implement something similar to #danechkin's answer #2 except using an environment variable to switch between the local shared deps folder and the default one for the project. No changes to rebar.config needed. The example at https://github.com/basho/rebar/wiki/Dynamic-configuration shows how to do this.
We are using maven in the development process. Maven provides a nice feature of configuring the repositories. Using this feature I have created a remote internal repository and I can download the dependencies from that repository.
The development machines are pointing to this remote internal repository. Each development machine has its own local repository(~/.m2/repository/) and hence the dependencies of the project are downloaded from the remote internal repositor**y to the **local repository(~/.m2/repository/) on each developer machine.
Is there any way that the local repository(~/.m2/repository/) on developer machines can be set to the internal remote repository that we have created and which is used for downloading the dependencies from.
If take a look on Maven Introduction to Repositories first paragraph says:
There are strictly only two types of repositories: local and remote.
There is no way how you could change this behavior.
If you would handle that differently it would cause many problems. E.g. build would take much longer because of downloading file all files, IDE would work not work properly (project dependencies would not be stored local), ...
May I suggest another approach to share dependencies and artifacts. In our projects we use nexus as a proxy and repository for our artifacts. It works well with no issues. A basic configuration I already posted here.
After nexus is running you could also setup continous integration using jenkins and enjoy a fully automated environment.
Is your requirement to avoid each developer from having to download all dependencies to his local repository?
Assuming your remote internal repository has the same format as a maven local repository, you can achieve this by adding the following line in the settings.xml of all your developers.
<localRepository>shared-drive-location-of-remote-repository</localRepository>
I am developing some school grading software and decided to use Github to host the project. After building some code on my Ubuntu box I pushed it to Github and then cloned it down to my MacBook Pro. After editing the code on the MBP I pushed it back to Github. The next morning I tried to update my repo on the Ubuntu box with a git pull and it gave me all kinds of trouble.
Whats the best way to work in this situation? I don't want to fork my own repo and I don't really want to send myself emails or pull requests. Why can't I just treat Github like a master and push/pull from it onto all of my personal repos on different computers?
I'll assume your problem was that the machine on which you first created the repo crapped out when you tried to issue the git pull command.
When you clone an existing git repository (like you did on your 2nd machine, the MacBook Pro), you're automatically set up to so your git pull commands will automatically merge the remote with your local changes.
However, when you initially create a repo and then share it on a remote repository, you have to issue a few commands to make things as automated as a on cloned repo.
# GitHub gives you that instruction, you've already done that
# git remote add origin git#github.com:user_name/repo_name.git
# GitHub doesn't specify the following instructions
git config branch.master.remote origin
git config branch.master.merge refs/heads/master
These last few instructions configure git so future git pull's from this repo will merge all remote changes automatically.
The following is a bit of shameless self-promotion. If you use Ruby, I have created a Ruby-based tool that lets you deal with all these kinds of things with git remote branches. The tool is called, unsurprisingly, git_remote_branch :-)
If you don't use Ruby, my tool is probably gonna be too much of a hassle to install. What you can do is look at an old post on my blog, where most of the stuff grb can do for you was explicitly shown. Whip out your git notes file :-)
You can also add multiple SSH public keys.